Science.gov

Sample records for additional big images

  1. Cincinnati Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Love, Lonnie J.

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  2. BigView Image Viewing on Tiled Displays

    NASA Technical Reports Server (NTRS)

    Sandstrom, Timothy

    2007-01-01

    BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running Linux. Additionally, it can work in a multi-screen environment where multiple PCs cooperate to view a single, large image. Using this software, one can explore on relatively modest machines images such as the Mars Orbiter Camera mosaic [92,160 33,280 pixels]. The images must be first converted into paged format, where the image is stored in 256 256 pages to allow rapid movement of pixels into texture memory. The format contains an image pyramid : a set of scaled versions of the original image. Each scaled image is 1/2 the size of the previous, starting with the original down to the smallest, which fits into a single 256 x 256 page.

  3. BIG FROG WILDERNESS STUDY AREA AND ADDITIONS, TENNESSEE AND GEORGIA.

    USGS Publications Warehouse

    Slack, John F.; Gazdik, Gertrude C.

    1984-01-01

    A mineral-resource survey was made of the Big Frog Wilderness Study Area and additions, Tennessee-Georgia. Geochemical sampling found traces of gold, zinc, copper, and arsenic in rocks, stream sediments, and panned concentrates, but not in sufficient quantities to indicate the presence of deposits of these metals. The results of the survey indicate that there is little promise for the occurrence of metallic mineral deposits within the study area. The only apparent resources are nonmetallic commodities including rock suitable for construction materials, and small amounts of sand and gravel; however, these commodities are found in abundance outside the study area. A potential may exist for oil and natural gas at great depths, but this cannot be evaluated by the present study.

  4. AirMSPI PODEX Big Sur Ellipsoid Images

    Atmospheric Science Data Center

    2013-12-11

    ... AirMSPI Browse Images from the PODEX 2013 Campaign   Big Sur target 02/03/2013 Ellipsoid-projected   Select ...   Version number   For more information, see the  Data Product Specifications (DPS) ...

  5. Small Art Images--Big Art Learning

    ERIC Educational Resources Information Center

    Stephens, Pam

    2005-01-01

    When small art images are incorporated into the curriculum, students are afforded opportunities to slow down, observe minute details, and communicate ideas about art and artists. This sort of purposeful art contemplation takes students beyond the day-to-day educational practice. It is through these sorts of art activities that students develop…

  6. Big Area Additive Manufacturing of High Performance Bonded NdFeB Magnets

    PubMed Central

    Li, Ling; Tirado, Angelica; Nlebedim, I. C.; Rios, Orlando; Post, Brian; Kunc, Vlastimil; Lowden, R. R.; Lara-Curzio, Edgar; Fredette, Robert; Ormerod, John; Lograsso, Thomas A.; Paranthaman, M. Parans

    2016-01-01

    Additive manufacturing allows for the production of complex parts with minimum material waste, offering an effective technique for fabricating permanent magnets which frequently involve critical rare earth elements. In this report, we demonstrate a novel method - Big Area Additive Manufacturing (BAAM) - to fabricate isotropic near-net-shape NdFeB bonded magnets with magnetic and mechanical properties comparable or better than those of traditional injection molded magnets. The starting polymer magnet composite pellets consist of 65 vol% isotropic NdFeB powder and 35 vol% polyamide (Nylon-12). The density of the final BAAM magnet product reached 4.8 g/cm3, and the room temperature magnetic properties are: intrinsic coercivity Hci = 688.4 kA/m, remanence Br = 0.51 T, and energy product (BH)max = 43.49 kJ/m3 (5.47 MGOe). In addition, tensile tests performed on four dog-bone shaped specimens yielded an average ultimate tensile strength of 6.60 MPa and an average failure strain of 4.18%. Scanning electron microscopy images of the fracture surfaces indicate that the failure is primarily related to the debonding of the magnetic particles from the polymer binder. The present method significantly simplifies manufacturing of near-net-shape bonded magnets, enables efficient use of rare earth elements thus contributing towards enriching the supply of critical materials. PMID:27796339

  7. Neural Computations for Biosonar Imaging in the Big Brown Bat

    NASA Astrophysics Data System (ADS)

    Saillant, Prestor Augusto

    1995-11-01

    The study of the intimate relationship between space and time has taken many forms, ranging from the Theory of Relativity down to the problem of avoiding traffic jams. However, nowhere has this relationship been more fully developed and exploited than in dolphins and bats, which have the ability to utilize biosonar. This thesis describes research on the behavioral and computational basis of echolocation carried out in order to explore the neural mechanisms which may account for the space-time constructs which are of psychological importance to the big brown bat. The SCAT (Spectrogram Correlation and Transformation) computational model was developed to provide a framework for understanding the computational requirements of FM echolocation as determined from psychophysical experiments (i.e., high resolution imaging) and neurobiological constraints (Saillant et al., 1993). The second part of the thesis consisted in developing a new behavioral paradigm for simultaneously studying acoustic behavior and flight behavior of big brown bats in pursuit of stationary or moving targets. In the third part of the thesis a complete acoustic "artificial bat" was constructed, making use of the SCAT process. The development of the artificial bat allowed us to begin experimentation with real world echoes from various targets, in order to gain a better appreciation for the additional complexities and sources of information encountered by bats in flight. Finally, the continued development of the SCAT model has allowed a deeper understanding of the phenomenon of "time expansion" and of the phenomenon of phase sensitivity in the ultrasonic range. Time expansion, first predicted through the use of the SCAT model, and later found in auditory local evoked potential recordings, opens up a new realm of information processing and representation in the brain which as of yet has not been considered. It seems possible, from the work in the auditory system, that time expansion may provide a novel

  8. Utility of Big Area Additive Manufacturing (BAAM) For The Rapid Manufacture of Customized Electric Vehicles

    SciTech Connect

    Love, Lonnie J.

    2015-08-01

    This Oak Ridge National Laboratory (ORNL) Manufacturing Development Facility (MDF) technical collaboration project was conducted in two phases as a CRADA with Local Motors Inc. Phase 1 was previously reported as Advanced Manufacturing of Complex Cyber Mechanical Devices through Community Engagement and Micro-manufacturing and demonstrated the integration of components onto a prototype body part for a vehicle. Phase 2 was reported as Utility of Big Area Additive Manufacturing (BAAM) for the Rapid Manufacture of Customized Electric Vehicles and demonstrated the high profile live printing of an all-electric vehicle using ONRL s Big Area Additive Manufacturing (BAAM) technology. This demonstration generated considerable national attention and successfully demonstrated the capabilities of the BAAM system as developed by ORNL and Cincinnati, Inc. and the feasibility of additive manufacturing of a full scale electric vehicle as envisioned by the CRADA partner Local Motors, Inc.

  9. Material Development for Tooling Applications Using Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Drye, Tom; Franc, Alan

    2015-03-01

    Techmer Engineered Solutions (TES) is working with Oak Ridge National Laboratory (ORNL) to develop materials and evaluate their use for ORNL s recently developed Big Area Additive Manufacturing (BAAM) system for tooling applications. The first phase of the project established the performance of some commercially available polymer compositions deposited with the BAAM system. Carbon fiber reinforced ABS demonstrated a tensile strength of nearly 10 ksi, which is sufficient for a number of low temperature tooling applications.

  10. AirMSPI PODEX BigSur Terrain Images

    Atmospheric Science Data Center

    2013-12-13

    ... from the PODEX 2013 Campaign   Big Sur target (Big Sur, California) 02/03/2013 Terrain-projected   Select ...   Version number   For more information, see the Data Product Specifications (DPS)   ...

  11. Body image and personality among British men: associations between the Big Five personality domains, drive for muscularity, and body appreciation.

    PubMed

    Benford, Karis; Swami, Viren

    2014-09-01

    The present study examined associations between the Big Five personality domains and measures of men's body image. A total of 509 men from the community in London, UK, completed measures of drive for muscularity, body appreciation, the Big Five domains, and subjective social status, and provided their demographic details. The results of a hierarchical regression showed that, once the effects of participant body mass index (BMI) and subjective social status had been accounted for, men's drive for muscularity was significantly predicted by Neuroticism (β=.29). In addition, taking into account the effects of BMI and subjective social status, men's body appreciation was significantly predicted by Neuroticism (β=-.35) and Extraversion (β=.12). These findings highlight potential avenues for the development of intervention approaches based on the relationship between the Big Five personality traits and body image.

  12. Research on image matching method of big data image of three-dimensional reconstruction

    NASA Astrophysics Data System (ADS)

    Zhang, Chunsen; Qiu, Zhenguo; Zhu, Shihuan; Wang, Xiqi; Xu, Xiaolei; Zhong, Sidong

    2015-12-01

    Image matching is the main flow of a three-dimensional reconstruction. With the development of computer processing technology, seeking the image to be matched from the large date image sets which acquired from different image formats, different scales and different locations has put forward a new request for image matching. To establish the three dimensional reconstruction based on image matching from big data images, this paper put forward a new effective matching method based on visual bag of words model. The main technologies include building the bag of words model and image matching. First, extracting the SIFT feature points from images in the database, and clustering the feature points to generate the bag of words model. We established the inverted files based on the bag of words. The inverted files can represent all images corresponding to each visual word. We performed images matching depending on the images under the same word to improve the efficiency of images matching. Finally, we took the three-dimensional model with those images. Experimental results indicate that this method is able to improve the matching efficiency, and is suitable for the requirements of large data reconstruction.

  13. The Big Five of Personality and structural imaging revisited: a VBM - DARTEL study.

    PubMed

    Liu, Wei-Yin; Weber, Bernd; Reuter, Martin; Markett, Sebastian; Chu, Woei-Chyn; Montag, Christian

    2013-05-01

    The present study focuses on the neurostructural foundations of the human personality. In a large sample of 227 healthy human individuals (168 women and 59 men), we used MRI to examine the relationship between personality traits and both regional gray and white matter volume, while controlling for age and sex. Personality was assessed using the German version of the NEO Five-Factor Inventory that measures individual differences in the 'Big Five of Personality': extraversion, neuroticism, agreeableness, conscientiousness, and openness to experience. In contrast to most previous studies on neural correlates of the Big Five, we used improved processing strategies: white and gray matter were independently assessed by segmentation steps before data analysis. In addition, customized sex-specific diffeomorphic anatomical registration using exponentiated lie algebra templates were used. Our results did not show significant correlations between any dimension of the Big Five and regional gray matter volume. However, among others, higher conscientiousness scores correlated significantly with reductions in regional white matter volume in different brain areas, including the right insula, putamen, caudate, and left fusiformis. These correlations were driven by the female subsample. The present study suggests that many results from the literature on the neurostructural basis of personality should be reviewed carefully, considering the results when the sample size is larger, imaging methods are rigorously applied, and sex-related and age-related effects are controlled.

  14. a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.

    2015-07-01

    Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.

  15. Big-deep-smart data in imaging for guiding materials design.

    PubMed

    Kalinin, Sergei V; Sumpter, Bobby G; Archibald, Richard K

    2015-10-01

    Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.

  16. Big Surveys, Big Data Centres

    NASA Astrophysics Data System (ADS)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  17. Big cat scan: magnetic resonance imaging of the tiger.

    PubMed

    Snow, Thomas M; Litster, Annette L; Gregory, Richard J W

    2004-03-01

    In August 2002, we performed MRI scans on a female juvenile Bengal tiger. We present the clinical course, imaging and autopsy findings, and some comparative anatomy of the tiger brain and skull. Magnetic resonance images of a tiger have not previously been published.

  18. Big cat scan: magnetic resonance imaging of the tiger.

    PubMed

    Snow, Thomas M; Litster, Annette L; Gregory, Richard J W

    2004-03-01

    In August 2002, we performed MRI scans on a female juvenile Bengal tiger. We present the clinical course, imaging and autopsy findings, and some comparative anatomy of the tiger brain and skull. Magnetic resonance images of a tiger have not previously been published. PMID:15027932

  19. Forensic detection of noise addition in digital images

    NASA Astrophysics Data System (ADS)

    Cao, Gang; Zhao, Yao; Ni, Rongrong; Ou, Bo; Wang, Yongbin

    2014-03-01

    We proposed a technique to detect the global addition of noise to a digital image. As an anti-forensics tool, noise addition is typically used to disguise the visual traces of image tampering or to remove the statistical artifacts left behind by other operations. As such, the blind detection of noise addition has become imperative as well as beneficial to authenticate the image content and recover the image processing history, which is the goal of general forensics techniques. Specifically, the special image blocks, including constant and strip ones, are used to construct the features for identifying noise addition manipulation. The influence of noising on blockwise pixel value distribution is formulated and analyzed formally. The methodology of detectability recognition followed by binary decision is proposed to ensure the applicability and reliability of noising detection. Extensive experimental results demonstrate the efficacy of our proposed noising detector.

  20. Application of imaging geodesy to assess kinematics of the Medicine Wheel Landslide, Big Horn Mountains, Wyoming

    NASA Astrophysics Data System (ADS)

    Held, B. M.; Gomez, F.; Corley, J.

    2012-12-01

    Radar interferometry provides a means of imaging spatially varying kinematics of slow mass movements, such as earthflows. These observations provide critical constraints for understanding earth flow mechanics when considered with topography and meteorological forcings. This study focuses the Medicine Wheel Landslide located on the western side of the Big Horn Mountains in Wyoming. The mass movement is 1km by 1km and is on a 7 degree slope and consists of multiple rotational slumps within the larger mass which is detaching along the Mowry and Thermopolis Shales. The slide regularly damages the highway that intersects it. Preliminary results of a combination of satellite-based and ground-based radar interferometry document variations in the rates of done-slope movement. Satellite interferometric synthetic aperture radar (InSAR) analysis utilized L-band data acquired by the ALOS PALSAR system. Resulting interferograms indicated that the Medicine Wheel Slide is active even during the summer months moving up to 7.5cm from July to October 2011. This movement corresponds with record amounts of snowfall during the 2010-2011 winter. Ground based interferometric radar (GBIR) was chosen for this application due to millimeter scale sensitivity to motion, and its ability to perform observations from the same point. GBIR observation point was chosen to maximize line-of-site sensitivity to motion and to provide a more complete view of the entire slide surface. In addition, ground truth measurements of displacement are provided by a network of 20 monuments measured using rapid static GPS, acquired during the same observation campaign as the GBIR imaging. Seasonal amounts of movement derived from radar interferometry and ground-truthed by GPS, can then be compared meteorological data to determine if the relationship shown for 2011 persists.

  1. An inexpensive x-ray imaging system of big visual field

    NASA Astrophysics Data System (ADS)

    Yu, Chunyu; Qing, Baowang; Chang, Benkang

    2006-01-01

    In this paper, a new kind of x-ray imaging system designed by our laboratory is introduced in detail. Different from the traditional x-ray imaging system, its image intensifier is a combined one. The system's main components are the intensifying screen and the brightness intensifier and they are coupled by lens. Compared with the traditional x-ray imaging system, it has the advantages of low cost, big visual field and convenient installation. At the very beginning of this paper, the structure and the imaging principle of the new kind of x-ray imaging system are described, then requirements for the key components of this system are discussed and put forward. At the end of this paper, we give the images of the foot and the bag, which are the imaging results of the system. It indicates that the x-ray imaging system is satisfied and just for its low price, the usual users such as the middle and small hospitals can afford the system. As for the imaging performance of the system as concerned, it can be applied to security checking, medical treatment, nondestructive testing and many other fields of the science and technology.

  2. Body image and personality: associations between the Big Five Personality Factors, actual-ideal weight discrepancy, and body appreciation.

    PubMed

    Swami, Viren; Tran, Ulrich S; Brooks, Louise Hoffmann; Kanaan, Laura; Luesse, Ellen-Marlene; Nader, Ingo W; Pietschnig, Jakob; Stieger, Stefan; Voracek, Martin

    2013-04-01

    Studies have suggested associations between personality dimensions and body image constructs, but these have not been conclusively established. In two studies, we examined direct associations between the Big Five dimensions and two body image constructs, actual-ideal weight discrepancy and body appreciation. In Study 1, 950 women completed measures of both body image constructs and a brief measure of the Big Five dimensions. In Study 2,339 women completed measures of the body image constructs and a more reliable measure of the Big Five. Both studies showed that Neuroticism was significantly associated with actual-ideal weight discrepancy (positively) and body appreciation (negatively) once the effects of body mass index and social status had been accounted for. These results are consistent with the suggestion that Neuroticism is a trait of public health significance requiring attention by body image scholars.

  3. Research on three-dimensional positioning method of big data image under bag of words model guidance

    NASA Astrophysics Data System (ADS)

    Zhang, Chunsen; Wang, Xiqi; Qiu, Zhenguo; Zhu, Shihuan; Xu, Xiaolei; Zhong, Sidong

    2015-12-01

    In order to retrieve the positioning image efficiently and quickly from a large number of different images to realize the three-dimensional spatial positioning, in this article, based on photogrammetry and computer vision theory, a new method of three-dimensional positioning of big data image under the bag of words model guidance is proposed. The method consists of two parts: image retrieving and spatial positioning. First, complete image retrieval by feature extraction, K-means clustering, bag of words model building and other processes, thus improve the efficiency of image matching. Second, achieve interior and exterior orientation element through image matching, building projection relationship and calculating the projection matrix, and then the spatial orientation is realized. The experimental result showed that the proposed method can retrieve the target image efficiently and achieve spatial orientation accurately, which made a beneficial exploration for achieving space positioning based on big data images.

  4. Imaging requirements for medical applications of additive manufacturing.

    PubMed

    Huotilainen, Eero; Paloheimo, Markku; Salmi, Mika; Paloheimo, Kaija-Stiina; Björkstrand, Roy; Tuomi, Jukka; Markkola, Antti; Mäkitie, Antti

    2014-02-01

    Additive manufacturing (AM), formerly known as rapid prototyping, is steadily shifting its focus from industrial prototyping to medical applications as AM processes, bioadaptive materials, and medical imaging technologies develop, and the benefits of the techniques gain wider knowledge among clinicians. This article gives an overview of the main requirements for medical imaging affected by needs of AM, as well as provides a brief literature review from existing clinical cases concentrating especially on the kind of radiology they required. As an example application, a pair of CT images of the facial skull base was turned into 3D models in order to illustrate the significance of suitable imaging parameters. Additionally, the model was printed into a preoperative medical model with a popular AM device. Successful clinical cases of AM are recognized to rely heavily on efficient collaboration between various disciplines - notably operating surgeons, radiologists, and engineers. The single main requirement separating tangible model creation from traditional imaging objectives such as diagnostics and preoperative planning is the increased need for anatomical accuracy in all three spatial dimensions, but depending on the application, other specific requirements may be present as well. This article essentially intends to narrow the potential communication gap between radiologists and engineers who work with projects involving AM by showcasing the overlap between the two disciplines.

  5. Estimating classification images with generalized linear and additive models.

    PubMed

    Knoblauch, Kenneth; Maloney, Laurence T

    2008-12-22

    Conventional approaches to modeling classification image data can be described in terms of a standard linear model (LM). We show how the problem can be characterized as a Generalized Linear Model (GLM) with a Bernoulli distribution. We demonstrate via simulation that this approach is more accurate in estimating the underlying template in the absence of internal noise. With increasing internal noise, however, the advantage of the GLM over the LM decreases and GLM is no more accurate than LM. We then introduce the Generalized Additive Model (GAM), an extension of GLM that can be used to estimate smooth classification images adaptively. We show that this approach is more robust to the presence of internal noise, and finally, we demonstrate that GAM is readily adapted to estimation of higher order (nonlinear) classification images and to testing their significance.

  6. Scalable splitting algorithms for big-data interferometric imaging in the SKA era

    NASA Astrophysics Data System (ADS)

    Onose, Alexandru; Carrillo, Rafael E.; Repetti, Audrey; McEwen, Jason D.; Thiran, Jean-Philippe; Pesquet, Jean-Christophe; Wiaux, Yves

    2016-11-01

    In the context of next-generation radio telescopes, like the Square Kilometre Array (SKA), the efficient processing of large-scale data sets is extremely important. Convex optimization tasks under the compressive sensing framework have recently emerged and provide both enhanced image reconstruction quality and scalability to increasingly larger data sets. We focus herein mainly on scalability and propose two new convex optimization algorithmic structures able to solve the convex optimization tasks arising in radio-interferometric imaging. They rely on proximal splitting and forward-backward iterations and can be seen, by analogy, with the CLEAN major-minor cycle, as running sophisticated CLEAN-like iterations in parallel in multiple data, prior, and image spaces. Both methods support any convex regularization function, in particular, the well-studied ℓ1 priors promoting image sparsity in an adequate domain. Tailored for big-data, they employ parallel and distributed computations to achieve scalability, in terms of memory and computational requirements. One of them also exploits randomization, over data blocks at each iteration, offering further flexibility. We present simulation results showing the feasibility of the proposed methods as well as their advantages compared to state-of-the-art algorithmic solvers. Our MATLAB code is available online on GitHub.

  7. Comparison of additive image fusion vs. feature-level image fusion techniques for enhanced night driving

    NASA Astrophysics Data System (ADS)

    Bender, Edward J.; Reese, Colin E.; Van Der Wal, Gooitzen S.

    2003-02-01

    The Night Vision & Electronic Sensors Directorate (NVESD) has conducted a series of image fusion evaluations under the Head-Tracked Vision System (HTVS) program. The HTVS is a driving system for both wheeled and tracked military vehicles, wherein dual-waveband sensors are directed in a more natural head-slewed imaging mode. The HTVS consists of thermal and image-intensified TV sensors, a high-speed gimbal, a head-mounted display, and a head tracker. A series of NVESD field tests over the past two years has investigated the degree to which additive (A+B) image fusion of these sensors enhances overall driving performance. Additive fusion employs a single (but user adjustable) fractional weighting for all the features of each sensor's image. More recently, NVESD and Sarnoff Corporation have begun a cooperative effort to evaluate and refine Sarnoff's "feature-level" multi-resolution (pyramid) algorithms for image fusion. This approach employs digital processing techniques to select at each image point only the sensor with the strongest features, and to utilize only those features to reconstruct the fused video image. This selection process is performed simultaneously at multiple scales of the image, which are combined to form the reconstructed fused image. All image fusion techniques attempt to combine the "best of both sensors" in a single image. Typically, thermal sensors are better for detecting military threats and targets, while image-intensified sensors provide more natural scene cues and detect cultural lighting. This investigation will address the differences between additive fusion and feature-level image fusion techniques for enhancing the driver's overall situational awareness.

  8. A Big Data Analytics Pipeline for the Analysis of TESS Full Frame Images

    NASA Astrophysics Data System (ADS)

    Wampler-Doty, Matthew; Pierce Doty, John

    2015-12-01

    We present a novel method for producing a catalogue of extra-solar planets and transients using the full frame image data from TESS. Our method involves (1) creating a fast Monte Carlo simulation of the TESS science instruments, (2) using the simulation to create a labeled dataset consisting of exoplanets with various orbital durations as well as transients (such as tidal disruption events), (3) using supervised machine learning to find optimal matched filters, Support Vector Machines (SVMs) and statistical classifiers (i.e. naïve Bayes and Markov Random Fields) to detect astronomical objects of interest and (4) “Big Data” analysis to produce a catalogue based on the TESS data. We will apply the resulting methods to all stars in the full frame images. We hope that by providing libraries that conform to industry standards of Free Open Source Software we may invite researchers from the astronomical community as well as the wider data-analytics community to contribute to our effort.

  9. Study on clear stereo image pair acquisition method for small objects with big vertical size in SLM vision system.

    PubMed

    Wang, Yuezong; Jin, Yan; Wang, Lika; Geng, Benliang

    2016-05-01

    Microscopic vision system with stereo light microscope (SLM) has been applied to surface profile measurement. If the vertical size of a small object exceeds the range of depth, its images will contain clear and fuzzy image regions. Hence, in order to obtain clear stereo images, we propose a microscopic sequence image fusion method which is suitable for SLM vision system. First, a solution to capture and align image sequence is designed, which outputs an aligning stereo images. Second, we decompose stereo image sequence by wavelet analysis theory, and obtain a series of high and low frequency coefficients with different resolutions. Then fused stereo images are output based on the high and low frequency coefficient fusion rules proposed in this article. The results show that Δw1 (Δw2 ) and ΔZ of stereo images in a sequence have linear relationship. Hence, a procedure for image alignment is necessary before image fusion. In contrast with other image fusion methods, our method can output clear fused stereo images with better performance, which is suitable for SLM vision system, and very helpful for avoiding image fuzzy caused by big vertical size of small objects.

  10. Study on clear stereo image pair acquisition method for small objects with big vertical size in SLM vision system.

    PubMed

    Wang, Yuezong; Jin, Yan; Wang, Lika; Geng, Benliang

    2016-05-01

    Microscopic vision system with stereo light microscope (SLM) has been applied to surface profile measurement. If the vertical size of a small object exceeds the range of depth, its images will contain clear and fuzzy image regions. Hence, in order to obtain clear stereo images, we propose a microscopic sequence image fusion method which is suitable for SLM vision system. First, a solution to capture and align image sequence is designed, which outputs an aligning stereo images. Second, we decompose stereo image sequence by wavelet analysis theory, and obtain a series of high and low frequency coefficients with different resolutions. Then fused stereo images are output based on the high and low frequency coefficient fusion rules proposed in this article. The results show that Δw1 (Δw2 ) and ΔZ of stereo images in a sequence have linear relationship. Hence, a procedure for image alignment is necessary before image fusion. In contrast with other image fusion methods, our method can output clear fused stereo images with better performance, which is suitable for SLM vision system, and very helpful for avoiding image fuzzy caused by big vertical size of small objects. PMID:26970109

  11. An Improved InSAR Image Co-Registration Method for Pairs with Relatively Big Distortions or Large Incoherent Areas.

    PubMed

    Chen, Zhenwei; Zhang, Lei; Zhang, Guo

    2016-09-17

    Co-registration is one of the most important steps in interferometric synthetic aperture radar (InSAR) data processing. The standard offset-measurement method based on cross-correlating uniformly distributed patches takes no account of specific geometric transformation between images or characteristics of ground scatterers. Hence, it is inefficient and difficult to obtain satisfying co-registration results for image pairs with relatively big distortion or large incoherent areas. Given this, an improved co-registration strategy is proposed in this paper which takes both the geometric features and image content into consideration. Firstly, some geometric transformations including scale, flip, rotation, and shear between images were eliminated based on the geometrical information, and the initial co-registration polynomial was obtained. Then the registration points were automatically detected by integrating the signal-to-clutter-ratio (SCR) thresholds and the amplitude information, and a further co-registration process was performed to refine the polynomial. Several comparison experiments were carried out using 2 TerraSAR-X data from the Hong Kong airport and 21 PALSAR data from the Donghai Bridge. Experiment results demonstrate that the proposed method brings accuracy and efficiency improvements for co-registration and processing abilities in the cases of big distortion between images or large incoherent areas in the images. For most co-registrations, the proposed method can enhance the reliability and applicability of co-registration and thus promote the automation to a higher level.

  12. An Improved InSAR Image Co-Registration Method for Pairs with Relatively Big Distortions or Large Incoherent Areas.

    PubMed

    Chen, Zhenwei; Zhang, Lei; Zhang, Guo

    2016-01-01

    Co-registration is one of the most important steps in interferometric synthetic aperture radar (InSAR) data processing. The standard offset-measurement method based on cross-correlating uniformly distributed patches takes no account of specific geometric transformation between images or characteristics of ground scatterers. Hence, it is inefficient and difficult to obtain satisfying co-registration results for image pairs with relatively big distortion or large incoherent areas. Given this, an improved co-registration strategy is proposed in this paper which takes both the geometric features and image content into consideration. Firstly, some geometric transformations including scale, flip, rotation, and shear between images were eliminated based on the geometrical information, and the initial co-registration polynomial was obtained. Then the registration points were automatically detected by integrating the signal-to-clutter-ratio (SCR) thresholds and the amplitude information, and a further co-registration process was performed to refine the polynomial. Several comparison experiments were carried out using 2 TerraSAR-X data from the Hong Kong airport and 21 PALSAR data from the Donghai Bridge. Experiment results demonstrate that the proposed method brings accuracy and efficiency improvements for co-registration and processing abilities in the cases of big distortion between images or large incoherent areas in the images. For most co-registrations, the proposed method can enhance the reliability and applicability of co-registration and thus promote the automation to a higher level. PMID:27649207

  13. An Improved InSAR Image Co-Registration Method for Pairs with Relatively Big Distortions or Large Incoherent Areas

    PubMed Central

    Chen, Zhenwei; Zhang, Lei; Zhang, Guo

    2016-01-01

    Co-registration is one of the most important steps in interferometric synthetic aperture radar (InSAR) data processing. The standard offset-measurement method based on cross-correlating uniformly distributed patches takes no account of specific geometric transformation between images or characteristics of ground scatterers. Hence, it is inefficient and difficult to obtain satisfying co-registration results for image pairs with relatively big distortion or large incoherent areas. Given this, an improved co-registration strategy is proposed in this paper which takes both the geometric features and image content into consideration. Firstly, some geometric transformations including scale, flip, rotation, and shear between images were eliminated based on the geometrical information, and the initial co-registration polynomial was obtained. Then the registration points were automatically detected by integrating the signal-to-clutter-ratio (SCR) thresholds and the amplitude information, and a further co-registration process was performed to refine the polynomial. Several comparison experiments were carried out using 2 TerraSAR-X data from the Hong Kong airport and 21 PALSAR data from the Donghai Bridge. Experiment results demonstrate that the proposed method brings accuracy and efficiency improvements for co-registration and processing abilities in the cases of big distortion between images or large incoherent areas in the images. For most co-registrations, the proposed method can enhance the reliability and applicability of co-registration and thus promote the automation to a higher level. PMID:27649207

  14. Imaging Structure, Stratigraphy and Groundwater with Ground-Penetrating Radar on the Big Island, Hawaii

    NASA Astrophysics Data System (ADS)

    Shapiro, S. R.; Tchakirides, T. F.; Brown, L. D.

    2004-12-01

    A series of exploratory ground-penetrating radar (GPR) surveys were carried out on the Big Island, Hawaii in March of 2004 to evaluate the efficacy of using GPR to address hydrological, volcanological, and tectonic issues in extrusive basaltic materials. Target sites included beach sands, nearshore lava flows, well-developed soil covers, lava tubes, and major fault zones. Surveys were carried out with a Sensors and Software T Pulse Ekko 100, which was equipped with 50, 100, and 200 MHz antennae. Both reflection profiles and CMP expanding spreads were collected at most sites to provide both structural detail and in situ velocity estimation. In general, the volcanic rocks exhibited propagation velocities of ca 0.09-0.10 m/ns, a value which we interpret to reflect the large air-filled porosity of the media. Penetration in the nearshore area was expectedly small (less than 1 m), which we attribute to seawater infiltration. However, surveys in the volcanics away from the coast routinely probed to depths of 10 m or greater, even at 100 MHz. While internal layering and lava tubes could be identified from individual profiles, the complexity of returns suggests that 3D imaging is required before detailed stratigraphy can be usefully interpreted. A pilot 3D survey over a lava tube complex supports this conclusion, although it was prematurely terminated by bad weather. Although analysis of the CMP data does not show a clear systematic variation in radar velocity with age of flow, the dataset is too limited to support any firm conclusions on this point. Unusually distinct, subhorizontal reflectors on several profiles seem to mark groundwater. In one case, the water seems to lie within a lava tube with an air-filled roof zone. Surveys over part of the controversial Hilana fault zone clearly image the fault as a steeply dipping feature in the subsurface, albeit only to depths of a few meters. The results suggest, however, that deeper extensions of the faults could be mapped by

  15. Data Processing of the magnetograms for the Near InfraRed Imaging Spectropolarimeter at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Ahn, Kwangsu; Cao, Wenda; Shumko, Sergiy; Chae, Jongchul

    2016-05-01

    We want to present the processing result of the vector magnetograms from the Near InfraRed Imaging Spectropolarimeter (NIRIS) at Big Bear Solar Observatory. The NIRIS is a successor of an old magnetograph system at BBSO, which equips with the new infrared detector and the improved Fabry-Perot filter system. While there are several upgrades to the new hardware, there are also some challenges as the data acquisition rate increases and we deal with the a larger detector array. The overall process includes dark and flat correction, image alignment, de-stretch, Stokes parameter selection, calibration of instrumental crosstalk, and Milne-Eddington inversion.

  16. Echo-delay resolution in sonar images of the big brown bat, Eptesicus fuscus

    PubMed Central

    Simmons, James A.; Ferragamo, Michael J.; Moss, Cynthia F.

    1998-01-01

    Echolocating big brown bats (Eptesicus fuscus) broadcast ultrasonic frequency-modulated (FM) biosonar sounds (20–100 kHz frequencies; 10–50 μs periods) and perceive target range from echo delay. Knowing the acuity for delay resolution is essential to understand how bats process echoes because they perceive target shape and texture from the delay separation of multiple reflections. Bats can separately perceive the delays of two concurrent electronically generated echoes arriving as little as 2 μs apart, thus resolving reflecting points as close together as 0.3 mm in range (two-point threshold). This two-point resolution is roughly five times smaller than the shortest periods in the bat’s sounds. Because the bat’s broadcasts are 2,000–4,500 μs long, the echoes themselves overlap and interfere with each other, to merge together into a single sound whose spectrum is shaped by their mutual interference depending on the size of the time separation. To separately perceive the delays of overlapping echoes, the bat has to recover information about their very small delay separation that was transferred into the spectrum when the two echoes interfered with each other, thus explicitly reconstructing the range profile of targets from the echo spectrum. However, the bat’s 2-μs resolution limit is so short that the available spectral cues are extremely limited. Resolution of delay seems overly sharp just for interception of flying insects, which suggests that the bat’s biosonar images are of higher quality to suit a wider variety of orientation tasks, and that biosonar echo processing is correspondingly more sophisticated than has been suspected. PMID:9770540

  17. Echo-delay resolution in sonar images of the big brown bat, Eptesicus fuscus.

    PubMed

    Simmons, J A; Ferragamo, M J; Moss, C F

    1998-10-13

    Echolocating big brown bats (Eptesicus fuscus) broadcast ultrasonic frequency-modulated (FM) biosonar sounds (20-100 kHz frequencies; 10-50 microseconds periods) and perceive target range from echo delay. Knowing the acuity for delay resolution is essential to understand how bats process echoes because they perceive target shape and texture from the delay separation of multiple reflections. Bats can separately perceive the delays of two concurrent electronically generated echoes arriving as little as 2 microseconds apart, thus resolving reflecting points as close together as 0.3 mm in range (two-point threshold). This two-point resolution is roughly five times smaller than the shortest periods in the bat's sounds. Because the bat's broadcasts are 2,000-4,500 microseconds long, the echoes themselves overlap and interfere with each other, to merge together into a single sound whose spectrum is shaped by their mutual interference depending on the size of the time separation. To separately perceive the delays of overlapping echoes, the bat has to recover information about their very small delay separation that was transferred into the spectrum when the two echoes interfered with each other, thus explicitly reconstructing the range profile of targets from the echo spectrum. However, the bat's 2-microseconds resolution limit is so short that the available spectral cues are extremely limited. Resolution of delay seems overly sharp just for interception of flying insects, which suggests that the bat's biosonar images are of higher quality to suit a wider variety of orientation tasks, and that biosonar echo processing is correspondingly more sophisticated than has been suspected.

  18. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  19. Additional bounds on the pre-big-bang-nucleosynthesis expansion by means of γ-rays from the galactic centre

    NASA Astrophysics Data System (ADS)

    Donato, Fiorenza; Fornengo, Nicolao; Schelke, Mia

    2007-03-01

    The possibility to use γ-ray data from the galactic centre (GC) to constrain the cosmological evolution of the Universe in a phase prior to primordial nucleosynthesis, namely around the time of cold dark matter (CDM) decoupling, is analysed. The basic idea is that in a modified cosmological scenario, where the Hubble expansion rate is enhanced with respect to the standard case, the CDM decoupling occurred earlier and the relic abundance of a given dark matter (DM) candidate enhanced. This implies that the present amount of CDM in the Universe may be explained by a weakly interacting massive particle (WIMP) which possesses an annihilation cross section that is (much) larger than that in standard cosmology. This enhanced annihilation implies larger fluxes of indirect detection signals of CDM. We show that the HESS measurements can set bounds for WIMPs heavier than a few hundred GeV, depending on the actual DM halo profile. These results are complementary to those obtained in a previous analysis based on cosmic antiprotons. For a Moore DM profile, γ-ray data limit the maximal Hubble rate enhancement to be below a factor of 100. Moreover, for the same profile, a WIMP in the 1 10 TeV mass range is not compatible with a cosmological scenario with an enhanced expansion rate prior to big bang nucleosynthesis (BBN). Less steep DM profiles provide less stringent bounds, depending on the cosmological scenario.

  20. Direct laser additive fabrication system with image feedback control

    DOEpatents

    Griffith, Michelle L.; Hofmeister, William H.; Knorovsky, Gerald A.; MacCallum, Danny O.; Schlienger, M. Eric; Smugeresky, John E.

    2002-01-01

    A closed-loop, feedback-controlled direct laser fabrication system is disclosed. The feedback refers to the actual growth conditions obtained by real-time analysis of thermal radiation images. The resulting system can fabricate components with severalfold improvement in dimensional tolerances and surface finish.

  1. Big Images and Big Ideas!

    ERIC Educational Resources Information Center

    McCullagh, John; Greenwood, Julian

    2011-01-01

    In this digital age, is primary science being left behind? Computer microscopes provide opportunities to transform science lessons into highly exciting learning experiences and to shift enquiry and discovery back into the hands of the children. A class of 5- and 6-year-olds was just one group of children involved in the Digitally Resourced…

  2. Optimal addition of images for detection and photometry

    NASA Technical Reports Server (NTRS)

    Fischer, Philippe; Kochanski, Greg P.

    1994-01-01

    In this paper we describe weighting techniques used for the optimal coaddition of charge coupled devices (CCD) frames with differing characteristics. Optimal means maximum signal to noise (S/N) for stellar objects. We derive formulas for four applications: (1) object detection via matched filter, (2) object detection identical to DAOFIND, (3) aperture photometry, and (4) ALLSTAR profile-fitting photometry. We have included examples involving 21 frames for which either the sky brightness or image resolution varied by a factor of 3. The gains in S/N were modest for most of the examples, except for DAOFIND detection with varying image resolution which exhibited a substantial S/N increase. Even though the only consideration was maximizing S/N, the image resolution was seen to improve for most of the variable resolution examples. Also discussed are empirical fits for the weighting and the availability of the program, WEIGHT, used to generate the weighting for the individual frames. Finally, we include appendices describing the effects of clipping algorithms and a scheme for star/galaxy and cosmic-ray/star discrimination. scheme for star/galaxy and cosmic-ray/star discrimination.

  3. BigNeuron: Large-scale 3D Neuron Reconstruction from Optical Microscopy Images

    PubMed Central

    Peng, Hanchuan; Hawrylycz, Michael; Roskams, Jane; Hill, Sean; Spruston, Nelson; Meijering, Erik; Ascoli, Giorgio A.

    2016-01-01

    Understanding the structure of single neurons is critical for understanding how they function within neural circuits. BigNeuron is a new community effort that combines modern bioimaging informatics, recent leaps in labeling and microscopy, and the widely recognized need for openness and standardization to provide a community resource for automated reconstruction of dendritic and axonal morphology of single neurons. PMID:26182412

  4. I Can Create Mental Images to Retell and Infer Big Ideas

    ERIC Educational Resources Information Center

    Miller, Debbie

    2013-01-01

    As teachers, we are always reflecting on and refining our craft. In this article, the author shares how her understanding and implementation of comprehension strategy instruction has evolved over the past ten years. These shifts include her current thinking about the gradual release of responsibility instructional model, how content and big ideas…

  5. BigNeuron: Large-Scale 3D Neuron Reconstruction from Optical Microscopy Images.

    PubMed

    Peng, Hanchuan; Hawrylycz, Michael; Roskams, Jane; Hill, Sean; Spruston, Nelson; Meijering, Erik; Ascoli, Giorgio A

    2015-07-15

    Understanding the structure of single neurons is critical for understanding how they function within neural circuits. BigNeuron is a new community effort that combines modern bioimaging informatics, recent leaps in labeling and microscopy, and the widely recognized need for openness and standardization to provide a community resource for automated reconstruction of dendritic and axonal morphology of single neurons.

  6. Big data for health.

    PubMed

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled. PMID:26173222

  7. Big data for health.

    PubMed

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  8. Big Society, Big Deal?

    ERIC Educational Resources Information Center

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  9. Rapid and retrievable recording of big data of time-lapse 3D shadow images of microbial colonies.

    PubMed

    Ogawa, Hiroyuki; Nasu, Senshi; Takeshige, Motomu; Saito, Mikako; Matsuoka, Hideaki

    2015-01-01

    We formerly developed an automatic colony count system based on the time-lapse shadow image analysis (TSIA). Here this system has been upgraded and applied to practical rapid decision. A microbial sample was spread on/in an agar plate with 90 mm in diameter as homogeneously as possible. We could obtain the results with several strains that most of colonies appeared within a limited time span. Consequently the number of colonies reached a steady level (Nstdy) and then unchanged until the end of long culture time to give the confirmed value (Nconf). The equivalence of Nstdy and Nconf as well as the difference of times for Nstdy and Nconf determinations were statistically significant at p < 0.001. Nstdy meets the requirement of practical routines treating a large number of plates. The difference of Nstdy and Nconf, if any, may be elucidated by means of retrievable big data. Therefore Nconf is valid for official documentation. PMID:25975590

  10. Rapid and retrievable recording of big data of time-lapse 3D shadow images of microbial colonies.

    PubMed

    Ogawa, Hiroyuki; Nasu, Senshi; Takeshige, Motomu; Saito, Mikako; Matsuoka, Hideaki

    2015-05-15

    We formerly developed an automatic colony count system based on the time-lapse shadow image analysis (TSIA). Here this system has been upgraded and applied to practical rapid decision. A microbial sample was spread on/in an agar plate with 90 mm in diameter as homogeneously as possible. We could obtain the results with several strains that most of colonies appeared within a limited time span. Consequently the number of colonies reached a steady level (Nstdy) and then unchanged until the end of long culture time to give the confirmed value (Nconf). The equivalence of Nstdy and Nconf as well as the difference of times for Nstdy and Nconf determinations were statistically significant at p < 0.001. Nstdy meets the requirement of practical routines treating a large number of plates. The difference of Nstdy and Nconf, if any, may be elucidated by means of retrievable big data. Therefore Nconf is valid for official documentation.

  11. Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment.

    PubMed

    Dilsizian, Steven E; Siegel, Eliot L

    2014-01-01

    Although advances in information technology in the past decade have come in quantum leaps in nearly every aspect of our lives, they seem to be coming at a slower pace in the field of medicine. However, the implementation of electronic health records (EHR) in hospitals is increasing rapidly, accelerated by the meaningful use initiatives associated with the Center for Medicare & Medicaid Services EHR Incentive Programs. The transition to electronic medical records and availability of patient data has been associated with increases in the volume and complexity of patient information, as well as an increase in medical alerts, with resulting "alert fatigue" and increased expectations for rapid and accurate diagnosis and treatment. Unfortunately, these increased demands on health care providers create greater risk for diagnostic and therapeutic errors. In the near future, artificial intelligence (AI)/machine learning will likely assist physicians with differential diagnosis of disease, treatment options suggestions, and recommendations, and, in the case of medical imaging, with cues in image interpretation. Mining and advanced analysis of "big data" in health care provide the potential not only to perform "in silico" research but also to provide "real time" diagnostic and (potentially) therapeutic recommendations based on empirical data. "On demand" access to high-performance computing and large health care databases will support and sustain our ability to achieve personalized medicine. The IBM Jeopardy! Challenge, which pitted the best all-time human players against the Watson computer, captured the imagination of millions of people across the world and demonstrated the potential to apply AI approaches to a wide variety of subject matter, including medicine. The combination of AI, big data, and massively parallel computing offers the potential to create a revolutionary way of practicing evidence-based, personalized medicine.

  12. Application of a New Method for Analyzing Images: Two-Dimensional Non-Linear Additive Decomposition

    SciTech Connect

    MA Zaccaria; DM Drudnoy; JE Stasenko

    2006-07-05

    This paper documents the application of a new image processing algorithm, two-dimensional non-linear additive decomposition (NLAD), which is used to identify regions in a digital image whose gray-scale (or color) intensity is different than the surrounding background. Standard image segmentation algorithms exist that allow users to segment images based on gray-scale intensity and/or shape. However, these processing techniques do not adequately account for the image noise and lighting variation that typically occurs across an image. NLAD is designed to separate image noise and background from artifacts thereby providing the ability to consistently evaluate images. The decomposition techniques used in this algorithm are based on the concepts of mathematical morphology. NLAD emulates the human capability of visually separating an image into different levels of resolution components, denoted as ''coarse'', ''fine'', and ''intermediate''. Very little resolution information overlaps any two of the component images. This method can easily determine and/or remove trends and noise from an image. NLAD has several additional advantages over conventional image processing algorithms, including no need for a transformation from one space to another, such as is done with Fourier transforms, and since only finite summations are required, the calculational effort is neither extensive nor complicated.

  13. Breast Imaging in the Era of Big Data: Structured Reporting and Data Mining

    PubMed Central

    Margolies, Laurie R.; Pandey, Gaurav; Horowitz, Eliot R.; Mendelson, David S.

    2016-01-01

    OBJECTIVE The purpose of this article is to describe structured reporting and the development of large databases for use in data mining in breast imaging. CONCLUSION The results of millions of breast imaging examinations are reported with structured tools based on the BI-RADS lexicon. Much of these data are stored in accessible media. Robust computing power creates great opportunity for data scientists and breast imagers to collaborate to improve breast cancer detection and optimize screening algorithms. Data mining can create knowledge, but the questions asked and their complexity require extremely powerful and agile databases. New data technologies can facilitate outcomes research and precision medicine. PMID:26587797

  14. Images of Paris: Big C Culture for the Nonspeaker of French.

    ERIC Educational Resources Information Center

    Spangler, May; York, Holly U.

    2002-01-01

    Discusses a course offered in both French and English at Emory University in Atlanta, Georgia that is based on the study of representations of Paris from the Middle Ages to the present. It uses architecture as a point of departure and explores the myth of Paris as expressed through a profusion of images in literature, painting, and film.…

  15. Mapping fetal brain development in utero using magnetic resonance imaging: the Big Bang of brain mapping.

    PubMed

    Studholme, Colin

    2011-08-15

    The development of tools to construct and investigate probabilistic maps of the adult human brain from magnetic resonance imaging (MRI) has led to advances in both basic neuroscience and clinical diagnosis. These tools are increasingly being applied to brain development in adolescence and childhood, and even to neonatal and premature neonatal imaging. Even earlier in development, parallel advances in clinical fetal MRI have led to its growing use as a tool in challenging medical conditions. This has motivated new engineering developments encompassing optimal fast MRI scans and techniques derived from computer vision, the combination of which allows full 3D imaging of the moving fetal brain in utero without sedation. These promise to provide a new and unprecedented window into early human brain growth. This article reviews the developments that have led us to this point, examines the current state of the art in the fields of fast fetal imaging and motion correction, and describes the tools to analyze dynamically changing fetal brain structure. New methods to deal with developmental tissue segmentation and the construction of spatiotemporal atlases are examined, together with techniques to map fetal brain growth patterns.

  16. A Feature-based Approach to Big Data Analysis of Medical Images

    PubMed Central

    Toews, Matthew; Wachinger, Christian; Estepar, Raul San Jose; Wells, William M.

    2015-01-01

    This paper proposes an inference method well-suited to large sets of medical images. The method is based upon a framework where distinctive 3D scale-invariant features are indexed efficiently to identify approximate nearest-neighbor (NN) feature matches in O(log N) computational complexity in the number of images N. It thus scales well to large data sets, in contrast to methods based on pair-wise image registration or feature matching requiring O(N) complexity. Our theoretical contribution is a density estimator based on a generative model that generalizes kernel density estimation and K-nearest neighbor (KNN) methods. The estimator can be used for on-the-fly queries, without requiring explicit parametric models or an off-line training phase. The method is validated on a large multi-site data set of 95,000,000 features extracted from 19,000 lung CT scans. Subject-level classification identifies all images of the same subjects across the entire data set despite deformation due to breathing state, including unintentional duplicate scans. State-of-the-art performance is achieved in predicting chronic pulmonary obstructive disorder (COPD) severity across the 5-category GOLD clinical rating, with an accuracy of 89% if both exact and one-off predictions are considered correct. PMID:26221685

  17. A Feature-Based Approach to Big Data Analysis of Medical Images.

    PubMed

    Toews, Matthew; Wachinger, Christian; Estepar, Raul San Jose; Wells, William M

    2015-01-01

    This paper proposes an inference method well-suited to large sets of medical images. The method is based upon a framework where distinctive 3D scale-invariant features are indexed efficiently to identify approximate nearest-neighbor (NN) feature matches-in O (log N) computational complexity in the number of images N. It thus scales well to large data sets, in contrast to methods based on pair-wise image registration or feature matching requiring O(N) complexity. Our theoretical contribution is a density estimator based on a generative model that generalizes kernel density estimation and K-nearest neighbor (KNN) methods.. The estimator can be used for on-the-fly queries, without requiring explicit parametric models or an off-line training phase. The method is validated on a large multi-site data set of 95,000,000 features extracted from 19,000 lung CT scans. Subject-level classification identifies all images of the same subjects across the entire data set despite deformation due to breathing state, including unintentional duplicate scans. State-of-the-art performance is achieved in predicting chronic pulmonary obstructive disorder (COPD) severity across the 5-category GOLD clinical rating, with an accuracy of 89% if both exact and one-off predictions are considered correct. PMID:26221685

  18. Unstructured medical image query using big data - An epilepsy case study.

    PubMed

    Istephan, Sarmad; Siadat, Mohammad-Reza

    2016-02-01

    Big data technologies are critical to the medical field which requires new frameworks to leverage them. Such frameworks would benefit medical experts to test hypotheses by querying huge volumes of unstructured medical data to provide better patient care. The objective of this work is to implement and examine the feasibility of having such a framework to provide efficient querying of unstructured data in unlimited ways. The feasibility study was conducted specifically in the epilepsy field. The proposed framework evaluates a query in two phases. In phase 1, structured data is used to filter the clinical data warehouse. In phase 2, feature extraction modules are executed on the unstructured data in a distributed manner via Hadoop to complete the query. Three modules have been created, volume comparer, surface to volume conversion and average intensity. The framework allows for user-defined modules to be imported to provide unlimited ways to process the unstructured data hence potentially extending the application of this framework beyond epilepsy field. Two types of criteria were used to validate the feasibility of the proposed framework - the ability/accuracy of fulfilling an advanced medical query and the efficiency that Hadoop provides. For the first criterion, the framework executed an advanced medical query that spanned both structured and unstructured data with accurate results. For the second criterion, different architectures were explored to evaluate the performance of various Hadoop configurations and were compared to a traditional Single Server Architecture (SSA). The surface to volume conversion module performed up to 40 times faster than the SSA (using a 20 node Hadoop cluster) and the average intensity module performed up to 85 times faster than the SSA (using a 40 node Hadoop cluster). Furthermore, the 40 node Hadoop cluster executed the average intensity module on 10,000 models in 3h which was not even practical for the SSA. The current study is

  19. Unstructured medical image query using big data - An epilepsy case study.

    PubMed

    Istephan, Sarmad; Siadat, Mohammad-Reza

    2016-02-01

    Big data technologies are critical to the medical field which requires new frameworks to leverage them. Such frameworks would benefit medical experts to test hypotheses by querying huge volumes of unstructured medical data to provide better patient care. The objective of this work is to implement and examine the feasibility of having such a framework to provide efficient querying of unstructured data in unlimited ways. The feasibility study was conducted specifically in the epilepsy field. The proposed framework evaluates a query in two phases. In phase 1, structured data is used to filter the clinical data warehouse. In phase 2, feature extraction modules are executed on the unstructured data in a distributed manner via Hadoop to complete the query. Three modules have been created, volume comparer, surface to volume conversion and average intensity. The framework allows for user-defined modules to be imported to provide unlimited ways to process the unstructured data hence potentially extending the application of this framework beyond epilepsy field. Two types of criteria were used to validate the feasibility of the proposed framework - the ability/accuracy of fulfilling an advanced medical query and the efficiency that Hadoop provides. For the first criterion, the framework executed an advanced medical query that spanned both structured and unstructured data with accurate results. For the second criterion, different architectures were explored to evaluate the performance of various Hadoop configurations and were compared to a traditional Single Server Architecture (SSA). The surface to volume conversion module performed up to 40 times faster than the SSA (using a 20 node Hadoop cluster) and the average intensity module performed up to 85 times faster than the SSA (using a 40 node Hadoop cluster). Furthermore, the 40 node Hadoop cluster executed the average intensity module on 10,000 models in 3h which was not even practical for the SSA. The current study is

  20. Biomass estimator for NIR image with a few additional spectral band images taken from light UAS

    NASA Astrophysics Data System (ADS)

    Pölönen, Ilkka; Salo, Heikki; Saari, Heikki; Kaivosoja, Jere; Pesonen, Liisa; Honkavaara, Eija

    2012-05-01

    A novel way to produce biomass estimation will offer possibilities for precision farming. Fertilizer prediction maps can be made based on accurate biomass estimation generated by a novel biomass estimator. By using this knowledge, a variable rate amount of fertilizers can be applied during the growing season. The innovation consists of light UAS, a high spatial resolution camera, and VTT's novel spectral camera. A few properly selected spectral wavelengths with NIR images and point clouds extracted by automatic image matching have been used in the estimation. The spectral wavelengths were chosen from green, red, and NIR channels.

  1. Three-dimensional oxygen isotope imaging of convective fluid flow around the Big Bonanza, Comstock lode mining district, Nevada

    USGS Publications Warehouse

    Criss, R.E.; Singleton, M.J.; Champion, D.E.

    2000-01-01

    Oxygen isotope analyses of propylitized andesites from the Con Virginia and California mines allow construction of a detailed, three-dimensional image of the isotopic surfaces produced by the convective fluid flows that deposited the famous Big Bonanza orebody. On a set of intersecting maps and sections, the δ18O isopleths clearly show the intricate and conformable relationship of the orebody to a deep, ~500 m gyre of meteoric-hydrothermal fluid that circulated along and above the Comstock fault, near the contact of the Davidson Granodiorite. The core of this gyre (δ18O = 0 to 3.8‰) encompasses the bonanza and is almost totally surrounded by rocks having much lower δ18O values (–1.0 to –4.4‰). This deep gyre may represent a convective longitudinal roll superimposed on a large unicellular meteoric-hydrothermal system, producing a complex flow field with both radial and longitudinal components that is consistent with experimentally observed patterns of fluid convection in permeable media.

  2. Classification Of Multi-Classed Stochastic Images Buried In Additive Noise

    NASA Astrophysics Data System (ADS)

    Gu, Zu-Han; Lee, Sing H.

    1987-01-01

    The Optimal Correlation Filter for the discrimination or classification of multi-class stochastic images buried in additive noise is designed. We consider noise in images as the (K+1)th class of stochastic image so that the K-class with noise problem becomes a problem of (K+1)-classes: K-class without noise plus the (K+1)th class of noise. Experimental verifications with both low frequency background noise and high fre-quency shot noise show that the new filter design is reliable.

  3. Big Heart Data: Advancing Health Informatics through Data Sharing in Cardiovascular Imaging

    PubMed Central

    Suinesiaputra, Avan; Medrano-Gracia, Pau; Cowan, Brett R.; Young, Alistair A.

    2015-01-01

    The burden of heart disease is rapidly worsening due to increasing prevalence of obesity and diabetes. Data sharing and open database resources for heart health informatics are important for advancing our understanding of cardiovascular function, disease progression and therapeutics. Data sharing enables valuable information, often obtained at considerable expense and effort, to be re-used beyond the specific objectives of the original study. Many government funding agencies and journal publishers are requiring data re-use, and are providing mechanisms for data curation and archival. Tools and infrastructure are available to archive anonymous data from a wide range of studies, from descriptive epidemiological data to gigabytes of imaging data. Meta-analyses can be performed to combine raw data from disparate studies to obtain unique comparisons or to enhance statistical power. Open benchmark datasets are invaluable for validating data analysis algorithms and objectively comparing results. This review provides a rationale for increased data sharing and surveys recent progress in the cardiovascular domain. We also highlight the potential of recent large cardiovascular epidemiological studies enabling collaborative efforts to facilitate data sharing, algorithms benchmarking, disease modeling and statistical atlases. PMID:25415993

  4. Big heart data: advancing health informatics through data sharing in cardiovascular imaging.

    PubMed

    Suinesiaputra, Avan; Medrano-Gracia, Pau; Cowan, Brett R; Young, Alistair A

    2015-07-01

    The burden of heart disease is rapidly worsening due to the increasing prevalence of obesity and diabetes. Data sharing and open database resources for heart health informatics are important for advancing our understanding of cardiovascular function, disease progression and therapeutics. Data sharing enables valuable information, often obtained at considerable expense and effort, to be reused beyond the specific objectives of the original study. Many government funding agencies and journal publishers are requiring data reuse, and are providing mechanisms for data curation and archival. Tools and infrastructure are available to archive anonymous data from a wide range of studies, from descriptive epidemiological data to gigabytes of imaging data. Meta-analyses can be performed to combine raw data from disparate studies to obtain unique comparisons or to enhance statistical power. Open benchmark datasets are invaluable for validating data analysis algorithms and objectively comparing results. This review provides a rationale for increased data sharing and surveys recent progress in the cardiovascular domain. We also highlight the potential of recent large cardiovascular epidemiological studies enabling collaborative efforts to facilitate data sharing, algorithms benchmarking, disease modeling and statistical atlases. PMID:25415993

  5. The BigBoss Experiment

    SciTech Connect

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  6. Image and compositional characteristics of the LDEF Big Guy impact crater

    SciTech Connect

    Bunch, T.E.; Paque, J.M.; Zolensky, M. |

    1995-02-01

    A 5.2 mm crater in Al-metal represents the largest found on LDEF. The authors have examined this crater by field emission scanning electron microscopy (FESEM), energy dispersive spectroscopy (EDS) and time-of-flight/secondary ion mass spectroscopy (TOF-SIMS) in order to determine if there is any evidence of impactor residue. Droplet and dome-shaped columns, along with flow features, are evidence of melting. EDS from the crater cavity and rim show Mg, C, O and variable amounts of Si, in addition to Al. No evidence for a chondritic impactor was found, and it is hypothesized that the crater may be the result of impact with space debris.

  7. Astronomy in the Cloud: Using MapReduce for Image Co-Addition

    NASA Astrophysics Data System (ADS)

    Wiley, K.; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-03-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computation challenges such as anomaly detection and classification and moving-object tracking. Since such studies benefit from the highest-quality data, methods such as image co-addition, i.e., astrometric registration followed by per-pixel summation, will be a critical preprocessing step prior to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources such as potentially hazardous asteroids or transient objects such as supernovae, these data streams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this article we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data are partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources: i.e., platforms where Hadoop is offered as a service. We report on our experience of implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multiterabyte imaging data set provides a good testbed for algorithm development, since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image co-addition to the MapReduce framework. Then we describe a number of optimizations to our basic approach

  8. How Big Is Too Big?

    ERIC Educational Resources Information Center

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  9. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.

  10. Terahertz imaging and tomography as efficient instruments for testing polymer additive manufacturing objects.

    PubMed

    Perraud, J B; Obaton, A F; Bou-Sleiman, J; Recur, B; Balacey, H; Darracq, F; Guillet, J P; Mounaix, P

    2016-05-01

    Additive manufacturing (AM) technology is not only used to make 3D objects but also for rapid prototyping. In industry and laboratories, quality controls for these objects are necessary though difficult to implement compared to classical methods of fabrication because the layer-by-layer printing allows for very complex object manufacturing that is unachievable with standard tools. Furthermore, AM can induce unknown or unexpected defects. Consequently, we demonstrate terahertz (THz) imaging as an innovative method for 2D inspection of polymer materials. Moreover, THz tomography may be considered as an alternative to x-ray tomography and cheaper 3D imaging for routine control. This paper proposes an experimental study of 3D polymer objects obtained by additive manufacturing techniques. This approach allows us to characterize defects and to control dimensions by volumetric measurements on 3D data reconstructed by tomography.

  11. Thermal imaging for assessment of electron-beam freeform fabrication (EBF3) additive manufacturing deposits

    NASA Astrophysics Data System (ADS)

    Zalameda, Joseph N.; Burke, Eric R.; Hafley, Robert A.; Taminger, Karen M.; Domack, Christopher S.; Brewer, Amy; Martin, Richard E.

    2013-05-01

    Additive manufacturing is a rapidly growing field where 3-dimensional parts can be produced layer by layer. NASA's electron beam freeform fabrication (EBF3) technology is being evaluated to manufacture metallic parts in a space environment. The benefits of EBF3 technology are weight savings to support space missions, rapid prototyping in a zero gravity environment, and improved vehicle readiness. The EBF3 system is composed of 3 main components: electron beam gun, multi-axis position system, and metallic wire feeder. The electron beam is used to melt the wire and the multi-axis positioning system is used to build the part layer by layer. To insure a quality deposit, a near infrared (NIR) camera is used to image the melt pool and solidification areas. This paper describes the calibration and application of a NIR camera for temperature measurement. In addition, image processing techniques are presented for deposit assessment metrics.

  12. Terahertz imaging and tomography as efficient instruments for testing polymer additive manufacturing objects.

    PubMed

    Perraud, J B; Obaton, A F; Bou-Sleiman, J; Recur, B; Balacey, H; Darracq, F; Guillet, J P; Mounaix, P

    2016-05-01

    Additive manufacturing (AM) technology is not only used to make 3D objects but also for rapid prototyping. In industry and laboratories, quality controls for these objects are necessary though difficult to implement compared to classical methods of fabrication because the layer-by-layer printing allows for very complex object manufacturing that is unachievable with standard tools. Furthermore, AM can induce unknown or unexpected defects. Consequently, we demonstrate terahertz (THz) imaging as an innovative method for 2D inspection of polymer materials. Moreover, THz tomography may be considered as an alternative to x-ray tomography and cheaper 3D imaging for routine control. This paper proposes an experimental study of 3D polymer objects obtained by additive manufacturing techniques. This approach allows us to characterize defects and to control dimensions by volumetric measurements on 3D data reconstructed by tomography. PMID:27140357

  13. High resolution seismic-reflection imaging of shallow deformation beneath the northeast margin of the Manila high at Big Lake, Arkansas

    USGS Publications Warehouse

    Odum, J.K.; Stephenson, W.J.; Williams, R.A.; Worley, D.M.; Guccione, M.J.; Van Arsdale, R.B.

    2001-01-01

    The Manila high, an elliptical area 19 km long (N-S) by 6 km wide (E-W) located west-southwest of Big Lake. Arkansas, has less than 3 m of topographic relief. Geomorphic, stratigraphic and chronology data indicate that Big Lake formed during at least two periods of Holocene uplift and subsequent damming of the south-flowing Little River. Age data of an organic mat located at the base of an upper lacustrine deposit indicates an abrupt, possibly tectonic, formation of the present Big Lake between AD 1640 and 1950. We acquired 7 km of high-resolution seismic-reflection data across the northeastern margin of the Manila high to examine its near-surface bedrock structure and possible association with underlying structures such as the Blytheville arch. Sense of displacement and character of imaged faults support interpretations for either a northwest trending, 1.5 km-wide, block of uplifted strata or a series of parallel northeast-trending faults that bound horst and graben structures. We interpret deformation of the Manila high to result from faulting generated by the reactivation of right-lateral strike-slip fault motion along this portion of the Blytheville arch. The most recent uplift of the Manila high may have occurred during the December 16, 1811, New Madrid earthquake. Published by Elsevier Science B.V.

  14. Focal masses in a non-cirrhotic liver: The additional benefit of CEUS over baseline imaging.

    PubMed

    Chiorean, L; Cantisani, V; Jenssen, C; Sidhu, P S; Baum, U; Dietrich, C F

    2015-09-01

    Incidentally detected focal liver lesions are commonly encountered in clinical practice presenting a challenge in the daily department work flow. Guidelines for the management of incidental focal liver lesions have been published but comments, illustrations and recommendations regarding practical issues are crucial. The unique features of contrast-enhanced ultrasound in non-invasive assessment of focal liver lesion enhancement throughout the vascular phases in real-time has allowed an impressive improvement in the diagnostic accuracy of ultrasound. We highlight the additional benefit of contrast-enhanced ultrasound over conventional B-mode ultrasound imaging in detection, characterization, differential and final diagnosis of focal liver lesions, as well as for liver metastases screening. The current roles of cross-sectional imaging are explained in detail, with indications and limitations for each procedure. The advantages of CEUS, such as non-ionizing radiation exposure, cost benefits, non-iodinate contrast agents, and repeatability are also described ultimately improving patient management.

  15. Color reproductivity improvement with additional virtual color filters for WRGB image sensor

    NASA Astrophysics Data System (ADS)

    Kawada, Shun; Kuroda, Rihito; Sugawa, Shigetoshi

    2013-02-01

    We have developed a high accuracy color reproduction method based on an estimated spectral reflectance of objects using additional virtual color filters for a wide dynamic range WRGB color filter CMOS image sensor. The four virtual color filters are created by multiplying the spectral sensitivity of White pixel by gauss functions which have different central wave length and standard deviation, and the virtual sensor outputs of those virtual filters are estimated from the four real output signals of the WRGB image sensor. The accuracy of color reproduction was evaluated with a Macbeth Color Checker (MCC), and the averaged value of the color difference ΔEab of 24 colors was 1.88 with our approach.

  16. Active Contours Using Additive Local and Global Intensity Fitting Models for Intensity Inhomogeneous Image Segmentation

    PubMed Central

    Soomro, Shafiullah; Kim, Jeong Heon; Soomro, Toufique Ahmed

    2016-01-01

    This paper introduces an improved region based active contour method with a level set formulation. The proposed energy functional integrates both local and global intensity fitting terms in an additive formulation. Local intensity fitting term influences local force to pull the contour and confine it to object boundaries. In turn, the global intensity fitting term drives the movement of contour at a distance from the object boundaries. The global intensity term is based on the global division algorithm, which can better capture intensity information of an image than Chan-Vese (CV) model. Both local and global terms are mutually assimilated to construct an energy function based on a level set formulation to segment images with intensity inhomogeneity. Experimental results show that the proposed method performs better both qualitatively and quantitatively compared to other state-of-the-art-methods. PMID:27800011

  17. Big Dreams

    ERIC Educational Resources Information Center

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  18. Big Opportunities and Big Concerns of Big Data in Education

    ERIC Educational Resources Information Center

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  19. A method for predicting DCT-based denoising efficiency for grayscale images corrupted by AWGN and additive spatially correlated noise

    NASA Astrophysics Data System (ADS)

    Rubel, Aleksey S.; Lukin, Vladimir V.; Egiazarian, Karen O.

    2015-03-01

    Results of denoising based on discrete cosine transform for a wide class of images corrupted by additive noise are obtained. Three types of noise are analyzed: additive white Gaussian noise and additive spatially correlated Gaussian noise with middle and high correlation levels. TID2013 image database and some additional images are taken as test images. Conventional DCT filter and BM3D are used as denoising techniques. Denoising efficiency is described by PSNR and PSNR-HVS-M metrics. Within hard-thresholding denoising mechanism, DCT-spectrum coefficient statistics are used to characterize images and, subsequently, denoising efficiency for them. Results of denoising efficiency are fitted for such statistics and efficient approximations are obtained. It is shown that the obtained approximations provide high accuracy of prediction of denoising efficiency.

  20. Big Bang Circus

    NASA Astrophysics Data System (ADS)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  1. Live 3D image overlay for arterial duct closure with Amplatzer Duct Occluder II additional size.

    PubMed

    Goreczny, Sebstian; Morgan, Gareth J; Dryzek, Pawel

    2016-03-01

    Despite several reports describing echocardiography for the guidance of ductal closure, two-dimensional angiography remains the mainstay imaging tool; three-dimensional rotational angiography has the potential to overcome some of the drawbacks of standard angiography, and reconstructed image overlay provides reliable guidance for device placement. We describe arterial duct closure solely from venous approach guided by live three-dimensional image overlay.

  2. Big data in multiple sclerosis: development of a web-based longitudinal study viewer in an imaging informatics-based eFolder system for complex data analysis and management

    NASA Astrophysics Data System (ADS)

    Ma, Kevin; Wang, Ximing; Lerner, Alex; Shiroishi, Mark; Amezcua, Lilyana; Liu, Brent

    2015-03-01

    In the past, we have developed and displayed a multiple sclerosis eFolder system for patient data storage, image viewing, and automatic lesion quantification results stored in DICOM-SR format. The web-based system aims to be integrated in DICOM-compliant clinical and research environments to aid clinicians in patient treatments and disease tracking. This year, we have further developed the eFolder system to handle big data analysis and data mining in today's medical imaging field. The database has been updated to allow data mining and data look-up from DICOM-SR lesion analysis contents. Longitudinal studies are tracked, and any changes in lesion volumes and brain parenchyma volumes are calculated and shown on the webbased user interface as graphical representations. Longitudinal lesion characteristic changes are compared with patients' disease history, including treatments, symptom progressions, and any other changes in the disease profile. The image viewer is updated such that imaging studies can be viewed side-by-side to allow visual comparisons. We aim to use the web-based medical imaging informatics eFolder system to demonstrate big data analysis in medical imaging, and use the analysis results to predict MS disease trends and patterns in Hispanic and Caucasian populations in our pilot study. The discovery of disease patterns among the two ethnicities is a big data analysis result that will help lead to personalized patient care and treatment planning.

  3. Solving the Big Data (BD) Problem in Advanced Manufacturing (Subcategory for work done at Georgia Tech. Study Process and Design Factors for Additive Manufacturing Improvement)

    SciTech Connect

    Clark, Brett W.; Diaz, Kimberly A.; Ochiobi, Chinaza Darlene; Paynabar, Kamran

    2015-09-01

    3D printing originally known as additive manufacturing is a process of making 3 dimensional solid objects from a CAD file. This ground breaking technology is widely used for industrial and biomedical purposes such as building objects, tools, body parts and cosmetics. An important benefit of 3D printing is the cost reduction and manufacturing flexibility; complex parts are built at the fraction of the price. However, layer by layer printing of complex shapes adds error due to the surface roughness. Any such error results in poor quality products with inaccurate dimensions. The main purpose of this research is to measure the amount of printing errors for parts with different geometric shapes and to analyze them for finding optimal printing settings to minimize the error. We use a Design of Experiments framework, and focus on studying parts with cone and ellipsoid shapes. We found that the orientation and the shape of geometric shapes have significant effect on the printing error. From our analysis, we also determined the optimal orientation that gives the least printing error.

  4. Additional value of biplane transoesophageal imaging in assessment of mitral valve prostheses.

    PubMed Central

    Groundstroem, K; Rittoo, D; Hoffman, P; Bloomfield, P; Sutherland, G R

    1993-01-01

    OBJECTIVES--To determine whether biplane transoesophageal imaging offers advantages in the evaluation of mitral prostheses when compared with standard single transverse plane imaging or the precordial approach in suspected prosthetic dysfunction. DESIGN--Prospective mitral valve prosthesis in situ using precordial and biplane transoesophageal ultrasonography. SETTING--Tertiary cardiac referral centre. SUBJECTS--67 consecutive patients with suspected dysfunction of a mitral valve prosthesis (16 had bioprostheses and 51 mechanical prostheses) who underwent precordial, transverse plane, and biplane transoesophageal echocardiography. Correlative invasive confirmation from surgery or angiography, or both, was available in 44 patients. MAIN OUTCOME MEASURES--Number, type, and site of leak according to the three means of scanning. RESULTS--Transverse plane transoesophageal imaging alone identified all 31 medial/lateral paravalvar leaks but only 24/30 of the anterior/posterior leaks. Combining the information from both imaging planes confirmed that biplane scanning identified all paravalvar leaks. Five of the six patients with prosthetic valve endocarditis, all three with valvar thrombus or obstruction, and all three with mitral annulus rupture were diagnosed from transverse plane imaging alone. Longitudinal plane imaging alone enabled diagnosis of the remaining case of prosthetic endocarditis and a further case of subvalvar pannus formation. CONCLUSIONS--Transverse plane transoesophageal imaging was superior to the longitudinal imaging in identifying medial and lateral lesions around the sewing ring of a mitral valve prosthesis. Longitudinal plane imaging was superior in identifying anterior and posterior lesions. Biplane imaging is therefore an important development in the study of mitral prosthesis function. Images PMID:8398497

  5. Application of Tapping-Mode Scanning Probe Electrospray Ionization to Mass Spectrometry Imaging of Additives in Polymer Films

    PubMed Central

    Shimazu, Ryo; Yamoto, Yoshinari; Kosaka, Tomoya; Kawasaki, Hideya; Arakawa, Ryuichi

    2014-01-01

    We report the application of tapping-mode scanning probe electrospray ionization (t-SPESI) to mass spectrometry imaging of industrial materials. The t-SPESI parameters including tapping solvent composition, solvent flow rate, number of tapping at each spot, and step-size were optimized using a quadrupole mass spectrometer to improve mass spectrometry (MS) imaging of thin-layer chromatography (TLC) and additives in polymer films. Spatial resolution of approximately 100 μm was achieved by t-SPESI imaging mass spectrometry using a fused-silica capillary (50 μm i.d., 150 μm o.d.) with the flow rate set at 0.2 μL/min. This allowed us to obtain discriminable MS imaging profiles of three dyes separated by TLC and the additive stripe pattern of a PMMA model film depleted by UV irradiation. PMID:26819894

  6. Big Sky Carbon Atlas

    DOE Data Explorer

    The Big Sky Carbon Atlas is an online geoportal designed for you to discover, interpret, and access geospatial data and maps relevant to decision support and education on carbon sequestration in the Big Sky Region. In serving as the public face of the Partnership's spatial Data Libraries, the Atlas provides a gateway to geographic information characterizing CO2 sources, potential geologic sinks, terrestrial carbon fluxes, civil and energy infrastructure, energy use, and related themes. In addition to directly serving the BSCSP and its stakeholders, the Atlas feeds regional data to the NatCarb Portal, contributing to a national perspective on carbon sequestration. Established components of the Atlas include a gallery of thematic maps and an interactive map that allows you to: • Navigate and explore regional characterization data through a user-friendly interface • Print your map views or publish them as PDFs • Identify technical references relevant to specific areas of interest • Calculate straight-line or pipeline-constrained distances from point sources of CO2 to potential geologic sink features • Download regional data layers (feature under development) (Acknowledgment to the Big Sky Carbon Sequestration Partnership (BSCSP); see home page at http://www.bigskyco2.org/)

  7. Satellite-based land use mapping: comparative analysis of Landsat-8, Advanced Land Imager, and big data Hyperion imagery

    NASA Astrophysics Data System (ADS)

    Pervez, Wasim; Uddin, Vali; Khan, Shoab Ahmad; Khan, Junaid Aziz

    2016-04-01

    Until recently, Landsat technology has suffered from low signal-to-noise ratio (SNR) and comparatively poor radiometric resolution, which resulted in limited application for inland water and land use/cover mapping. The new generation of Landsat, the Landsat Data Continuity Mission carrying the Operational Land Imager (OLI), has improved SNR and high radiometric resolution. This study evaluated the utility of orthoimagery from OLI in comparison with the Advanced Land Imager (ALI) and hyperspectral Hyperion (after preprocessing) with respect to spectral profiling of classes, land use/cover classification, classification accuracy assessment, classifier selection, study area selection, and other applications. For each data source, the support vector machine (SVM) model outperformed the spectral angle mapper (SAM) classifier in terms of class discrimination accuracy (i.e., water, built-up area, mixed forest, shrub, and bare soil). Using the SVM classifier, Hyperion hyperspectral orthoimagery achieved higher overall accuracy than OLI and ALI. However, OLI outperformed both hyperspectral Hyperion and multispectral ALI using the SAM classifier, and with the SVM classifier outperformed ALI in terms of overall accuracy and individual classes. The results show that the new generation of Landsat achieved higher accuracies in mapping compared with the previous Landsat multispectral satellite series.

  8. SU-E-J-06: Additional Imaging Guidance Dose to Patient Organs Resulting From X-Ray Tubes Used in CyberKnife Image Guidance System

    SciTech Connect

    Sullivan, A; Ding, G

    2015-06-15

    Purpose: The use of image-guided radiation therapy (IGRT) has become increasingly common, but the additional radiation exposure resulting from repeated image guidance procedures raises concerns. Although there are many studies reporting imaging dose from different image guidance devices, imaging dose for the CyberKnife Robotic Radiosurgery System is not available. This study provides estimated organ doses resulting from image guidance procedures on the CyberKnife system. Methods: Commercially available Monte Carlo software, PCXMC, was used to calculate average organ doses resulting from x-ray tubes used in the CyberKnife system. There are seven imaging protocols with kVp ranging from 60 – 120 kV and 15 mAs for treatment sites in the Cranium, Head and Neck, Thorax, and Abdomen. The output of each image protocol was measured at treatment isocenter. For each site and protocol, Adult body sizes ranging from anorexic to extremely obese were simulated since organ dose depends on patient size. Doses for all organs within the imaging field-of-view of each site were calculated for a single image acquisition from both of the orthogonal x-ray tubes. Results: Average organ doses were <1.0 mGy for every treatment site and imaging protocol. For a given organ, dose increases as kV increases or body size decreases. Higher doses are typically reported for skeletal components, such as the skull, ribs, or clavicles, than for softtissue organs. Typical organ doses due to a single exposure are estimated as 0.23 mGy to the brain, 0.29 mGy to the heart, 0.08 mGy to the kidneys, etc., depending on the imaging protocol and site. Conclusion: The organ doses vary with treatment site, imaging protocol and patient size. Although the organ dose from a single image acquisition resulting from two orthogonal beams is generally insignificant, the sum of repeated image acquisitions (>100) could reach 10–20 cGy for a typical treatment fraction.

  9. Enhancement of Glossiness Perception by Retinal-Image Motion: Additional Effect of Head-Yoked Motion Parallax

    PubMed Central

    Tani, Yusuke; Araki, Keisuke; Nagai, Takehiro; Koida, Kowa; Nakauchi, Shigeki; Kitazaki, Michiteru

    2013-01-01

    It has been argued that when an observer moves, a contingent retinal-image motion of a stimulus would strengthen the perceived glossiness. This would be attributed to the veridical perception of three-dimensional structure by motion parallax. However, it has not been investigated whether the effect of motion parallax is more than that of retinal-image motion of the stimulus. Using a magnitude estimation method, we examine in this paper whether cross-modal coordination of the stimulus change and the observer's motion (i.e., motion parallax) is essential or the retinal-image motion alone is sufficient for enhancing the perceived glossiness. Our data show that a retinal-image motion simulating motion parallax without head motion strengthened the perceived glossiness but that its effect was weaker than that of motion parallax with head motion. These results suggest the existence of an additional effect of the cross-modal coordination between vision and proprioception on glossiness perception. That is, motion parallax enhances the perception of glossiness, in addition to retinal-image motions of specular surfaces. PMID:23336006

  10. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  11. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  12. Big data for bipolar disorder.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  13. Big data for bipolar disorder.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process. PMID:27068058

  14. Testing a Gender Additive Model: The Role of Body Image in Adolescent Depression

    ERIC Educational Resources Information Center

    Bearman, Sarah Kate; Stice, Eric

    2008-01-01

    Despite consistent evidence that adolescent girls are at greater risk of developing depression than adolescent boys, risk factor models that account for this difference have been elusive. The objective of this research was to examine risk factors proposed by the "gender additive" model of depression that attempts to partially explain the increased…

  15. Thermal Imaging for Assessment of Electron-Beam Free Form Fabrication (EBF(sup 3)) Additive Manufacturing Welds

    NASA Technical Reports Server (NTRS)

    Zalameda, Joseph N.; Burke, Eric R.; Hafley, Robert A.; Taminger, Karen M.; Domack, Christopher S.; Brewer, Amy R.; Martin, Richard E.

    2013-01-01

    Additive manufacturing is a rapidly growing field where 3-dimensional parts can be produced layer by layer. NASA s electron beam free-form fabrication (EBF(sup 3)) technology is being evaluated to manufacture metallic parts in a space environment. The benefits of EBF(sup 3) technology are weight savings to support space missions, rapid prototyping in a zero gravity environment, and improved vehicle readiness. The EBF(sup 3) system is composed of 3 main components: electron beam gun, multi-axis position system, and metallic wire feeder. The electron beam is used to melt the wire and the multi-axis positioning system is used to build the part layer by layer. To insure a quality weld, a near infrared (NIR) camera is used to image the melt pool and solidification areas. This paper describes the calibration and application of a NIR camera for temperature measurement. In addition, image processing techniques are presented for weld assessment metrics.

  16. Assessing the use of an infrared spectrum hyperpixel array imager to measure temperature during additive and subtractive manufacturing

    NASA Astrophysics Data System (ADS)

    Whitenton, Eric; Heigel, Jarred; Lane, Brandon; Moylan, Shawn

    2016-05-01

    Accurate non-contact temperature measurement is important to optimize manufacturing processes. This applies to both additive (3D printing) and subtractive (material removal by machining) manufacturing. Performing accurate single wavelength thermography suffers numerous challenges. A potential alternative is hyperpixel array hyperspectral imaging. Focusing on metals, this paper discusses issues involved such as unknown or changing emissivity, inaccurate greybody assumptions, motion blur, and size of source effects. The algorithm which converts measured thermal spectra to emissivity and temperature uses a customized multistep non-linear equation solver to determine the best-fit emission curve. Emissivity dependence on wavelength may be assumed uniform or have a relationship typical for metals. The custom software displays residuals for intensity, temperature, and emissivity to gauge the correctness of the greybody assumption. Initial results are shown from a laser powder-bed fusion additive process, as well as a machining process. In addition, the effects of motion blur are analyzed, which occurs in both additive and subtractive manufacturing processes. In a laser powder-bed fusion additive process, the scanning laser causes the melt pool to move rapidly, causing a motion blur-like effect. In machining, measuring temperature of the rapidly moving chip is a desirable goal to develop and validate simulations of the cutting process. A moving slit target is imaged to characterize how the measured temperature values are affected by motion of a measured target.

  17. Determination of detergent and dispensant additives in gasoline by ring-oven and near infrared hypespectral imaging.

    PubMed

    Rodrigues e Brito, Lívia; da Silva, Michelle P F; Rohwedder, Jarbas J R; Pasquini, Celio; Honorato, Fernanda A; Pimentel, Maria Fernanda

    2015-03-10

    A method using the ring-oven technique for pre-concentration in filter paper discs and near infrared hyperspectral imaging is proposed to identify four detergent and dispersant additives, and to determine their concentration in gasoline. Different approaches were used to select the best image data processing in order to gather the relevant spectral information. This was attained by selecting the pixels of the region of interest (ROI), using a pre-calculated threshold value of the PCA scores arranged as histograms, to select the spectra set; summing up the selected spectra to achieve representativeness; and compensating for the superimposed filter paper spectral information, also supported by scores histograms for each individual sample. The best classification model was achieved using linear discriminant analysis and genetic algorithm (LDA/GA), whose correct classification rate in the external validation set was 92%. Previous classification of the type of additive present in the gasoline is necessary to define the PLS model required for its quantitative determination. Considering that two of the additives studied present high spectral similarity, a PLS regression model was constructed to predict their content in gasoline, while two additional models were used for the remaining additives. The results for the external validation of these regression models showed a mean percentage error of prediction varying from 5 to 15%.

  18. Flatbed scanners as a source of imaging. Brightness assessment and additives determination in a nickel electroplating bath.

    PubMed

    Vidal, M; Amigo, J M; Bro, R; Ostra, M; Ubide, C; Zuriarrain, J

    2011-05-23

    Desktop flatbed scanners are very well-known devices that can provide digitized information of flat surfaces. They are practically present in most laboratories as a part of the computer support. Several quality levels can be found in the market, but all of them can be considered as tools with a high performance and low cost. The present paper shows how the information obtained with a scanner, from a flat surface, can be used with fine results for exploratory and quantitative purposes through image analysis. It provides cheap analytical measurements for assessment of quality parameters of coated metallic surfaces and monitoring of electrochemical coating bath lives. The samples used were steel sheets nickel-plated in an electrodeposition bath. The quality of the final deposit depends on the bath conditions and, especially, on the concentration of the additives in the bath. Some additives become degraded with the bath life and so is the quality of the plate finish. Analysis of the scanner images can be used to follow the evolution of the metal deposit and the concentration of additives in the bath. Principal component analysis (PCA) is applied to find significant differences in the coating of sheets, to find directions of maximum variability and to identify odd samples. The results found are favorably compared with those obtained by means of specular reflectance (SR), which is here used as a reference technique. Also the concentration of additives SPB and SA-1 along a nickel bath life can be followed using image data handled with algorithms such as partial least squares (PLS) regression and support vector regression (SVR). The quantitative results obtained with these and other algorithms are compared. All this opens new qualitative and quantitative possibilities to flatbed scanners.

  19. Temperature Profile and Imaging Analysis of Laser Additive Manufacturing of Stainless Steel

    NASA Astrophysics Data System (ADS)

    Islam, M.; Purtonen, T.; Piili, H.; Salminen, A.; Nyrhilä, O.

    Powder bed fusion is a laser additive manufacturing (LAM) technology which is used to manufacture parts layer-wise from powdered metallic materials. The technology has advanced vastly in the recent years and current systems can be used to manufacture functional parts for e.g. aerospace industry. The performance and accuracy of the systems have improved also, but certain difficulties in the powder fusion process are reducing the final quality of the parts. One of these is commonly known as the balling phenomenon. The aim of this study was to define some of the process characteristics in powder bed fusion by performing comparative studies with two different test setups. This was done by comparing measured temperature profiles and on-line photography of the process. The material used during the research was EOS PH1 stainless steel. Both of the test systems were equipped with 200 W single mode fiber lasers. The main result of the research was that some of the process instabilities are resulting from the energy input during the process.

  20. Five Big Ideas

    ERIC Educational Resources Information Center

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  1. Big data are coming to psychiatry: a general introduction.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Bauer, Michael

    2015-12-01

    Big data are coming to the study of bipolar disorder and all of psychiatry. Data are coming from providers and payers (including EMR, imaging, insurance claims and pharmacy data), from omics (genomic, proteomic, and metabolomic data), and from patients and non-providers (data from smart phone and Internet activities, sensors and monitoring tools). Analysis of the big data will provide unprecedented opportunities for exploration, descriptive observation, hypothesis generation, and prediction, and the results of big data studies will be incorporated into clinical practice. Technical challenges remain in the quality, analysis and management of big data. This paper discusses some of the fundamental opportunities and challenges of big data for psychiatry.

  2. Big data are coming to psychiatry: a general introduction.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Bauer, Michael

    2015-12-01

    Big data are coming to the study of bipolar disorder and all of psychiatry. Data are coming from providers and payers (including EMR, imaging, insurance claims and pharmacy data), from omics (genomic, proteomic, and metabolomic data), and from patients and non-providers (data from smart phone and Internet activities, sensors and monitoring tools). Analysis of the big data will provide unprecedented opportunities for exploration, descriptive observation, hypothesis generation, and prediction, and the results of big data studies will be incorporated into clinical practice. Technical challenges remain in the quality, analysis and management of big data. This paper discusses some of the fundamental opportunities and challenges of big data for psychiatry. PMID:26440506

  3. Big Data

    PubMed Central

    SOBEK, MATTHEW; CLEVELAND, LARA; FLOOD, SARAH; HALL, PATRICIA KELLY; KING, MIRIAM L.; RUGGLES, STEVEN; SCHROEDER, MATTHEW

    2011-01-01

    The Minnesota Population Center (MPC) provides aggregate data and microdata that have been integrated and harmonized to maximize crosstemporal and cross-spatial comparability. All MPC data products are distributed free of charge through an interactive Web interface that enables users to limit the data and metadata being analyzed to samples and variables of interest to their research. In this article, the authors describe the integrated databases available from the MPC, report on recent additions and enhancements to these data sets, and summarize new online tools and resources that help users to analyze the data over time. They conclude with a description of the MPC’s newest and largest infrastructure project to date: a global population and environment data network. PMID:21949459

  4. Deficits in Agency in Schizophrenia, and Additional Deficits in Body Image, Body Schema, and Internal Timing, in Passivity Symptoms

    PubMed Central

    Graham, Kyran T.; Martin-Iverson, Mathew T.; Holmes, Nicholas P.; Jablensky, Assen; Waters, Flavie

    2014-01-01

    Individuals with schizophrenia, particularly those with passivity symptoms, may not feel in control of their actions, believing them to be controlled by external agents. Cognitive operations that contribute to these symptoms may include abnormal processing in agency as well as body representations that deal with body schema and body image. However, these operations in schizophrenia are not fully understood, and the questions of general versus specific deficits in individuals with different symptom profiles remain unanswered. Using the projected-hand illusion (a digital video version of the rubber-hand illusion) with synchronous and asynchronous stroking (500 ms delay), and a hand laterality judgment task, we assessed sense of agency, body image, and body schema in 53 people with clinically stable schizophrenia (with a current, past, and no history of passivity symptoms) and 48 healthy controls. The results revealed a stable trait in schizophrenia with no difference between clinical subgroups (sense of agency) and some quantitative (specific) differences depending on the passivity symptom profile (body image and body schema). Specifically, a reduced sense of self-agency was a common feature of all clinical subgroups. However, subgroup comparisons showed that individuals with passivity symptoms (both current and past) had significantly greater deficits on tasks assessing body image and body schema, relative to the other groups. In addition, patients with current passivity symptoms failed to demonstrate the normal reduction in body illusion typically seen with a 500 ms delay in visual feedback (asynchronous condition), suggesting internal timing problems. Altogether, the results underscore self-abnormalities in schizophrenia, provide evidence for both trait abnormalities and state changes specific to passivity symptoms, and point to a role for internal timing deficits as a mechanistic explanation for external cues becoming a possible source of self-body input

  5. Big Spherules near 'Victoria'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This frame from the microscopic imager on NASA's Mars Exploration Rover Opportunity shows spherules up to about 5 millimeters (one-fifth of an inch) in diameter. The camera took this image during the 924th Martian day, or sol, of Opportunity's Mars-surface mission (Aug. 30, 2006), when the rover was about 200 meters (650 feet) north of 'Victoria Crater.'

    Opportunity discovered spherules like these, nicknamed 'blueberries,' at its landing site in 'Eagle Crater,' and investigations determined them to be iron-rich concretions that formed inside deposits soaked with groundwater. However, such concretions were much smaller or absent at the ground surface along much of the rover's trek of more than 5 kilometers (3 miles) southward to Victoria. The big ones showed up again when Opportunity got to the ring, or annulus, of material excavated and thrown outward by the impact that created Victoria Crater. Researchers hypothesize that some layer beneath the surface in Victoria's vicinity was once soaked with water long enough to form the concretions, that the crater-forming impact dispersed some material from that layer, and that Opportunity might encounter that layer in place if the rover drives down into the crater.

  6. Magnetic resonance imaging in partial epilepsy: additional abnormalities shown with the fluid attenuated inversion recovery (FLAIR) pulse sequence.

    PubMed Central

    Bergin, P S; Fish, D R; Shorvon, S D; Oatridge, A; deSouza, N M; Bydder, G M

    1995-01-01

    Thirty six patients with a history of partial epilepsy had MRI of the brain performed with conventional T1 and T2 weighted pulse sequences as well as the fluid attenuated inversion recovery (FLAIR) sequence. Abnormalities were found in 20 cases (56%), in whom there were 25 lesions or groups of lesions. Twenty four of these lesions were more conspicuous with the FLAIR sequence than with any of the conventional sequences. In 11 of these 20 cases, lesions thought to be of aetiological importance were only seen with the FLAIR sequence. In eight this was a solitary lesion. In the other three, an additional and apparently significant lesion (or lesions) was only seen with the FLAIR sequence when another lesion had been identified with both conventional and FLAIR sequences. The 11 additional lesions or groups of lesions were seen in the hippocampus, amygdala, cortex, or subcortical and periventricular regions. No lesion was found with any pulse sequence in 16 (44%) of the original group of 36 patients. In the eight cases where a lesion was seen only with the FLAIR sequence, localisation was concordant with the electroclinical features. Two of the eight patients with solitary lesions seen only on the FLAIR sequence underwent surgery, after which there was pathological confirmation of the abnormality identified with imaging. In one patient with a congenital cavernoma, the primary lesion was best seen with a contrast enhanced T1 weighted spin echo sequence. In this selected series, the FLAIR sequence increased the yield of MRI examinations of the brain by 30%. Images PMID:7738550

  7. Dual of big bang and big crunch

    SciTech Connect

    Bak, Dongsu

    2007-01-15

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory.

  8. The Big Loser.

    ERIC Educational Resources Information Center

    Marks, Daniel

    1999-01-01

    Presents an activity in which the subject is the identity of the team in the greatest jeopardy of becoming the big loser in a basketball tournament. Explores several facts about the big loser, offering them in a hierarchy appropriate for creating various short- and long-term projects for a high school mathematics class. (ASK)

  9. Implementing Big History.

    ERIC Educational Resources Information Center

    Welter, Mark

    2000-01-01

    Contends that world history should be taught as "Big History," a view that includes all space and time beginning with the Big Bang. Discusses five "Cardinal Questions" that serve as a course structure and address the following concepts: perspectives, diversity, change and continuity, interdependence, and causes. (CMK)

  10. Big data: the management revolution.

    PubMed

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  11. Big data, big knowledge: big data for personalized healthcare.

    PubMed

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority. PMID:26218867

  12. Big data, big knowledge: big data for personalized healthcare.

    PubMed

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  13. Facile preparation and biological imaging of luminescent polymeric nanoprobes with aggregation-induced emission characteristics through Michael addition reaction.

    PubMed

    Lv, Qiulan; Wang, Ke; Xu, Dazhuang; Liu, Meiying; Wan, Qing; Huang, Hongye; Liang, Shangdong; Zhang, Xiaoyong; Wei, Yen

    2016-09-01

    Water dispersion aggregation-induced emission (AIE) dyes based nanomaterials have recently attracted increasing attention in the biomedical fields because of their unique optical properties, outstanding performance as imaging and therapeutic agents. The methods to conjugate hydrophilic polymers with AIE dyes to solve the hydrophobic nature of AIE dyes and makeS them widely used in biomedicine, which have been extensively explored and paid great effort previously. Although great advance has been made in the fabrication and biomedical applications of AIE-active polymeric nanoprobes, facile and efficient strategies for fabrication of biodegradable AIE-active nanoprobes are still high desirable. In this work, amphiphilic biodegradable fluorescent organic nanoparticles (PLL-TPE-O-E FONs) have been fabricated for the first time by conjugation of AIE dye tetraphenylethene acrylate (TPE-O-E) with Poly-l-Lysine (PLL) through a facile one-step Michael addition reaction, which was carried out under rather mild conditions, included air atmosphere, near room temperature and absent of metal catalysts or hazardous reagents. Due to the unique AIE properties, these amphiphilic copolymers tend to self-assemble into high luminescent water dispersible nanoparticles with size range from 400 to 600nm. Laser scanning microscope and cytotoxicity results revealed that PLL-TPE-O-E FONs can be internalized into cytoplasm with negative cytotoxicity, which implied that PLL-TPE-O-E FONs are promising for biological applications. PMID:27311129

  14. Big Data in industry

    NASA Astrophysics Data System (ADS)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  15. The Big Bang Theory

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  16. The Big Bang Theory

    SciTech Connect

    Lincoln, Don

    2014-09-30

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  17. Genesis of the big bang

    NASA Astrophysics Data System (ADS)

    Alpher, Ralph A.; Herman, Robert

    The authors of this volume have been intimately connected with the conception of the big bang model since 1947. Following the late George Gamov's ideas in 1942 and more particularly in 1946 that the early universe was an appropriate site for the synthesis of the elements, they became deeply involved in the question of cosmic nucleosynthesis and particularly the synthesis of the light elements. In the course of this work they developed a general relativistic model of the expanding universe with physics folded in, which led in a progressive, logical sequence to our prediction of the existence of a present cosmic background radiation some seventeen years before the observation of such radiation was reported by Penzias and Wilson. In addition, they carried out with James W. Follin, Jr., a detailed study of the physics of what was then considered to be the very early universe, starting a few seconds after the big bang, which still provides a methodology for studies of light element nucleosynthesis. Because of their involvement, they bring a personal perspective to the subject. They present a picture of what is now believed to be the state of knowledge about the evolution of the expanding universe and delineate the story of the development of the big bang model as they have seen and lived it from their own unique vantage point.

  18. The Big Bang Singularity

    NASA Astrophysics Data System (ADS)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  19. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    ERIC Educational Resources Information Center

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own "big" words and dreams. During the one…

  20. Improvement in perception of image sharpness through the addition of noise and its relationship with memory texture

    NASA Astrophysics Data System (ADS)

    Wan, Xiazi; Kobayashi, Hiroyuki; Aoki, Naokazu

    2015-03-01

    In a preceding study, we investigated the effects of image noise on the perception of image sharpness using white noise, and one- and two-dimensional single-frequency sinusoidal patterns as stimuli. This study extends our preceding study by evaluating natural color images, rather than black-and-white patterns. The results showed that the effect of noise in improving image sharpness perception is more evident in blurred images than in sharp images. This is consistent with the results of the preceding study. In another preceding study, we proposed "memory texture" to explain the preferred granularity of images, as a concept similar to "memory color" for preferred color reproduction. We observed individual differences in type of memory texture for each object, that is, white or 1/f noise. This study discusses the relationship between improvement of sharpness perception by adding noise, and the memory texture, following its individual differences. We found that memory texture is one of the elements that affect sharpness perception.

  1. Will Big Data Mean the End of Privacy?

    ERIC Educational Resources Information Center

    Pence, Harry E.

    2015-01-01

    Big Data is currently a hot topic in the field of technology, and many campuses are considering the addition of this topic into their undergraduate courses. Big Data tools are not just playing an increasingly important role in many commercial enterprises; they are also combining with new digital devices to dramatically change privacy. This article…

  2. Uniform Big Bang-Chaotic Big Crunch optimization

    NASA Astrophysics Data System (ADS)

    Alatas, Bilal

    2011-09-01

    This study proposes methods to improve the convergence of the novel optimization method, Big Bang-Big Crunch (BB-BC). Uniform population method has been used to generate uniformly distributed random points in the Big Bang phase. Chaos has been utilized to rapidly shrink those points to a single representative point via a center of mass in the Big Crunch phase. The proposed algorithm has been named as Uniform Big Bang-Chaotic Big Crunch (UBB-CBC). The performance of the UBB-CBC optimization algorithm demonstrates superiority over the BB-BC optimization for the benchmark functions.

  3. Offshore oil & gas markets heating up: Gulf of Mexico rising from `Dead Sea` image; healthy Gulf, North Sea markets combine for big impact

    SciTech Connect

    Simmons, M.R.

    1995-09-01

    Only three years ago, Gulf of Mexico drilling activity was so moribund that it was termed the Dead Sea. But the market has changed so there is now effectively 100 percent utilization in several important categories of offshore rigs, and almost every type of offshore rig is now getting higher use and better rates. What makes these changes so profound is that few industry participants saw this tightness developing, and almost no one predicted that it would occur so soon. Even the largest offshore contractors were pleasantly surprised as they watched their key drilling markets tighten so uickly after many years of vast oversupply. Today, while neither the Gulf of Mexico nor the North Sea could be described as booming, they are not falling apart either. The combination of both markets merely being normal at the same time has made a big impact on the worldwide supply and demand for offshore drilling. The need for steady and increasing offshore oil and gas production has never been so high. The technology now in place is allowing the development of offshore areas deemed almost impossible less than a decade ago. Also, the vast excess supply of offshore equipment is gone for many forms of drilling, and the need for steadily higher dayrates is real and will merely increase over time.

  4. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    PubMed

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  5. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    PubMed

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed. PMID:21859221

  6. Post-lumpectomy CT-guided tumor bed delineation for breast boost and partial breast irradiation: Can additional pre- and postoperative imaging reduce interobserver variability?

    PubMed Central

    DEN HARTOGH, MARISKA D.; PHILIPPENS, MARIELLE E.P.; VAN DAM, IRIS E.; KLEYNEN, CATHARINA E.; TERSTEEG, ROBBERT J.H.A.; KOTTE, ALEXIS N.T.J.; VAN VULPEN, MARCO; VAN ASSELEN, BRAM; VAN DEN BONGARD, DESIRÉE H.J.G.

    2015-01-01

    For breast boost radiotherapy or accelerated partial breast irradiation, the tumor bed (TB) is delineated by the radiation oncologist on a planning computed tomography (CT) scan. The aim of the present study was to investigate whether the interobserver variability (IOV) of the TB delineation is reduced by providing the radiation oncologist with additional magnetic resonance imaging (MRI) or CT scans. A total of 14 T1-T2 breast cancer patients underwent a standard planning CT in the supine treatment position following lumpectomy, as well as additional pre- and postoperative imaging in the same position. Post-lumpectomy TBs were independently delineated by four breast radiation oncologists on standard postoperative CT and on CT registered to an additional imaging modality. The additional imaging modalities used were postoperative MRI, preoperative contrast-enhanced (CE)-CT and preoperative CE-MRI. A cavity visualization score (CVS) was assigned to each standard postoperative CT by each observer. In addition, the conformity index (CI), volume and distance between centers of mass (dCOM) of the TB delineations were calculated. On CT, the median CI was 0.57, with a median volume of 22 cm3 and dCOM of 5.1 mm. The addition of postoperative MRI increased the median TB volume significantly to 28 cm3 (P<0.001), while the CI (P=0.176) and dCOM (P=0.110) were not affected. The addition of preoperative CT or MRI increased the TB volume to 26 and 25 cm3, respectively (both P<0.001), while the CI increased to 0.58 and 0.59 (both P<0.001) and the dCOM decreased to 4.7 mm (P=0.004) and 4.6 mm (P=0.001), respectively. In patients with CVS≤3, the median CI was 0.40 on CT, which was significantly increased by all additional imaging modalities, up to 0.52, and was accompanied by a median volume increase up to 6 cm3. In conclusion, the addition of postoperative MRI, preoperative CE-CT or preoperative CE-MRI did not result in a considerable reduction in the IOV in postoperative CT

  7. Big data in biomedicine.

    PubMed

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science.

  8. Bayesian big bang

    NASA Astrophysics Data System (ADS)

    Daum, Fred; Huang, Jim

    2011-09-01

    We show that the flow of particles corresponding to Bayes' rule has a number of striking similarities with the big bang, including cosmic inflation and cosmic acceleration. We derive a PDE for this flow using a log-homotopy from the prior probability density to the posteriori probability density. We solve this PDE using the gradient of the solution to Poisson's equation, which is computed using an exact Green's function and the standard Monte Carlo approximation of integrals. The resulting flow is analogous to Coulomb's law in electromagnetics. We have used no physics per se to derive this flow, but rather we have only used Bayes' rule and the definition of normalized probability and a loghomotopy parameter that could be interpreted as time. The details of this big bang resemble very recent theories much more closely than the so-called new inflation models, which postulate enormous inflation immediately after the big bang.

  9. Big Questions: Missing Antimatter

    SciTech Connect

    Lincoln, Don

    2013-08-27

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  10. Big Questions: Missing Antimatter

    ScienceCinema

    Lincoln, Don

    2016-07-12

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  11. Coma morphology and dust-emission pattern of periodic comet Halley. III - Additional high-resolution images taken in 1910

    NASA Technical Reports Server (NTRS)

    Larson, S. M.; Sekanina, Z.

    1985-01-01

    High-resolution photographic images of comet Halley obtained at Lick, Helwan, Lowell, and Vienna observatories during May-June 1910 are analyzed using the image-processing algorithm of Larson and Sekanina (1984). The results are presented in tables and compared with analyses of Mt. Wilson plates for the same period (Sekanina and Larson, 1984), and good general agreement is noted. Consideration is given to the position of the rotation pole, features indicative of the nucleus spin rate, the spokelike structure observed on May 21-22, and the expanding gas halo.

  12. Evidence of big bang turbulence

    NASA Astrophysics Data System (ADS)

    Gibson, Carl H.

    2002-11-01

    Chaotic, eddy-like motions dominated by inertial-vortex forces begin at Planck scales in a hot big-bang-turbulence (BBT) cosmological model where this version of the quantum-gravitational-dynamics epoch produces not only the first space-time-energy of the universe but the first high Reynolds number turbulence and turbulent mixing with Kolmogorov and Batchelor-Obukhov-Corrsin velocity and temperature gradient spectra. Strong-force-freeze-out and inflation produced the first fossil-temperature-turbulence by stretching the fluctuations beyond the horizon scale ct of causal connection for light speed c and time t. Recent Cosmic Background Imager spectra of the cosmic microwave background (CMB) temperature anisotropies at high wavenumbers support the prediction that fossil BBT fluctuation patterns imprinted by nucleosynthesis on light element densities and the associated Sachs-Wolfe temperature fluctuations should not decay by thermal diffusion as expected if the CMB anisotropies were acoustic as commonly assumed. Extended Self Similarity coefficients of the CMB anisotropies exactly match those of high Reynolds number turbulence (Bershadskii and Sreenivasan 2002), supporting the conclusion that fossil big-bang-turbulence seeded nucleosynthesis of light elements and the first hydro-gravitational structure formation.

  13. Astronomical surveys and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  14. A novel material detection algorithm based on 2D GMM-based power density function and image detail addition scheme in dual energy X-ray images.

    PubMed

    Pourghassem, Hossein

    2012-01-01

    Material detection is a vital need in dual energy X-ray luggage inspection systems at security of airport and strategic places. In this paper, a novel material detection algorithm based on statistical trainable models using 2-Dimensional power density function (PDF) of three material categories in dual energy X-ray images is proposed. In this algorithm, the PDF of each material category as a statistical model is estimated from transmission measurement values of low and high energy X-ray images by Gaussian Mixture Models (GMM). Material label of each pixel of object is determined based on dependency probability of its transmission measurement values in the low and high energy to PDF of three material categories (metallic, organic and mixed materials). The performance of material detection algorithm is improved by a maximum voting scheme in a neighborhood of image as a post-processing stage. Using two background removing and denoising stages, high and low energy X-ray images are enhanced as a pre-processing procedure. For improving the discrimination capability of the proposed material detection algorithm, the details of the low and high energy X-ray images are added to constructed color image which includes three colors (orange, blue and green) for representing the organic, metallic and mixed materials. The proposed algorithm is evaluated on real images that had been captured from a commercial dual energy X-ray luggage inspection system. The obtained results show that the proposed algorithm is effective and operative in detection of the metallic, organic and mixed materials with acceptable accuracy.

  15. Unsupervised Tensor Mining for Big Data Practitioners.

    PubMed

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry.

  16. Unsupervised Tensor Mining for Big Data Practitioners.

    PubMed

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry. PMID:27642720

  17. Differentiation between Glioblastoma Multiforme and Primary Cerebral Lymphoma: Additional Benefits of Quantitative Diffusion-Weighted MR Imaging

    PubMed Central

    Li, Chien Feng; Chen, Tai Yuan; Shu, Ginger; Kuo, Yu Ting; Lee, Yu Chang

    2016-01-01

    The differentiation between glioblastoma multiforme (GBM) and primary cerebral lymphoma (PCL) is important because the treatments are substantially different. The purpose of this article is to describe the MR imaging characteristics of GBM and PCL with emphasis on the quantitative ADC analysis in the tumor necrosis, the most strongly-enhanced tumor area, and the peritumoral edema. This retrospective cohort study collected 104 GBM (WHO grade IV) patients and 22 immune-competent PCL (diffuse large B cell lymphoma) patients. All these patients had pretreatment brain MR DWI and ADC imaging. Analysis of conventional MR imaging and quantitative ADC measurement including the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe) were done. ROC analysis with optimal cut-off values and area-under-the ROC curve (AUC) was performed. For conventional MR imaging, there are statistical differences in tumor size, tumor location, tumor margin, and the presence of tumor necrosis between GBM and PCL. Quantitative ADC analysis shows that GBM tended to have significantly (P<0.05) higher ADC in the most strongly-enhanced area (ADCt) and lower ADC in the peritumoral edema (ADCe) as compared with PCL. Excellent AUC (0.94) with optimal sensitivity of 90% and specificity of 86% for differentiating between GBM and PCL was obtained by combination of ADC in the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe). Besides, there are positive ADC gradients in the peritumoral edema in a subset of GBMs but not in the PCLs. Quantitative ADC analysis in these three areas can thus be implemented to improve diagnostic accuracy for these two brain tumor types. The histological correlation of the ADC difference deserves further investigation. PMID:27631626

  18. Differentiation between Glioblastoma Multiforme and Primary Cerebral Lymphoma: Additional Benefits of Quantitative Diffusion-Weighted MR Imaging.

    PubMed

    Ko, Ching Chung; Tai, Ming Hong; Li, Chien Feng; Chen, Tai Yuan; Chen, Jeon Hor; Shu, Ginger; Kuo, Yu Ting; Lee, Yu Chang

    2016-01-01

    The differentiation between glioblastoma multiforme (GBM) and primary cerebral lymphoma (PCL) is important because the treatments are substantially different. The purpose of this article is to describe the MR imaging characteristics of GBM and PCL with emphasis on the quantitative ADC analysis in the tumor necrosis, the most strongly-enhanced tumor area, and the peritumoral edema. This retrospective cohort study collected 104 GBM (WHO grade IV) patients and 22 immune-competent PCL (diffuse large B cell lymphoma) patients. All these patients had pretreatment brain MR DWI and ADC imaging. Analysis of conventional MR imaging and quantitative ADC measurement including the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe) were done. ROC analysis with optimal cut-off values and area-under-the ROC curve (AUC) was performed. For conventional MR imaging, there are statistical differences in tumor size, tumor location, tumor margin, and the presence of tumor necrosis between GBM and PCL. Quantitative ADC analysis shows that GBM tended to have significantly (P<0.05) higher ADC in the most strongly-enhanced area (ADCt) and lower ADC in the peritumoral edema (ADCe) as compared with PCL. Excellent AUC (0.94) with optimal sensitivity of 90% and specificity of 86% for differentiating between GBM and PCL was obtained by combination of ADC in the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe). Besides, there are positive ADC gradients in the peritumoral edema in a subset of GBMs but not in the PCLs. Quantitative ADC analysis in these three areas can thus be implemented to improve diagnostic accuracy for these two brain tumor types. The histological correlation of the ADC difference deserves further investigation. PMID:27631626

  19. A Sobering Big Idea

    ERIC Educational Resources Information Center

    Wineburg, Sam

    2006-01-01

    Since Susan Adler, Alberta Dougan, and Jesus Garcia like "big ideas," the author offers one to ponder: young people in this country can not read with comprehension. The saddest thing about this crisis is that it is no secret. The 2001 results of the National Assessment of Educational Progress (NAEP) for reading, published in every major newspaper,…

  20. The Big Sky inside

    ERIC Educational Resources Information Center

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  1. The Big Fish

    ERIC Educational Resources Information Center

    DeLisle, Rebecca; Hargis, Jace

    2005-01-01

    The Killer Whale, Shamu jumps through hoops and splashes tourists in hopes for the big fish, not because of passion, desire or simply the enjoyment of doing so. What would happen if those fish were obsolete? Would this killer whale be able to find the passion to continue to entertain people? Or would Shamu find other exciting activities to do…

  2. Big-City Rules

    ERIC Educational Resources Information Center

    Gordon, Dan

    2011-01-01

    When it comes to implementing innovative classroom technology programs, urban school districts face significant challenges stemming from their big-city status. These range from large bureaucracies, to scalability, to how to meet the needs of a more diverse group of students. Because of their size, urban districts tend to have greater distance…

  3. A Big Bang Lab

    ERIC Educational Resources Information Center

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  4. Big Enough for Everyone?

    ERIC Educational Resources Information Center

    Coote, Anna

    2010-01-01

    The UK's coalition government wants to build a "Big Society." The Prime Minister says "we are all in this together" and building it is the responsibility of every citizen as well as every government department. The broad vision is welcome, but everything depends on how the vision is translated into policy and practice. The government aims to put…

  5. The big bang

    NASA Astrophysics Data System (ADS)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  6. Thinking Big, Aiming High

    ERIC Educational Resources Information Center

    Berkeley, Viv

    2010-01-01

    What do teachers, providers and policymakers need to do in order to support disabled learners to "think big and aim high"? That was the question put to delegates at NIACE's annual disability conference. Some clear themes emerged, with delegates raising concerns about funding, teacher training, partnership-working and employment for disabled…

  7. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    PubMed

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record

  8. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    PubMed

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record PMID:26348599

  9. Big Data and Chemical Education

    ERIC Educational Resources Information Center

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  10. Business and Science - Big Data, Big Picture

    NASA Astrophysics Data System (ADS)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  11. Computer image analysis: an additional tool for the identification of processed poultry and mammal protein containing bones.

    PubMed

    Pinotti, L; Fearn, T; Gulalp, S; Campagnoli, A; Ottoboni, M; Baldi, A; Cheli, F; Savoini, G; Dell'Orto, V

    2013-01-01

    The aims of this study were (1) to evaluate the potential of image analysis measurements, in combination with the official analytical methods for the detection of constituents of animal origin in feedstuffs, to distinguish between poultry versus mammals; and (2) to identify possible markers that can be used in routine analysis. For this purpose, 14 mammal and seven poultry samples and a total of 1081 bone fragment lacunae were analysed by combining the microscopic methods with computer image analysis. The distribution of 30 different measured size and shape bone lacunae variables were studied both within and between the two zoological classes. In all cases a considerable overlap between classes meant that classification of individual lacunae was problematic, though a clear separation in the means did allow successful classification of samples on the basis of averages. The variables most useful for classification were those related to size, lacuna area for example. The approach shows considerable promise but will need further study using a larger number of samples with a wider range.

  12. Optical image hiding based on computational ghost imaging

    NASA Astrophysics Data System (ADS)

    Wang, Le; Zhao, Shengmei; Cheng, Weiwen; Gong, Longyan; Chen, Hanwu

    2016-05-01

    Imaging hiding schemes play important roles in now big data times. They provide copyright protections of digital images. In the paper, we propose a novel image hiding scheme based on computational ghost imaging to have strong robustness and high security. The watermark is encrypted with the configuration of a computational ghost imaging system, and the random speckle patterns compose a secret key. Least significant bit algorithm is adopted to embed the watermark and both the second-order correlation algorithm and the compressed sensing (CS) algorithm are used to extract the watermark. The experimental and simulation results show that the authorized users can get the watermark with the secret key. The watermark image could not be retrieved when the eavesdropping ratio is less than 45% with the second-order correlation algorithm, whereas it is less than 20% with the TVAL3 CS reconstructed algorithm. In addition, the proposed scheme is robust against the 'salt and pepper' noise and image cropping degradations.

  13. Big Data and the Future of Radiology Informatics.

    PubMed

    Kansagra, Akash P; Yu, John-Paul J; Chatterjee, Arindam R; Lenchik, Leon; Chow, Daniel S; Prater, Adam B; Yeh, Jean; Doshi, Ankur M; Hawkins, C Matthew; Heilbrun, Marta E; Smith, Stacy E; Oselkin, Martin; Gupta, Pushpender; Ali, Sayed

    2016-01-01

    Rapid growth in the amount of data that is electronically recorded as part of routine clinical operations has generated great interest in the use of Big Data methodologies to address clinical and research questions. These methods can efficiently analyze and deliver insights from high-volume, high-variety, and high-growth rate datasets generated across the continuum of care, thereby forgoing the time, cost, and effort of more focused and controlled hypothesis-driven research. By virtue of an existing robust information technology infrastructure and years of archived digital data, radiology departments are particularly well positioned to take advantage of emerging Big Data techniques. In this review, we describe four areas in which Big Data is poised to have an immediate impact on radiology practice, research, and operations. In addition, we provide an overview of the Big Data adoption cycle and describe how academic radiology departments can promote Big Data development.

  14. Big Data and the Future of Radiology Informatics.

    PubMed

    Kansagra, Akash P; Yu, John-Paul J; Chatterjee, Arindam R; Lenchik, Leon; Chow, Daniel S; Prater, Adam B; Yeh, Jean; Doshi, Ankur M; Hawkins, C Matthew; Heilbrun, Marta E; Smith, Stacy E; Oselkin, Martin; Gupta, Pushpender; Ali, Sayed

    2016-01-01

    Rapid growth in the amount of data that is electronically recorded as part of routine clinical operations has generated great interest in the use of Big Data methodologies to address clinical and research questions. These methods can efficiently analyze and deliver insights from high-volume, high-variety, and high-growth rate datasets generated across the continuum of care, thereby forgoing the time, cost, and effort of more focused and controlled hypothesis-driven research. By virtue of an existing robust information technology infrastructure and years of archived digital data, radiology departments are particularly well positioned to take advantage of emerging Big Data techniques. In this review, we describe four areas in which Big Data is poised to have an immediate impact on radiology practice, research, and operations. In addition, we provide an overview of the Big Data adoption cycle and describe how academic radiology departments can promote Big Data development. PMID:26683510

  15. Small turbines, big unknown

    SciTech Connect

    Gipe, P.

    1995-07-01

    While financial markets focus on the wheeling and dealing of the big wind companies, the small wind turbine industry quietly keeps churning out its smaller but effective machines. Some, the micro turbines, are so small they can be carried by hand. Though worldwide sales of small wind turbines fall far short of even one large windpower plant, figures reach $8 million to $10 million annually and could be as much as twice that if batteries and engineering services are included.

  16. DARPA's Big Mechanism program.

    PubMed

    Cohen, Paul R

    2015-07-16

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  17. DARPA's Big Mechanism program

    NASA Astrophysics Data System (ADS)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  18. The Next Big Idea

    PubMed Central

    2013-01-01

    Abstract George S. Eisenbarth will remain in our memories as a brilliant scientist and great collaborator. His quest to discover the cause and prevention of type 1 (autoimmune) diabetes started from building predictive models based on immunogenetic markers. Despite his tremendous contributions to our understanding of the natural history of pre-type 1 diabetes and potential mechanisms, George left us with several big questions to answer before his quest is completed. PMID:23786296

  19. Big3. Editorial

    PubMed Central

    Lehmann, Christoph U.; Séroussi, Brigitte; Jaulent, Marie-Christine

    2014-01-01

    Summary Objectives To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. Methods A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. Results ‘Big Data’ has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that ‘Big Data’ will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics – some to a higher degree than others. It was our goal to provide a comprehensive view at the state of ‘Big Data’ today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. Conclusions For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016. PMID:24853037

  20. A big first step.

    PubMed

    Jones, Howard W

    2004-11-01

    The singleton, term gestation, live birth rate per cycle initiated (BESST) endpoint proposed at the beginning of 2004 is a first big step which should be added to by the consideration of multiple pregnancy rates in relation to singleton rates, by recording of fetal reductions and of pregnancies resulting from cryopreserved material. After three or more steps we may have an accurate reporting system which helps patients to distinguish the pros and cons for singleton term delivery. PMID:15479704

  1. Big Data Technologies

    PubMed Central

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  2. iPAINT: a general approach tailored to image the topology of interfaces with nanometer resolution† †Electronic supplementary information (ESI) available: Fig. S1–S8. See DOI: 10.1039/c6nr00445h Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Aloi, A.; Vilanova, N.

    2016-01-01

    Understanding interfacial phenomena in soft materials such as wetting, colloidal stability, coalescence, and friction warrants non-invasive imaging with nanometer resolution. Super-resolution microscopy has emerged as an attractive method to visualize nanostructures labeled covalently with fluorescent tags, but this is not amenable to all interfaces. Inspired by PAINT we developed a simple and general strategy to overcome this limitation, which we coin ‘iPAINT: interface Point Accumulation for Imaging in Nanoscale Topography’. It enables three-dimensional, sub-diffraction imaging of interfaces irrespective of their nature via reversible adsorption of polymer chains end-functionalized with photo-activatable moieties. We visualized model dispersions, emulsions, and foams with ∼20 nm and ∼3° accuracy demonstrating the general applicability of iPAINT to study solid/liquid, liquid/liquid and liquid/air interfaces. iPAINT thus broadens the scope of super-resolution microscopy paving the way for non-invasive, high-resolution imaging of complex soft materials. PMID:27055489

  3. Big Crater as Viewed by Pathfinder Lander

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.

    The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.

    This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.

    Mars Pathfinder is the second in NASA

  4. How Big is Earth?

    NASA Astrophysics Data System (ADS)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  5. Dark radiation emerging after big bang nucleosynthesis?

    SciTech Connect

    Fischler, Willy; Meyers, Joel

    2011-03-15

    We show how recent data from observations of the cosmic microwave background may suggest the presence of additional radiation density which appeared after big bang nucleosynthesis. We propose a general scheme by which this radiation could be produced from the decay of nonrelativistic matter, we place constraints on the properties of such matter, and we give specific examples of scenarios in which this general scheme may be realized.

  6. Solid-phase synthesis of graphene quantum dots from the food additive citric acid under microwave irradiation and their use in live-cell imaging.

    PubMed

    Zhuang, Qianfen; Wang, Yong; Ni, Yongnian

    2016-05-01

    The work demonstrated that solid citric acid, one of the most common food additives, can be converted to graphene quantum dots (GQDs) under microwave heating. The as-prepared GQDs were further characterized by various analytical techniques like transmission electron microscopy, atomic force microscopy, X-ray diffraction, X-ray photoelectron spectroscopy, Fourier transform infrared spectroscopy, fluorescence and UV-visible spectroscopy. Cytotoxicity of the GQDs was evaluated using HeLa cells. The result showed that the GQDs almost did not exhibit cytotoxicity at concentrations as high as 1000 µg mL(-1). In addition, it was found that the GQDs showed good solubility, excellent photostability, and excitation-dependent multicolor photoluminescence. Subsequently, the multicolor GQDs were successfully used as a fluorescence light-up probe for live-cell imaging.

  7. The International Big History Association

    ERIC Educational Resources Information Center

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big History" is a…

  8. The Big Read: Case Studies

    ERIC Educational Resources Information Center

    National Endowment for the Arts, 2009

    2009-01-01

    The Big Read evaluation included a series of 35 case studies designed to gather more in-depth information on the program's implementation and impact. The case studies gave readers a valuable first-hand look at The Big Read in context. Both formal and informal interviews, focus groups, attendance at a wide range of events--all showed how…

  9. Think Big, Bigger ... and Smaller

    ERIC Educational Resources Information Center

    Nisbett, Richard E.

    2010-01-01

    One important principle of social psychology, writes Nisbett, is that some big-seeming interventions have little or no effect. This article discusses a number of cases from the field of education that confirm this principle. For example, Head Start seems like a big intervention, but research has indicated that its effects on academic achievement…

  10. Towards Big Earth Data Analytics: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2013-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data whereby the term "coverage", according to ISO and OGC, is defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timeseries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The EarthServer initiative, funded by EU FP7 eInfrastructures, unites 11 partners from computer and earth sciences to establish Big Earth Data Analytics. One key ingredient is flexibility for users to ask what they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level query languages; these have proven tremendously successful on tabular and XML data, and we extend them with a central geo data structure, multi-dimensional arrays. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing code has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, an Array DBMS enabling efficient storage and retrieval of any-size, any-type multi-dimensional raster data. In the project, rasdaman is being extended with several functionality and scalability features, including: support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data

  11. The Rise of Big Data in Neurorehabilitation.

    PubMed

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry.

  12. Big-bounce genesis

    NASA Astrophysics Data System (ADS)

    Li, Changhong; Brandenberger, Robert H.; Cheung, Yeuk-Kwan E.

    2014-12-01

    We report on the possibility of using dark matter particle's mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the pre-bounce contraction and the post-bounce expansion epochs of the bounce universe reveals a new venue for achieving the observed relic abundance of our present universe, in which a significantly smaller amount of dark matter with a smaller cross section—as compared to the prediction of standard cosmology—is produced and the information about the bounce universe evolution is preserved by the out-of-thermal-equilibrium process. Once the value of dark matter mass and interaction cross section are obtained by direct detection in laboratories, this alternative route becomes a signature prediction of the bounce universe scenario.

  13. BIG DATA AND STATISTICS

    PubMed Central

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies. PMID:27722040

  14. Is There an Additional Value of {sup 11}C-Choline PET-CT to T2-weighted MRI Images in the Localization of Intraprostatic Tumor Nodules?

    SciTech Connect

    Van den Bergh, Laura; Koole, Michel; Isebaert, Sofie; Joniau, Steven; Deroose, Christophe M.; Oyen, Raymond; Lerut, Evelyne; Budiharto, Tom; Mottaghy, Felix; Bormans, Guy; Van Poppel, Hendrik; Haustermans, Karin

    2012-08-01

    Purpose: To investigate the additional value of {sup 11}C-choline positron emission tomography (PET)-computed tomography (CT) to T2-weighted (T2w) magnetic resonance imaging (MRI) for localization of intraprostatic tumor nodules. Methods and Materials: Forty-nine prostate cancer patients underwent T2w MRI and {sup 11}C-choline PET-CT before radical prostatectomy and extended lymphadenectomy. Tumor regions were outlined on the whole-mount histopathology sections and on the T2w MR images. Tumor localization was recorded in the basal, middle, and apical part of the prostate by means of an octant grid. To analyze {sup 11}C-choline PET-CT images, the same grid was used to calculate the standardized uptake values (SUV) per octant, after rigid registration with the T2w MR images for anatomic reference. Results: In total, 1,176 octants were analyzed. Sensitivity, specificity, and accuracy of T2w MRI were 33.5%, 94.6%, and 70.2%, respectively. For {sup 11}C-choline PET-CT, the mean SUV{sub max} of malignant octants was significantly higher than the mean SUV{sub max} of benign octants (3.69 {+-} 1.29 vs. 3.06 {+-} 0.97, p < 0.0001) which was also true for mean SUV{sub mean} values (2.39 {+-} 0.77 vs. 1.94 {+-} 0.61, p < 0.0001). A positive correlation was observed between SUV{sub mean} and absolute tumor volume (Spearman r = 0.3003, p = 0.0362). No correlation was found between SUVs and prostate-specific antigen, T-stage or Gleason score. The highest accuracy (61.1%) was obtained with a SUV{sub max} cutoff of 2.70, resulting in a sensitivity of 77.4% and a specificity of 44.9%. When both modalities were combined (PET-CT or MRI positive), sensitivity levels increased as a function of SUV{sub max} but at the cost of specificity. When only considering suspect octants on {sup 11}C-choline PET-CT (SUV{sub max} {>=} 2.70) and T2w MRI, 84.7% of these segments were in agreement with the gold standard, compared with 80.5% for T2w MRI alone. Conclusions: The additional value of {sup

  15. Big bang and big crunch in matrix string theory

    SciTech Connect

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-04-15

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe.

  16. Big questions, big science: meeting the challenges of global ecology.

    PubMed

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  17. Magnetic resonance imaging: A potential tool in assessing the addition of hyperthermia to neoadjuvant therapy in patients with locally advanced breast cancer

    PubMed Central

    CRACIUNESCU, OANA I.; THRALL, DONALD E.; VUJASKOVIC, ZELJKO; DEWHIRST, MARK W.

    2010-01-01

    The poor overall survival for patients with locally advanced breast cancers has led over the past decade to the introduction of numerous neoadjuvant combined therapy regimens to down-stage the disease before surgery. At the same time, more evidence suggests the need for treatment individualisation with a wide variety of new targets for cancer therapeutics and also multi modality therapies. In this context, early determination of whether the patient will fail to respond can enable the use of alternative therapies that can be more beneficial. The purpose of this review is to examine the potential role of magnetic resonance imaging (MRI) in early prediction of treatment response and prognosis of overall survival in locally advanced breast cancer patients enrolled on multi modality therapy trials that include hyperthermia. The material is organised with a review of dynamic contrast (DCE)-MRI and diffusion weighted (DW)-MRI for characterisation of phenomenological parameters of tumour physiology and their potential role in estimating therapy response. Most of the work published in this field has focused on responses to neoadjuvant chemotherapy regimens alone, so the emphasis will be there, however the available data that involves the addition of hyperthermia to the regimen will be discussed The review will also include future directions that include the potential use of MRI imaging techniques in establishing the role of hyperthermia alone in modifying breast tumour microenvironment, together with specific challenges related to performing such studies. PMID:20849258

  18. The challenges of big data.

    PubMed

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest.

  19. Homogeneous and isotropic big rips?

    SciTech Connect

    Giovannini, Massimo

    2005-10-15

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behavior is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  20. Big Data and Ambulatory Care

    PubMed Central

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2015-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an overview of the legal framework governing health information, dispels misconceptions about privacy regulations, and highlights how ambulatory care providers in particular can maximize the utility of big data to improve care. PMID:25401945

  1. The challenges of big data

    PubMed Central

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  2. The NOAA Big Data Project

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  3. Big climate data analysis

    NASA Astrophysics Data System (ADS)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  4. The BigBOSS Experiment

    SciTech Connect

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  5. Challenges of Big Data Analysis.

    PubMed

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  6. Challenges of Big Data Analysis

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-01-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469

  7. Powering Big Data for Nursing Through Partnership.

    PubMed

    Harper, Ellen M; Parkerson, Sara

    2015-01-01

    The Big Data Principles Workgroup (Workgroup) was established with support of the Healthcare Information and Management Systems Society. Building on the Triple Aim challenge, the Workgroup sought to identify Big Data principles, barriers, and challenges to nurse-sensitive data inclusion into Big Data sets. The product of this pioneering partnership Workgroup was the "Guiding Principles for Big Data in Nursing-Using Big Data to Improve the Quality of Care and Outcomes."

  8. The Economics of Big Area Addtiive Manufacturing

    SciTech Connect

    Post, Brian; Lloyd, Peter D; Lindahl, John; Lind, Randall F; Love, Lonnie J; Kunc, Vlastimil

    2016-01-01

    Case studies on the economics of Additive Manufacturing (AM) suggest that processing time is the dominant cost in manufacturing. Most additive processes have similar performance metrics: small part sizes, low production rates and expensive feedstocks. Big Area Additive Manufacturing is based on transitioning polymer extrusion technology from a wire to a pellet feedstock. Utilizing pellets significantly increases deposition speed and lowers material cost by utilizing low cost injection molding feedstock. The use of carbon fiber reinforced polymers eliminates the need for a heated chamber, significantly reducing machine power requirements and size constraints. We hypothesize that the increase in productivity coupled with decrease in feedstock and energy costs will enable AM to become more competitive with conventional manufacturing processes for many applications. As a test case, we compare the cost of using traditional fused deposition modeling (FDM) with BAAM for additively manufacturing composite tooling.

  9. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  10. JPL Big Data Technologies for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Jones, Dayton L.; D'Addario, L. R.; De Jong, E. M.; Mattmann, C. A.; Rebbapragada, U. D.; Thompson, D. R.; Wagstaff, K.

    2014-04-01

    During the past three years the Jet Propulsion Laboratory has been working on several technologies to deal with big data challenges facing next-generation radio arrays, among other applications. This program has focused on the following four areas: 1) We are investigating high-level ASIC architectures that reduce power consumption for cross-correlation of data from large interferometer arrays by one to two orders of magnitude. The cost of operations for the Square Kilometre Array (SKA), which may be dominated by the cost of power for data processing, is a serious concern. A large improvement in correlator power efficiency could have a major positive impact. 2) Data-adaptive algorithms (machine learning) for real-time detection and classification of fast transient signals in high volume data streams are being developed and demonstrated. Studies of the dynamic universe, particularly searches for fast (<< 1 second) transient events, require that data be analyzed rapidly and with robust RFI rejection. JPL, in collaboration with the International Center for Radio Astronomy Research in Australia, has developed a fast transient search system for eventual deployment on ASKAP. In addition, a real-time transient detection experiment is now running continuously and commensally on NRAO's Very Long Baseline Array. 3) Scalable frameworks for data archiving, mining, and distribution are being applied to radio astronomy. A set of powerful open-source Object Oriented Data Technology (OODT) tools is now available through Apache. OODT was developed at JPL for Earth science data archives, but it is proving to be useful for radio astronomy, planetary science, health care, Earth climate, and other large-scale archives. 4) We are creating automated, event-driven data visualization tools that can be used to extract information from a wide range of complex data sets. Visualization of complex data can be improved through algorithms that detect events or features of interest and autonomously

  11. Big Sky Carbon Sequestration Partnership

    SciTech Connect

    Susan Capalbo

    2005-12-31

    has significant potential to sequester large amounts of CO{sub 2}. Simulations conducted to evaluate mineral trapping potential of mafic volcanic rock formations located in the Idaho province suggest that supercritical CO{sub 2} is converted to solid carbonate mineral within a few hundred years and permanently entombs the carbon. Although MMV for this rock type may be challenging, a carefully chosen combination of geophysical and geochemical techniques should allow assessment of the fate of CO{sub 2} in deep basalt hosted aquifers. Terrestrial carbon sequestration relies on land management practices and technologies to remove atmospheric CO{sub 2} where it is stored in trees, plants, and soil. This indirect sequestration can be implemented today and is on the front line of voluntary, market-based approaches to reduce CO{sub 2} emissions. Initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil Carbon (C) on rangelands, and forested, agricultural, and reclaimed lands. Rangelands can store up to an additional 0.05 mt C/ha/yr, while the croplands are on average four times that amount. Estimates of technical potential for soil sequestration within the region in cropland are in the range of 2.0 M mt C/yr over 20 year time horizon. This is equivalent to approximately 7.0 M mt CO{sub 2}e/yr. The forestry sinks are well documented, and the potential in the Big Sky region ranges from 9-15 M mt CO{sub 2} equivalent per year. Value-added benefits include enhanced yields, reduced erosion, and increased wildlife habitat. Thus the terrestrial sinks provide a viable, environmentally beneficial, and relatively low cost sink that is available to sequester C in the current time frame. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts

  12. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-06-01

    soil C in the partnership region, and to design a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long term viability. Scientifically sound information on MMV is critical for public acceptance of these technologies. Two key deliverables were completed this quarter--a literature review/database to assess the soil carbon on rangelands, and the draft protocols, contracting options for soil carbon trading. To date, there has been little research on soil carbon on rangelands, and since rangeland constitutes a major land use in the Big Sky region, this is important in achieving a better understanding of terrestrial sinks. The protocols developed for soil carbon trading are unique and provide a key component of the mechanisms that might be used to efficiently sequester GHG and reduce CO{sub 2} concentrations. Progress on other deliverables is noted in the PowerPoint presentations. A series of meetings held during the second quarter have laid the foundations for assessing the issues surrounding the implementation of a market-based setting for soil C credits. These meetings provide a connection to stakeholders in the region and a basis on which to draw for the DOE PEIS hearings. Finally, the education and outreach efforts have resulted in a comprehensive plan and process which serves as a guide for implementing the outreach activities under Phase I. While we are still working on the public website, we have made many presentations to stakeholders and policy makers, connections to other federal and state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, we have laid plans for integration of our outreach efforts with the students, especially at the tribal colleges and at the universities involved in our partnership

  13. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  14. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany. PMID:26077871

  15. GEOSS: Addressing Big Data Challenges

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  16. Multiwavelength astronomy and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.

    2016-09-01

    Two major characteristics of modern astronomy are multiwavelength (MW) studies (fromγ-ray to radio) and big data (data acquisition, storage and analysis). Present astronomical databases and archives contain billions of objects observed at various wavelengths, both galactic and extragalactic, and the vast amount of data on them allows new studies and discoveries. Astronomers deal with big numbers. Surveys are the main source for discovery of astronomical objects and accumulation of observational data for further analysis, interpretation, and achieving scientific results. We review the main characteristics of astronomical surveys, compare photographic and digital eras of astronomical studies (including the development of wide-field observations), describe the present state of MW surveys, and discuss the Big Data in astronomy and related topics of Virtual Observatories and Computational Astrophysics. The review includes many numbers and data that can be compared to have a possibly overall understanding on the Universe, cosmic numbers and their relationship to modern computational facilities.

  17. Big Data: Astronomical or Genomical?

    PubMed

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  18. Big Data: Astronomical or Genomical?

    PubMed Central

    Stephens, Zachary D.; Lee, Skylar Y.; Faghri, Faraz; Campbell, Roy H.; Zhai, Chengxiang; Efron, Miles J.; Iyer, Ravishankar; Schatz, Michael C.; Sinha, Saurabh; Robinson, Gene E.

    2015-01-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a “four-headed beast”—it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the “genomical” challenges of the next decade. PMID:26151137

  19. Big Data: Astronomical or Genomical?

    PubMed

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade. PMID:26151137

  20. Rasdaman for Big Spatial Raster Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.

    2015-12-01

    Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.

  1. Big Sky Carbon Sequestration Partnership

    SciTech Connect

    Susan M. Capalbo

    2005-11-01

    state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, the Partnership has plans for integration of our outreach efforts with students, especially at the tribal colleges and at the universities involved in our Partnership. This includes collaboration with MSU and with the U.S.-Norway Summer School, extended outreach efforts at LANL and INEEL, and with the student section of the ASME. Finally, the Big Sky Partnership was involved in key meetings and symposium in the 7th quarter including the USDOE Wye Institute Conference on Carbon Sequestration and Capture (April, 2005); the DOE/NETL Fourth Annual Conference on Carbon Capture and Sequestration (May 2005); Coal Power Development Conference (Denver, June 2005) and meetings with our Phase II industry partners and Governor's staff.

  2. Big Data Goes Personal: Privacy and Social Challenges

    ERIC Educational Resources Information Center

    Bonomi, Luca

    2015-01-01

    The Big Data phenomenon is posing new challenges in our modern society. In addition to requiring information systems to effectively manage high-dimensional and complex data, the privacy and social implications associated with the data collection, data analytics, and service requirements create new important research problems. First, the high…

  3. Preliminary Geologic Map of the Big Pine Mountain Quadrangle, California

    USGS Publications Warehouse

    Vedder, J.G.; McLean, Hugh; Stanley, R.G.

    1995-01-01

    Reconnaissance geologic mapping of the San Rafael Primitive Area (now the San Rafael Wilderness) by Gower and others (1966) and Vedder an others (1967) showed s number of stratigraphic and structural ambiguities. To help resolve some of those problems, additional field work was done on parts of the Big Pine Moutain quadrangle during short intervals in 1981 and 1984, and 1990-1994.

  4. Partnering with Big Pharma-What Academics Need to Know.

    PubMed

    Lipton, Stuart A; Nordstedt, Christer

    2016-04-21

    Knowledge of the parameters of drug development can greatly aid academic scientists hoping to partner with pharmaceutical companies. Here, we discuss the three major pillars of drug development-pharmacodynamics, pharmacokinetics, and toxicity studies-which, in addition to pre-clinical efficacy, are critical for partnering with Big Pharma to produce novel therapeutics.

  5. [Big Data- challenges and risks].

    PubMed

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  6. [Big Data- challenges and risks].

    PubMed

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented. PMID:26614539

  7. Little Science to Big Science: Big Scientists to Little Scientists?

    ERIC Educational Resources Information Center

    Simonton, Dean Keith

    2010-01-01

    This article presents the author's response to Hisham B. Ghassib's essay entitled "Where Does Creativity Fit into a Productivist Industrial Model of Knowledge Production?" Professor Ghassib's (2010) essay presents a provocative portrait of how the little science of the Babylonians, Greeks, and Arabs became the Big Science of the modern industrial…

  8. Baryon symmetric big-bang cosmology. [matter-antimatter symmetry

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1978-01-01

    The framework of baryon-symmetric big-bang cosmology offers the greatest potential for deducing the evolution of the universe as a consequence of physical laws and processes with the minimum number of arbitrary assumptions as to initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the universe and how galaxies and galaxy clusters are formed, and also provides the only acceptable explanation at present for the origin of the cosmic gamma ray background radiation.

  9. Observational hints on the Big Bounce

    SciTech Connect

    Mielczarek, Jakub; Kurek, Aleksandra; Szydłowski, Marek; Kamionka, Michał E-mail: kamionka@astro.uni.wroc.pl E-mail: uoszydlo@cyf-kr.edu.pl

    2010-07-01

    In this paper we study possible observational consequences of the bouncing cosmology. We consider a model where a phase of inflation is preceded by a cosmic bounce. While we consider in this paper only that the bounce is due to loop quantum gravity, most of the results presented here can be applied for different bouncing cosmologies. We concentrate on the scenario where the scalar field, as the result of contraction of the universe, is driven from the bottom of the potential well. The field is amplified, and finally the phase of the standard slow-roll inflation is realized. Such an evolution modifies the standard inflationary spectrum of perturbations by the additional oscillations and damping on the large scales. We extract the parameters of the model from the observations of the cosmic microwave background radiation. In particular, the value of inflaton mass is equal to m = (1.7±0.6)·10{sup 13} GeV. In our considerations we base on the seven years of observations made by the WMAP satellite. We propose the new observational consistency check for the phase of slow-roll inflation. We investigate the conditions which have to be fulfilled to make the observations of the Big Bounce effects possible. We translate them to the requirements on the parameters of the model and then put the observational constraints on the model. Based on assumption usually made in loop quantum cosmology, the Barbero-Immirzi parameter was shown to be constrained by γ < 1100 from the cosmological observations. We have compared the Big Bounce model with the standard Big Bang scenario and showed that the present observational data is not informative enough to distinguish these models.

  10. Big Data for a Big Ocean at the NOAA National Oceanographic Data Center

    NASA Astrophysics Data System (ADS)

    Casey, K. S.

    2014-12-01

    Covering most of planet Earth, the vast, physically challenging ocean environment was once the sole domain of hardy, sea-going oceanographers. More recently, however, ocean observing systems have become more operational as well as more diverse. With observations coming from satellites, automated ship-based systems, autonomous underwater and airborne vehicles, in situ observing systems, and numerical models the field of oceanography is now clearly in the domain of Big Data. The NOAA National Oceanographic Data Center (NODC) and its partners around the world are addressing the entire range of Big Data issues for the ocean environment. A growing variety, volume, and velocity of incoming "Big Ocean" data streams are being managed through numerous approaches including the automated ingest and archive of incoming data; deployment of standardized, machine-consumable data discovery services; and interoperable data access, visualization, and subset mechanisms. In addition, support to the community of data producers to help them create more machine-ready ocean observation data streams is being provided and pilot projects to effectively incorporate commercial and hybrid cloud storage, access, and processing services into existing workflows and systems are being conducted. NODC is also engaging more actively than ever in the broader community of environmental data facilities to address these challenges. Details on these efforts at NODC and its partners will be provided and input sought on new and evolving user requirements.

  11. Big Dust Devils

    NASA Technical Reports Server (NTRS)

    2005-01-01

    28 January 2004 Northern Amazonis Planitia is famous for its frequent, large (> 1 km high) dust devils. They occur throughout the spring and summer seasons, and can be detected from orbit, even at the 240 meters (278 yards) per pixel resolution of the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide angle instruments. This red wide angle image shows a plethora of large dust devils. The arrow points to an example. Shadows cast by the towering columns of swirling dust point away from the direction of sunlight illumination (sun is coming from the left/lower left). This December 2004 scene covers an area more than 125 km (> 78 mi) across and is located near 37oN, 154oW.

  12. Big Explosives Experimental Facility - BEEF

    SciTech Connect

    2014-10-31

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  13. China: Big Changes Coming Soon

    ERIC Educational Resources Information Center

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  14. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  15. Big6 Turbotools and Synthesis

    ERIC Educational Resources Information Center

    Tooley, Melinda

    2005-01-01

    The different tools that are helpful during the Synthesis stage, their role in boosting students abilities in Synthesis and the way in which it can be customized to meet the needs of each group of students are discussed. Big6 TurboTools offers several tools to help complete the task. In Synthesis stage, these same tools along with Turbo Report and…

  16. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  17. The Case for "Big History."

    ERIC Educational Resources Information Center

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  18. Big Explosives Experimental Facility - BEEF

    ScienceCinema

    None

    2016-07-12

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  19. Fossils of big bang turbulence

    NASA Astrophysics Data System (ADS)

    Gibson, C. H.

    2004-12-01

    A model is proposed connecting turbulence, fossil turbulence, and the big bang origin of the universe. While details are incomplete, the model is consistent with our knowledge of these processes and is supported by observations. Turbulence arises in a hot-big-bang quantum-gravitational-dynamics scenario at Planck scales. Chaotic, eddy-like-motions produce an exothermic Planck particle cascade from 10-35 m at 1032 K to 108 larger, 104 cooler, quark-gluon scales. A Planck-Kerr instability gives high-Reynolds-number (Re 106) turbulent combustion, space-time-energy-entropy and turbulent mixing. Batchelor-Obukhov-Corrsin turbulent-temperature fluctuations are preserved as the first fossil-turbulence by inflation stretching the patterns beyond the horizon ct of causal connection faster than light speed c in time t 10-33 seconds. Fossil-big-bang-temperature-turbulence re-enters the horizon and imprints nucleosynthesis of H-He densities that seed fragmentation by gravity at 1012 s in the low Reynolds number plasma before its transition to gas at t 1013 s and T 3000 K. Multi-scaling coefficients of the cosmic-microwave-background (CMB) temperature anisotropies closely match those for high Reynolds number turbulence, Bershadskii and Sreenivasan 2002, 2003. CMB spectra support the interpretation that big-bang-turbulence-fossils triggered fragmentation of the viscous plasma at supercluster to galaxy mass scales from 1046 to 1042 kg, Gibson 1996, 2000, 2004ab.

  20. 1976 Big Thompson flood, Colorado

    USGS Publications Warehouse

    Jarrett, R. D.; Vandas, S.J.

    2006-01-01

    In the early evening of July 31, 1976, a large stationary thunderstorm released as much as 7.5 inches of rainfall in about an hour (about 12 inches in a few hours) in the upper reaches of the Big Thompson River drainage. This large amount of rainfall in such a short period of time produced a flash flood that caught residents and tourists by surprise. The immense volume of water that churned down the narrow Big Thompson Canyon scoured the river channel and destroyed everything in its path, including 418 homes, 52 businesses, numerous bridges, paved and unpaved roads, power and telephone lines, and many other structures. The tragedy claimed the lives of 144 people. Scores of other people narrowly escaped with their lives. The Big Thompson flood ranks among the deadliest of Colorado's recorded floods. It is one of several destructive floods in the United States that has shown the necessity of conducting research to determine the causes and effects of floods. The U.S. Geological Survey (USGS) conducts research and operates a Nationwide streamgage network to help understand and predict the magnitude and likelihood of large streamflow events such as the Big Thompson Flood. Such research and streamgage information are part of an ongoing USGS effort to reduce flood hazards and to increase public awareness.

  1. How do we identify big rivers? And how big is big?

    NASA Astrophysics Data System (ADS)

    Miall, Andrew D.

    2006-04-01

    "Big rivers" are the trunk rivers that carry the water and sediment load from major orogens, or that drain large areas of a continent. Identifying such rivers in the ancient record is a challenge. Some guidance may be provided by tectonic setting and sedimentological evidence, including the scale of architectural elements, and clues from provenance studies, but such data are not infallible guides to river magnitude. The scale of depositional elements is the most obvious clue to channel size, but evidence is typically sparse and inadequate, and may be misleading. For example, thick fining-upward successions may be tectonic cyclothems. Two examples of the analysis of large ancient river systems are discussed here in order to highlight problems of methodology and interpretation. The Hawkesbury Sandstone (Triassic) of the Sydney Basin, Australia, is commonly cited as the deposit of a large river, on the basis of abundant very large-scale crossbedding. An examination of very large outcrops of this unit, including a coastal cliff section 6 km long near Sydney, showed that even with 100% exposure there are ambiguities in the determination of channel scale. It was concluded in this case that the channel dimensions of the Hawkesbury rivers were about half the size of the modern Brahmaputra River. The tectonic setting of a major ancient fluvial system is commonly not a useful clue to river scale. The Hawkesbury Sandstone is a system draining transversely from a cratonic source into a foreland basin, whereas most large rivers in foreland basins flow axially and are derived mainly from the orogenic uplifts (e.g., the large tidally influenced rivers of the Athabasca Oil Sands, Alberta). Epeirogenic tilting of a continent by the dynamic topography process may generate drainages in unexpected directions. For example, analyses of detrital zircons in Upper Paleozoic-Mesozoic nonmarine successions in the SW United States suggests significant derivation from the Appalachian orogen

  2. A Spectrograph for BigBOSS

    NASA Astrophysics Data System (ADS)

    CARTON, Pierre-Henri; Bebek, C.; Cazaux, S.; Ealet, A.; Eppelle, D.; Kneib, J.; Karst, P.; levi, M.; magneville, C.; Palanque-Delabrouille, N.; Ruhlmann-Kleider, V.; Schlegel, D.; Yeche, C.

    2012-01-01

    The Big-Boss spectrographs assembly will take in charge the light from the fiber output to the detector, including the optics, gratings, mechanics and cryostats. The 5000 fibers are split in 10 bundles of 500 ones. Each of these channel feed one spectrograph. The full bandwidth from 0.36µm to 1.05µm is split in 3 bands. Each channel is composed with one collimator (doublet lenses), a VPH grating, and a 6 lenses camera. The 500 fiber spectrum are imaged onto a 4kx4k detector thanks to the F/2 camera. Each fiber core is imaged onto 4 pixels. Each channel of the BigBOSS spectrograph will be equipped with a single-CCD camera, resulting in 30 cryostats in total for the instrument. Based on its experience of CCD cameras for projects like EROS and MegaCam, CEA/Saclay has designed small and autonomous cryogenic vessels which integrate cryo-cooling, CCD positioning and slow control interfacing capabilities. The use of a Linear Pulse Tube with its own control unit, both developed by Thales Cryogenics BV, will ensure versatility, reliability and operational flexibility. CCD's will be cooled down to 140K, with stability better than 1K. CCD's will be positioned within 15µm along the optical axis and 50µm in the XY Plan. Slow Control machines will be directly interfaced to an Ethernet network, which will allow them to be operated remotely. The concept of spectrograph leads to a very robust concept without any mechanics (except the shutters). This 30 channels has a impressive compactness with its 3m3 volume. The development of such number of channel will drive to a quasi mass production philosophy.

  3. Big Data: Implications for Health System Pharmacy.

    PubMed

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  4. Big City Education: Its Challenge to Governance.

    ERIC Educational Resources Information Center

    Haskew, Laurence D.

    This chapter traces the migration from farms to cities and the later movement from cities to suburbs and discusses the impact of the resulting big city environment on the governance of big city education. The author (1) suggests how local, State, and Federal governments can improve big city education; (2) discusses ways of planning for the future…

  5. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  6. Judging Big Deals: Challenges, Outcomes, and Advice

    ERIC Educational Resources Information Center

    Glasser, Sarah

    2013-01-01

    This article reports the results of an analysis of five Big Deal electronic journal packages to which Hofstra University's Axinn Library subscribes. COUNTER usage reports were used to judge the value of each Big Deal. Limitations of usage statistics are also discussed. In the end, the author concludes that four of the five Big Deals are good…

  7. Big Data: Implications for Health System Pharmacy.

    PubMed

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services. PMID:27559194

  8. A SWOT Analysis of Big Data

    ERIC Educational Resources Information Center

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  9. Big Sagebrush Seed Bank Densities Following Wildfires

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big sagebrush (Artemisia sp.) is a critical shrub to such sagebrush obligate species as sage grouse, (Centocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush do not sprout after wildfires wildfires and big sagebrush seed is generally sho...

  10. Big Data - Smart Health Strategies

    PubMed Central

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  11. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  12. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  13. Gravitational waves from the big bounce

    SciTech Connect

    Mielczarek, Jakub

    2008-11-15

    In this paper we investigate gravitational wave production during the big bounce phase, inspired by loop quantum cosmology. We consider the influence of the holonomy corrections to the equation for tensor modes. We show that they act like additional effective graviton mass, suppressing gravitational wave creation. However, such effects can be treated perturbatively. We investigate a simplified model without holonomy corrections to the equation for modes and find its exact analytical solution. Assuming the form for matter {rho}{proportional_to}a{sup -2} we calculate the full spectrum of the gravitational waves from the big bounce phase. The spectrum obtained decreases to zero for the low energy modes. On the basis of this observation we infer that this effect can lead to low cosmic microwave background (CMB) multipole suppression and gives a potential way for testing loop quantum cosmology models. We also consider a scenario with a post-bounce inflationary phase. The power spectrum obtained gives a qualitative explanation of the CMB spectra, including low multipole suppression.

  14. Big Bang Cosmic Titanic: Cause for Concern?

    NASA Astrophysics Data System (ADS)

    Gentry, Robert

    2013-04-01

    This abstract alerts physicists to a situation that, unless soon addressed, may yet affect PRL integrity. I refer to Stanley Brown's and DAE Robert Caldwell's rejection of PRL submission LJ12135, A Cosmic Titanic: Big Bang Cosmology Unravels Upon Discovery of Serious Flaws in Its Foundational Expansion Redshift Assumption, by their claim that BB is an established theory while ignoring our paper's Titanic, namely, that BB's foundational spacetime expansion redshifts assumption has now been proven to be irrefutably false because it is contradicted by our seminal discovery that GPS operation unequivocally proves that GR effects do not produce in-flight photon wavelength changes demanded by this central assumption. This discovery causes the big bang to collapse as quickly as did Ptolemaic cosmology when Copernicus discovered its foundational assumption was heliocentric, not geocentric. Additional evidence that something is amiss in PRL's treatment of LJ12135 comes from both Brown and EiC Gene Spouse agreeing to meet at my exhibit during last year's Atlanta APS to discuss this cover-up issue. Sprouse kept his commitment; Brown didn't. Question: If Brown could have refuted my claim of a cover-up, why didn't he come to present it before Gene Sprouse? I am appealing LJ12135's rejection.

  15. Big Data Issues under the Copernicus Programme

    NASA Astrophysics Data System (ADS)

    Schulte-Braucks, R. L.

    2014-12-01

    The Copernicus Programme of Earth observation satellites (http://copernicus.eu) will be affected by a growing volume of data and information. The first satellite (Sentinel 1A) has just been launched. Seven additional satellites are to be launched by the end of the decade. These will produce 8 TB of data per day, i.e. considerably more than can be downloaded via normal Internet connections.There is no definitive answer to the many challenges of big data but there are gradual solutions for Copernicus in view of the progressive roll out of the space infrastructure and the thematic services which the European Commission will develop. This presentation will present several approaches to the big data issue. It will start from the needs of the Copernicus users, which are far from being homogeneous. As their needs are different, the European Commission and ESA will have to propose different solutions to fulfil these needs, taking into account the present and future state of technology. The presentation will discuss these solutions, both with regard to a better use of the network and with regard to hosted processing.

  16. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    ScienceCinema

    None

    2016-07-12

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  17. Solution of a braneworld big crunch/big bang cosmology

    SciTech Connect

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-11-15

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c){sup 2}. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios.

  18. Antigravity and the big crunch/big bang transition

    NASA Astrophysics Data System (ADS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  19. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    SciTech Connect

    2009-10-13

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  20. Better big data.

    PubMed

    Al Kazzi, Elie S; Hutfless, Susan

    2015-01-01

    By 2018, Medicare payments will be tied to quality of care. The Centers for Medicare and Medicaid Services currently use quality-based metric for some reimbursements through their different programs. Existing and future quality metrics will rely on risk adjustment to avoid unfairly punishing those who see the sickest, highest-risk patients. Despite the limitations of the data used for risk adjustment, there are potential solutions to improve the accuracy of these codes by calibrating data by merging databases and compiling information collected for multiple reporting programs to improve accuracy. In addition, healthcare staff should be informed about the importance of risk adjustment for quality of care assessment and reimbursement. As the number of encounters tied to value-based reimbursements increases in inpatient and outpatient care, coupled with accurate data collection and utilization, the methods used for risk adjustment could be expanded to better account for differences in the care delivered in diverse settings.

  1. Big Questions: Dark Matter

    SciTech Connect

    Lincoln, Don

    2013-12-05

    Carl Sagan's oft-quoted statement that there are "billions and billions" of stars in the cosmos gives an idea of just how much "stuff" is in the universe. However scientists now think that in addition to the type of matter with which we are familiar, there is another kind of matter out there. This new kind of matter is called "dark matter" and there seems to be five times as much as ordinary matter. Dark matter interacts only with gravity, thus light simply zips right by it. Scientists are searching through their data, trying to prove that the dark matter idea is real. Fermilab's Dr. Don Lincoln tells us why we think this seemingly-crazy idea might not be so crazy after all.

  2. Big Questions: Dark Matter

    ScienceCinema

    Lincoln, Don

    2016-07-12

    Carl Sagan's oft-quoted statement that there are "billions and billions" of stars in the cosmos gives an idea of just how much "stuff" is in the universe. However scientists now think that in addition to the type of matter with which we are familiar, there is another kind of matter out there. This new kind of matter is called "dark matter" and there seems to be five times as much as ordinary matter. Dark matter interacts only with gravity, thus light simply zips right by it. Scientists are searching through their data, trying to prove that the dark matter idea is real. Fermilab's Dr. Don Lincoln tells us why we think this seemingly-crazy idea might not be so crazy after all.

  3. Big Data in Medicine is Driving Big Changes

    PubMed Central

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  4. A systematic review of image segmentation methodology, used in the additive manufacture of patient-specific 3D printed models of the cardiovascular system

    PubMed Central

    Byrne, N; Velasco Forte, M; Tandon, A; Valverde, I

    2016-01-01

    Background Shortcomings in existing methods of image segmentation preclude the widespread adoption of patient-specific 3D printing as a routine decision-making tool in the care of those with congenital heart disease. We sought to determine the range of cardiovascular segmentation methods and how long each of these methods takes. Methods A systematic review of literature was undertaken. Medical imaging modality, segmentation methods, segmentation time, segmentation descriptive quality (SDQ) and segmentation software were recorded. Results Totally 136 studies met the inclusion criteria (1 clinical trial; 80 journal articles; 55 conference, technical and case reports). The most frequently used image segmentation methods were brightness thresholding, region growing and manual editing, as supported by the most popular piece of proprietary software: Mimics (Materialise NV, Leuven, Belgium, 1992–2015). The use of bespoke software developed by individual authors was not uncommon. SDQ indicated that reporting of image segmentation methods was generally poor with only one in three accounts providing sufficient detail for their procedure to be reproduced. Conclusions and implication of key findings Predominantly anecdotal and case reporting precluded rigorous assessment of risk of bias and strength of evidence. This review finds a reliance on manual and semi-automated segmentation methods which demand a high level of expertise and a significant time commitment on the part of the operator. In light of the findings, we have made recommendations regarding reporting of 3D printing studies. We anticipate that these findings will encourage the development of advanced image segmentation methods. PMID:27170842

  5. Image

    SciTech Connect

    Marsh, Amber; Harsch, Tim; Pitt, Julie; Firpo, Mike; Lekin, April; Pardes, Elizabeth

    2007-08-31

    The computer side of the IMAGE project consists of a collection of Perl scripts that perform a variety of tasks; scripts are available to insert, update and delete data from the underlying Oracle database, download data from NCBI's Genbank and other sources, and generate data files for download by interested parties. Web scripts make up the tracking interface, and various tools available on the project web-site (image.llnl.gov) that provide a search interface to the database.

  6. Big data and ophthalmic research.

    PubMed

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research.

  7. District Bets Big on Standards

    ERIC Educational Resources Information Center

    Gewertz, Catherine

    2013-01-01

    The big clock in Dowan McNair-Lee's 8th grade classroom in the Stuart-Hobson Middle School is silent, but she can hear the minutes ticking away nonetheless. On this day, like any other, the clock is a constant reminder of how little time she has to prepare her students--for spring tests, and for high school and all that lies beyond it. The…

  8. EHR Big Data Deep Phenotyping

    PubMed Central

    Lenert, L.; Lopez-Campos, G.

    2014-01-01

    Summary Objectives Given the quickening speed of discovery of variant disease drivers from combined patient genotype and phenotype data, the objective is to provide methodology using big data technology to support the definition of deep phenotypes in medical records. Methods As the vast stores of genomic information increase with next generation sequencing, the importance of deep phenotyping increases. The growth of genomic data and adoption of Electronic Health Records (EHR) in medicine provides a unique opportunity to integrate phenotype and genotype data into medical records. The method by which collections of clinical findings and other health related data are leveraged to form meaningful phenotypes is an active area of research. Longitudinal data stored in EHRs provide a wealth of information that can be used to construct phenotypes of patients. We focus on a practical problem around data integration for deep phenotype identification within EHR data. The use of big data approaches are described that enable scalable markup of EHR events that can be used for semantic and temporal similarity analysis to support the identification of phenotype and genotype relationships. Conclusions Stead and colleagues’ 2005 concept of using light standards to increase the productivity of software systems by riding on the wave of hardware/processing power is described as a harbinger for designing future healthcare systems. The big data solution, using flexible markup, provides a route to improved utilization of processing power for organizing patient records in genotype and phenotype research. PMID:25123744

  9. Empowering Personalized Medicine with Big Data and Semantic Web Technology: Promises, Challenges, and Use Cases

    PubMed Central

    Panahiazar, Maryam; Taslimitehrani, Vahid; Jadhav, Ashutosh; Pathak, Jyotishman

    2015-01-01

    In healthcare, big data tools and technologies have the potential to create significant value by improving outcomes while lowering costs for each individual patient. Diagnostic images, genetic test results and biometric information are increasingly generated and stored in electronic health records presenting us with challenges in data that is by nature high volume, variety and velocity, thereby necessitating novel ways to store, manage and process big data. This presents an urgent need to develop new, scalable and expandable big data infrastructure and analytical methods that can enable healthcare providers access knowledge for the individual patient, yielding better decisions and outcomes. In this paper, we briefly discuss the nature of big data and the role of semantic web and data analysis for generating “smart data” which offer actionable information that supports better decision for personalized medicine. In our view, the biggest challenge is to create a system that makes big data robust and smart for healthcare providers and patients that can lead to more effective clinical decision-making, improved health outcomes, and ultimately, managing the healthcare costs. We highlight some of the challenges in using big data and propose the need for a semantic data-driven environment to address them. We illustrate our vision with practical use cases, and discuss a path for empowering personalized medicine using big data and semantic web technology. PMID:25705726

  10. Turning big bang into big bounce. I. Classical dynamics

    SciTech Connect

    Dzierzak, Piotr; Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz

    2009-11-15

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  11. Big A affair. [Big Ambejackmockamus Falls, Penobscot River, Maine

    SciTech Connect

    Laitin, J.

    1985-02-01

    This article describes the conflict between proponents of a hydroelectric power plant on Maine's Penobscot River and recreation interests. Great Northern Paper Company filed application to build a dam on a wild stretch of the river used by sports fishermen and white-water enthusiasts. Great Northern claimed that it needed to replace 380,000 barrels of oil and to forego the purchase of 36,000 megawatt-hours of electricity annually to improve its competitive edge in the marketplace. The controversy will be a big issue for Maine's Land Use Regulatory Commission. The final decision will hinge on the Commission's perception of the greater public benefit - hydropower or recreation.

  12. Sub-meter desiccation crack patterns imaged by Curiosity at Gale Crater on Mars shed additional light on former lakes evident from examined outcrops

    NASA Astrophysics Data System (ADS)

    Hallet, B.; Sletten, R. S.; Mangold, N.; Oehler, D. Z.; Williams, R. M. E.; Bish, D. L.; Heydari, E.; Rubin, D. M.; Rowland, S. K.

    2015-12-01

    Small-scale desiccation crack patterns (mudcrack-like arrays of uniform ~0.1 to 1 m polygonal domains separated by linear or curving cracks in exposed bedding) imaged by Curiosity in Gale Crater, Mars complement a wealth of diverse data obtained from exposures of sedimentary rocks that point to deposition "in fluvial, deltaic, and lacustrine environments" including an "intracrater lake system likely [to have] existed intermittently for thousands to millions of years …"(e.g. Grotzinger et al., 2015, Science, submitted). We interpret these mudcrack-like patterns, found on many of the bedrock exposures imaged by Curiosity, as desiccation cracks that developed either of two ways: 1) at the soft sediment-air interface like common mudcracks, or 2) at or below the sediment-water interface by synaeresis or diastasis (involving differential compaction). In the context of recent studies of terrestrial mudcracks, and cracks formed experimentally in various wet powders as they loose moisture, these desiccation features reflect diverse aspects of the formative environment. If they formed as mudcracks, some of the lakes were shallow enough to permit the recurrent drying and wetting that can lead to the geometric regularity characteristic of several of sets of mudcracks. Moreover, the water likely contained little suspended sediment otherwise the mudcracks would be buried too rapidly for the crack pattern to persist and to mature into regular polygonal patterns. The preservation of these desiccation crack patterns does not require, but does not exclude, deep burial and exhumation. Although invisible from satellite because of their size, a multitude of Mastcam and Navcam images reveals these informative features in considerable detail. These images complement much evidence, mostly from HiRISE data from several regions, suggesting that potential desiccation polygons on larger scales may be more common on the surface of Mars than generally recognized.

  13. Big Data” and the Electronic Health Record

    PubMed Central

    Ross, M. K.; Wei, Wei

    2014-01-01

    Summary Objectives Implementation of Electronic Health Record (EHR) systems continues to expand. The massive number of patient encounters results in high amounts of stored data. Transforming clinical data into knowledge to improve patient care has been the goal of biomedical informatics professionals for many decades, and this work is now increasingly recognized outside our field. In reviewing the literature for the past three years, we focus on “big data” in the context of EHR systems and we report on some examples of how secondary use of data has been put into practice. Methods We searched PubMed database for articles from January 1, 2011 to November 1, 2013. We initiated the search with keywords related to “big data” and EHR. We identified relevant articles and additional keywords from the retrieved articles were added. Based on the new keywords, more articles were retrieved and we manually narrowed down the set utilizing predefined inclusion and exclusion criteria. Results Our final review includes articles categorized into the themes of data mining (pharmacovigilance, phenotyping, natural language processing), data application and integration (clinical decision support, personal monitoring, social media), and privacy and security. Conclusion The increasing adoption of EHR systems worldwide makes it possible to capture large amounts of clinical data. There is an increasing number of articles addressing the theme of “big data”, and the concepts associated with these articles vary. The next step is to transform healthcare big data into actionable knowledge. PMID:25123728

  14. Some experiences and opportunities for big data in translational research

    PubMed Central

    Chute, Christopher G.; Ullman-Cullere, Mollie; Wood, Grant M.; Lin, Simon M.; He, Min; Pathak, Jyotishman

    2014-01-01

    Health care has become increasingly information intensive. The advent of genomic data, integrated into patient care, significantly accelerates the complexity and amount of clinical data. Translational research in the present day increasingly embraces new biomedical discovery in this data-intensive world, thus entering the domain of “big data.” The Electronic Medical Records and Genomics consortium has taught us many lessons, while simultaneously advances in commodity computing methods enable the academic community to affordably manage and process big data. Although great promise can emerge from the adoption of big data methods and philosophy, the heterogeneity and complexity of clinical data, in particular, pose additional challenges for big data inferencing and clinical application. However, the ultimate comparability and consistency of heterogeneous clinical information sources can be enhanced by existing and emerging data standards, which promise to bring order to clinical data chaos. Meaningful Use data standards in particular have already simplified the task of identifying clinical phenotyping patterns in electronic health records. PMID:24008998

  15. A Solution to ``Too Big to Fail''

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-10-01

    Its a tricky business to reconcile simulations of our galaxys formation with our current observations of the Milky Way and its satellites. In a recent study, scientists have addressed one discrepancy between simulations and observations: the so-called to big to fail problem.From Missing Satellites to Too Big to FailThe favored model of the universe is the lambda-cold-dark-matter (CDM) cosmological model. This model does a great job of correctly predicting the large-scale structure of the universe, but there are still a few problems with it on smaller scales.Hubble image of UGC 5497, a dwarf galaxy associated with Messier 81. In the missing satellite problem, simulations of galaxy formation predict that there should be more such satellite galaxies than we observe. [ESA/NASA]The first is the missing satellites problem: CDM cosmology predicts that galaxies like the Milky Way should have significantly more satellite galaxies than we observe. A proposed solution to this problem is the argument that there may exist many more satellites than weve observed, but these dwarf galaxies have had their stars stripped from them during tidal interactions which prevents us from being able to see them.This solution creates a new problem, though: the too big to fail problem. This problem states that many of the satellites predicted by CDM cosmology are simply so massive that theres no way they couldnt have visible stars. Another way of looking at it: the observed satellites of the Milky Way are not massive enough to be consistent with predictions from CDM.Artists illustration of a supernova, a type of stellar feedback that can modify the dark-matter distribution of a satellite galaxy. [NASA/CXC/M. Weiss]Density Profiles and Tidal StirringLed by Mihai Tomozeiu (University of Zurich), a team of scientists has published a study in which they propose a solution to the too big to fail problem. By running detailed cosmological zoom simulations of our galaxys formation, Tomozeiu and

  16. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    SciTech Connect

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  17. Big data and clinical research: focusing on the area of critical care medicine in mainland China

    PubMed Central

    2014-01-01

    Big data has long been found its way into clinical practice since the advent of information technology era. Medical records and follow-up data can be more efficiently stored and extracted with information technology. Immediately after admission a patient immediately produces a large amount of data including laboratory findings, medications, fluid balance, progressing notes and imaging findings. Clinicians and clinical investigators should make every effort to make full use of the big data that is being continuously generated by electronic medical record (EMR) system and other healthcare databases. At this stage, more training courses on data management and statistical analysis are required before clinicians and clinical investigators can handle big data and translate them into advances in medical science. China is a large country with a population of 1.3 billion and can contribute greatly to clinical researches by providing reliable and high-quality big data. PMID:25392827

  18. Big data and clinical research: focusing on the area of critical care medicine in mainland China.

    PubMed

    Zhang, Zhongheng

    2014-10-01

    Big data has long been found its way into clinical practice since the advent of information technology era. Medical records and follow-up data can be more efficiently stored and extracted with information technology. Immediately after admission a patient immediately produces a large amount of data including laboratory findings, medications, fluid balance, progressing notes and imaging findings. Clinicians and clinical investigators should make every effort to make full use of the big data that is being continuously generated by electronic medical record (EMR) system and other healthcare databases. At this stage, more training courses on data management and statistical analysis are required before clinicians and clinical investigators can handle big data and translate them into advances in medical science. China is a large country with a population of 1.3 billion and can contribute greatly to clinical researches by providing reliable and high-quality big data.

  19. Traffic information computing platform for big data

    SciTech Connect

    Duan, Zongtao Li, Ying Zheng, Xibin Liu, Yan Dai, Jiting Kang, Jun

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  20. Big data: an introduction for librarians.

    PubMed

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included.

  1. Urgent Call for Nursing Big Data.

    PubMed

    Delaney, Connie W

    2016-01-01

    The purpose of this panel is to expand internationally a National Action Plan for sharable and comparable nursing data for quality improvement and big data science. There is an urgent need to assure that nursing has sharable and comparable data for quality improvement and big data science. A national collaborative - Nursing Knowledge and Big Data Science includes multi-stakeholder groups focused on a National Action Plan toward implementing and using sharable and comparable nursing big data. Panelists will share accomplishments and future plans with an eye toward international collaboration. This presentation is suitable for any audience attending the NI2016 conference. PMID:27332330

  2. Big data: an introduction for librarians.

    PubMed

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included. PMID:25023020

  3. The LHC's Next Big Mystery

    NASA Astrophysics Data System (ADS)

    Lincoln, Don

    2015-03-01

    When the sun rose over America on July 4, 2012, the world of science had radically changed. The Higgs boson had been discovered. Mind you, the press releases were more cautious than that, with "a new particle consistent with being the Higgs boson" being the carefully constructed phrase of the day. But, make no mistake, champagne corks were popped and backs were slapped. The data had spoken and a party was in order. Even if the observation turned out to be something other than the Higgs boson, the first big discovery from data taken at the Large Hadron Collider had been made.

  4. Big bang nucleosynthesis: An update

    SciTech Connect

    Olive, Keith A.

    2013-07-23

    An update on the standard model of big bang nucleosynthesis (BBN) is presented. With the value of the baryon-tophoton ratio determined to high precision by WMAP, standard BBN is a parameter-free theory. In this context, the theoretical prediction for the abundances of D, {sup 4}He, and {sup 7}Li is discussed and compared to their observational determination. While concordance for D and {sup 4}He is satisfactory, the prediction for {sup 7}Li exceeds the observational determination by a factor of about four. Possible solutions to this problem are discussed.

  5. The faces of Big Science.

    PubMed

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  6. Fitting ERGMs on big networks.

    PubMed

    An, Weihua

    2016-09-01

    The exponential random graph model (ERGM) has become a valuable tool for modeling social networks. In particular, ERGM provides great flexibility to account for both covariates effects on tie formations and endogenous network formation processes. However, there are both conceptual and computational issues for fitting ERGMs on big networks. This paper describes a framework and a series of methods (based on existent algorithms) to address these issues. It also outlines the advantages and disadvantages of the methods and the conditions to which they are most applicable. Selected methods are illustrated through examples. PMID:27480375

  7. Theoretical and Experimental Investigation of Thermodynamics and Kinetics of Thiol-Michael Addition Reactions: A Case Study of Reversible Fluorescent Probes for Glutathione Imaging in Single Cells.

    PubMed

    Chen, Jianwei; Jiang, Xiqian; Carroll, Shaina L; Huang, Jia; Wang, Jin

    2015-12-18

    Density functional theory (DFT) was applied to study the thermodynamics and kinetics of reversible thiol-Michael addition reactions. M06-2X/6-31G(d) with the SMD solvation model can reliably predict the Gibbs free energy changes (ΔG) of thiol-Michael addition reactions with an error of less than 1 kcal·mol(-1) compared with the experimental benchmarks. Taking advantage of this computational model, the first reversible reaction-based fluorescent probe was developed that can monitor the changes in glutathione levels in single living cells.

  8. Image quality analyzer

    NASA Astrophysics Data System (ADS)

    Lukin, V. P.; Botugina, N. N.; Emaleev, O. N.; Antoshkin, L. V.; Konyaev, P. A.

    2012-07-01

    Image quality analyzer (IQA) which used as device for efficiency analysis of adaptive optics application is described. In analyzer marketed possibility estimations quality of images on three different criterions of quality images: contrast, sharpnesses and the spectral criterion. At present given analyzer is introduced on Big Solar Vacuum Telescope in stale work that allows at observations to conduct the choice of the most contrasting images of Sun. Is it hereinafter planned use the analyzer in composition of the ANGARA adaptive correction system.

  9. Fires Burning near Big Sur, California

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Fires near Big Sur, Calif., continued to burn unchecked when the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra satellite captured this image on Sunday, June 29. In Northern California alone, fires have consumed more than 346,000 acres.At least 18,000 people have deployed to attempt to extinguish or control the flames. Air quality as far away as San Francisco has been adversely impacted by the dense clouds of smoke and ash blowing towards the northwest. The satellite image combines a natural color portrayal of the landscape with thermal infrared data showing the active burning areas in red. The dark area in the lower right is a previous forest fire.

    ASTER is one of five Earth-observing instruments launched December 18, 1999, on NASA's Terra satellite. The instrument was built by Japan's Ministry of Economy, Trade and Industry. A joint U.S./Japan science team is responsible for validation and calibration of the instrument and the data products.

    The broad spectral coverage and high spectral resolution of ASTER provides scientists in numerous disciplines with critical information for surface mapping, and monitoring of dynamic conditions and temporal change. Example applications are: monitoring glacial advances and retreats; monitoring potentially active volcanoes; identifying crop stress; determining cloud morphology and physical properties; wetlands evaluation; thermal pollution monitoring; coral reef degradation; surface temperature mapping of soils and geology; and measuring surface heat balance.

    The U.S. science team is located at NASA's Jet Propulsion Laboratory, Pasadena, Calif. The Terra mission is part of NASA's Science Mission Directorate.

    Size: 35.4 by 57 kilometers (21.9 by 34.2 miles) Location: 36.1 degrees North latitude, 121.6 degrees West longitude Orientation: North at top Image Data: ASTER bands 3, 2, and 1 Original Data Resolution: 15 meters (49 feet) Dates Acquired: June 29

  10. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    ERIC Educational Resources Information Center

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  11. Three dimensional simulation for Big Hill Strategic Petroleum Reserve (SPR).

    SciTech Connect

    Ehgartner, Brian L.; Park, Byoung Yoon; Sobolik, Steven Ronald; Lee, Moo Yul

    2005-07-01

    3-D finite element analyses were performed to evaluate the structural integrity of caverns located at the Strategic Petroleum Reserve's Big Hill site. State-of-art analyses simulated the current site configuration and considered additional caverns. The addition of 5 caverns to account for a full site and a full dome containing 31 caverns were modeled. Operations including both normal and cavern workover pressures and cavern enlargement due to leaching were modeled to account for as many as 5 future oil drawdowns. Under the modeled conditions, caverns were placed very close to the edge of the salt dome. The web of salt separating the caverns and the web of salt between the caverns and edge of the salt dome were reduced due to leaching. The impacts on cavern stability, underground creep closure, surface subsidence and infrastructure, and well integrity were quantified. The analyses included recently derived damage criterion obtained from testing of Big Hill salt cores. The results show that from a structural view point, many additional caverns can be safely added to Big Hill.

  12. Untapped Potential: Fulfilling the Promise of Big Brothers Big Sisters and the Bigs and Littles They Represent

    ERIC Educational Resources Information Center

    Bridgeland, John M.; Moore, Laura A.

    2010-01-01

    American children represent a great untapped potential in our country. For many young people, choices are limited and the goal of a productive adulthood is a remote one. This report paints a picture of who these children are, shares their insights and reflections about the barriers they face, and offers ways forward for Big Brothers Big Sisters as…

  13. Baryon symmetric big bang cosmology

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  14. The BigBOSS spectrograph

    NASA Astrophysics Data System (ADS)

    Jelinsky, Patrick; Bebek, Chris; Besuner, Robert; Carton, Pierre-Henri; Edelstein, Jerry; Lampton, Michael; Levi, Michael E.; Poppett, Claire; Prieto, Eric; Schlegel, David; Sholl, Michael

    2012-09-01

    BigBOSS is a proposed ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a 14,000 square degree galaxy and quasi-stellar object redshift survey. It consists of a 5,000- fiber-positioner focal plane feeding the spectrographs. The optical fibers are separated into ten 500 fiber slit heads at the entrance of ten identical spectrographs in a thermally insulated room. Each of the ten spectrographs has a spectral resolution (λ/Δλ) between 1500 and 4000 over a wavelength range from 360 - 980 nm. Each spectrograph uses two dichroic beam splitters to separate the spectrograph into three arms. It uses volume phase holographic (VPH) gratings for high efficiency and compactness. Each arm uses a 4096x4096 15 μm pixel charge coupled device (CCD) for the detector. We describe the requirements and current design of the BigBOSS spectrograph. Design trades (e.g. refractive versus reflective) and manufacturability are also discussed.

  15. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  16. Big Data Application in Biomedical Research and Health Care: A Literature Review.

    PubMed

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care.

  17. Big Data Application in Biomedical Research and Health Care: A Literature Review.

    PubMed

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care. PMID:26843812

  18. Big Data Application in Biomedical Research and Health Care: A Literature Review

    PubMed Central

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care. PMID:26843812

  19. Bioimage Informatics for Big Data.

    PubMed

    Peng, Hanchuan; Zhou, Jie; Zhou, Zhi; Bria, Alessandro; Li, Yujie; Kleissas, Dean Mark; Drenkow, Nathan G; Long, Brian; Liu, Xiaoxiao; Chen, Hanbo

    2016-01-01

    Bioimage informatics is a field wherein high-throughput image informatics methods are used to solve challenging scientific problems related to biology and medicine. When the image datasets become larger and more complicated, many conventional image analysis approaches are no longer applicable. Here, we discuss two critical challenges of large-scale bioimage informatics applications, namely, data accessibility and adaptive data analysis. We highlight case studies to show that these challenges can be tackled based on distributed image computing as well as machine learning of image examples in a multidimensional environment. PMID:27207370

  20. 3. EASTERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. EASTERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON DAM SHOWING THE SOUTHEAST END OF THE DAM, AND THE HOLLOW BAYS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  1. 6. EASTERLY VIEW OF BIG DALTON DAM SHOWING THE SHELTER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. EASTERLY VIEW OF BIG DALTON DAM SHOWING THE SHELTER HOUSE IN THE BACKGROUND. PHOTO TAKEN FROM THE ACCESS ROAD LEADING TO THE CONTROL HOUSE. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  2. 2. NORTHERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. NORTHERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON DAM SHOWING THE NORTHWEST END OF THE DAM, THE CONTROL HOUSE, AND SPILLWAY CHUTE. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  3. Big sagebrush transplanting success in crested wheatgrass stands

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The conversion of formerly big sagebrush (Artemisia tridentate ssp. wyomingensis)/bunchgrass communities to annual grass dominance, primarily cheatgrass (Bromus tectorum), in Wyoming big sagebrush ecosystems has sparked the increasing demand to establish big sagebrush on disturbed rangelands. The e...

  4. 2. Big Creek Road, worm fence and road at trailhead. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Big Creek Road, worm fence and road at trailhead. - Great Smoky Mountains National Park Roads & Bridges, Big Creek Road, Between State Route 284 & Big Creek Campground, Gatlinburg, Sevier County, TN

  5. Big system: Interactive graphics for the engineer

    NASA Technical Reports Server (NTRS)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  6. Efficiency, Corporate Power, and the Bigness Complex.

    ERIC Educational Resources Information Center

    Adams, Walter; Brock, James W.

    1990-01-01

    Concludes that (1) the current infatuation with corporate bigness is void of credible empirical support; (2) disproportionate corporate size and industry concentration are incompatible with and destructive to good economic performance; and (3) structurally oriented antitrust policy must be revitalized to combat the burdens of corporate bigness.…

  7. An embedding for the big bang

    NASA Technical Reports Server (NTRS)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  8. In Search of the Big Bubble

    ERIC Educational Resources Information Center

    Simoson, Andrew; Wentzky, Bethany

    2011-01-01

    Freely rising air bubbles in water sometimes assume the shape of a spherical cap, a shape also known as the "big bubble". Is it possible to find some objective function involving a combination of a bubble's attributes for which the big bubble is the optimal shape? Following the basic idea of the definite integral, we define a bubble's surface as…

  9. A New Look at Big History

    ERIC Educational Resources Information Center

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  10. Epidemiology in the Era of Big Data

    PubMed Central

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  11. The Big bang and the Quantum

    NASA Astrophysics Data System (ADS)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  12. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods

  13. The big war over brackets.

    PubMed

    Alvarez, R O

    1994-01-01

    The Third Preparatory Committee Meeting for the International Conference on Population and Development (ICPD), PrepCom III, was held at UN headquarters in New York on April 4-22, 1994. It was the last big preparatory meeting leading to the ICPD to be held in Cairo, Egypt, in September 1994. The author attended the second week of meetings as the official delegate of the Institute for Social Studies and Action. Debates mostly focused upon reproductive health and rights, sexual health and rights, family planning, contraception, condom use, fertility regulation, pregnancy termination, and safe motherhood. The Vatican and its allies' preoccupation with discussing language which may imply abortion caused sustainable development, population, consumption patterns, internal and international migration, economic strategies, and budgetary allocations to be discussed less extensively than they should have been. The author describes points of controversy, the power of women at the meetings, and afterthoughts on the meetings.

  14. Exploring Relationships in Big Data

    NASA Astrophysics Data System (ADS)

    Mahabal, A.; Djorgovski, S. G.; Crichton, D. J.; Cinquini, L.; Kelly, S.; Colbert, M. A.; Kincaid, H.

    2015-12-01

    Big Data are characterized by several different 'V's. Volume, Veracity, Volatility, Value and so on. For many datasets inflated Volumes through redundant features often make the data more noisy and difficult to extract Value out of. This is especially true if one is comparing/combining different datasets, and the metadata are diverse. We have been exploring ways to exploit such datasets through a variety of statistical machinery, and visualization. We show how we have applied it to time-series from large astronomical sky-surveys. This was done in the Virtual Observatory framework. More recently we have been doing similar work for a completely different domain viz. biology/cancer. The methodology reuse involves application to diverse datasets gathered through the various centers associated with the Early Detection Research Network (EDRN) for cancer, an initiative of the National Cancer Institute (NCI). Application to Geo datasets is a natural extension.

  15. Was the Big Bang hot?

    NASA Technical Reports Server (NTRS)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  16. Island Universe or Big Galaxy?

    NASA Astrophysics Data System (ADS)

    Wolfschmidt, Gudrun

    In 1920, the "great debate" took place: Harlow Shapley defended his model of the "Big Galaxy", i.e. we live in a large galaxy and all nebulous objects belong to our galaxy. He got this result from the distribution of the globular nebulae. Heber D. Curtis on the other side analyzed novae and was then convinced that nebulae are far distant objects which are stellar systems themselves like our galaxy. The solution of the discussion was brought by Edwin P. Hubble who confirmed the interpretation of nebulae as extragalactic objects, i.e. galaxies, and introduced the red shift for getting the distance of galaxies. The resulting expansion of the universe led to a new cosmological world view.

  17. Evidence of the big fix

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2014-06-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value vh. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self-coupling are fixed when we vary vh. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental law in our case.

  18. Spectral observations of big objects

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Sargsyan, L. A.

    2010-12-01

    This is a summary and general analysis of optical spectroscopic data on 172 BIG (Byurakan-IRAS Galaxies) objects obtained with the BAO 2.6-m, SAO 6-m, and OHP 1.93-m telescopes. 102 galaxies with star formation regions, 29 galaxies with active nuclei, and 19 galaxies with a composite spectrum were identified. The spectra of 12 of the galaxies show signs of emission, but without the possibility of a more precise determination of their activity class, 9 galaxies appear to have star formation rates that do not exceed normal, and 1 is an absorption galaxy. In order to establish the nature of these galaxies and the place they occupy in the general picture of the evolution of the universe, we compare them with 128 infrared galaxies.

  19. Big Mysteries: The Higgs Mass

    ScienceCinema

    Lincoln, Don

    2016-07-12

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  20. Big Mysteries: The Higgs Mass

    SciTech Connect

    Lincoln, Don

    2014-04-28

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  1. Big Bang nucleosynthesis in crisis\\?

    NASA Astrophysics Data System (ADS)

    Hata, N.; Scherrer, R. J.; Steigman, G.; Thomas, D.; Walker, T. P.; Bludman, S.; Langacker, P.

    1995-11-01

    A new evaluation of the constraint on the number of light neutrino species (Nν) from big bang nucleosynthesis suggests a discrepancy between the predicted light element abundances and those inferred from observations, unless the inferred primordial 4He abundance has been underestimated by 0.014+/-0.004 (1σ) or less than 10% (95% C.L.) of 3He survives stellar processing. With the quoted systematic errors in the observed abundances and a conservative chemical evolution parametrization, the best fit to the combined data is Nν=2.1+/-0.3 (1σ) and the upper limit is Nν<2.6 (95% C.L.). The data are inconsistent with the standard model (Nν=3) at the 98.6% C.L.

  2. Big Earth observation data analytics for land use and land cover change information

    NASA Astrophysics Data System (ADS)

    Câmara, Gilberto

    2015-04-01

    Current scientific methods for extracting information for Earth observation data lag far behind our capacity to build complex satellites. In response to this challenge, our work explores a new type of knowledge platform to improve the extraction of land use and land cover change information from big Earth Observation data sets. We take a space-time perspective of Earth Observation data, considering that each sensor revisits the same place at regular intervals. Sensor data can, in principle, be calibrated so that observations of the same place in different times are comparable and each measure from a sensor is mapped into a three dimensional array in space-time. To fully enable the use of space-time arrays for working with Earth Observation data, we use the SciDB array database. Arrays naturally fit the data structure of Earth Observation images, breaking the image-as-a-snapshot paradigm. Thus, entire collections of images can be stored as multidimensional arrays. However, array databases do not understand the specific nature of geographical data, and do not capture the meaning and the differences between spatial and temporal dimensions. In our work, we have extended SciDB to include additional information about satellite image metadata, cartographical projections, and time. We are currently developing methods to extract land use and land cover information based on space-time analysis on array databases. Our experiments show these space-time methods give us significant improvements over current space-only remote sensing image processing methods. We have been able to capture tropical forest degradation and forest regrowth and also to distinguish between single-cropping and double-cropping practices in tropical agriculture.

  3. The Big Apple's Core: Exploring Manhattan

    ERIC Educational Resources Information Center

    Groce, Eric C.; Groce, Robin D.; Colby, Susan

    2005-01-01

    Children are exposed to a wide variety of images related to New York City through various media outlets. They may have seen glimpses of Manhattan by watching movies such as Spiderman or Stuart Little or by taking in annual television events such as the Macy's Thanksgiving Day Parade or the Times Square New Year's Eve celebration. Additionally,…

  4. Microsystems - The next big thing

    SciTech Connect

    STINNETT,REGAN W.

    2000-05-11

    Micro-Electro-Mechanical Systems (MEMS) is a big name for tiny devices that will soon make big changes in everyday life and the workplace. These and other types of Microsystems range in size from a few millimeters to a few microns, much smaller than a human hair. These Microsystems have the capability to enable new ways to solve problems in commercial applications ranging from automotive, aerospace, telecommunications, manufacturing equipment, medical diagnostics to robotics, and in national security applications such as nuclear weapons safety and security, battlefield intelligence, and protection against chemical and biological weapons. This broad range of applications of Microsystems reflects the broad capabilities of future Microsystems to provide the ability to sense, think, act, and communicate, all in a single integrated package. Microsystems have been called the next silicon revolution, but like many revolutions, they incorporate more elements than their predecessors. Microsystems do include MEMS components fabricated from polycrystalline silicon processed using techniques similar to those used in the manufacture of integrated electrical circuits. They also include optoelectronic components made from gallium arsenide and other semiconducting compounds from the III-V groups of the periodic table. Microsystems components are also being made from pure metals and metal alloys using the LIGA process, which utilizes lithography, etching, and casting at the micron scale. Generically, Microsystems are micron scale, integrated systems that have the potential to combine the ability to sense light, heat, pressure, acceleration, vibration, and chemicals with the ability to process the collected data using CMOS circuitry, execute an electrical, mechanical, or photonic response, and communicate either optically or with microwaves.

  5. Big, Dark Dunes Northeast of Syrtis Major

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Big sand dunes! Mars is home to some very large, windblown dunes. The dunes shown here rise to almost 100 meters (275 feet) at their crests. Unlike dunes on Earth, the larger dunes of Mars are composed of dark, rather than light grains. This is probably related to the composition of the sand, since different materials will have different brightnesses. For example, beaches on the island of Oahu in Hawaii are light colored because they consist of ground-up particles of seashells, while beaches in the southern shores of the island of Hawaii (the 'Big Island' in the Hawaiian island chain) are dark because they consist of sand derived from dark lava rock.

    The dunes in this picture taken by the Mars Orbiter Camera (MOC) are located on the floor of an old, 72 km-(45 mi)-diameter crater located northeast of Syrtis Major. The sand is being blown from the upper right toward the lower left. The surface that the dunes have been travelling across is pitted and cratered. The substrate is also hard and bright--i.e., it is composed of a material of different composition than the sand in the dunes. The dark streaks on the dune surfaces area puzzle...at first glance one might conclude they are the result of holiday visitors with off-road vehicles. However, the streaks more likely result from passing dust devils or wind gusts that disturb the sand surface just enough to leave a streak. The image shown here covers an area approximately 2.6 km (1.6 mi) wide, and is illuminated from the lower right.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  6. [Big data in medicine and healthcare].

    PubMed

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  7. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  8. Processing Solutions for Big Data in Astronomy

    NASA Astrophysics Data System (ADS)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  9. Big data and the electronic health record.

    PubMed

    Peters, Steve G; Buntrock, James D

    2014-01-01

    The electronic medical record has evolved from a digital representation of individual patient results and documents to information of large scale and complexity. Big Data refers to new technologies providing management and processing capabilities, targeting massive and disparate data sets. For an individual patient, techniques such as Natural Language Processing allow the integration and analysis of textual reports with structured results. For groups of patients, Big Data offers the promise of large-scale analysis of outcomes, patterns, temporal trends, and correlations. The evolution of Big Data analytics moves us from description and reporting to forecasting, predictive modeling, and decision optimization.

  10. Big-Data RHEED analysis for understanding epitaxial film growth processes

    SciTech Connect

    Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V

    2014-10-28

    Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in-situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED image, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the dataset are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of RHEED image sequence. This approach is illustrated for growth of LaxCa1-xMnO3 films grown on etched (001) SrTiO3 substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the assymetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.

  11. ATLAS: Big Data in a Small Package

    NASA Astrophysics Data System (ADS)

    Denneau, Larry; Tonry, John

    2015-08-01

    For even small telescope projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (ATLAS; Tonry 2011) will robotically survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids (NEAs) on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards -- two 0.5 m F/2.0 telescopes -- each year the ATLAS system will obtain ~103 measurements of 109 astronomical sources to a photometric accuracy of <5%. This ever-growing dataset must be searched in real-time for moving objects then archived for further analysis, and alerts for newly discovered near-Earth NEAs disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many ``rifle shot'' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of satellites and pieces of space junk that ATLAS will see each night. Additional interrogation will identify interesting phenomena from beyond the solar system occurring over millions of transient sources per night. The data processing and storage requirements for ATLAS demand a ``big data'' approach typical of commercial Internet enterprises. We describe our approach to deploying a nimble, scalable and reliable data processing infrastructure, and promote ATLAS as steppingstone to eventual processing scales in the era of LSST.

  12. Big Bang Nucleosynthesis in the New Cosmology

    SciTech Connect

    Fields, Brian D.

    2008-01-24

    Big bang nucleosynthesis (BBN) describes the production of the lightest elements in the first minutes of cosmic time. We review the physics of cosmological element production, and the observations of the primordial element abundances. The comparison between theory and observation has heretofore provided our earliest probe of the universe, and given the best measure of the cosmic baryon content. However, BBN has now taken a new role in cosmology, in light of new precision measurements of the cosmic microwave background (CMB). Recent CMB anisotropy data yield a wealth of cosmological parameters; in particular, the baryon-to-photon ratio {eta} = n{sub B}/n{sub {gamma}} is measured to high precision. The confrontation between the BBN and CMB ''baryometers'' poses a new and stringent test of the standard cosmology; the status of this test are discussed. Moreover, it is now possible to recast the role of BBN by using the CMB to fix the baryon density and even some light element abundances. This strategy sharpens BBN into a more powerful probe of early universe physics, and of galactic nucleosynthesis processes. The impact of the CMB results on particle physics beyond the Standard Model, and on non-standard cosmology, are illustrated. Prospects for improvement of these bounds via additional astronomical observations and nuclear experiments are discussed, as is the lingering ''lithium problem.''.

  13. Big bang nucleosynthesis in the new cosmology

    NASA Astrophysics Data System (ADS)

    Fields, B. D.

    2006-03-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest elements in the first minutes of cosmic time. We review the physics of cosmological element production, and the observations of the primordial element abundances. The comparison between theory and observation has heretofore provided our earliest probe of the universe, and given the best measure of the cosmic baryon content. However, BBN has now taken a new role in cosmology, in light of new precision measurements of the cosmic microwave background (CMB). Recent CMB anisotropy data yield a wealth of cosmological parameters; in particular, the baryon-to-photon ratio η = n B/n γ is measured to high precision. The confrontation between the BBN and CMB “baryometers” poses a new and stringent test of the standard cosmology; the status of this test is discussed. Moreover, it is now possible to recast the role of BBN by using the CMB to fix the baryon density and even some light element abundances. This strategy sharpens BBN into a more powerful probe of early universe physics, and of galactic nucleosynthesis processes. The impact of the CMB results on particle physics beyond the Standard Model, and on non-standard cosmology, are illustrated. Prospects for improvement of these bounds via additional astronomical observations and nuclear experiments are discussed, as is the lingering “lithium problem.”

  14. NOAA Big Data Partnership RFI

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  15. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  16. Big Crater as Viewed by Pathfinder Lander - Anaglyph

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.

    The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.

    This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.

    The anaglyph view of Big Crater was

  17. Interoperability Outlook in the Big Data Future

    NASA Astrophysics Data System (ADS)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center

  18. bwtool: a tool for bigWig files

    PubMed Central

    Pohl, Andy; Beato, Miguel

    2014-01-01

    BigWig files are a compressed, indexed, binary format for genome-wide signal data for calculations (e.g. GC percent) or experiments (e.g. ChIP-seq/RNA-seq read depth). bwtool is a tool designed to read bigWig files rapidly and efficiently, providing functionality for extracting data and summarizing it in several ways, globally or at specific regions. Additionally, the tool enables the conversion of the positions of signal data from one genome assembly to another, also known as ‘lifting’. We believe bwtool can be useful for the analyst frequently working with bigWig data, which is becoming a standard format to represent functional signals along genomes. The article includes supplementary examples of running the software. Availability and implementation: The C source code is freely available under the GNU public license v3 at http://cromatina.crg.eu/bwtool. Contact: andrew.pohl@crg.eu, andypohl@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24489365

  19. The big five personality traits: psychological entities or statistical constructs?

    PubMed

    Franić, Sanja; Borsboom, Denny; Dolan, Conor V; Boomsma, Dorret I

    2014-11-01

    The present study employed multivariate genetic item-level analyses to examine the ontology and the genetic and environmental etiology of the Big Five personality dimensions, as measured by the NEO Five Factor Inventory (NEO-FFI) [Costa and McCrae, Revised NEO personality inventory (NEO PI-R) and NEO five-factor inventory (NEO-FFI) professional manual, 1992; Hoekstra et al., NEO personality questionnaires NEO-PI-R, NEO-FFI: manual, 1996]. Common and independent pathway model comparison was used to test whether the five personality dimensions fully mediate the genetic and environmental effects on the items, as would be expected under the realist interpretation of the Big Five. In addition, the dimensionalities of the latent genetic and environmental structures were examined. Item scores of a population-based sample of 7,900 adult twins (including 2,805 complete twin pairs; 1,528 MZ and 1,277 DZ) on the Dutch version of the NEO-FFI were analyzed. Although both the genetic and the environmental covariance components display a 5-factor structure, applications of common and independent pathway modeling showed that they do not comply with the collinearity constraints entailed in the common pathway model. Implications for the substantive interpretation of the Big Five are discussed.

  20. Cosmic relics from the big bang

    SciTech Connect

    Hall, L.J.

    1988-12-01

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab.

  1. Quantum nature of the big bang.

    PubMed

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime. PMID:16712061

  2. Big Data and Analytics in Healthcare.

    PubMed

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  3. Quantum nature of the big bang.

    PubMed

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  4. Big Red Eye is Ready

    NASA Astrophysics Data System (ADS)

    2007-01-01

    The world's biggest infrared camera for Europe's newest telescope left the UK today for Chile. The 67 million pixel camera will equip VISTA - a UK provided survey telescope being constructed in Chile for ESO. VISTA will map the infrared sky faster than any previous telescope, studying areas of the Universe that are hard to see in the visible due to either their cool temperature, surrounding dust or high redshift. ESO PR Photo 04a/07 ESO PR Photo 04a/07 The VISTA Camera The 2.9-tonne VISTA camera has been designed and built by a consortium including the CCLRC Rutherford Appleton Laboratory, the UK Astronomy Technology Centre (UK ATC) in Edinburgh and the University of Durham. "The camera operates under vacuum at a temperature of -200 degrees Celsius, so in many ways it has been like designing an instrument for use in space, but with the additional constraint of having to survive an earthquake environment," said Kim Ward, the Camera Manager from the Rutherford Appleton Laboratory, who oversaw the technical challenges. "With a total of 67 million pixels, VISTA has a much larger number of infrared sensitive detectors than previous infrared instruments." VISTA is due to start scientific operations in the last quarter of 2007. "VISTA will be able to take images of sky areas each about 3 times as large as the full Moon," said Jim Emerson of Queen Mary, University of London, UK and VISTA's Principal Investigator. "This means it can survey quickly. The camera is crucial to carrying out VISTA's surveys which will provide statistical samples of objects and at the same time locate and characterise rare and variable objects, and perhaps most tantalisingly make discoveries of the as-yet unknown." The 4-m VISTA will survey large areas of the southern sky at near-infrared wavelengths to study objects that are not seen easily in optical light either because they are too cool, or are surrounded by dust (which infrared light penetrates much better than optical), or whose optical

  5. Big-bang nucleosynthesis revisited

    NASA Technical Reports Server (NTRS)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  6. Data Confidentiality Challenges in Big Data Applications

    SciTech Connect

    Yin, Jian; Zhao, Dongfang

    2015-12-15

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  7. Quality of Big Data in Healthcare

    SciTech Connect

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    2015-01-01

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  8. Dark energy, wormholes, and the big rip

    SciTech Connect

    Faraoni, V.; Israel, W.

    2005-03-15

    The time evolution of a wormhole in a Friedmann universe approaching the big rip is studied. The wormhole is modeled by a thin spherical shell accreting the superquintessence fluid--two different models are presented. Contrary to recent claims that the wormhole overtakes the expansion of the universe and engulfs it before the big rip is reached, it is found that the wormhole becomes asymptotically comoving with the cosmic fluid and the future evolution of the universe is fully causal.

  9. COBE looks back to the Big Bang

    NASA Technical Reports Server (NTRS)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  10. Big data: survey, technologies, opportunities, and challenges.

    PubMed

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  11. Big Data: Survey, Technologies, Opportunities, and Challenges

    PubMed Central

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  12. Big data: survey, technologies, opportunities, and challenges.

    PubMed

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  13. Big biomedical data as the key resource for discovery science.

    PubMed

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-11-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an "-ome to home" approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center's computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson's and Alzheimer's.

  14. Mosaicking Mexico - the Big Picture of Big Data

    NASA Astrophysics Data System (ADS)

    Hruby, F.; Melamed, S.; Ressl, R.; Stanley, D.

    2016-06-01

    The project presented in this article is to create a completely seamless and cloud-free mosaic of Mexico at a resolution of 5m, using approximately 4,500 RapidEye images. To complete this project in a timely manner and with limited operators, a number of processing architectures were required to handle a data volume of 12 terabytes. This paper will discuss the different operations realized to complete this project, which include, preprocessing, mosaic generation and post mosaic editing. Prior to mosaic generation, it was necessary to filter the 50,000 RapidEye images captured over Mexico between 2011 and 2014 to identify the top candidate images, based on season and cloud cover. Upon selecting the top candidate images, PCI Geomatics' GXL system was used to reproject, color balance and generate seamlines for the output 1TB+ mosaic. This paper will also discuss innovative techniques used by the GXL for color balancing large volumes of imagery with substantial radiometric differences. Furthermore, post-mosaicking steps, such as, exposure correction, cloud and cloud shadow elimination will be presented.

  15. Heat Exchange, Additive Manufacturing, and Neutron Imaging

    SciTech Connect

    Geoghegan, Patrick

    2015-02-23

    Researchers at the Oak Ridge National Laboratory have captured undistorted snapshots of refrigerants flowing through small heat exchangers, helping them to better understand heat transfer in heating, cooling and ventilation systems.

  16. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-06-30

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop (see attached agenda). The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement

  17. Boosting Big National Lab Data

    SciTech Connect

    Kleese van Dam, Kerstin

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  18. Big bang nucleosynthesis: Present status

    NASA Astrophysics Data System (ADS)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nν<3.2 . The new precision of the CMB and D/H observations together leaves D/H predictions as the largest source of uncertainties. Future improvement in BBN calculations will therefore rely on improved nuclear cross-section data. In contrast with D/H and 4He, 7Li predictions continue to disagree with observations, perhaps pointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  19. Pockmarks off Big Sur, California

    USGS Publications Warehouse

    Paull, C.; Ussler, W.; Maher, N.; Greene, H. Gary; Rehder, G.; Lorenson, T.; Lee, H.

    2002-01-01

    A pockmark field was discovered during EM-300 multi-beam bathymetric surveys on the lower continental slope off the Big Sur coast of California. The field contains ??? 1500 pockmarks which are between 130 and 260 m in diameter, and typically are 8-12 m deep located within a 560 km2 area. To investigate the origin of these features, piston cores were collected from both the interior and the flanks of the pockmarks, and remotely operated vehicle observation (ROV) video and sampling transects were conducted which passed through 19 of the pockmarks. The water column within and above the pockmarks was sampled for methane concentration. Piston cores and ROV collected push cores show that the pockmark field is composed of monotonous fine silts and clays and the cores within the pockmarks are indistinguishable from those outside the pockmarks. No evidence for either sediment winnowing or diagenetic alteration suggestive of fluid venting was obtained. 14C measurements of the organic carbon in the sediments indicate continuous sedimentation throughout the time resolution of the radiocarbon technique ( ??? 45000 yr BP), with a sedimentation rate of ??? 10 cm per 1000 yr both within and between the pockmarks. Concentrations of methane, dissolved inorganic carbon, sulfate, chloride, and ammonium in pore water extracted from within the cores are generally similar in composition to seawater and show little change with depth, suggesting low biogeochemical activity. These pore water chemical gradients indicate that neither significant accumulations of gas are likely to exist in the shallow subsurface ( ??? 100 m) nor is active fluid advection occurring within the sampled sediments. Taken together the data indicate that these pockmarks are more than 45000 yr old, are presently inactive, and contain no indications of earlier fluid or gas venting events. ?? 2002 Elsevier Science B.V. All rights reserved.

  20. Breaking Barriers in Polymer Additive Manufacturing

    SciTech Connect

    Love, Lonnie J; Duty, Chad E; Post, Brian K; Lind, Randall F; Lloyd, Peter D; Kunc, Vlastimil; Peter, William H; Blue, Craig A

    2015-01-01

    Additive Manufacturing (AM) enables the creation of complex structures directly from a computer-aided design (CAD). There are limitations that prevent the technology from realizing its full potential. AM has been criticized for being slow and expensive with limited build size. Oak Ridge National Laboratory (ORNL) has developed a large scale AM system that improves upon each of these areas by more than an order of magnitude. The Big Area Additive Manufacturing (BAAM) system directly converts low cost pellets into a large, three-dimensional part at a rate exceeding 25 kg/h. By breaking these traditional barriers, it is possible for polymer AM to penetrate new manufacturing markets.

  1. Classification of Big Point Cloud Data Using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Liu, K.; Boehm, J.

    2015-08-01

    Point cloud data plays an significant role in various geospatial applications as it conveys plentiful information which can be used for different types of analysis. Semantic analysis, which is an important one of them, aims to label points as different categories. In machine learning, the problem is called classification. In addition, processing point data is becoming more and more challenging due to the growing data volume. In this paper, we address point data classification in a big data context. The popular cluster computing framework Apache Spark is used through the experiments and the promising results suggests a great potential of Apache Spark for large-scale point data processing.

  2. ALMA imaging of gas and dust in a galaxy protocluster at redshift 5.3: [C II] emission in 'typical' galaxies and dusty starbursts ≈1 billion years after the big bang

    SciTech Connect

    Riechers, Dominik A.; Carilli, Christopher L.; Capak, Peter L.; Yan, Lin; Scoville, Nicholas Z.; Smolčić, Vernesa; Schinnerer, Eva; Yun, Min; Cox, Pierre; Bertoldi, Frank; Karim, Alexander

    2014-12-01

    We report interferometric imaging of [C II]({sup 2} P {sub 3/2}→{sup 2} P {sub 1/2}) and OH({sup 2}Π{sub 1/2} J = 3/2→1/2) emission toward the center of the galaxy protocluster associated with the z = 5.3 submillimeter galaxy (SMG) AzTEC-3, using the Atacama Large (sub)Millimeter Array (ALMA). We detect strong [C II], OH, and rest-frame 157.7 μm continuum emission toward the SMG. The [C II]({sup 2} P {sub 3/2}→{sup 2} P {sub 1/2}) emission is distributed over a scale of 3.9 kpc, implying a dynamical mass of 9.7 × 10{sup 10} M {sub ☉}, and a star formation rate (SFR) surface density of Σ{sub SFR} = 530 M {sub ☉} yr{sup –1} kpc{sup –2}. This suggests that AzTEC-3 forms stars at Σ{sub SFR} approaching the Eddington limit for radiation pressure supported disks. We find that the OH emission is slightly blueshifted relative to the [C II] line, which may indicate a molecular outflow associated with the peak phase of the starburst. We also detect and dynamically resolve [C II]({sup 2} P {sub 3/2}→{sup 2} P {sub 1/2}) emission over a scale of 7.5 kpc toward a triplet of Lyman-break galaxies with moderate UV-based SFRs in the protocluster at ∼95 kpc projected distance from the SMG. These galaxies are not detected in the continuum, suggesting far-infrared SFRs of <18-54 M {sub ☉} yr{sup –1}, consistent with a UV-based estimate of 22 M {sub ☉} yr{sup –1}. The spectral energy distribution of these galaxies is inconsistent with nearby spiral and starburst galaxies, but resembles those of dwarf galaxies. This is consistent with expectations for young starbursts without significant older stellar populations. This suggests that these galaxies are significantly metal-enriched, but not heavily dust-obscured, 'normal' star-forming galaxies at z > 5, showing that ALMA can detect the interstellar medium in 'typical' galaxies in the very early universe.

  3. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online.

  4. Big Data in Caenorhabditis elegans: quo vadis?

    PubMed

    Hutter, Harald; Moerman, Donald

    2015-11-01

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell.

  5. Big Data in Caenorhabditis elegans: quo vadis?

    PubMed Central

    Hutter, Harald; Moerman, Donald

    2015-01-01

    A clear definition of what constitutes “Big Data” is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of “complete” data sets for this organism is actually rather small—not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein–protein interaction—important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. PMID:26543198

  6. Making a Difference. An Impact Study of Big Brothers/Big Sisters.

    ERIC Educational Resources Information Center

    Tierney, Joseph P.; And Others

    This report provides reliable evidence that mentoring programs can positively affect young people. The evidence is derived from research conducted at local affiliates of Big Brothers/Big Sisters of America (BB/BSA), the oldest, best-known, and arguably most sophisticated of the country's mentoring programs. Public/Private Ventures, Inc. conducted…

  7. Benchmarking Big Data Systems and the BigData Top100 List.

    PubMed

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  8. Benchmarking Big Data Systems and the BigData Top100 List.

    PubMed

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process. PMID:27447039

  9. Different pressor and bronchoconstrictor properties of human big-endothelin-1, 2 (1-38) and 3 in ketamine/xylazine-anaesthetized guinea-pigs.

    PubMed Central

    Gratton, J P; Rae, G A; Claing, A; Télémaque, S; D'Orléans-Juste, P

    1995-01-01

    1. In the present study, the precursors of endothelin-1, endothelin-2 and endothelin-3 were tested for their pressor and bronchoconstrictor properties in the anaesthetized guinea-pig. In addition, the effects of big-endothelin-1 and endothelin-1 were assessed under urethane or ketamine/xylazine anaesthesia. 2. When compared to ketamine/xylazine, urethane markedly depressed the pressor and bronchoconstrictor properties of endothelin-1 and big-endothelin-1. 3. Under ketamine/xylazine anaesthesia, the three endothelins induced a biphasic increase of mean arterial blood pressure. In contrast, big-endothelin-1, as well as big-endothelin-2 (1-38), induced only sustained increase in blood pressure whereas big-endothelin-3 was inactive at doses up to 25 nmol kg-1. 4. Big-endothelin-1, but not big-endothelin-2, induced a significant increase in airway resistance. Yet, endothelin-1, endothelin-2 and endothelin-3 were equipotent as bronchoconstrictor agents. 5. Big-endothelin-1, endothelin-1 and endothelin-2, but not big-endothelin-2, triggered a marked release of prostacyclin and thromboxane A2 from the guinea-pig perfused lung. 6. Our results suggest the presence of a phosphoramidon-sensitive endothelin-converting enzyme (ECE) which is responsible for the conversion of big-endothelin-1 and big-endothelin-2 to their active moieties, endothelin-1 and 2. However, the lack of bronchoconstrictor and eicosanoid-releasing properties of big-endothelin-2, as opposed to endothelin-2 or big-endothelin-1, suggests the presence of two distinct phosphoramidon-sensitive ECEs in the guinea-pig.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7735698

  10. Small Things Draw Big Interest

    ERIC Educational Resources Information Center

    Green, Susan; Smith III, Julian

    2005-01-01

    Although the microscope is a basic tool in both physical and biological sciences, it is notably absent from most elementary school science programs. One reason teachers find it challenging to introduce microscopy at the elementary level is because children can have a hard time connecting the image of an object seen through a microscope with what…

  11. ATLAS: Big Data in a Small Package?

    NASA Astrophysics Data System (ADS)

    Denneau, Larry

    2016-01-01

    For even small astronomy projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (Tonry 2011) will survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards - two 0.5 m F/2.0 telescopes - each night the ATLAS system will measure nearly 109 astronomical sources to a photometric accuracy of <5%, totaling 1012 individual observations over its initial 3-year mission. This ever-growing dataset must be searched in real-time for moving objects and transients then archived for further analysis, and alerts for newly discovered near-Earth asteroids (NEAs) disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many `rifle shot' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of low-Earth orbit (LEO) and geosynchronous orbit (GEO) satellites ATLAS will see each night. Additional interrogation will identify interesting phenomena from millions of transient sources per night beyond the solar system. The data processing and storage requirements for ATLAS demand a `big data' approach typical of commercial internet enterprises. We describe our experience in deploying a nimble, scalable and reliable data processing infrastructure, and suggest ATLAS as steppingstone to data processing capability needed as we enter the era of LSST.

  12. Transcriptome marker diagnostics using big data.

    PubMed

    Han, Henry; Liu, Ying

    2016-02-01

    The big omics data are challenging translational bioinformatics in an unprecedented way for its complexities and volumes. How to employ big omics data to achieve a rivalling-clinical, reproducible disease diagnosis from a systems approach is an urgent problem to be solved in translational bioinformatics and machine learning. In this study, the authors propose a novel transcriptome marker diagnosis to tackle this problem using big RNA-seq data by viewing whole transcriptome as a profile marker systematically. The systems diagnosis not only avoids the reproducibility issue of the existing gene-/network-marker-based diagnostic methods, but also achieves rivalling-clinical diagnostic results by extracting true signals from big RNA-seq data. Their method demonstrates a better fit for personalised diagnostics by attaining exceptional diagnostic performance via using systems information than its competitive methods and prepares itself as a good candidate for clinical usage. To the best of their knowledge, it is the first study on this topic and will inspire the more investigations in big omics data diagnostics.

  13. Volume and Value of Big Healthcare Data

    PubMed Central

    Dinov, Ivo D.

    2016-01-01

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309

  14. The Confluence of Exascale and Big Data

    NASA Astrophysics Data System (ADS)

    Dosanjh, Sudip

    2014-04-01

    Exascale computing has rightly received considerable attention within the high performance computing community. In many fields, scientific progress requires a thousand-fold increase in supercomputing performance over the next decade. Science needs include performing single simulations that span a large portion of an exascale system, as well high throughput computing. The big data problem has also received considerable attention, but is sometimes viewed as being orthogonal to exascale computing. This talk focuses on the confluence of exascale and big data. Exascale and big data face many similar technical challenges including increasing power/energy constraints, the growing mismatch between computing and data movement speeds, an explosion in concurrency and the reduced reliability of large computing systems. Even though exascale and data intensive systems might have different system-level architectures, the fundamental building blocks will be similar. Analyzing all the information produced by exascale simulations will also generate a big data problem. And finally, many experimental facilities are being inundated with large quantities of data as sensors and sequencers improve at rates that surpass Moore's Law. It is becoming increasingly difficult to analyze all of the data from a single experiment and it is often impossible to make comparisons across data sets. It will only be possible to accelerate scientific discovery if we bring together the high performance computing and big data communities.

  15. Peplography: an image restoration technique through scattering media

    NASA Astrophysics Data System (ADS)

    Cho, Myungjin; Cho, Ki-Ok; Kim, Youngjun

    2016-06-01

    In this paper, we propose an image restoration technique through scattering media. Under natural light an imaging through scattering media is a big challenge in many applications. To overcome this challenge, many methods have been reported such as non-invasive imaging, ghost imaging, and wavefront shaping. However, their results have not been sufficient for observers. In this paper, we estimate the scattering media by statistical estimation such as maximum likelihood estimation. By removing this estimated scattering media from the original image, we can obtain the image with only ballistic photons. Then, the ballistic photons can be detected by photon counting imaging concept. In addition, since each basic color channel has its own wavelength, color photon counting process can be implemented. To enhance the visual quality of the result image, a passive three-dimensional (3D) imaging technique such as integral imaging is used. To prove our method and show the better performance, we carried out optical experiments and calculate mean square error (MSE).

  16. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  17. Elevation of neuron specific enolase and brain iron deposition on susceptibility-weighted imaging as diagnostic clues for beta-propeller protein-associated neurodegeneration in early childhood: Additional case report and review of the literature.

    PubMed

    Takano, Kyoko; Shiba, Naoko; Wakui, Keiko; Yamaguchi, Tomomi; Aida, Noriko; Inaba, Yuji; Fukushima, Yoshimitsu; Kosho, Tomoki

    2016-02-01

    Beta-propeller protein-associated neurodegeneration (BPAN), also known as static encephalopathy of childhood with neurodegeneration in adulthood (SENDA), is a subtype of neurodegeneration with brain iron accumulation (NBIA). BPAN is caused by mutations in an X-linked gene WDR45 that is involved in autophagy. BPAN is characterized by developmental delay or intellectual disability until adolescence or early adulthood, followed by severe dystonia, parkinsonism, and progressive dementia. Brain magnetic resonance imaging (MRI) shows iron deposition in the bilateral globus pallidus (GP) and substantia nigra (SN). Clinical manifestations and laboratory findings in early childhood are limited. We report a 3-year-old girl with BPAN who presented with severe developmental delay and characteristic facial features. In addition to chronic elevation of serum aspartate transaminase, lactate dehydrogenase, creatine kinase, and soluble interleukin-2 receptor, she had persistent elevation of neuron specific enolase (NSE) in serum and cerebrospinal fluid. MRI using susceptibility-weighted imaging (SWI) demonstrated iron accumulation in the GP and SN bilaterally. Targeted next-generation sequencing identified a de novo splice-site mutation, c.831-1G>C in WDR45, which resulted in aberrant splicing evidenced by reverse transcriptase-PCR. Persistent elevation of NSE and iron deposition on SWI may provide clues for diagnosis of BPAN in early childhood.

  18. Supramolecular polymerisation in water; elucidating the role of hydrophobic and hydrogen-bond interactions† †Electronic supplementary information (ESI) available: Experimental details, characterization by IR and UV spectroscopy and dynamic light scattering, video files of optical microscopy imaging. See DOI: 10.1039/c5sm02843d Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Leenders, Christianus M. A.; Baker, Matthew B.; Pijpers, Imke A. B.; Lafleur, René P. M.; Albertazzi, Lorenzo

    2016-01-01

    Understanding the self-assembly of small molecules in water is crucial for the development of responsive, biocompatible soft materials. Here, a family of benzene-1,3,5-tricarboxamide (BTA) derivatives that comprise a BTA moiety connected to an amphiphilic chain is synthesised with the aim to elucidate the role of hydrophobic and hydrogen-bonding interactions in the self-assembly of these BTAs. The amphiphilic chain consists of an alkyl chain with a length of 10, 11, or 12 methylene units, connected to a tetraethylene glycol (at the periphery). The results show that an undecyl spacer is the minimum length required for these BTAs to self-assemble into supramolecular polymers. Interestingly, exchange studies reveal only minor differences in exchange rates between BTAs containing undecyl or dodecyl spacers. Additionally, IR spectroscopy provides the first experimental evidence that hydrogen-bonding is operative and contributes to the stabilisation of the supramolecular polymers in water. PMID:26892482

  19. Adapting bioinformatics curricula for big data.

    PubMed

    Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs.

  20. Implications of Big Data for cell biology

    PubMed Central

    Dolinski, Kara; Troyanskaya, Olga G.

    2015-01-01

    Big Data” has surpassed “systems biology” and “omics” as the hottest buzzword in the biological sciences, but is there any substance behind the hype? Certainly, we have learned about various aspects of cell and molecular biology from the many individual high-throughput data sets that have been published in the past 15–20 years. These data, although useful as individual data sets, can provide much more knowledge when interrogated with Big Data approaches, such as applying integrative methods that leverage the heterogeneous data compendia in their entirety. Here we discuss the benefits and challenges of such Big Data approaches in biology and how cell and molecular biologists can best take advantage of them. PMID:26174066

  1. The dominance of big pharma: power.

    PubMed

    Edgar, Andrew

    2013-05-01

    The purpose of this paper is to provide a normative model for the assessment of the exercise of power by Big Pharma. By drawing on the work of Steven Lukes, it will be argued that while Big Pharma is overtly highly regulated, so that its power is indeed restricted in the interests of patients and the general public, the industry is still able to exercise what Lukes describes as a third dimension of power. This entails concealing the conflicts of interest and grievances that Big Pharma may have with the health care system, physicians and patients, crucially through rhetorical engagements with Patient Advocacy Groups that seek to shape public opinion, and also by marginalising certain groups, excluding them from debates over health care resource allocation. Three issues will be examined: the construction of a conception of the patient as expert patient or consumer; the phenomenon of disease mongering; the suppression or distortion of debates over resource allocation.

  2. Little Big Horn River Water Quality Project

    SciTech Connect

    Bad Bear, D.J.; Hooker, D.

    1995-10-01

    This report summarizes the accomplishments of the Water Quality Project on the Little Big horn River during the summer of 1995. The majority of the summer was spent collecting data on the Little Big Horn River, then testing the water samples for a number of different tests which was done at the Little Big Horn College in Crow Agency, Montana. The intention of this study is to preform stream quality analysis to gain an understanding of the quality of selected portion of the river, to assess any impact that the existing developments may be causing to the environment and to gather base-line data which will serve to provide information concerning the proposed development. Citizens of the reservation have expressed a concern of the quality of the water on the reservation; surface waters, ground water, and well waters.

  3. Big Science and the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Giudice, Gian Francesco

    2012-03-01

    The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

  4. Human Neuroimaging as a “Big Data” Science

    PubMed Central

    Van Horn, John Darrell; Toga, Arthur W.

    2013-01-01

    The maturation of in vivo neuroimaging has lead to incredible quantities of digital information about the human brain. While much is made of the data deluge in science, neuroimaging represents the leading edge of this onslaught of “big data”. A range of neuroimaging databasing approaches has streamlined the transmission, storage, and dissemination of data from such brain imaging studies. Yet few, if any, common solutions exist to support the science of neuroimaging. In this article, we discuss how modern neuroimaging research represents a mutifactorial and broad ranging data challenge, involving the growing size of the data being acquired; sociologial and logistical sharing issues; infrastructural challenges for multi-site, multi-datatype archiving; and the means by which to explore and mine these data. As neuroimaging advances further, e.g. aging, genetics, and age-related disease, new vision is needed to manage and process this information while marshalling of these resources into novel results. Thus, “big data” can become “big” brain science. PMID:24113873

  5. Spatial Big Data Organization, Access and Visualization with ESSG

    NASA Astrophysics Data System (ADS)

    Wu, L. X.; Yu, J. Q.; Yang, Y. Z.; Jia, Y. J.

    2013-10-01

    There are hundreds of spatial reference frame (SRF) being applied, and the great difference among SRFs has blocked the share of global data on planet Earth. A conceptual spheroid of radius 12,800 km and a spheroid degenerated octree grid method are applied to produce an earth system spatial grid (ESSG), which is of natural characteristics to be applied as a new common SRF. A triple CTA is designed as ESSG-based data structure to organize the big data of planet Earth, and a 2D table of a unique label and limitless records for time slices and attribute values is present to record the data of each grid. The big data on planet Earth can hence be gridded and interrelated without discipline gaps and SRF obstacles. An integral data organization mode is designed, and three potential routes are presented for users to access shareable global data in cloud environment. Furthermore, with global crust, atmosphere, DEM, and satellite image being examples, the integrated visualization of global large objects is demonstrated.

  6. The Reliability and Validity of Big Five Inventory Scores with African American College Students

    ERIC Educational Resources Information Center

    Worrell, Frank C.; Cross, William E., Jr.

    2004-01-01

    This article describes a study that examined the reliability and validity of scores on the Big Five Inventory (BFI; O. P. John, E. M. Donahue, & R. L. Kentle, 1991) in a sample of 336 African American college students. Results from the study indicated moderate reliability and structural validity for BFI scores. Additionally, BFI subscales had few…

  7. Characterization of Stream Morphology and Sediment Yield for the Big Black and Tombigbee River Basins, Mississippi

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Three segments within the Big Black River Basin, and nine within the Tombigbee River Basin are on the Mississippi 303d list of water bodies as having impaired conditions for aquatic life due to sediment. An additional 56 reaches of channel are listed for biologic impairment between the two basins. ...

  8. Design and development of a medical big data processing system based on Hadoop.

    PubMed

    Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song

    2015-03-01

    Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.

  9. Design and development of a medical big data processing system based on Hadoop.

    PubMed

    Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song

    2015-03-01

    Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations. PMID:25666927

  10. Energy scale of the Big Bounce

    SciTech Connect

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz

    2009-09-15

    We examine the nature of the cosmological Big Bounce transition within the loop geometry underlying loop quantum cosmology at classical and quantum levels. Our canonical quantization method is an alternative to the standard loop quantum cosmology. An evolution parameter we use has a clear interpretation. Our method opens the door for analyses of spectra of physical observables like the energy density and the volume operator. We find that one cannot determine the energy scale specific to the Big Bounce by making use of the loop geometry without an extra input from observational cosmology.

  11. How quantum is the big bang?

    PubMed

    Bojowald, Martin

    2008-06-01

    When quantum gravity is used to discuss the big bang singularity, the most important, though rarely addressed, question is what role genuine quantum degrees of freedom play. Here, complete effective equations are derived for isotropic models with an interacting scalar to all orders in the expansions involved. The resulting coupling terms show that quantum fluctuations do not affect the bounce much. Quantum correlations, however, do have an important role and could even eliminate the bounce. How quantum gravity regularizes the big bang depends crucially on properties of the quantum state. PMID:18643411

  12. Effective dynamics of the matrix big bang

    SciTech Connect

    Craps, Ben; Rajaraman, Arvind; Sethi, Savdeep

    2006-05-15

    We study the leading quantum effects in the recently introduced matrix big bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the big bang. Surprisingly, the potential decays very rapidly at late times where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form of our one-loop potential seems to violate these constraints in a manner that suggests a connection between the cosmological singularity and long wavelength, late time physics.

  13. Harnessing the Heart of Big Data

    PubMed Central

    Scruggs, Sarah B.; Watson, Karol; Su, Andrew I.; Hermjakob, Henning; Yates, John R.; Lindsey, Merry L.; Ping, Peipei

    2015-01-01

    The exponential increase in Big Data generation combined with limited capitalization on the wealth of information embedded within Big Data have prompted us to revisit our scientific discovery paradigms. A successful transition into this digital era of medicine holds great promise for advancing fundamental knowledge in biology, innovating human health and driving personalized medicine, however, this will require a drastic shift of research culture in how we conceptualize science and use data. An e-transformation will require global adoption and synergism among computational science, biomedical research and clinical domains. PMID:25814682

  14. Livermore Big Trees Park: 1998 Results

    SciTech Connect

    Mac Queen, D; Gallegos, G; Surano, K

    2002-04-18

    This report is an in-depth study of results from environmental sampling conducted in 1998 by the Lawrence Livermore National Laboratory (LLNL) at Big Trees Park in the city of Livermore. The purpose of the sampling was to determine the extent and origin of plutonium found in soil at concentrations above fallout-background levels in the park. This report describes the sampling that was conducted, the chemical and radio-chemical analyses of the samples, the quality control assessments and statistical analyses of the analytical results, and LLNL's interpretations of the results. It includes a number of data analyses not presented in LLNL's previous reports on Big Trees Park.

  15. How quantum is the big bang?

    PubMed

    Bojowald, Martin

    2008-06-01

    When quantum gravity is used to discuss the big bang singularity, the most important, though rarely addressed, question is what role genuine quantum degrees of freedom play. Here, complete effective equations are derived for isotropic models with an interacting scalar to all orders in the expansions involved. The resulting coupling terms show that quantum fluctuations do not affect the bounce much. Quantum correlations, however, do have an important role and could even eliminate the bounce. How quantum gravity regularizes the big bang depends crucially on properties of the quantum state.

  16. Stellar photometry with big pixels

    SciTech Connect

    Buonanno, R.; Iannicola, G.; European Southern Observatory, Garching )

    1989-03-01

    A new software for stellar photometry in crowded fields is presented. This software overcomes the limitations present in a traditional package like ROMAFOT when the pixel size of the detector is comparable to the scale length of point images. This is the case, for instance, with the Hubble Space Telescope-Wide Field Camera and, partially, with the Planetary Camera. The numerical solution presented here is compared to the technical solution of obtaining more exposures of the same field, each shifted by a fraction of pixel. This software will be available in MIDAS. 11 refs.

  17. Big Events in Greece and HIV Infection Among People Who Inject Drugs.

    PubMed

    Nikolopoulos, Georgios K; Sypsa, Vana; Bonovas, Stefanos; Paraskevis, Dimitrios; Malliori-Minerva, Melpomeni; Hatzakis, Angelos; Friedman, Samuel R

    2015-01-01

    Big Events are processes like macroeconomic transitions that have lowered social well-being in various settings in the past. Greece has been hit by the global crisis and experienced an HIV outbreak among people who inject drugs. Since the crisis began (2008), Greece has seen population displacement, inter-communal violence, cuts in governmental expenditures, and social movements. These may have affected normative regulation, networks, and behaviors. However, most pathways to risk remain unknown or unmeasured. We use what is known and unknown about the Greek HIV outbreak to suggest modifications in Big Events models and the need for additional research.

  18. Big Events in Greece and HIV Infection Among People Who Inject Drugs

    PubMed Central

    Nikolopoulos, Georgios K.; Sypsa, Vana; Bonovas, Stefanos; Paraskevis, Dimitrios; Malliori-Minerva, Melpomeni; Hatzakis, Angelos; Friedman, Samuel R.

    2015-01-01

    Big Events are processes like macroeconomic transitions that have lowered social well-being in various settings in the past. Greece has been hit by the global crisis and experienced an HIV outbreak among people who inject drugs. Since the crisis began (2008), Greece has seen population displacement, inter-communal violence, cuts in governmental expenditures, and social movements. These may have affected normative regulation, networks, and behaviors. However, most pathways to risk remain unknown or unmeasured. We use what is known and unknown about the Greek HIV outbreak to suggest modifications in Big Events models and the need for additional research. PMID:25723309

  19. Big Creek Hydroelectric System, East & West Transmission Line, 241mile ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Big Creek Hydroelectric System, East & West Transmission Line, 241-mile transmission corridor extending between the Big Creek Hydroelectric System in the Sierra National Forest in Fresno County and the Eagle Rock Substation in Los Angeles, California, Visalia, Tulare County, CA

  20. What's the Big Sweat about Dehydration? (For Kids)

    MedlinePlus

    ... Dictionary of Medical Words En Español What Other Kids Are Reading Back-to-School Butterflies? Read This ... What's the Big Sweat About Dehydration? KidsHealth > For Kids > What's the Big Sweat About Dehydration? Print A ...

  1. "Small Steps, Big Rewards": Preventing Type 2 Diabetes

    MedlinePlus

    ... please turn Javascript on. Feature: Diabetes "Small Steps, Big Rewards": Preventing Type 2 Diabetes Past Issues / Fall ... These are the plain facts in "Small Steps. Big Rewards: Prevent Type 2 Diabetes," an education campaign ...

  2. Device Data Ingestion for Industrial Big Data Platforms with a Case Study.

    PubMed

    Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei

    2016-01-01

    Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data. PMID:26927121

  3. Device Data Ingestion for Industrial Big Data Platforms with a Case Study †

    PubMed Central

    Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei

    2016-01-01

    Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data. PMID:26927121

  4. Device Data Ingestion for Industrial Big Data Platforms with a Case Study.

    PubMed

    Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei

    2016-02-26

    Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data.

  5. Geologic map of Big Bend National Park, Texas

    USGS Publications Warehouse

    Turner, Kenzie J.; Berry, Margaret E.; Page, William R.; Lehman, Thomas M.; Bohannon, Robert G.; Scott, Robert B.; Miggins, Daniel P.; Budahn, James R.; Cooper, Roger W.; Drenth, Benjamin J.; Anderson, Eric D.; Williams, Van S.

    2011-01-01

    The purpose of this map is to provide the National Park Service and the public with an updated digital geologic map of Big Bend National Park (BBNP). The geologic map report of Maxwell and others (1967) provides a fully comprehensive account of the important volcanic, structural, geomorphological, and paleontological features that define BBNP. However, the map is on a geographically distorted planimetric base and lacks topography, which has caused difficulty in conducting GIS-based data analyses and georeferencing the many geologic features investigated and depicted on the map. In addition, the map is outdated, excluding significant data from numerous studies that have been carried out since its publication more than 40 years ago. This report includes a modern digital geologic map that can be utilized with standard GIS applications to aid BBNP researchers in geologic data analysis, natural resource and ecosystem management, monitoring, assessment, inventory activities, and educational and recreational uses. The digital map incorporates new data, many revisions, and greater detail than the original map. Although some geologic issues remain unresolved for BBNP, the updated map serves as a foundation for addressing those issues. Funding for the Big Bend National Park geologic map was provided by the United States Geological Survey (USGS) National Cooperative Geologic Mapping Program and the National Park Service. The Big Bend mapping project was administered by staff in the USGS Geology and Environmental Change Science Center, Denver, Colo. Members of the USGS Mineral and Environmental Resources Science Center completed investigations in parallel with the geologic mapping project. Results of these investigations addressed some significant current issues in BBNP and the U.S.-Mexico border region, including contaminants and human health, ecosystems, and water resources. Funding for the high-resolution aeromagnetic survey in BBNP, and associated data analyses and

  6. Early experiences with big data at an academic medical center.

    PubMed

    Halamka, John D

    2014-07-01

    Beth Israel Deaconess Medical Center (BIDMC), an academic health care institution affiliated with Harvard University, has been an early adopter of electronic applications since the 1970s. Various departments of the medical center and the physician practice groups affiliated with it have implemented electronic health records, filmless imaging, and networked medical devices to such an extent that data storage at BIDMC now amounts to three petabytes and continues to grow at a rate of 25 percent a year. Initially, the greatest technical challenge was the cost and complexity of data storage. However, today the major focus is on transforming raw data into information, knowledge, and wisdom. This article discusses the data growth, increasing importance of analytics, and changing user requirements that have shaped the management of big data at BIDMC.

  7. Big-Time Fundraising for Today's Schools

    ERIC Educational Resources Information Center

    Levenson, Stanley

    2006-01-01

    In this enlightening book, nationally recognized author and fundraising consultant Stanley Levenson shows school leaders how to move away from labor-intensive, nickel-and-dime bake sales and car washes, and into the world of big-time fundraising. Following the model used by colleges and universities, the author presents a wealth of practical…

  8. Big-Time Sports in American Universities

    ERIC Educational Resources Information Center

    Clotfelter, Charles T.

    2011-01-01

    For almost a century, big-time college sports has been a wildly popular but consistently problematic part of American higher education. The challenges it poses to traditional academic values have been recognized from the start, but they have grown more ominous in recent decades, as cable television has become ubiquitous, commercial opportunities…

  9. Big physics quartet win government backing

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2014-09-01

    Four major physics-based projects are among 10 to have been selected by Japan’s Ministry of Education, Culture, Sports, Science and Technology for funding in the coming decade as part of its “roadmap” of big-science projects.

  10. Integrating "big data" into surgical practice.

    PubMed

    Mathias, Brittany; Lipori, Gigi; Moldawer, Lyle L; Efron, Philip A

    2016-02-01

    'Big data' is the next frontier of medicine. We now have the ability to generate and analyze large quantities of healthcare data. Although interpreting and integrating this information into clinical practice poses many challenges, the potential benefits of personalized medicine are seemingly without limit.

  11. A Big Problem for Magellan: Food Preservation

    ERIC Educational Resources Information Center

    Galvao, Cecilia; Reis, Pedro; Freire, Sofia

    2008-01-01

    In this paper, we present data related to how a Portuguese teacher developed the module "A big problem for Magellan: Food preservation." Students were asked to plan an investigation in order to identify which were the best food preservation methods in the XV and XVI centuries of Portuguese overseas navigation, and then establish a parallel between…

  12. Challenges of Big Data in Educational Assessment

    ERIC Educational Resources Information Center

    Gibson, David C.; Webb, Mary; Ifenthaler, Dirk

    2015-01-01

    This paper briefly discusses four measurement challenges of data science or "big data" in educational assessments that are enabled by technology: 1. Dealing with change over time via time-based data. 2. How a digital performance space's relationships interact with learner actions, communications and products. 3. How layers of…

  13. Big Island Demonstration Project - Black Liquor

    SciTech Connect

    2006-08-01

    Black liquor is a papermaking byproduct that also serves as a fuel for pulp and paper mills. This project involves the design, construction, and operation of a black liquor gasifier that will be integrated into Georgia-Pacific's Big Island facility in Virginia, a mill that has been in operation for more than 100 years.

  14. Marketing Your Library with the Big Read

    ERIC Educational Resources Information Center

    Johnson, Wendell G.

    2012-01-01

    The Big Read was developed by the National Endowment for the Arts to revitalize the role of culture in American society and encourage the reading of landmark literature. Each year since 2007, the DeKalb Public Library, Northern Illinois University, and Kishwaukee Community College have partnered to foster literacy in the community. This article…

  15. More on Sports and the Big6.

    ERIC Educational Resources Information Center

    Eisenberg, Mike

    1998-01-01

    Presents strategies for relating the Big6 information problem-solving process to sports to gain students' attention, sustain it, and make instruction relevant to their interests. Lectures by coaches, computer-based sports games, sports information sources, the use of technology in sports, and judging sports events are discussed. (LRW)

  16. Data Needs for Big City Schools.

    ERIC Educational Resources Information Center

    Eubanks, Eugene E.

    Public schools in the big cities and urban areas will become proportionally more minority and poor in the 1980's and 1990's. The traditional measures used to collect data on minority population have proved to be inaccurate. The following items are needed and will be of value to people working in urban public schools: (1) data which distinguish…

  17. The Big Ideas behind Whole System Reform

    ERIC Educational Resources Information Center

    Fullan, Michael

    2010-01-01

    Whole system reform means that every vital part of the system--school, community, district, and government--contributes individually and in concert to forward movement and success, using practice, not research, as the driver of reform. With this in mind, several "big ideas", based on successful implementation, informed Ontario's reform strategy:…

  18. Science Literacy Circles: Big Ideas about Science

    ERIC Educational Resources Information Center

    Devick-Fry, Jane; LeSage, Teresa

    2010-01-01

    Science literacy circles incorporate the organization of both science notebooks and literature circles to help K-8 students internalize big ideas about science. Using science literacy circles gives students opportunities to engage in critical thinking as they inductively develop understanding about science concepts. (Contains 1 table and 7…

  19. Big Broadband Connectivity in the United States

    ERIC Educational Resources Information Center

    Windhausen, John, Jr.

    2008-01-01

    The economic and social future of the United States depends on answering the growing demand for very high-speed broadband connectivity, a capability termed "big broadband." Failure to take on the challenge could lead to a decline in global competitiveness and an inability to educate students. (Contains 20 notes.)

  20. Big Bubbles in Boiling Liquids: Students' Views

    ERIC Educational Resources Information Center

    Costu, Bayram

    2008-01-01

    The aim of this study was to elicit students' conceptions about big bubbles in boiling liquids (water, ethanol and aqueous CuSO[subscript 4] solution). The study is based on twenty-four students at different ages and grades. The clinical interviews technique was conducted to solicit students' conceptions and the interviews were analyzed to…

  1. The Big Island of Hawaii

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Boasting snow-covered mountain peaks and tropical forest, the Island of Hawaii, the largest of the Hawaiian Islands, is stunning at any altitude. This false-color composite (processed to simulate true color) image of Hawaii was constructed from data gathered between 1999 and 2001 by the Enhanced Thematic Mapper plus (ETM+) instrument, flying aboard the Landsat 7 satellite. The Landsat data were processed by the National Oceanographic and Atmospheric Administration (NOAA) to develop a landcover map. This map will be used as a baseline to chart changes in land use on the islands. Types of change include the construction of resorts along the coastal areas, and the conversion of sugar plantations to other crop types. Hawaii was created by a 'hotspot' beneath the ocean floor. Hotspots form in areas where superheated magma in the Earth's mantle breaks through the Earth's crust. Over the course of millions of years, the Pacific Tectonic Plate has slowly moved over this hotspot to form the entire Hawaiian Island archipelago. The black areas on the island (in this scene) that resemble a pair of sun-baked palm fronds are hardened lava flows formed by the active Mauna Loa Volcano. Just to the north of Mauna Loa is the dormant grayish Mauna Kea Volcano, which hasn't erupted in an estimated 3,500 years. A thin greyish plume of smoke is visible near the island's southeastern shore, rising from Kilauea-the most active volcano on Earth. Heavy rainfall and fertile volcanic soil have given rise to Hawaii's lush tropical forests, which appear as solid dark green areas in the image. The light green, patchy areas near the coasts are likely sugar cane plantations, pineapple farms, and human settlements. Courtesy of the NOAA Coastal Services Center Hawaii Land Cover Analysis project

  2. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Note: [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to 1000 light-years, or about 9000 million million km! More Information: This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates

  3. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Notes [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to several hundreds light years on each side of the black hole, or about several thousand million million km! More information This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising

  4. Assessment of acreage and vegetation change in Florida's Big Bend tidal wetlands using satellite imagery

    USGS Publications Warehouse

    Raabe, Ellen A.; Stumpf, Richard P.

    1997-01-01

    Fluctuations in sea level and impending development on the west coast of Florida have aroused concern for the relatively pristine tidal marshes of the Big Bend. Landsat Thematic Mapper (TM) images for 1986 and 1995 are processed and evaluated for signs of change. The images cover 250 km of Florida's Big Bend Gulf Coast, encompassing 160,000 acres of tidal marshes. Change is detected using the normalized difference vegetation index (NDVI) and land cover classification. The imagery shows negligible net loss or gain in the marsh over the 9-year period. However, regional changes in biomass are apparent and are due to natural disturbances such as low winter temperatures, fire, storm surge, and the conversion of forest to march. Within the marsh, the most prominent changes in NDVI and in land cover result from the recovery of mangroves from freezes, a decline of transitional upland vegetation, and susceptibility of the marsh edge and interior to variations in tidal flooding.

  5. 8. NORTHERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. NORTHERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON DAM SHOWING THE HOLLOW BAYS 2, 3, 4, 5, AND 6 AND THE PLUNGE POOL IN THE FOREGROUND. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  6. Big Ideas in Primary Mathematics: Issues and Directions

    ERIC Educational Resources Information Center

    Askew, Mike

    2013-01-01

    This article is located within the literature arguing for attention to Big Ideas in teaching and learning mathematics for understanding. The focus is on surveying the literature of Big Ideas and clarifying what might constitute Big Ideas in the primary Mathematics Curriculum based on both theoretical and pragmatic considerations. This is…

  7. 76 FR 26240 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-06

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming..., 2011, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn County Weed...

  8. 78 FR 33326 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-04

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... at 3:00 p.m. ADDRESSES: The meeting will be held at Big Horn County Weed and Pest Building,...

  9. 76 FR 47141 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-04

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ] ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming..., 2011 and will begin at 3 p.m. ADDRESSES: The meeting will be held at the Big Horn County Weed and...

  10. 76 FR 7810 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Lovell, Wyoming..., 2011, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn Federal...

  11. 77 FR 49779 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-17

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... 11, 2012 and will begin at 3 p.m. ADDRESSES: The meeting will be held at the Big Horn County Weed...

  12. 75 FR 71069 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... December 1, 2010, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn County...

  13. Sports and the Big6: The Information Advantage.

    ERIC Educational Resources Information Center

    Eisenberg, Mike

    1997-01-01

    Explores the connection between sports and the Big6 information problem-solving process and how sports provides an ideal setting for learning and teaching about the Big6. Topics include information aspects of baseball, football, soccer, basketball, figure skating, track and field, and golf; and the Big6 process applied to sports. (LRW)

  14. ["Big data" - large data, a lot of knowledge?].

    PubMed

    Hothorn, Torsten

    2015-01-28

    Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.

  15. 11. VIEW OF UPSTREAM ELEVATION OF BIG TUJUNGA DAM SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. VIEW OF UPSTREAM ELEVATION OF BIG TUJUNGA DAM SHOWING CONSTRUCTION OF THE ARCH, TAKEN ON NOVEMBER 26, 1930, (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON JUNE 5, 1973, BY PHOTOGRAPHER GATSON OF L.A. COUNTY PUBLIC WORKS. - Big Tujunga Dam, 809 West Big Tujunga Road, Sunland, Los Angeles County, CA

  16. 14. VIEW OF UPSTREAM ELEVATION SHOWING CONSTRUCTION OF BIG TUJUNGA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. VIEW OF UPSTREAM ELEVATION SHOWING CONSTRUCTION OF BIG TUJUNGA DAM, TAKEN ON MAY 27, 1931, (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON JUNE 5, 1973, BY PHOTOGRAPHER GATSON OF L.A. COUNTY PUBLIC WORKS. - Big Tujunga Dam, 809 West Big Tujunga Road, Sunland, Los Angeles County, CA

  17. 12. VIEW OF UPSTREAM ELEVATION OF BIG TUJUNGA DAM SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. VIEW OF UPSTREAM ELEVATION OF BIG TUJUNGA DAM SHOWING CONSTRUCTION OF THE ARCH, TAKEN ON JANUARY 28, 1931, (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON JUNE 5, 1973, BY PHOTOGRAPHER GATSON OF L.A. COUNTY PUBLIC WORKS. - Big Tujunga Dam, 809 West Big Tujunga Road, Sunland, Los Angeles County, CA

  18. 7. SOUTHEAST VIEW OF BIG DALTON DAM SHOWING THE MULTIPLE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. SOUTHEAST VIEW OF BIG DALTON DAM SHOWING THE MULTIPLE ARCHES, AN UPSTREAM VIEW OF THE PARAPET WALL ALONG THE CREST OF THE DAM, AND THE SHELTER HOUSE AT THE EAST END OF THE DAM. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  19. 11. VIEW OF UPSTREAM ELEVATION OF BIG DALTON DAM SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. VIEW OF UPSTREAM ELEVATION OF BIG DALTON DAM SHOWING CONSTRUCTION OF THE ARCH WALLS, TAKEN ON SEPTEMBER 11, 1928 (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON 6/5/1973 BY PHOTOGRAPHER GATSON OF L.A. COUNTY PUBLIC WORKS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  20. 15. UPSTREAM VIEW (PHOTOGRAPHER UNKNOWN) SHOWING BIG DALTON DAM NEAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. UPSTREAM VIEW (PHOTOGRAPHER UNKNOWN) SHOWING BIG DALTON DAM NEAR FULL CAPACITY AFTER CONSTRUCTION. PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON 2-15-1973 BY PHOTOGRAPHER D. MEIER OF L.A. COUNTY PUBLIC WORKS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  1. 16. AERIAL VIEW OF BIG DALTON DAM TAKEN ON 2161962 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. AERIAL VIEW OF BIG DALTON DAM TAKEN ON 2-16-1962 BY L.A. COUNTY PUBLIC WORKS PHOTOGRAPHER SINGER. PHOTO SHOWS THE RESERVOIR NEAR FULL CAPACITY AND WATER BEING RELEASED ON THE DOWNSTREAM SIDE. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  2. 13. VIEW OF DOWNSTREAM ELEVATION OF BIG DALTON DAM SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. VIEW OF DOWNSTREAM ELEVATION OF BIG DALTON DAM SHOWING CONSTRUCTION OF THE ARCHES AND ARCH WALLS TAKEN IN 1928-1929 (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON 2-15-1973 BY PHOTOGRAPHER D. MEIER OF L.A. COUNTY PUBLIC WORKS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  3. Clarity and causality needed in claims about Big Gods.

    PubMed

    Watts, Joseph; Bulbulia, Joseph; Gray, Russell D; Atkinson, Quentin D

    2016-01-01

    We welcome Norenzayan et al.'s claim that the prosocial effects of beliefs in supernatural agents extend beyond Big Gods. To date, however, supporting evidence has focused on the Abrahamic Big God, making generalisations difficult. We discuss a recent study that highlights the need for clarity about the causal path by which supernatural beliefs affect the evolution of big societies. PMID:26948745

  4. View of New Big Oak Flat Road seen from Old ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of New Big Oak Flat Road seen from Old Wawona Road near location of photograph HAER CA-148-17. Note road cuts, alignment, and tunnels. Devils Dance Floor at left distance. Looking northwest - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  5. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its...

  6. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    PubMed

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  7. The Ethics of Big Data: Current and Foreseeable Issues in Biomedical Contexts.

    PubMed

    Mittelstadt, Brent Daniel; Floridi, Luciano

    2016-04-01

    The capacity to collect and analyse data is growing exponentially. Referred to as 'Big Data', this scientific, social and technological trend has helped create destabilising amounts of information, which can challenge accepted social and ethical norms. Big Data remains a fuzzy idea, emerging across social, scientific, and business contexts sometimes seemingly related only by the gigantic size of the datasets being considered. As is often the case with the cutting edge of scientific and technological progress, understanding of the ethical implications of Big Data lags behind. In order to bridge such a gap, this article systematically and comprehensively analyses academic literature concerning the ethical implications of Big Data, providing a watershed for future ethical investigations and regulations. Particular attention is paid to biomedical Big Data due to the inherent sensitivity of medical information. By means of a meta-analysis of the literature, a thematic narrative is provided to guide ethicists, data scientists, regulators and other stakeholders through what is already known or hypothesised about the ethical risks of this emerging and innovative phenomenon. Five key areas of concern are identified: (1) informed consent, (2) privacy (including anonymisation and data protection), (3) ownership, (4) epistemology and objectivity, and (5) 'Big Data Divides' created between those who have or lack the necessary resources to analyse increasingly large datasets. Critical gaps in the treatment of these themes are identified with suggestions for future research. Six additional areas of concern are then suggested which, although related have not yet attracted extensive debate in the existing literature. It is argued that they will require much closer scrutiny in the immediate future: (6) the dangers of ignoring group-level ethical harms; (7) the importance of epistemology in assessing the ethics of Big Data; (8) the changing nature of fiduciary relationships that

  8. The Ethics of Big Data: Current and Foreseeable Issues in Biomedical Contexts.

    PubMed

    Mittelstadt, Brent Daniel; Floridi, Luciano

    2016-04-01

    The capacity to collect and analyse data is growing exponentially. Referred to as 'Big Data', this scientific, social and technological trend has helped create destabilising amounts of information, which can challenge accepted social and ethical norms. Big Data remains a fuzzy idea, emerging across social, scientific, and business contexts sometimes seemingly related only by the gigantic size of the datasets being considered. As is often the case with the cutting edge of scientific and technological progress, understanding of the ethical implications of Big Data lags behind. In order to bridge such a gap, this article systematically and comprehensively analyses academic literature concerning the ethical implications of Big Data, providing a watershed for future ethical investigations and regulations. Particular attention is paid to biomedical Big Data due to the inherent sensitivity of medical information. By means of a meta-analysis of the literature, a thematic narrative is provided to guide ethicists, data scientists, regulators and other stakeholders through what is already known or hypothesised about the ethical risks of this emerging and innovative phenomenon. Five key areas of concern are identified: (1) informed consent, (2) privacy (including anonymisation and data protection), (3) ownership, (4) epistemology and objectivity, and (5) 'Big Data Divides' created between those who have or lack the necessary resources to analyse increasingly large datasets. Critical gaps in the treatment of these themes are identified with suggestions for future research. Six additional areas of concern are then suggested which, although related have not yet attracted extensive debate in the existing literature. It is argued that they will require much closer scrutiny in the immediate future: (6) the dangers of ignoring group-level ethical harms; (7) the importance of epistemology in assessing the ethics of Big Data; (8) the changing nature of fiduciary relationships that

  9. Multi-Scale Change Detection Research of Remotely Sensed Big Data in CyberGIS

    NASA Astrophysics Data System (ADS)

    Xing, J.; Sieber, R.

    2015-12-01

    Big remotely sensed data, the heterogeneity of satellite platforms and file formats along with increasing volumes and velocities, offers new types of analyses. This makes big remotely sensed data a good candidate for CyberGIS, the aim of which is to enable knowledge discovery of big data in the cloud. We apply CyberGIS to feature-based multi-scale land use/cover change (LUCC) detection. There have been attempts to do multi-scale LUCC. However, studies were done with small data and could not consider the mismatch between multi-scale analysis and computational scale. They have yet to consider the possibilities for scalar research across numerous temporal and spatial scales afforded by big data, especially if we want to advance beyond pixel-based analysis and also reduce preprocessing requirements. We create a geospatial cyberinfrastructure (GCI) to handle multi-spatio-temporal scale change detection. We first clarify different meanings of scale in CyberGIS and LUCC to derive a feature scope layer in the GCI based on Stommel modelling. Our analysis layer contains a multi-scale segmentation-based method based on normalized cut image segmentation and wavelet-based image scaling algorithms. Our computer resource utilization layer uses Wang and Armstrong's (2009) method for mainly for memory, I/O and CPU time. Our case is urban-rural change detection in the Greater Montreal Area (5 time periods, 2006-2012, 100 virtual machines), 36,000km2 and varying from 0.6m to 38m resolution. We present a ground truthed accuracy assessment of a change matrix that is composed of 6 feature classes at 12 different spatio-temporal scales, and the performance of the change detection GCI for multi-scale LUCC study. The GCI allows us to extract and coordinate different types of changes by varying spatio-temporal scales from the big imagery datasets.

  10. Nonsingular big bounces and the evolution of linear fluctuations

    NASA Astrophysics Data System (ADS)

    Hwang, Jai-Chan; Noh, Hyerim

    2002-06-01

    We consider the evolutions of linear fluctuations as the background Friedmann world model goes from contracting to expanding phases through smooth and nonsingular bouncing phases. As long as gravity dominates over the pressure gradient in the perturbation equation, the growing mode in the expanding phase is characterized by a conserved amplitude; we call this a C mode. In spherical geometry with a pressureless medium, we show that there exists a special gauge-invariant combination Φ which stays constant throughout the evolution from the big bang to the big crunch, with the same value even after the bounce: it characterizes the coefficient of the C mode. We show this result by using a bounce model where the pressure gradient term is negligible during the bounce; this requires the additional presence of exotic matter. In such a bounce, even in more general situations for the equation of state before and after the bounce, the C mode in the expanding phase is affected only by the C mode in the contracting phase; thus the growing mode in the contracting phase decays away as the world model enters the expanding phase. When the background curvature plays a significant role during the bounce, the pressure gradient term becomes important and we cannot trace the C mode in the expanding phase to the one before the bounce. In such situations, perturbations in a fluid bounce model show exponential instability, whereas perturbations in a scalar field bounce model show oscillatory behavior.

  11. Primordial comets: big bang nucleosynthesis, dark matter and life

    NASA Astrophysics Data System (ADS)

    Sheldon, Robert B.

    2015-09-01

    Primordial comets are comets made of Big Bang synthesized materials—water, ammonium, and carbon ices. These are the basic elements for life, so that these comets can be colonized by cyanobacteria that grow and bioengineer it for life dispersal. In addition, should they exist in large enough quantities, they would easily satisfy the qualifications for dark matter: low albedo with low visibility, gravitationally femtolensing, galactic negative viscosity, early galaxy formation seeds, and a self-interaction providing cosmic structure. The major arguments against their existence are the absence of metals (elements heavier than He) in ancient Population III stars, and the stringent requirements put on the Big Bang (BB) baryonic density by the BB nucleosynthesis (BBN) models. We argue that CI chondrites, hyperbolic comets, and carbon-enriched Pop III stars are all evidence for primordial comets. The BBN models provide the greater obstacle, but we argue that they crucially omit the magnetic field in their homogeneous, isotropic, "ideal baryon gas" model. Should large magnetic fields exist, not only would they undermine the 1-D models, but if their magnitude exceeds some critical field/density ratio, then the neutrino interacts with the fields, changing the equilibrium ratio of protons to neutrons. Since BBN models are strongly dependent on this ratio, magnetic fields have the potential to radically change the production of C, N, and O (CNO) to produce primordial comets. Then the universe from the earliest moments is not only seeded for galaxy formation, but it is seeded with the ingredients for life.

  12. Big Data Archives: Replication and synchronizing on a large scale

    NASA Astrophysics Data System (ADS)

    King, T. A.; Walker, R. J.

    2015-12-01

    Modern data archives provide unique challenges to replication and synchronization because of their large size. We collect more digital information today than any time before and the volume of data collected is continuously increasing. Some of these data are from unique observations, like those from planetary missions that should be preserved for use by future generations. In addition data from NASA missions are considered federal records and must be retained. While the data may be stored on resilient hardware (i.e. RAID systems) they also must be protected from local or regional disasters. Meeting this challenge requires creating multiple copies. This task is complicated by the fact that new data are constantly being added creating what are called "active archives". Having reliable, high performance tools for replicating and synchronizing active archives in a timely fashion is critical to preservation of the data. When archives were smaller using tools like bbcp, rsync and rcp worked fairly well. While these tools are affective they are not optimized for synchronizing big data archives and their poor performance at scale lead us to develop a new tool designed specifically for big data archives. It combines the best features of git, bbcp, rsync and rcp. We call this tool "Mimic" and we discuss the design of the tool, performance comparisons and its use at NASA's Planetary Plasma Interactions (PPI) Node of the Planetary Data System (PDS).

  13. High School Students as Mentors: Findings from the Big Brothers Big Sisters School-Based Mentoring Impact Study

    ERIC Educational Resources Information Center

    Herrera, Carla; Kauh, Tina J.; Cooney, Siobhan M.; Grossman, Jean Baldwin; McMaken, Jennifer

    2008-01-01

    High schools have recently become a popular source of mentors for school-based mentoring (SBM) programs. The high school Bigs program of Big Brothers Big Sisters of America, for example, currently involves close to 50,000 high-school-aged mentors across the country. While the use of these young mentors has several potential advantages, their age…

  14. ';Big Data' can make a big difference: Applying Big Data to National Scale Change Analyses

    NASA Astrophysics Data System (ADS)

    Mueller, N. R.; Curnow, S.; Melrose, R.; Purss, M. B.; Lewis, A.

    2013-12-01

    The traditional method of change detection in remote sensing is based on acquiring a pair of images and conducting a set of analyses to determine what is different between them. The end result is a single change analysis for a single time period. While this may be repeated several times, it is generally a time consuming, often manual process providing a series of snapshots of change. As datasets become larger, and time series analyses become more sophisticated, these traditional methods of analysis are unviable. The Geoscience Australia ';Data Cube' provides a 25-year time series of all Landsat-5 and Landsat-7 data for the entire Australian continent. Each image is orthorectified to a standard set of pixel locations and is fully calibrated to a measure of surface reflectance (the 25m Australian Reflectance Grid [ARG25]). These surface reflectance measurements are directly comparable, between different scenes, and regardless of whether they are sourced from the Landsat-5 TM instrument or the Landsat-7 ETM+. The advantage of the Data Cube environment lies in the ability to apply an algorithm to every pixel across Australia (some 1013 pixels) in a consistent way, enabling change analysis for every acquired observation. This provides a framework to analyse change through time on a scene to scene basis, and across national-scale areas for the entire duration of the archive. Two examples of applications of the Data Cube are described here: surface water extent mapping across Australia; and vegetation condition mapping across the Murray-Darling Basin, Australia's largest river system.. Ongoing water mapping and vegetation condition mapping is required by the Australian government to produce information products for a range of requirements including ecological monitoring and emergency management risk planning. With a 25 year archive of Landsat-5 and Landsat-7 imagery hosted on an efficient High Performance Computing (HPC) environment, high speed analyses of long time

  15. SETI as a part of Big History

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive Big History Theory based on GBMs: just as the GBM is the “movie” unfolding in time, so the Statistical Drake Equation is its “still picture”, static in time, and the GBM is the time-extension of the Drake Equation. Darwinian Evolution on Earth may be easily described as an increasing GBM in the number of living species on Earth over the last 3.5 billion years. The first of them was RNA 3.5 billion years ago, and now 50

  16. The Interplay of "Big Five" Personality Factors and Metaphorical Schemas: A Pilot Study with 20 Lung Transplant Recipients

    ERIC Educational Resources Information Center

    Goetzmann, Lutz; Moser, Karin S.; Vetsch, Esther; Grieder, Erhard; Klaghofer, Richard; Naef, Rahel; Russi, Erich W.; Boehler, Annette; Buddeberg, Claus

    2007-01-01

    The aim of the present study was to investigate the interplay between personality factors and metaphorical schemas. The "Big Five" personality factors of 20 patients after lung transplantation were examined with the NEO-FFI. Patients were questioned about their social network, and self- and body-image. The interviews were assessed with metaphor…

  17. Comparative proteomics and metallomics studies in Arabidopsis thaliana leaf tissues: evaluation of the selenium addition in transgenic and nontransgenic plants using two-dimensional difference gel electrophoresis and laser ablation imaging.

    PubMed

    Maciel, Bruna C M; Barbosa, Herbert S; Pessôa, Gustavo S; Salazar, Marcela M; Pereira, Gonçalo A G; Gonçalves, Danieli C; Ramos, Carlos H I; Arruda, Marco A Z

    2014-04-01

    The main goal of this work is to evaluate some differential protein species in transgenic (T) and nontransgenic (NT) Arabidopsis thaliana plants after their cultivation in the presence or absence of sodium selenite. The transgenic line was obtained through insertion of CaMV 35S controlling nptII gene. Comparative proteomics through 2D-DIGE is carried out in four different groups (NT × T; NT × Se-NT (where Se is selenium); Se-NT × Se-T, and T × Se-T). Although no differential proteins are achieved in the T × Se-T group, for the others, 68 differential proteins (by applying a regulation factor ≥1.5) are achieved, and 27 of them accurately characterized by ESI-MS/MS. These proteins are classified into metabolism, energy, signal transduction, disease/defense categories, and some of them are involved in the glycolysis pathway-Photosystems I and II and ROS combat. Additionally, laser ablation imaging is used for evaluating the Se and sulfur distribution in leaves of different groups, corroborating some results obtained and related to proteins involved in the glycolysis pathway. From these results, it is possible to conclude that the genetic modification also confers to the plant resistance to oxidative stress.

  18. Big Data - What is it and why it matters.

    PubMed

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers.

  19. Pinna nobilis: A big bivalve with big haemocytes?

    PubMed

    Matozzo, V; Pagano, M; Spinelli, A; Caicci, F; Faggio, C

    2016-08-01

    The fan mussel Pinna nobilis (Linnaeus, 1758) is one of the biggest bivalves worldwide. Currently, no updated information is available in the literature concerning the morpho-functional aspects of haemocytes from this bivalve species. Consequently, in this study, we characterised P. nobilis haemocytes from both a morphological and functional point of view. The mean number of haemocytes was about 5 (×10(5)) cells mL haemolymph(-1), and the cell viability was about 92-100%. Two haemocyte types were distinguished under the light microscope: granulocytes (51.6%), with evident cytoplasmic granules, and hyalinocytes (48.4%), with a few granules. The granules of the granulocytes were mainly lysosomes, as indicated by the in vivo staining with Neutral Red. Haemocytes were further distinguished in basophils (83.75%), acidophils (14.75%) and neutrophils (1.5%). After adhesion to slides and fixation, the cell diameter was approximately 10 μm for granulocytes and 7 μm for hyalinocytes. The granulocytes and hyalinocytes were both positive to the Periodic Acid-Schiff reaction for carbohydrates. Only granulocytes were able to phagocytise yeast cells. The phagocytic index (6%) increased significantly up to twofold after preincubation of yeast in cell-free haemolymph, suggesting that haemolymph has opsonising properties. In addition, haemocytes produce superoxide anion and acid and alkaline phosphatases. Summarising, this preliminary study indicates that both the granulocytes and hyalinocytes circulate in the haemolymph of P. nobilis and that they are active immunocytes. PMID:27346153

  20. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  1. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'. PMID:26918190

  2. The big data-big model (BDBM) challenges in ecological research

    NASA Astrophysics Data System (ADS)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  3. Anticipated Changes in Conducting Scientific Data-Analysis Research in the Big-Data Era

    NASA Technical Reports Server (NTRS)

    Kuo, Kwo-Sen; Seablom, Michael; Clune, Thomas; Ramachandran, Rahul

    2014-01-01

    A Big-Data environment is one that is capable of orchestrating quick-turnaround analyses involving large volumes of data for numerous simultaneous users. Based on our experiences with a prototype Big-Data analysis environment, we anticipate some important changes in research behaviors and processes while conducting scientific data-analysis research in the near future as such Big-Data environments become the mainstream. The first anticipated change will be the reduced effort and difficulty in most parts of the data management process. A Big-Data analysis environment is likely to house most of the data required for a particular research discipline along with appropriate analysis capabilities. This will reduce the need for researchers to download local copies of data. In turn, this also reduces the need for compute and storage procurement by individual researchers or groups, as well as associated maintenance and management afterwards. It is almost certain that Big-Data environments will require a different "programming language" to fully exploit the latent potential. In addition, the process of extending the environment to provide new analysis capabilities will likely be more involved than, say, compiling a piece of new or revised code.We thus anticipate that researchers will require support from dedicated organizations associated with the environment that are composed of professional software engineers and data scientists. A major benefit will likely be that such extensions are of higherquality and broader applicability than ad hoc changes by physical scientists. Another anticipated significant change is improved collaboration among the researchers using the same environment. Since the environment is homogeneous within itself, many barriers to collaboration are minimized or eliminated. For example, data and analysis algorithms can be seamlessly shared, reused and re-purposed. In conclusion, we will be able to achieve a new level of scientific productivity in the Big

  4. Big Data Analytics for Disaster Preparedness and Response of Mobile Communication Infrastructure during Natural Hazards

    NASA Astrophysics Data System (ADS)

    Zhong, L.; Takano, K.; Ji, Y.; Yamada, S.

    2015-12-01

    The disruption of telecommunications is one of the most critical disasters during natural hazards. As the rapid expanding of mobile communications, the mobile communication infrastructure plays a very fundamental role in the disaster response and recovery activities. For this reason, its disruption will lead to loss of life and property, due to information delays and errors. Therefore, disaster preparedness and response of mobile communication infrastructure itself is quite important. In many cases of experienced disasters, the disruption of mobile communication networks is usually caused by the network congestion and afterward long-term power outage. In order to reduce this disruption, the knowledge of communication demands during disasters is necessary. And big data analytics will provide a very promising way to predict the communication demands by analyzing the big amount of operational data of mobile users in a large-scale mobile network. Under the US-Japan collaborative project on 'Big Data and Disaster Research (BDD)' supported by the Japan Science and Technology Agency (JST) and National Science Foundation (NSF), we are going to investigate the application of big data techniques in the disaster preparedness and response of mobile communication infrastructure. Specifically, in this research, we have considered to exploit the big amount of operational information of mobile users for predicting the communications needs in different time and locations. By incorporating with other data such as shake distribution of an estimated major earthquake and the power outage map, we are able to provide the prediction information of stranded people who are difficult to confirm safety or ask for help due to network disruption. In addition, this result could further facilitate the network operators to assess the vulnerability of their infrastructure and make suitable decision for the disaster preparedness and response. In this presentation, we are going to introduce the

  5. Anticipated Changes in Conducting Scientific Data-Analysis Research in the Big-Data Era

    NASA Astrophysics Data System (ADS)

    Kuo, Kwo-Sen; Seablom, Michael; Clune, Thomas; Ramachandran, Rahul

    2014-05-01

    A Big-Data environment is one that is capable of orchestrating quick-turnaround analyses involving large volumes of data for numerous simultaneous users. Based on our experiences with a prototype Big-Data analysis environment, we anticipate some important changes in research behaviors and processes while conducting scientific data-analysis research in the near future as such Big-Data environments become the mainstream. The first anticipated change will be the reduced effort and difficulty in most parts of the data management process. A Big-Data analysis environment is likely to house most of the data required for a particular research discipline along with appropriate analysis capabilities. This will reduce the need for researchers to download local copies of data. In turn, this also reduces the need for compute and storage procurement by individual researchers or groups, as well as associated maintenance and management afterwards. It is almost certain that Big-Data environments will require a different "programming language" to fully exploit the latent potential. In addition, the process of extending the environment to provide new analysis capabilities will likely be more involved than, say, compiling a piece of new or revised code. We thus anticipate that researchers will require support from dedicated organizations associated with the environment that are composed of professional software engineers and data scientists. A major benefit will likely be that such extensions are of higher-quality and broader applicability than ad hoc changes by physical scientists. Another anticipated significant change is improved collaboration among the researchers using the same environment. Since the environment is homogeneous within itself, many barriers to collaboration are minimized or eliminated. For example, data and analysis algorithms can be seamlessly shared, reused and re-purposed. In conclusion, we will be able to achieve a new level of scientific productivity in the

  6. EDITORIAL: Big challenges and nanosolutions Big challenges and nanosolutions

    NASA Astrophysics Data System (ADS)

    Demming, Anna

    2011-07-01

    Population increases have triggered a number of concerns over the impact of human activity on the global environment. In addition these anxieties are exacerbated by the trend towards high levels of energy consumption and waste generation in developed nations. Pollutants that figure highly in environmental debate include greenhouse gases from fuel combustion and waste decomposition [1] and nitrogen from fertilisers [2]. In fact, human activity is transforming the nitrogen cycle at a record pace [3], and the pressure on available natural resources is mounting. As a collaboration of researchers in Saudi Arabia and the US explain in this issue, 26 countries across the world do not have sufficient water resources to sustain agriculture and economic development, and approximately one billion people lack access to safe drinking water [4]. They also point out a number of ways the potential of nanoscience and technology can be harnessed to tackle the problem. The key to managing pollutants is their detection. The biodegradation of waste in land fill sites can generate a build up of a number of green house and other gases. Olfactometry using the human expertise of a trained test panel is not a viable option for continuous monitoring of potentially odourless gases on industrial scales with any valid objectivity. Researchers in Italy have fabricated forest-like structures of carbon nanotubes loaded with metal nanoparticles and unmodified nanotubes on low-cost iron-coated alumina substrates [1]. The structure was able to detect NO2 in a multicomponent gas mixture of CO2, CH4, H2, NH3, CO and NO2 with sensitivity better than one part per million. Nanostructures exhibit a number of properties that lend themselves to sensing applications. They often have unique electrical properties that are readily affected by their environment. Such features were exploited by researchers in China who created nanoporous structures in ZnO sheets that can detect formaldehyde and ammonia, the

  7. 76 FR 29786 - Environmental Impact Statement for the Big Cypress National Preserve Addition, Florida

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-23

    ... Floodplain Statement of Findings for the General Management Plan/Wilderness Study/ Off-Road Vehicle... availability of the Record of Decision (ROD) and Floodplain Statement of Findings for the General Management... Floodplain Statement of Findings. ADDRESSES: The ROD is available online at...

  8. The good body: when big is better.

    PubMed

    Cassidy, C M

    1991-09-01

    An important cultural question is, "What is a 'good'--desirable, beautiful, impressive--body?" The answers are legion; here I examine why bigger bodies represent survival skill, and how this power symbolism is embodied by behaviors that guide larger persons toward the top of the social hierarchy. bigness is a complex concept comprising tallness, boniness, muscularity and fattiness. Data show that most people worldwide want to be big--both tall and fat. Those who achieve the ideal are disproportionately among the society's most socially powerful. In the food-secure West, fascination with power and the body has not waned, but has been redefined such that thinness is desired. This apparent anomaly is resolved by realizing that thinness in the midst of abundance--as long as one is also tall and muscular--still projects the traditional message of power, and brings such social boons as upward mobility. PMID:1961102

  9. Big Crunch-based omnidirectional light concentrators

    NASA Astrophysics Data System (ADS)

    Smolyaninov, Igor I.; Hung, Yu-Ju

    2014-12-01

    Omnidirectional light concentration remains an unsolved problem despite such important practical applications as the design of efficient mobile photovoltaic cells. Recently developed optical black hole designs offer partial solutions to this problem. However, even these solutions are not truly omnidirectional since they do not exhibit a horizon, and at large enough incidence angles the light may be trapped into quasi-stationary orbits around such imperfect optical black holes. Here, we propose and realize experimentally another gravity-inspired design of a broadband omnidirectional light concentrator based on the cosmological Big Crunch solutions. By mimicking the Big Crunch spacetime via a corresponding effective optical metric, we make sure that every photon world line terminates in a single point.

  10. Spectral Observations of BIG Objects. III

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.

    2004-07-01

    Results of spectral observations of 66 objects from the BIG (Byurakan IRAS Galaxies) sample made with the 1.93 m telescope at the Observatoire de Haute Provence (OHP, France) are presented. Emission lines are observed from 64 of the galaxies. The red shifts are determined, the radial velocities, distances, and absolute stellar magnitudes are calculated, the spectrum line parameters are determined, diagnostic diagrams are constructed, the objects are classified according to activity type, and their IR and far-IR luminosities are calculated. Of the 66 objects (corresponding to 61 IRAS sources), 6 are Sy2, 2 are LINERs, 8 are AGN (Sy2 or LINER), 10 are composite, 34 are HII, and 4 are Em of undetermined type. It is calculated that IRAS 07479+7832= BIG d141a is a ultraluminous IR galaxy (ULIG), and 21 are LIG. Spectra of several of the galaxies being studied are presented.

  11. FAST TRACK COMMUNICATION: Big Bounce and inhomogeneities

    NASA Astrophysics Data System (ADS)

    Brizuela, David; Mena Marugán, Guillermo A.; Pawłowski, Tomasz

    2010-03-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified.

  12. Big Data and Deep data in scanning and electron microscopies: functionality from multidimensional data sets

    SciTech Connect

    Belianinov, Alex; Vasudevan, Rama K; Strelcov, Evgheni; Steed, Chad A; Yang, Sang Mo; Tselev, Alexander; Jesse, Stephen; Biegalski, Michael D; Shipman, Galen M; Symons, Christopher T; Borisevich, Albina Y; Archibald, Richard K; Kalinin, Sergei

    2015-01-01

    The development of electron, and scanning probe microscopies in the second half of the twentieth century have produced spectacular images of internal structure and composition of matter with, at nanometer, molecular, and atomic resolution. Largely, this progress was enabled by computer-assisted methods of microscope operation, data acquisition and analysis. The progress in imaging technologies in the beginning of the twenty first century has opened the proverbial floodgates of high-veracity information on structure and functionality. High resolution imaging now allows information on atomic positions with picometer precision, allowing for quantitative measurements of individual bond length and angles. Functional imaging often leads to multidimensional data sets containing partial or full information on properties of interest, acquired as a function of multiple parameters (time, temperature, or other external stimuli). Here, we review several recent applications of the big and deep data analysis methods to visualize, compress, and translate this data into physically and chemically relevant information from imaging data.

  13. Big Bend National Park, TX, USA, Mexico

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Sierra del Carmen of Mexico, across the Rio Grande River from Big Bend National Park, TX, (28.5N, 104.0W) is centered in this photo. The Rio Grande River bisects the scene; Mexico to the east, USA to the west. The thousand ft. Boquillas limestone cliff on the Mexican side of the river changes colors from white to pink to lavender at sunset. This severely eroded sedimentary landscape was once an ancient seabed later overlaid with volcanic activity.

  14. Big Data Challenges for Large Radio Arrays

    NASA Technical Reports Server (NTRS)

    Jones, Dayton L.; Wagstaff, Kiri; Thompson, David; D'Addario, Larry; Navarro, Robert; Mattmann, Chris; Majid, Walid; Lazio, Joseph; Preston, Robert; Rebbapragada, Umaa

    2012-01-01

    Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields.

  15. Funding big research with small money.

    PubMed

    Hickey, Joanne V; Koithan, Mary; Unruh, Lynn; Lundmark, Vicki

    2014-06-01

    This department highlights change management strategies that maybe successful in strategically planning and executing organizational change initiatives.With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools,and resources that mobilize and sustain organizational change initiatives.In this article, the guest authors introduce crowd sourcing asa strategy for funding big research with small money. PMID:24853791

  16. Funding big research with small money.

    PubMed

    Hickey, Joanne V; Koithan, Mary; Unruh, Lynn; Lundmark, Vicki

    2014-06-01

    This department highlights change management strategies that maybe successful in strategically planning and executing organizational change initiatives.With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools,and resources that mobilize and sustain organizational change initiatives.In this article, the guest authors introduce crowd sourcing asa strategy for funding big research with small money.

  17. Can big business save health care?

    PubMed

    Dunn, Philip

    2007-01-01

    Corporate America has decided to stop bellyaching about the cost and quality of the health care it helps buy for its employees. Now it's taking concrete action. Large employers such as Wal-Mart, Oracle, Cisco, BP America and many, many others are pressuring providers to meet performance standards, adopt information technology and transform the efficiency of their operations. Big Business wants value for its buck, and it's now putting money where its mouth is.

  18. Can big business save health care?

    PubMed

    Dunn, Philip

    2007-01-01

    Corporate America has decided to stop bellyaching about the cost and quality of the health care it helps buy for its employees. Now it's taking concrete action. Large employers such as Wal-Mart, Oracle, Cisco, BP America and many, many others are pressuring providers to meet performance standards, adopt information technology and transform the efficiency of their operations. Big Business wants value for its buck, and it's now putting money where its mouth is. PMID:17302135

  19. EDITORIAL: Big challenges and nanosolutions Big challenges and nanosolutions

    NASA Astrophysics Data System (ADS)

    Demming, Anna

    2011-07-01

    Population increases have triggered a number of concerns over the impact of human activity on the global environment. In addition these anxieties are exacerbated by the trend towards high levels of energy consumption and waste generation in developed nations. Pollutants that figure highly in environmental debate include greenhouse gases from fuel combustion and waste decomposition [1] and nitrogen from fertilisers [2]. In fact, human activity is transforming the nitrogen cycle at a record pace [3], and the pressure on available natural resources is mounting. As a collaboration of researchers in Saudi Arabia and the US explain in this issue, 26 countries across the world do not have sufficient water resources to sustain agriculture and economic development, and approximately one billion people lack access to safe drinking water [4]. They also point out a number of ways the potential of nanoscience and technology can be harnessed to tackle the problem. The key to managing pollutants is their detection. The biodegradation of waste in land fill sites can generate a build up of a number of green house and other gases. Olfactometry using the human expertise of a trained test panel is not a viable option for continuous monitoring of potentially odourless gases on industrial scales with any valid objectivity. Researchers in Italy have fabricated forest-like structures of carbon nanotubes loaded with metal nanoparticles and unmodified nanotubes on low-cost iron-coated alumina substrates [1]. The structure was able to detect NO2 in a multicomponent gas mixture of CO2, CH4, H2, NH3, CO and NO2 with sensitivity better than one part per million. Nanostructures exhibit a number of properties that lend themselves to sensing applications. They often have unique electrical properties that are readily affected by their environment. Such features were exploited by researchers in China who created nanoporous structures in ZnO sheets that can detect formaldehyde and ammonia, the

  20. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches.

  1. Statistical methods and computing for big data

    PubMed Central

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593

  2. Vertical landscraping, a big regionalism for Dubai.

    PubMed

    Wilson, Matthew

    2010-01-01

    Dubai's ecologic and economic complications are exacerbated by six years of accelerated expansion, a fixed top-down approach to urbanism and the construction of iconic single-phase mega-projects. With recent construction delays, project cancellations and growing landscape issues, Dubai's tower typologies have been unresponsive to changing environmental, socio-cultural and economic patterns (BBC, 2009; Gillet, 2009; Lewis, 2009). In this essay, a theory of "Big Regionalism" guides an argument for an economically and ecologically linked tower typology called the Condenser. This phased "box-to-tower" typology is part of a greater Landscape Urbanist strategy called Vertical Landscraping. Within this strategy, the Condenser's role is to densify the city, facilitating the creation of ecologic voids that order the urban region. Delineating "Big Regional" principles, the Condenser provides a time-based, global-local urban growth approach that weaves Bigness into a series of urban-regional, economic and ecological relationships, builds upon the environmental performance of the city's regional architecture and planning, promotes a continuity of Dubai's urban history, and responds to its landscape issues while condensing development. These speculations permit consideration of the overlooked opportunities embedded within Dubai's mega-projects and their long-term impact on the urban morphology. PMID:21132951

  3. Adapting bioinformatics curricula for big data

    PubMed Central

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  4. Vertical landscraping, a big regionalism for Dubai.

    PubMed

    Wilson, Matthew

    2010-01-01

    Dubai's ecologic and economic complications are exacerbated by six years of accelerated expansion, a fixed top-down approach to urbanism and the construction of iconic single-phase mega-projects. With recent construction delays, project cancellations and growing landscape issues, Dubai's tower typologies have been unresponsive to changing environmental, socio-cultural and economic patterns (BBC, 2009; Gillet, 2009; Lewis, 2009). In this essay, a theory of "Big Regionalism" guides an argument for an economically and ecologically linked tower typology called the Condenser. This phased "box-to-tower" typology is part of a greater Landscape Urbanist strategy called Vertical Landscraping. Within this strategy, the Condenser's role is to densify the city, facilitating the creation of ecologic voids that order the urban region. Delineating "Big Regional" principles, the Condenser provides a time-based, global-local urban growth approach that weaves Bigness into a series of urban-regional, economic and ecological relationships, builds upon the environmental performance of the city's regional architecture and planning, promotes a continuity of Dubai's urban history, and responds to its landscape issues while condensing development. These speculations permit consideration of the overlooked opportunities embedded within Dubai's mega-projects and their long-term impact on the urban morphology.

  5. Statistical methods and computing for big data

    PubMed Central

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay.

  6. Bohmian quantization of the big rip

    NASA Astrophysics Data System (ADS)

    Pinto-Neto, Nelson; Pantoja, Diego Moraes

    2009-10-01

    It is shown in this paper that minisuperspace quantization of homogeneous and isotropic geometries with phantom scalar fields, when examined in the light of the Bohm-de Broglie interpretation of quantum mechanics, does not eliminate, in general, the classical big rip singularity present in the classical model. For some values of the Hamilton-Jacobi separation constant present in a class of quantum state solutions of the Wheeler-De Witt equation, the big rip can be either completely eliminated or may still constitute a future attractor for all expanding solutions. This is contrary to the conclusion presented in [M. P. Dabrowski, C. Kiefer, and B. Sandhofer, Phys. Rev. DPRVDAQ1550-7998 74, 044022 (2006).10.1103/PhysRevD.74.044022], using a different interpretation of the wave function, where the big rip singularity is completely eliminated (“smoothed out”) through quantization, independently of such a separation constant and for all members of the above mentioned class of solutions. This is an example of the very peculiar situation where different interpretations of the same quantum state of a system are predicting different physical facts, instead of just giving different descriptions of the same observable facts: in fact, there is nothing more observable than the fate of the whole Universe.

  7. When small is better than BIG

    SciTech Connect

    McDaniel, Hunter; Beard, Matthew C; Wheeler, Lance M; Pietryga, Jeffrey M

    2013-07-18

    Representing the Center for Advanced Solar Photophysics (CASP), this document is one of the entries in the Ten Hundred and One Word Challenge and was awarded “Overall Winner Runner-up and People’s Choice Winner.” As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE: energy. The mission of CASP is to explore and exploit the unique physics of nanostructured materials to boost the efficiency of solar energy conversion through novel light-matter interactions, controlled excited-state dynamics, and engineered carrier-carrier coupling.

  8. Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response.

    PubMed

    Ofli, Ferda; Meier, Patrick; Imran, Muhammad; Castillo, Carlos; Tuia, Devis; Rey, Nicolas; Briant, Julien; Millet, Pauline; Reinhard, Friedrich; Parkan, Matthew; Joost, Stéphane

    2016-03-01

    Aerial imagery captured via unmanned aerial vehicles (UAVs) is playing an increasingly important role in disaster response. Unlike satellite imagery, aerial imagery can be captured and processed within hours rather than days. In addition, the spatial resolution of aerial imagery is an order of magnitude higher than the imagery produced by the most sophisticated commercial satellites today. Both the United States Federal Emergency Management Agency (FEMA) and the European Commission's Joint Research Center (JRC) have noted that aerial imagery will inevitably present a big data challenge. The purpose of this article is to get ahead of this future challenge by proposing a hybrid crowdsourcing and real-time machine learning solution to rapidly process large volumes of aerial data for disaster response in a time-sensitive manner. Crowdsourcing can be used to annotate features of interest in aerial images (such as damaged shelters and roads blocked by debris). These human-annotated features can then be used to train a supervised machine learning system to learn to recognize such features in new unseen images. In this article, we describe how this hybrid solution for image analysis can be implemented as a module (i.e., Aerial Clicker) to extend an existing platform called Artificial Intelligence for Disaster Response (AIDR), which has already been deployed to classify microblog messages during disasters using its Text Clicker module and in response to Cyclone Pam, a category 5 cyclone that devastated Vanuatu in March 2015. The hybrid solution we present can be applied to both aerial and satellite imagery and has applications beyond disaster response such as wildlife protection, human rights, and archeological exploration. As a proof of concept, we recently piloted this solution using very high-resolution aerial photographs of a wildlife reserve in Namibia to support rangers with their wildlife conservation efforts (SAVMAP project, http://lasig.epfl.ch/savmap ). The

  9. Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response.

    PubMed

    Ofli, Ferda; Meier, Patrick; Imran, Muhammad; Castillo, Carlos; Tuia, Devis; Rey, Nicolas; Briant, Julien; Millet, Pauline; Reinhard, Friedrich; Parkan, Matthew; Joost, Stéphane

    2016-03-01

    Aerial imagery captured via unmanned aerial vehicles (UAVs) is playing an increasingly important role in disaster response. Unlike satellite imagery, aerial imagery can be captured and processed within hours rather than days. In addition, the spatial resolution of aerial imagery is an order of magnitude higher than the imagery produced by the most sophisticated commercial satellites today. Both the United States Federal Emergency Management Agency (FEMA) and the European Commission's Joint Research Center (JRC) have noted that aerial imagery will inevitably present a big data challenge. The purpose of this article is to get ahead of this future challenge by proposing a hybrid crowdsourcing and real-time machine learning solution to rapidly process large volumes of aerial data for disaster response in a time-sensitive manner. Crowdsourcing can be used to annotate features of interest in aerial images (such as damaged shelters and roads blocked by debris). These human-annotated features can then be used to train a supervised machine learning system to learn to recognize such features in new unseen images. In this article, we describe how this hybrid solution for image analysis can be implemented as a module (i.e., Aerial Clicker) to extend an existing platform called Artificial Intelligence for Disaster Response (AIDR), which has already been deployed to classify microblog messages during disasters using its Text Clicker module and in response to Cyclone Pam, a category 5 cyclone that devastated Vanuatu in March 2015. The hybrid solution we present can be applied to both aerial and satellite imagery and has applications beyond disaster response such as wildlife protection, human rights, and archeological exploration. As a proof of concept, we recently piloted this solution using very high-resolution aerial photographs of a wildlife reserve in Namibia to support rangers with their wildlife conservation efforts (SAVMAP project, http://lasig.epfl.ch/savmap ). The

  10. Big Data Visual Analytics for Exploratory Earth System Simulation Analysis

    SciTech Connect

    Steed, Chad A.; Ricciuto, Daniel M.; Shipman, Galen M.; Smith, Brian E.; Thornton, Peter E.; Wang, Dali; Shi, Xiaoying; Williams, Dean N.

    2013-12-01

    Rapid increases in high performance computing are feeding the development of larger and more complex data sets in climate research, which sets the stage for so-called big data analysis challenges. However, conventional climate analysis techniques are inadequate in dealing with the complexities of today s data. In this paper, we describe and demonstrate a visual analytics system, called the Exploratory Data analysis ENvironment (EDEN), with specific application to the analysis of complex earth system simulation data sets. EDEN represents the type of interactive visual analysis tools that are necessary to transform data into insight, thereby improving critical comprehension of earth system processes. In addition to providing an overview of EDEN, we describe real-world studies using both point ensembles and global Community Land Model Version 4 (CLM4) simulations.

  11. Fast algorithm for relaxation processes in big-data systems

    NASA Astrophysics Data System (ADS)

    Hwang, S.; Lee, D.-S.; Kahng, B.

    2014-10-01

    Relaxation processes driven by a Laplacian matrix can be found in many real-world big-data systems, for example, in search engines on the World Wide Web and the dynamic load-balancing protocols in mesh networks. To numerically implement such processes, a fast-running algorithm for the calculation of the pseudoinverse of the Laplacian matrix is essential. Here we propose an algorithm which computes quickly and efficiently the pseudoinverse of Markov chain generator matrices satisfying the detailed-balance condition, a general class of matrices including the Laplacian. The algorithm utilizes the renormalization of the Gaussian integral. In addition to its applicability to a wide range of problems, the algorithm outperforms other algorithms in its ability to compute within a manageable computing time arbitrary elements of the pseudoinverse of a matrix of size millions by millions. Therefore our algorithm can be used very widely in analyzing the relaxation processes occurring on large-scale networked systems.

  12. Quark mass variation constraints from Big Bang nucleosynthesis

    SciTech Connect

    Bedaque, P; Luu, T; Platter, L

    2010-12-13

    We study the impact on the primordial abundances of light elements created of a variation of the quark masses at the time of Big Bang nucleosynthesis (BBN). In order to navigate through the particle and nuclear physics required to connect quark masses to binding energies and reaction rates in a model-independent way we use lattice QCD data and an hierarchy of effective field theories. We find that the measured {sup 4}He abundances put a bound of {delta}-1% {approx}< m{sub q}/m{sub 1} {approx}< 0.7%. The effect of quark mass variations on the deuterium abundances can be largely compensated by changes of the baryon-to-photon ratio {eta}. Including the bounds on the variation of {eta} coming from WMAP results and some additional assumptions narrows the range of allowed values of {delta}m{sub q}/m{sub q} somewhat.

  13. Big bad data: law, public health, and biomedical databases.

    PubMed

    Hoffman, Sharona; Podgurski, Andy

    2013-03-01

    The accelerating adoption of electronic health record (EHR) systems will have far-reaching implications for public health research and surveillance, which in turn could lead to changes in public policy, statutes, and regulations. The public health benefits of EHR use can be significant. However, researchers and analysts who rely on EHR data must proceed with caution and understand the potential limitations of EHRs. Because of clinicians' workloads, poor user-interface design, and other factors, EHR data can be erroneous, miscoded, fragmented, and incomplete. In addition, public health findings can be tainted by the problems of selection bias, confounding bias, and measurement bias. These flaws may become all the more troubling and important in an era of electronic "big data," in which a massive amount of information is processed automatically, without human checks. Thus, we conclude the paper by outlining several regulatory and other interventions to address data analysis difficulties that could result in invalid conclusions and unsound public health policies.

  14. Building Simulation Modelers are we big-data ready?

    SciTech Connect

    Sanyal, Jibonananda; New, Joshua Ryan

    2014-01-01

    Recent advances in computing and sensor technologies have pushed the amount of data we collect or generate to limits previously unheard of. Sub-minute resolution data from dozens of channels is becoming increasingly common and is expected to increase with the prevalence of non-intrusive load monitoring. Experts are running larger building simulation experiments and are faced with an increasingly complex data set to analyze and derive meaningful insight. This paper focuses on the data management challenges that building modeling experts may face in data collected from a large array of sensors, or generated from running a large number of building energy/performance simulations. The paper highlights the technical difficulties that were encountered and overcome in order to run 3.5 million EnergyPlus simulations on supercomputers and generating over 200 TBs of simulation output. This extreme case involved development of technologies and insights that will be beneficial to modelers in the immediate future. The paper discusses different database technologies (including relational databases, columnar storage, and schema-less Hadoop) in order to contrast the advantages and disadvantages of employing each for storage of EnergyPlus output. Scalability, analysis requirements, and the adaptability of these database technologies are discussed. Additionally, unique attributes of EnergyPlus output are highlighted which make data-entry non-trivial for multiple simulations. Practical experience regarding cost-effective strategies for big-data storage is provided. The paper also discusses network performance issues when transferring large amounts of data across a network to different computing devices. Practical issues involving lag, bandwidth, and methods for synchronizing or transferring logical portions of the data are presented. A cornerstone of big-data is its use for analytics; data is useless unless information can be meaningfully derived from it. In addition to technical

  15. Big Soda Lake (Nevada). 2. Pelagic sulfate reduction

    USGS Publications Warehouse

    Smith, Richard L.; Oremland, Ronald S.

    1987-01-01

    The epilimnion of hypersaline, alkaline, meromictic Big Soda Lake contains an average 58 mmol sulfate liter−1 and 0.4 µmol dissolved iron liter−1. The monimolimnion, which is permanently anoxic, has a sulfide concentration ranging seasonally from 4 to 7 mmol liter−1. Depth profiles of sulfate reduction in the monimolimnion, assayed with a 35S tracer technique and in situ incubations, demonstrated that sulfate reduction occurs within the water column of this extreme environment. The average rate of reduction in the monimolimnion was 3 µmol sulfate liter−1 d−1in May compared to 0.9 in October. These values are comparable to rates of sulfate reduction reported for anoxic waters of more moderate environments. Sulfate reduction also occurred in the anoxic zone of the mixolimnion, though at significantly lower rates (0.025–0.090 µmol liter−1 d−1 at 25 m). Additions of FeS (1.0 mmol liter−1) doubled the endogenous rate of sulfate reduction in the monimolimnion, while MnS and kaolinite had no effect. These results suggest that sulfate reduction in Big Soda Lake is iron limited and controlled by seasonal variables other than temperature. Estimates of the organic carbon mineralized by sulfate reduction exceed measured fluxes of particulate organic carbon sinking from the mixolimnion. Thus, additional sources of electron donors (other than those derived from the sinking of pelagic autotrophs) may also fuel monimolimnetic sulfate reduction in the lake.

  16. Additive manufacturing of polymer-derived ceramics.

    PubMed

    Eckel, Zak C; Zhou, Chaoyin; Martin, John H; Jacobsen, Alan J; Carter, William B; Schaedler, Tobias A

    2016-01-01

    The extremely high melting point of many ceramics adds challenges to additive manufacturing as compared with metals and polymers. Because ceramics cannot be cast or machined easily, three-dimensional (3D) printing enables a big leap in geometrical flexibility. We report preceramic monomers that are cured with ultraviolet light in a stereolithography 3D printer or through a patterned mask, forming 3D polymer structures that can have complex shape and cellular architecture. These polymer structures can be pyrolyzed to a ceramic with uniform shrinkage and virtually no porosity. Silicon oxycarbide microlattice and honeycomb cellular materials fabricated with this approach exhibit higher strength than ceramic foams of similar density. Additive manufacturing of such materials is of interest for propulsion components, thermal protection systems, porous burners, microelectromechanical systems, and electronic device packaging.

  17. Additive manufacturing of polymer-derived ceramics.

    PubMed

    Eckel, Zak C; Zhou, Chaoyin; Martin, John H; Jacobsen, Alan J; Carter, William B; Schaedler, Tobias A

    2016-01-01

    The extremely high melting point of many ceramics adds challenges to additive manufacturing as compared with metals and polymers. Because ceramics cannot be cast or machined easily, three-dimensional (3D) printing enables a big leap in geometrical flexibility. We report preceramic monomers that are cured with ultraviolet light in a stereolithography 3D printer or through a patterned mask, forming 3D polymer structures that can have complex shape and cellular architecture. These polymer structures can be pyrolyzed to a ceramic with uniform shrinkage and virtually no porosity. Silicon oxycarbide microlattice and honeycomb cellular materials fabricated with this approach exhibit higher strength than ceramic foams of similar density. Additive manufacturing of such materials is of interest for propulsion components, thermal protection systems, porous burners, microelectromechanical systems, and electronic device packaging. PMID:26721993

  18. Additive manufacturing of polymer-derived ceramics

    NASA Astrophysics Data System (ADS)

    Eckel, Zak C.; Zhou, Chaoyin; Martin, John H.; Jacobsen, Alan J.; Carter, William B.; Schaedler, Tobias A.

    2016-01-01

    The extremely high melting point of many ceramics adds challenges to additive manufacturing as compared with metals and polymers. Because ceramics cannot be cast or machined easily, three-dimensional (3D) printing enables a big leap in geometrical flexibility. We report preceramic monomers that are cured with ultraviolet light in a stereolithography 3D printer or through a patterned mask, forming 3D polymer structures that can have complex shape and cellular architecture. These polymer structures can be pyrolyzed to a ceramic with uniform shrinkage and virtually no porosity. Silicon oxycarbide microlattice and honeycomb cellular materials fabricated with this approach exhibit higher strength than ceramic foams of similar density. Additive manufacturing of such materials is of interest for propulsion components, thermal protection systems, porous burners, microelectromechanical systems, and electronic device packaging.

  19. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    NASA Astrophysics Data System (ADS)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  20. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography

    NASA Astrophysics Data System (ADS)

    Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.

    2016-05-01

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.

  1. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography

    DOE PAGES

    Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.

    2016-05-23

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. In this paper, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature andmore » does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. Finally, however, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.« less

  2. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography.

    PubMed

    Jesse, S; Chi, M; Belianinov, A; Beekman, C; Kalinin, S V; Borisevich, A Y; Lupini, A R

    2016-05-23

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called "big-data" methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.

  3. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography

    PubMed Central

    Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.

    2016-01-01

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy. PMID:27211523

  4. Occurrence and transport of nitrogen in the Big Sunflower River, northwestern Mississippi, October 2009-June 2011

    USGS Publications Warehouse

    Barlow, Jeannie R.B.; Coupe, Richard H.

    2014-01-01

    The Big Sunflower River Basin, located within the Yazoo River Basin, is subject to large annual inputs of nitrogen from agriculture, atmospheric deposition, and point sources. Understanding how nutrients are transported in, and downstream from, the Big Sunflower River is key to quantifying their eutrophying effects on the Gulf. Recent results from two Spatially Referenced Regressions on Watershed attributes (SPARROW models), which include the Big Sunflower River, indicate minimal losses of nitrogen in stream reaches typical of the main channels of major river systems. If SPARROW assumptions of relatively conservative transport of nitrogen are correct and surface-water losses through the bed of the Big Sunflower River are negligible, then options for managing nutrient loads to the Gulf of Mexico may be limited. Simply put, if every pound of nitrogen entering the Delta is eventually delivered to the Gulf, then the only effective nutrient management option in the Delta is to reduce inputs. If, on the other hand, it can be shown that processes within river channels of the Mississippi Delta act to reduce the mass of nitrogen in transport, other hydrologic approaches may be designed to further limit nitrogen transport. Direct validation of existing SPARROW models for the Delta is a first step in assessing the assumptions underlying those models. In order to characterize spatial and temporal variability of nitrogen in the Big Sunflower River Basin, water samples were collected at four U.S. Geological Survey gaging stations located on the Big Sunflower River between October 1, 2009, and June 30, 2011. Nitrogen concentrations were generally highest at each site during the spring of the 2010 water year and the fall and winter of the 2011 water year. Additionally, the dominant form of nitrogen varied between sites. For example, in samples collected from the most upstream site (Clarksdale), the concentration of organic nitrogen was generally higher than the concentrations of

  5. "small problems, Big Trouble": An Art and Science Collaborative Exhibition Reflecting Seemingly small problems Leading to Big Threats

    NASA Astrophysics Data System (ADS)

    Waller, J. L.; Brey, J. A.

    2014-12-01

    disasters continues to inspire new chapters in their "Layers: Places in Peril" exhibit! A slide show includes images of paintings for "small problems, Big Trouble". Brey and Waller will lead a discussion on their process of incorporating broader collaboration with geoscientists and others in an educational art exhibition.

  6. Big Data, Big Problems: Incorporating Mission, Values, and Culture in Provider Affiliations.

    PubMed

    Shaha, Steven H; Sayeed, Zain; Anoushiravani, Afshin A; El-Othmani, Mouhanad M; Saleh, Khaled J

    2016-10-01

    This article explores how integration of data from clinical registries and electronic health records produces a quality impact within orthopedic practices. Data are differentiated from information, and several types of data that are collected and used in orthopedic outcome measurement are defined. Furthermore, the concept of comparative effectiveness and its impact on orthopedic clinical research are assessed. This article places emphasis on how the concept of big data produces health care challenges balanced with benefits that may be faced by patients and orthopedic surgeons. Finally, essential characteristics of an electronic health record that interlinks musculoskeletal care and big data initiatives are reviewed.

  7. Big Data, Big Problems: Incorporating Mission, Values, and Culture in Provider Affiliations.

    PubMed

    Shaha, Steven H; Sayeed, Zain; Anoushiravani, Afshin A; El-Othmani, Mouhanad M; Saleh, Khaled J

    2016-10-01

    This article explores how integration of data from clinical registries and electronic health records produces a quality impact within orthopedic practices. Data are differentiated from information, and several types of data that are collected and used in orthopedic outcome measurement are defined. Furthermore, the concept of comparative effectiveness and its impact on orthopedic clinical research are assessed. This article places emphasis on how the concept of big data produces health care challenges balanced with benefits that may be faced by patients and orthopedic surgeons. Finally, essential characteristics of an electronic health record that interlinks musculoskeletal care and big data initiatives are reviewed. PMID:27637659

  8. 10 Aspects of the Big Five in the Personality Inventory for DSM-5

    PubMed Central

    DeYoung, Colin. G.; Carey, Bridget E.; Krueger, Robert F.; Ross, Scott R.

    2015-01-01

    DSM-5 includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into five higher-order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In two healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS scales would be the highest loading BFAS scale on one and only one factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders. PMID:27032017

  9. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  10. Ten aspects of the Big Five in the Personality Inventory for DSM-5.

    PubMed

    DeYoung, Colin G; Carey, Bridget E; Krueger, Robert F; Ross, Scott R

    2016-04-01

    Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into 5 higher order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In 2 healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS would be the highest loading BFAS on 1 and only 1 factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders.

  11. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  12. Ten aspects of the Big Five in the Personality Inventory for DSM-5.

    PubMed

    DeYoung, Colin G; Carey, Bridget E; Krueger, Robert F; Ross, Scott R

    2016-04-01

    Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into 5 higher order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In 2 healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS would be the highest loading BFAS on 1 and only 1 factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders. PMID:27032017

  13. Data management by using R: big data clinical research series.

    PubMed

    Zhang, Zhongheng

    2015-11-01

    Electronic medical record (EMR) system has been widely used in clinical practice. Instead of traditional record system by hand writing and recording, the EMR makes big data clinical research feasible. The most important feature of big data research is its real-world setting. Furthermore, big data research can provide all aspects of information related to healthcare. However, big data research requires some skills on data management, which however, is always lacking in the curriculum of medical education. This greatly hinders doctors from testing their clinical hypothesis by using EMR. To make ends meet, a series of articles introducing data management techniques are put forward to guide clinicians to big data clinical research. The present educational article firstly introduces some basic knowledge on R language, followed by some data management skills on creating new variables, recoding variables and renaming variables. These are very basic skills and may be used in every project of big data research.

  14. Data management by using R: big data clinical research series

    PubMed Central

    2015-01-01

    Electronic medical record (EMR) system has been widely used in clinical practice. Instead of traditional record system by hand writing and recording, the EMR makes big data clinical research feasible. The most important feature of big data research is its real-world setting. Furthermore, big data research can provide all aspects of information related to healthcare. However, big data research requires some skills on data management, which however, is always lacking in the curriculum of medical education. This greatly hinders doctors from testing their clinical hypothesis by using EMR. To make ends meet, a series of articles introducing data management techniques are put forward to guide clinicians to big data clinical research. The present educational article firstly introduces some basic knowledge on R language, followed by some data management skills on creating new variables, recoding variables and renaming variables. These are very basic skills and may be used in every project of big data research. PMID:26697463

  15. Parallel and Scalable Big Data Analysis in the Earth Sciences with JuML

    NASA Astrophysics Data System (ADS)

    Goetz, M.

    2015-12-01

    Recent developments of using a significantly increasing number of sensors with better resolutions in the wide variety of different earth observation projects continously contribute to the availability of 'big data' in the earth sciences. Not only the volume, velocity, and variety of the datasets pose increasing challenges for its analysis, but also the complexity of datasets (e.g. high number of dimensions in hyper-spectral images) requires data algorithms that are able to scale. This contribution will provide insights about the Juelich Machine learning Library (JuML) and its contents that have been actively used in several scientific use cases in the earth sciences. We discuss and categorize challenges related to 'big data' analysis and outline parallel algorithmic solutions driven by those use cases.

  16. Feature Extraction in Sequential Multimedia Images: with Applications in Satellite Images and On-line Videos

    NASA Astrophysics Data System (ADS)

    Liang, Yu-Li

    Multimedia data is increasingly important in scientific discovery and people's daily lives. Content of massive multimedia is often diverse and noisy, and motion between frames is sometimes crucial in analyzing those data. Among all, still images and videos are commonly used formats. Images are compact in size but do not contain motion information. Videos record motion but are sometimes too big to be analyzed. Sequential images, which are a set of continuous images with low frame rate, stand out because they are smaller than videos and still maintain motion information. This thesis investigates features in different types of noisy sequential images, and the proposed solutions that intelligently combined multiple features to successfully retrieve visual information from on-line videos and cloudy satellite images. The first task is detecting supraglacial lakes above ice sheet in sequential satellite images. The dynamics of supraglacial lakes on the Greenland ice sheet deeply affect glacier movement, which is directly related to sea level rise and global environment change. Detecting lakes above ice is suffering from diverse image qualities and unexpected clouds. A new method is proposed to efficiently extract prominent lake candidates with irregular shapes, heterogeneous backgrounds, and in cloudy images. The proposed system fully automatize the procedure that track lakes with high accuracy. We further cooperated with geoscientists to examine the tracked lakes and found new scientific findings. The second one is detecting obscene content in on-line video chat services, such as Chatroulette, that randomly match pairs of users in video chat sessions. A big problem encountered in such systems is the presence of flashers and obscene content. Because of various obscene content and unstable qualities of videos capture by home web-camera, detecting misbehaving users is a highly challenging task. We propose SafeVchat, which is the first solution that achieves satisfactory

  17. Big Data Analytics for Prostate Radiotherapy

    PubMed Central

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose–volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the “RadoncSpace”) in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  18. Big Impacts and Transient Oceans on Titan

    NASA Technical Reports Server (NTRS)

    Zahnle, K. J.; Korycansky, D. G.; Nixon, C. A.

    2014-01-01

    We have studied the thermal consequences of very big impacts on Titan [1]. Titan's thick atmosphere and volatile-rich surface cause it to respond to big impacts in a somewhat Earth-like manner. Here we construct a simple globally-averaged model that tracks the flow of energy through the environment in the weeks, years, and millenia after a big comet strikes Titan. The model Titan is endowed with 1.4 bars of N2 and 0.07 bars of CH4, methane lakes, a water ice crust, and enough methane underground to saturate the regolith to the surface. We assume that half of the impact energy is immediately available to the atmosphere and surface while the other half is buried at the site of the crater and is unavailable on time scales of interest. The atmosphere and surface are treated as isothermal. We make the simplifying assumptions that the crust is everywhere as methane saturated as it was at the Huygens landing site, that the concentration of methane in the regolith is the same as it is at the surface, and that the crust is made of water ice. Heat flow into and out of the crust is approximated by step-functions. If the impact is great enough, ice melts. The meltwater oceans cool to the atmosphere conductively through an ice lid while at the base melting their way into the interior, driven down in part through Rayleigh-Taylor instabilities between the dense water and the warm ice. Topography, CO2, and hydrocarbons other than methane are ignored. Methane and ethane clathrate hydrates are discussed quantitatively but not fully incorporated into the model.

  19. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  20. Analysis of Landsat TM data for active tectonics: the case of the Big Chino Fault, Arizona

    NASA Astrophysics Data System (ADS)

    Salvi, Stefano

    1994-12-01

    The Big Chino Valley is a 50 km-long tectonic depression of the Basin and Range province of the South- western United States. It is bordered on the NE side by an important normal fault, the Big Chino Fault. The activity of the latter has been hypothesised on the basis of the presence of a 20 m-high fault scarp and on local geomorphological studies. Moreover, a magnitude 4.9 earthquake occurred in southern Arizona in 1976 has been attributed to this fault. The climate in the Big Chino Valley is semi-arid with average rainfall of about 400 mm per year; a very sparse vegetation cover is present, yielding a good possibility for the geo-lithologic application of remote sensing data. The analysis of the TM spectral bands shows, in the short wave infrared, a clear variation in the reflected radiance across the fault scarp. Also the available radar (SLAR) images show a marked difference in response between the two sides of the fault. An explanation of this phenomena has been found in the interaction between the geomorphic evolution, the pedological composition, and the periodic occurrence of coseismic deformation along the fault. Other effects of the latter process have been investigated on colour D- stretched images whose interpretation allowed to detect two paleoseismic events of the Big Chino Fault. This work demonstrates that important information on the seismological parameters of active faults in arid and semiarid climates can be extracted from the analysis of satellite spectral data in the visible and near -infrared.