Science.gov

Sample records for additional big images

  1. Cincinnati Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Love, Lonnie J.

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  2. BigView Image Viewing on Tiled Displays

    NASA Technical Reports Server (NTRS)

    Sandstrom, Timothy

    2007-01-01

    BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running Linux. Additionally, it can work in a multi-screen environment where multiple PCs cooperate to view a single, large image. Using this software, one can explore on relatively modest machines images such as the Mars Orbiter Camera mosaic [92,160 33,280 pixels]. The images must be first converted into paged format, where the image is stored in 256 256 pages to allow rapid movement of pixels into texture memory. The format contains an image pyramid : a set of scaled versions of the original image. Each scaled image is 1/2 the size of the previous, starting with the original down to the smallest, which fits into a single 256 x 256 page.

  3. BIG FROG WILDERNESS STUDY AREA AND ADDITIONS, TENNESSEE AND GEORGIA.

    USGS Publications Warehouse

    Slack, John F.; Gazdik, Gertrude C.

    1984-01-01

    A mineral-resource survey was made of the Big Frog Wilderness Study Area and additions, Tennessee-Georgia. Geochemical sampling found traces of gold, zinc, copper, and arsenic in rocks, stream sediments, and panned concentrates, but not in sufficient quantities to indicate the presence of deposits of these metals. The results of the survey indicate that there is little promise for the occurrence of metallic mineral deposits within the study area. The only apparent resources are nonmetallic commodities including rock suitable for construction materials, and small amounts of sand and gravel; however, these commodities are found in abundance outside the study area. A potential may exist for oil and natural gas at great depths, but this cannot be evaluated by the present study.

  4. AirMSPI PODEX Big Sur Ellipsoid Images

    Atmospheric Science Data Center

    2013-12-11

    ... AirMSPI Browse Images from the PODEX 2013 Campaign   Big Sur target 02/03/2013 Ellipsoid-projected   Select ...   Version number   For more information, see the  Data Product Specifications (DPS) ...

  5. Building recognition based on big template in FLIR images

    NASA Astrophysics Data System (ADS)

    Zhang, Jiangwei; Niu, Zhaodong; Liu, Songlin; Liu, Fang; Chen, Zengping

    2014-10-01

    In order to enhance the robustness of building recognition in forward-looking infrared (FLIR) images, an effective method based on big template is proposed. Big template is a set of small templates which contains a great amount of information of surface features. Its information content cannot be matched by any small template and it has advantages in conquering noise interference or incompleteness and avoiding erroneous judgments. Firstly, digital surface model (DSM) was utilized to make big template, distance transformation was operated on the big template, and region of interest (ROI) was extracted by the way of template matching between the big template and contour of real-time image. Secondly, corners were detected from the big template, response function was defined by utilizing gradients and phases of corners and their neighborhoods, a kind of similarity measure was designed based on the response function and overlap ratio, then the template and real-time image were matched accurately. Finally, a large number of image data was used to test the performance of the algorithm, and optimal parameters selection criterion was designed. Test results indicate that the target matching ratio of the algorithm can reach 95%, it has effectively solved the problem of building recognition under the conditions of noise disturbance, incompleteness or the target is not in view.

  6. The Big Addition Book. Activity Book [for Grades K-3].

    ERIC Educational Resources Information Center

    Daniel, Becky; Daniel, Charlie

    This book contains a variety of worksheets to teach and reinforce addition skills. Secret codes, magic squares, pyramids, coloring activities, and dot-to-dot challenges are included to motivate students. A two-page test of basic addition facts is provided, with the suggestion that the teacher time the test and give it repeatedly to assess…

  7. Neural Computations for Biosonar Imaging in the Big Brown Bat

    NASA Astrophysics Data System (ADS)

    Saillant, Prestor Augusto

    1995-11-01

    The study of the intimate relationship between space and time has taken many forms, ranging from the Theory of Relativity down to the problem of avoiding traffic jams. However, nowhere has this relationship been more fully developed and exploited than in dolphins and bats, which have the ability to utilize biosonar. This thesis describes research on the behavioral and computational basis of echolocation carried out in order to explore the neural mechanisms which may account for the space-time constructs which are of psychological importance to the big brown bat. The SCAT (Spectrogram Correlation and Transformation) computational model was developed to provide a framework for understanding the computational requirements of FM echolocation as determined from psychophysical experiments (i.e., high resolution imaging) and neurobiological constraints (Saillant et al., 1993). The second part of the thesis consisted in developing a new behavioral paradigm for simultaneously studying acoustic behavior and flight behavior of big brown bats in pursuit of stationary or moving targets. In the third part of the thesis a complete acoustic "artificial bat" was constructed, making use of the SCAT process. The development of the artificial bat allowed us to begin experimentation with real world echoes from various targets, in order to gain a better appreciation for the additional complexities and sources of information encountered by bats in flight. Finally, the continued development of the SCAT model has allowed a deeper understanding of the phenomenon of "time expansion" and of the phenomenon of phase sensitivity in the ultrasonic range. Time expansion, first predicted through the use of the SCAT model, and later found in auditory local evoked potential recordings, opens up a new realm of information processing and representation in the brain which as of yet has not been considered. It seems possible, from the work in the auditory system, that time expansion may provide a novel

  8. Utility of Big Area Additive Manufacturing (BAAM) For The Rapid Manufacture of Customized Electric Vehicles

    SciTech Connect

    Love, Lonnie J.

    2015-08-01

    This Oak Ridge National Laboratory (ORNL) Manufacturing Development Facility (MDF) technical collaboration project was conducted in two phases as a CRADA with Local Motors Inc. Phase 1 was previously reported as Advanced Manufacturing of Complex Cyber Mechanical Devices through Community Engagement and Micro-manufacturing and demonstrated the integration of components onto a prototype body part for a vehicle. Phase 2 was reported as Utility of Big Area Additive Manufacturing (BAAM) for the Rapid Manufacture of Customized Electric Vehicles and demonstrated the high profile live printing of an all-electric vehicle using ONRL s Big Area Additive Manufacturing (BAAM) technology. This demonstration generated considerable national attention and successfully demonstrated the capabilities of the BAAM system as developed by ORNL and Cincinnati, Inc. and the feasibility of additive manufacturing of a full scale electric vehicle as envisioned by the CRADA partner Local Motors, Inc.

  9. UTILITY OF BIG AREA ADDITIVE MANUFACTURING (BAAM) FOR THE RAPID MANUFACTURE OF CUSTOMIZED ELECTRIC VEHICLES

    SciTech Connect

    Love, Lonnie J

    2015-08-01

    This Oak Ridge National Laboratory (ORNL) Manufacturing Development Facility (MDF) technical collaboration project was conducted in two phases as a CRADA with Local Motors Inc. Phase 1was previously reported as Advanced Manufacturing of Complex Cyber Mechanical Devices through Community Engagement and Micro-manufacturing and demonstrated the integration of components onto a prototype body part for a vehicle. Phase 2 was reported as Utility of Big Area Additive Manufacturing (BAAM) for the Rapid Manufacture of Customized Electric Vehicles and demonstrated the high profile live printing of an all-electric vehicle using ONRL s Big Area Additive Manufacturing (BAAM) technology. This demonstration generated considerable national attention and successfully demonstrated the capabilities of the BAAM system as developed by ORNL and Cincinnati, Inc. and the feasibility of additive manufacturing of a full scale electric vehicle as envisioned by the CRADA partner Local Motors, Inc.

  10. [Utility of noise addition image made by using water phantom and image addition and subtraction software].

    PubMed

    Watanabe, Ryo; Ogawa, Masato; Mituzono, Hiroki; Aoki, Takahiro; Hayano, Mizuho; Watanabe, Yuka

    2010-08-20

    In optimizing exposures, it is very important to evaluate the impact of image noise on image quality. To realize this, there is a need to evaluate how much image noise will make the subject disease invisible. But generally it is very difficult to shoot images of different quality in a clinical examination. Thus, a method to create a noise addition image by adding the image noise to raw data has been reported. However, this approach requires a special system, so it is difficult to implement in many facilities. We have invented a method to easily create a noise addition image by using the water phantom and image add-subtract software that accompanies the device. To create a noise addition image, first we made a noise image by subtracting the water phantom with different SD. A noise addition image was then created by adding the noise image to the original image. By using this method, a simulation image with intergraded SD can be created from the original. Moreover, the noise frequency component of the created noise addition image is as same as the real image. Thus, the relationship of image quality to SD in the clinical image can be evaluated. Although this method is an easy method of LDSI creation on image data, a noise addition image can be easily created by using image addition and subtraction software and water phantom, and this can be implemented in many facilities. PMID:20953102

  11. Material Development for Tooling Applications Using Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Drye, Tom; Franc, Alan

    2015-03-01

    Techmer Engineered Solutions (TES) is working with Oak Ridge National Laboratory (ORNL) to develop materials and evaluate their use for ORNL s recently developed Big Area Additive Manufacturing (BAAM) system for tooling applications. The first phase of the project established the performance of some commercially available polymer compositions deposited with the BAAM system. Carbon fiber reinforced ABS demonstrated a tensile strength of nearly 10 ksi, which is sufficient for a number of low temperature tooling applications.

  12. AirMSPI PODEX BigSur Terrain Images

    Atmospheric Science Data Center

    2013-12-13

    ... from the PODEX 2013 Campaign   Big Sur target (Big Sur, California) 02/03/2013 Terrain-projected   Select ...   Version number   For more information, see the Data Product Specifications (DPS)   ...

  13. First Results of the Near Real-Time Imaging Reconstruction System at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Yang, G.; Denker, C.; Wang, H.

    2003-05-01

    The Near Real-Time Imaging Reconstruction system (RTIR) at Big Bear Solar Observatory (BBSO) is designed to obtain high spatial resolution solar images at a cadence of 1 minute utilizing the power of parallel processing. With this system, we can compute near diffraction-limited images without saving huge amounts of data that are involved in the speckle masking reconstruction algorithm. It enables us to monitor active regions and give fast response to the solar activity. In this poster we present the first results of our new 32-CPU Beowulf cluster system. The images are 1024 x 1024 and the field of view (FOV) is 80'' x 80''. Our target is an active region with complex magnetic configuration. We focus on pores and small spots in the active region with the goal of better understanding the formation of penumbra structure. In addition we expect to study evolution of active regions during solar flares.

  14. Research on image matching method of big data image of three-dimensional reconstruction

    NASA Astrophysics Data System (ADS)

    Zhang, Chunsen; Qiu, Zhenguo; Zhu, Shihuan; Wang, Xiqi; Xu, Xiaolei; Zhong, Sidong

    2015-12-01

    Image matching is the main flow of a three-dimensional reconstruction. With the development of computer processing technology, seeking the image to be matched from the large date image sets which acquired from different image formats, different scales and different locations has put forward a new request for image matching. To establish the three dimensional reconstruction based on image matching from big data images, this paper put forward a new effective matching method based on visual bag of words model. The main technologies include building the bag of words model and image matching. First, extracting the SIFT feature points from images in the database, and clustering the feature points to generate the bag of words model. We established the inverted files based on the bag of words. The inverted files can represent all images corresponding to each visual word. We performed images matching depending on the images under the same word to improve the efficiency of images matching. Finally, we took the three-dimensional model with those images. Experimental results indicate that this method is able to improve the matching efficiency, and is suitable for the requirements of large data reconstruction.

  15. a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.

    2015-07-01

    Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.

  16. Big Surveys, Big Data Centres

    NASA Astrophysics Data System (ADS)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  17. The caBIG annotation and image Markup project.

    PubMed

    Channin, David S; Mongkolwat, Pattanasak; Kleper, Vladimir; Sepukar, Kastubh; Rubin, Daniel L

    2010-04-01

    Image annotation and markup are at the core of medical interpretation in both the clinical and the research setting. Digital medical images are managed with the DICOM standard format. While DICOM contains a large amount of meta-data about whom, where, and how the image was acquired, DICOM says little about the content or meaning of the pixel data. An image annotation is the explanatory or descriptive information about the pixel data of an image that is generated by a human or machine observer. An image markup is the graphical symbols placed over the image to depict an annotation. While DICOM is the standard for medical image acquisition, manipulation, transmission, storage, and display, there are no standards for image annotation and markup. Many systems expect annotation to be reported verbally, while markups are stored in graphical overlays or proprietary formats. This makes it difficult to extract and compute with both of them. The goal of the Annotation and Image Markup (AIM) project is to develop a mechanism, for modeling, capturing, and serializing image annotation and markup data that can be adopted as a standard by the medical imaging community. The AIM project produces both human- and machine-readable artifacts. This paper describes the AIM information model, schemas, software libraries, and tools so as to prepare researchers and developers for their use of AIM. PMID:19294468

  18. Project BIG (Black Image Growth). Model Cities Schools, Indianapolis Public Schools. Final Report.

    ERIC Educational Resources Information Center

    Sciara, Frank J.

    Project BIG (Black Image Growth) was designed as an attempt to build self-pride in black fourth grade children through the inclusion of black Indiana history in their curriculum. Commercially available materials could not be used since few of the project children possessed the necessary reading skills. An attempt was made to create a variety of…

  19. Diffraction-limited Polarimetry from the Infrared Imaging Magnetograph at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Cao, Wenda; Jing, Ju; Ma, Jun; Xu, Yan; Wang, Haimin; Goode, Philip R.

    2006-06-01

    The Infrared Imaging Magnetograph (IRIM) system developed by Big Bear Solar Observatory (BBSO) has been put into preliminary operation. It is one of the first imaging spectropolarimeters working at 1565 nm and is used for the observations of the Sun at its opacity minimum, exposing the deepest photospheric layers. The tandem system, which includes a 4.2 nm interference filter, a unique 0.25 nm birefringent Lyot filter, and a Fabry-Pérot etalon, is capable of providing a bandpass as low as 0.01 nm in a telecentric configuration. A fixed quarter-wave plate and a nematic liquid crystal variable retarder are employed for analyzing the circular polarization of the Zeeman components. The longitudinal magnetic field is measured for the highly Zeeman-sensitive Fe I line at 1564.85 nm (Landé factor g=3). The polarimetric data were taken through a field of view of ~145''×145'' and were recorded by a 1024×1024 pixel, 14 bit HgCdTe CMOS focal plane array camera. Benefiting from the correlation tracking system and a newly developed adaptive optics system, the first imaging polarimetric observations at 1565 nm were made at the diffraction limit on 2005 July 1 using BBSO's 65 cm telescope. After comparing the magnetograms from IRIM with those taken by the Michelson Doppler Imager on board SOHO, it was found that all the magnetic features matched very well in both sets of magnetograms. In addition, Stokes V profiles obtained from the Fabry-Pérot etalon scan data provide access to both the true magnetic field strength and the filling factor of the small-scale magnetic flux elements. In this paper, we present the design, fabrication, and calibration of IRIM, as well as the results of the first scientific observations.

  20. Deconvolution of partially compensated solar images from additional wavefront sensing.

    PubMed

    Miura, Noriaki; Oh-Ishi, Akira; Kuwamura, Susumu; Baba, Naoshi; Ueno, Satoru; Nakatani, Yoshikazu; Ichimoto, Kiyoshi

    2016-04-01

    A technique for restoring solar images partially compensated with adaptive optics is developed. An additional wavefront sensor is installed in an adaptive optics system to acquire residual wavefront information simultaneously to a solar image. A point spread function is derived from the wavefront information and used to deconvolve the solar image. Successful image restorations are demonstrated when the estimated point spread functions have relatively high Strehl ratios. PMID:27139647

  1. Forensic detection of noise addition in digital images

    NASA Astrophysics Data System (ADS)

    Cao, Gang; Zhao, Yao; Ni, Rongrong; Ou, Bo; Wang, Yongbin

    2014-03-01

    We proposed a technique to detect the global addition of noise to a digital image. As an anti-forensics tool, noise addition is typically used to disguise the visual traces of image tampering or to remove the statistical artifacts left behind by other operations. As such, the blind detection of noise addition has become imperative as well as beneficial to authenticate the image content and recover the image processing history, which is the goal of general forensics techniques. Specifically, the special image blocks, including constant and strip ones, are used to construct the features for identifying noise addition manipulation. The influence of noising on blockwise pixel value distribution is formulated and analyzed formally. The methodology of detectability recognition followed by binary decision is proposed to ensure the applicability and reliability of noising detection. Extensive experimental results demonstrate the efficacy of our proposed noising detector.

  2. Big-data x-ray phase contrast imaging simulation challenges

    NASA Astrophysics Data System (ADS)

    Jimenez, Edward S.; Dagel, Amber L.

    2015-08-01

    This position paper describes a potential implementation of a large-scale grating-based X-ray Phase Contrast Imaging System (XPCI) simulation tool along with the associated challenges in its implementation. This work proposes an implementation based off of an implementation by Peterzol et. al. where each grating is treated as an object imaged in the field-of-view. Two main challenges exist; the first, is the required sampling and information management in object space due to the micron-scale periods of each grating propagating over significant distances. The second is maintaining algorithmic numerical stability for imaging systems relevant to industrial applications. We present preliminary results for a numerical stability study using a simplified algorithm that performs Talbot imaging in a big-data context

  3. Research on three-dimensional positioning method of big data image under bag of words model guidance

    NASA Astrophysics Data System (ADS)

    Zhang, Chunsen; Wang, Xiqi; Qiu, Zhenguo; Zhu, Shihuan; Xu, Xiaolei; Zhong, Sidong

    2015-12-01

    In order to retrieve the positioning image efficiently and quickly from a large number of different images to realize the three-dimensional spatial positioning, in this article, based on photogrammetry and computer vision theory, a new method of three-dimensional positioning of big data image under the bag of words model guidance is proposed. The method consists of two parts: image retrieving and spatial positioning. First, complete image retrieval by feature extraction, K-means clustering, bag of words model building and other processes, thus improve the efficiency of image matching. Second, achieve interior and exterior orientation element through image matching, building projection relationship and calculating the projection matrix, and then the spatial orientation is realized. The experimental result showed that the proposed method can retrieve the target image efficiently and achieve spatial orientation accurately, which made a beneficial exploration for achieving space positioning based on big data images.

  4. Stereoscopic high-speed imaging using additive colors

    NASA Astrophysics Data System (ADS)

    Sankin, Georgy N.; Piech, David; Zhong, Pei

    2012-04-01

    An experimental system for digital stereoscopic imaging produced by using a high-speed color camera is described. Two bright-field image projections of a three-dimensional object are captured utilizing additive-color backlighting (blue and red). The two images are simultaneously combined on a two-dimensional image sensor using a set of dichromatic mirrors, and stored for off-line separation of each projection. This method has been demonstrated in analyzing cavitation bubble dynamics near boundaries. This technique may be useful for flow visualization and in machine vision applications.

  5. ICORE: Image Co-addition with Optional Resolution Enhancement

    NASA Astrophysics Data System (ADS)

    Masci, Frank

    2013-02-01

    ICORE is a command-line driven co-addition, mosaicking, and resolution enhancement (HiRes) tool for creating science quality products from image data in FITS format and with World Coordinate System information following the FITS-WCS standard. It includes preparatory steps such as image background matching, photometric gain-matching, and pixel-outlier rejection. Co-addition and/or HiRes'ing can be performed in either the inertial WCS or in the rest frame of a moving object. Three interpolation methods are supported: overlap-area weighting, drizzle, and weighting by the detector Point Response Function (PRF). The latter enables the creation of matched-filtered products for optimal point-source detection, but most importantly allows for resolution enhancement using a spatially-dependent deconvolution method. This is a variant of the classic Richardson-Lucy algorithm with the added benefit to simultaneously register and co-add multiple images to optimize signal-to-noise and sampling of the instrumental PSF. It can assume real (or otherwise "flat") image priors, mitigate "ringing" artifacts, and assess the quality of image solutions using statistically-motivated convergence criteria. Uncertainties are also estimated and internally validated for all products. The software supports multithreading that can be configured for different architectures. Numerous example scripts are included (with test data) to co-add and/or HiRes image data from Spitzer-IRAC/MIPS, WISE, and Herschel-SPIRE.

  6. The first light of the Infrared Imaging Magnetographat Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Cao, Wenda; Ma, J.; Jing, J.; Xu, Y.; Denker, C.; Wang, H.; Goode, P.

    2006-06-01

    The InfraRed Imaging Magnetograph (IRIM) system developed by Big Bear Solar Observatory (BBSO) has been put into preliminary operation. It is one of the first imaging spectro-polarimeters working at 1565 nm, and is used for the observations of the Sun at its opacity minimum, exposing the deepest photospheric layers. The tandem system of a 4.2 nm interference filter, an unique 0.25 nm birefringent Lyot filter and a Fabry-Perot etalon is capable of providing a bandpass as low as 0.01 nm in a telecentric configuration. A fixed quarter wave plate and a nematic liquid crystal variable retarder are employed for analyzing the circular polarization of the Zeeman components. The longitudinal magnetic field is measured for highly Zeeman-sensitive Fe I line at 1564.85 nm (Lande factor g = 3). The polarimetric data, with a field of view (FOV) 145" × 145", were recorded by a 1024 × 1024 pixel, 14-bit HgCdTe CMOS focal plane array camera. Benefiting from the Correlation Tracking system (CT) and newly developed Adaptive Optics (AO) system, the first imaging polarimetric observations at 1565 nm were made at the diffraction limit on 1 July 2005 using BBSO's 65 cm telescope. After comparing the magnetograms from IRIM with those taken by the Michelson Doppler Imager (MDI) on board SOHO, it was found that all the magnetic features matched very well in both sets of magnetograms. Also, Stokes V profiles obtained from the Fabry-Perot etalon scanning data provide access to both the true magnetic field strength and filling factor of the small-scale magnetic flux elements. In this paper, we present the design, fabrication, and calibration of IRIM, as well as the results of the first scientific observations.

  7. A Big Data Analytics Pipeline for the Analysis of TESS Full Frame Images

    NASA Astrophysics Data System (ADS)

    Wampler-Doty, Matthew; Pierce Doty, John

    2015-12-01

    We present a novel method for producing a catalogue of extra-solar planets and transients using the full frame image data from TESS. Our method involves (1) creating a fast Monte Carlo simulation of the TESS science instruments, (2) using the simulation to create a labeled dataset consisting of exoplanets with various orbital durations as well as transients (such as tidal disruption events), (3) using supervised machine learning to find optimal matched filters, Support Vector Machines (SVMs) and statistical classifiers (i.e. naïve Bayes and Markov Random Fields) to detect astronomical objects of interest and (4) “Big Data” analysis to produce a catalogue based on the TESS data. We will apply the resulting methods to all stars in the full frame images. We hope that by providing libraries that conform to industry standards of Free Open Source Software we may invite researchers from the astronomical community as well as the wider data-analytics community to contribute to our effort.

  8. Study on clear stereo image pair acquisition method for small objects with big vertical size in SLM vision system.

    PubMed

    Wang, Yuezong; Jin, Yan; Wang, Lika; Geng, Benliang

    2016-05-01

    Microscopic vision system with stereo light microscope (SLM) has been applied to surface profile measurement. If the vertical size of a small object exceeds the range of depth, its images will contain clear and fuzzy image regions. Hence, in order to obtain clear stereo images, we propose a microscopic sequence image fusion method which is suitable for SLM vision system. First, a solution to capture and align image sequence is designed, which outputs an aligning stereo images. Second, we decompose stereo image sequence by wavelet analysis theory, and obtain a series of high and low frequency coefficients with different resolutions. Then fused stereo images are output based on the high and low frequency coefficient fusion rules proposed in this article. The results show that Δw1 (Δw2 ) and ΔZ of stereo images in a sequence have linear relationship. Hence, a procedure for image alignment is necessary before image fusion. In contrast with other image fusion methods, our method can output clear fused stereo images with better performance, which is suitable for SLM vision system, and very helpful for avoiding image fuzzy caused by big vertical size of small objects. PMID:26970109

  9. Open source software projects of the caBIG In Vivo Imaging Workspace Software special interest group.

    PubMed

    Prior, Fred W; Erickson, Bradley J; Tarbox, Lawrence

    2007-11-01

    The Cancer Bioinformatics Grid (caBIG) program was created by the National Cancer Institute to facilitate sharing of IT infrastructure, data, and applications among the National Cancer Institute-sponsored cancer research centers. The program was launched in February 2004 and now links more than 50 cancer centers. In April 2005, the In Vivo Imaging Workspace was added to promote the use of imaging in cancer clinical trials. At the inaugural meeting, four special interest groups (SIGs) were established. The Software SIG was charged with identifying projects that focus on open-source software for image visualization and analysis. To date, two projects have been defined by the Software SIG. The eXtensible Imaging Platform project has produced a rapid application development environment that researchers may use to create targeted workflows customized for specific research projects. The Algorithm Validation Tools project will provide a set of tools and data structures that will be used to capture measurement information and associated needed to allow a gold standard to be defined for the given database against which change analysis algorithms can be tested. Through these and future efforts, the caBIG In Vivo Imaging Workspace Software SIG endeavors to advance imaging informatics and provide new open-source software tools to advance cancer research. PMID:17846835

  10. Imaging Structure, Stratigraphy and Groundwater with Ground-Penetrating Radar on the Big Island, Hawaii

    NASA Astrophysics Data System (ADS)

    Shapiro, S. R.; Tchakirides, T. F.; Brown, L. D.

    2004-12-01

    A series of exploratory ground-penetrating radar (GPR) surveys were carried out on the Big Island, Hawaii in March of 2004 to evaluate the efficacy of using GPR to address hydrological, volcanological, and tectonic issues in extrusive basaltic materials. Target sites included beach sands, nearshore lava flows, well-developed soil covers, lava tubes, and major fault zones. Surveys were carried out with a Sensors and Software T Pulse Ekko 100, which was equipped with 50, 100, and 200 MHz antennae. Both reflection profiles and CMP expanding spreads were collected at most sites to provide both structural detail and in situ velocity estimation. In general, the volcanic rocks exhibited propagation velocities of ca 0.09-0.10 m/ns, a value which we interpret to reflect the large air-filled porosity of the media. Penetration in the nearshore area was expectedly small (less than 1 m), which we attribute to seawater infiltration. However, surveys in the volcanics away from the coast routinely probed to depths of 10 m or greater, even at 100 MHz. While internal layering and lava tubes could be identified from individual profiles, the complexity of returns suggests that 3D imaging is required before detailed stratigraphy can be usefully interpreted. A pilot 3D survey over a lava tube complex supports this conclusion, although it was prematurely terminated by bad weather. Although analysis of the CMP data does not show a clear systematic variation in radar velocity with age of flow, the dataset is too limited to support any firm conclusions on this point. Unusually distinct, subhorizontal reflectors on several profiles seem to mark groundwater. In one case, the water seems to lie within a lava tube with an air-filled roof zone. Surveys over part of the controversial Hilana fault zone clearly image the fault as a steeply dipping feature in the subsurface, albeit only to depths of a few meters. The results suggest, however, that deeper extensions of the faults could be mapped by

  11. Data Processing of the magnetograms for the Near InfraRed Imaging Spectropolarimeter at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Ahn, Kwangsu; Cao, Wenda; Shumko, Sergiy; Chae, Jongchul

    2016-05-01

    We want to present the processing result of the vector magnetograms from the Near InfraRed Imaging Spectropolarimeter (NIRIS) at Big Bear Solar Observatory. The NIRIS is a successor of an old magnetograph system at BBSO, which equips with the new infrared detector and the improved Fabry-Perot filter system. While there are several upgrades to the new hardware, there are also some challenges as the data acquisition rate increases and we deal with the a larger detector array. The overall process includes dark and flat correction, image alignment, de-stretch, Stokes parameter selection, calibration of instrumental crosstalk, and Milne-Eddington inversion.

  12. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  13. Big Images and Big Ideas!

    ERIC Educational Resources Information Center

    McCullagh, John; Greenwood, Julian

    2011-01-01

    In this digital age, is primary science being left behind? Computer microscopes provide opportunities to transform science lessons into highly exciting learning experiences and to shift enquiry and discovery back into the hands of the children. A class of 5- and 6-year-olds was just one group of children involved in the Digitally Resourced…

  14. BigNeuron: Large-scale 3D Neuron Reconstruction from Optical Microscopy Images

    PubMed Central

    Peng, Hanchuan; Hawrylycz, Michael; Roskams, Jane; Hill, Sean; Spruston, Nelson; Meijering, Erik; Ascoli, Giorgio A.

    2016-01-01

    Understanding the structure of single neurons is critical for understanding how they function within neural circuits. BigNeuron is a new community effort that combines modern bioimaging informatics, recent leaps in labeling and microscopy, and the widely recognized need for openness and standardization to provide a community resource for automated reconstruction of dendritic and axonal morphology of single neurons. PMID:26182412

  15. Direct laser additive fabrication system with image feedback control

    DOEpatents

    Griffith, Michelle L.; Hofmeister, William H.; Knorovsky, Gerald A.; MacCallum, Danny O.; Schlienger, M. Eric; Smugeresky, John E.

    2002-01-01

    A closed-loop, feedback-controlled direct laser fabrication system is disclosed. The feedback refers to the actual growth conditions obtained by real-time analysis of thermal radiation images. The resulting system can fabricate components with severalfold improvement in dimensional tolerances and surface finish.

  16. Fast Imaging Solar Spectrograph of the 1.6 Meter New Solar Telescope at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Chae, Jongchul; Park, Hyung-Min; Ahn, Kwangsu; Yang, Heesu; Park, Young-Deuk; Nah, Jakyoung; Jang, Bi Ho; Cho, Kyung-Suk; Cao, Wenda; Goode, Philip R.

    2013-11-01

    For high resolution spectral observations of the Sun - particularly its chromosphere, we have developed a dual-band echelle spectrograph named Fast Imaging Solar Spectrograph (FISS), and installed it in a vertical optical table in the Coudé Lab of the 1.6 meter New Solar Telescope at Big Bear Solar Observatory. This instrument can cover any part of the visible and near-infrared spectrum, but it usually records the Hα band and the Ca ii 8542 Å band simultaneously using two CCD cameras, producing data well suited for the study of the structure and dynamics of the chromosphere and filaments/prominences. The instrument does imaging of high quality using a fast scan of the slit across the field of view with the aid of adaptive optics. We describe its design, specifics, and performance as well as data processing

  17. Big Society, Big Deal?

    ERIC Educational Resources Information Center

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  18. Big data for health.

    PubMed

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled. PMID:26173222

  19. Rapid and retrievable recording of big data of time-lapse 3D shadow images of microbial colonies.

    PubMed

    Ogawa, Hiroyuki; Nasu, Senshi; Takeshige, Motomu; Saito, Mikako; Matsuoka, Hideaki

    2015-01-01

    We formerly developed an automatic colony count system based on the time-lapse shadow image analysis (TSIA). Here this system has been upgraded and applied to practical rapid decision. A microbial sample was spread on/in an agar plate with 90 mm in diameter as homogeneously as possible. We could obtain the results with several strains that most of colonies appeared within a limited time span. Consequently the number of colonies reached a steady level (Nstdy) and then unchanged until the end of long culture time to give the confirmed value (Nconf). The equivalence of Nstdy and Nconf as well as the difference of times for Nstdy and Nconf determinations were statistically significant at p < 0.001. Nstdy meets the requirement of practical routines treating a large number of plates. The difference of Nstdy and Nconf, if any, may be elucidated by means of retrievable big data. Therefore Nconf is valid for official documentation. PMID:25975590

  20. Disturbed basal ice seen in radio echo images coincide with zones of big interlocking ice crystals.

    NASA Astrophysics Data System (ADS)

    Dahl-Jensen, Dorthe; Gogineni, Sivaprasad; Panton, Christian

    2014-05-01

    Improvement of the depth sounding radio echo sounding (RES) over Antarctica and Greenland Ice Sheet has made it possible to map the near basal layers that have not been 'seen' earlier due to the very high demand of attenuation needed to reach through more than 3000m of ice. The RES internal reflectors show that the near basal ice at many locations has disturbed layering. At the locations where ice cores reach the bedrock both in Greenland and Antarctica studies of the ice crystal size and orientation show that the near basal ice has big and interlocking ice crystals which suggests the ice is not actively deforming. These observations challenge the often used constitutive equations like Glens flow law in ice sheet modelling. A discussion of the impact of the RES findings on ice sheet modeling and the quest to find the oldest ice in Antarctic based on the anisotropy of the basal ice will follow.

  1. Breast Imaging in the Era of Big Data: Structured Reporting and Data Mining

    PubMed Central

    Margolies, Laurie R.; Pandey, Gaurav; Horowitz, Eliot R.; Mendelson, David S.

    2016-01-01

    OBJECTIVE The purpose of this article is to describe structured reporting and the development of large databases for use in data mining in breast imaging. CONCLUSION The results of millions of breast imaging examinations are reported with structured tools based on the BI-RADS lexicon. Much of these data are stored in accessible media. Robust computing power creates great opportunity for data scientists and breast imagers to collaborate to improve breast cancer detection and optimize screening algorithms. Data mining can create knowledge, but the questions asked and their complexity require extremely powerful and agile databases. New data technologies can facilitate outcomes research and precision medicine. PMID:26587797

  2. 1024 × 1024 HgCdTe CMOS camera for infrared imaging magnetograph of Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Cao, W.; Xu, Y.; Denker, C.; Wang, H.

    2005-08-01

    The InfraRed Imaging Magnetograph (IRIM)1,2 is a two-dimensional narrow-band solar spectro-polarimeter currently being developed at Big Bear Solar Observatory (BBSO). It works in the near infrared (NIR) from 1.0 μm to 1.7 μm and possesses high temporal resolution, high spatial resolution, high spectral resolving power, high magnetic sensitivity. As the detector of IRIM, the 1024 × 1024 HgCdTe TCM8600 CMOS camera manufactured by the Rockwell Scientific Company plays a very important role in acquiring the high precision solar spectropolarimetry data. In order to make the best use of it for solar observation, the characteristic evaluation was carried out at BBSO and National Solar Observatory (NSO), Sacramento Peak in October 2003. The paper presents a series of measured performance parameters including linearity, readout noise, gain, full well capacity, hot pixels, dark, flat field, frame rate, vacuum, low temperature control, etc., and shows some solar infrared narrow band imaging observation results.

  3. Images of Paris: Big C Culture for the Nonspeaker of French.

    ERIC Educational Resources Information Center

    Spangler, May; York, Holly U.

    2002-01-01

    Discusses a course offered in both French and English at Emory University in Atlanta, Georgia that is based on the study of representations of Paris from the Middle Ages to the present. It uses architecture as a point of departure and explores the myth of Paris as expressed through a profusion of images in literature, painting, and film.…

  4. Unstructured medical image query using big data - An epilepsy case study.

    PubMed

    Istephan, Sarmad; Siadat, Mohammad-Reza

    2016-02-01

    Big data technologies are critical to the medical field which requires new frameworks to leverage them. Such frameworks would benefit medical experts to test hypotheses by querying huge volumes of unstructured medical data to provide better patient care. The objective of this work is to implement and examine the feasibility of having such a framework to provide efficient querying of unstructured data in unlimited ways. The feasibility study was conducted specifically in the epilepsy field. The proposed framework evaluates a query in two phases. In phase 1, structured data is used to filter the clinical data warehouse. In phase 2, feature extraction modules are executed on the unstructured data in a distributed manner via Hadoop to complete the query. Three modules have been created, volume comparer, surface to volume conversion and average intensity. The framework allows for user-defined modules to be imported to provide unlimited ways to process the unstructured data hence potentially extending the application of this framework beyond epilepsy field. Two types of criteria were used to validate the feasibility of the proposed framework - the ability/accuracy of fulfilling an advanced medical query and the efficiency that Hadoop provides. For the first criterion, the framework executed an advanced medical query that spanned both structured and unstructured data with accurate results. For the second criterion, different architectures were explored to evaluate the performance of various Hadoop configurations and were compared to a traditional Single Server Architecture (SSA). The surface to volume conversion module performed up to 40 times faster than the SSA (using a 20 node Hadoop cluster) and the average intensity module performed up to 85 times faster than the SSA (using a 40 node Hadoop cluster). Furthermore, the 40 node Hadoop cluster executed the average intensity module on 10,000 models in 3h which was not even practical for the SSA. The current study is

  5. An additive and lossless watermarking method based on invariant image approximation and Haar wavelet transform.

    PubMed

    Pan, W; Coatrieux, G; Cuppens, N; Cuppens, F; Roux, Ch

    2010-01-01

    In this article, we propose a new additive lossless watermarking scheme which identifies parts of the image that can be reversibly watermarked and conducts message embedding in the conventional Haar wavelet transform coefficients. Our approach makes use of an approximation of the image signal that is invariant to the watermark addition for classifying the image in order to avoid over/underflows. The method has been tested on different sets of medical images and some usual natural test images as Lena. Experimental result analysis conducted with respect to several aspects including data hiding capacity and image quality preservation, shows that our method is one of the most competitive existing lossless watermarking schemes in terms of high capacity and low distortion. PMID:21096246

  6. Three-dimensional oxygen isotope imaging of convective fluid flow around the Big Bonanza, Comstock lode mining district, Nevada

    USGS Publications Warehouse

    Criss, R.E.; Singleton, M.J.; Champion, D.E.

    2000-01-01

    Oxygen isotope analyses of propylitized andesites from the Con Virginia and California mines allow construction of a detailed, three-dimensional image of the isotopic surfaces produced by the convective fluid flows that deposited the famous Big Bonanza orebody. On a set of intersecting maps and sections, the δ18O isopleths clearly show the intricate and conformable relationship of the orebody to a deep, ~500 m gyre of meteoric-hydrothermal fluid that circulated along and above the Comstock fault, near the contact of the Davidson Granodiorite. The core of this gyre (δ18O = 0 to 3.8‰) encompasses the bonanza and is almost totally surrounded by rocks having much lower δ18O values (–1.0 to –4.4‰). This deep gyre may represent a convective longitudinal roll superimposed on a large unicellular meteoric-hydrothermal system, producing a complex flow field with both radial and longitudinal components that is consistent with experimentally observed patterns of fluid convection in permeable media.

  7. The VIsible and InfraRed Imaging Magnetograph (VIM-IRIM) at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Cao, W.; Tritschler, A.; Denker, C.; Wang, H.; Shumko, S.; Ma, J.; Wang, J.; Marquette, B.

    2004-05-01

    The Visible-light and the InfraRed Imaging Magnetograph (VIM-IRIM) are Fabry-Perot based filtergraphs working in a telecentric configuration, planned to upgrade the capability for measuring solar magnetic fields at BBSO. Both filtergraph instruments are designed to work with the combination of a narrow-band prefilter and a single Fabry-Perot etalon. VIM and IRIM will provide high temporal resolution, high spatial resolution (< 0.2 "/pixel image scale), high spectral resolution (< 0.1 Å) simultaneous observation at 600-700 nm and 1.0-1.6 μ m with a substantial field of view 170", respectively. Modifications in the setup allow also for scanning different spectral lines that cover the height range from the solar photosphere up to the solar chromopshere. Here we describe the optical setup and present first observations to demonstrate the feasibility of the instrument. After the instrument has proven to work as a 2D-spectrometer, the upgrade to a 2D spectropolarimeter is planned.

  8. The BigBoss Experiment

    SciTech Connect

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  9. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation

    PubMed Central

    Lee, Jae H.; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T.; Seo, Youngho

    2014-01-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting. PMID:27081299

  10. Image and compositional characteristics of the LDEF Big Guy impact crater

    NASA Technical Reports Server (NTRS)

    Bunch, T. E.; Paque, Julie M.; Zolensky, Michael

    1995-01-01

    A 5.2 mm crater in Al-metal represents the largest found on LDEF. We have examined this crater by field emission scanning electron microscopy (FESEM), energy dispersive spectroscopy (EDS) and time-of-flight/secondary ion mass spectroscopy (TOF-SIMS) in order to determine if there is any evidence of impactor residue. Droplet and dome-shaped columns, along with flow features, are evidence of melting. EDS from the crater cavity and rim show Mg, C, O and variable amounts of Si, in addition to Al. No evidence for a chondritic impactor was found, and it hypothesized that the crater may be the result of impact with space debris.

  11. Big heart data: advancing health informatics through data sharing in cardiovascular imaging.

    PubMed

    Suinesiaputra, Avan; Medrano-Gracia, Pau; Cowan, Brett R; Young, Alistair A

    2015-07-01

    The burden of heart disease is rapidly worsening due to the increasing prevalence of obesity and diabetes. Data sharing and open database resources for heart health informatics are important for advancing our understanding of cardiovascular function, disease progression and therapeutics. Data sharing enables valuable information, often obtained at considerable expense and effort, to be reused beyond the specific objectives of the original study. Many government funding agencies and journal publishers are requiring data reuse, and are providing mechanisms for data curation and archival. Tools and infrastructure are available to archive anonymous data from a wide range of studies, from descriptive epidemiological data to gigabytes of imaging data. Meta-analyses can be performed to combine raw data from disparate studies to obtain unique comparisons or to enhance statistical power. Open benchmark datasets are invaluable for validating data analysis algorithms and objectively comparing results. This review provides a rationale for increased data sharing and surveys recent progress in the cardiovascular domain. We also highlight the potential of recent large cardiovascular epidemiological studies enabling collaborative efforts to facilitate data sharing, algorithms benchmarking, disease modeling and statistical atlases. PMID:25415993

  12. Big Heart Data: Advancing Health Informatics through Data Sharing in Cardiovascular Imaging

    PubMed Central

    Suinesiaputra, Avan; Medrano-Gracia, Pau; Cowan, Brett R.; Young, Alistair A.

    2015-01-01

    The burden of heart disease is rapidly worsening due to increasing prevalence of obesity and diabetes. Data sharing and open database resources for heart health informatics are important for advancing our understanding of cardiovascular function, disease progression and therapeutics. Data sharing enables valuable information, often obtained at considerable expense and effort, to be re-used beyond the specific objectives of the original study. Many government funding agencies and journal publishers are requiring data re-use, and are providing mechanisms for data curation and archival. Tools and infrastructure are available to archive anonymous data from a wide range of studies, from descriptive epidemiological data to gigabytes of imaging data. Meta-analyses can be performed to combine raw data from disparate studies to obtain unique comparisons or to enhance statistical power. Open benchmark datasets are invaluable for validating data analysis algorithms and objectively comparing results. This review provides a rationale for increased data sharing and surveys recent progress in the cardiovascular domain. We also highlight the potential of recent large cardiovascular epidemiological studies enabling collaborative efforts to facilitate data sharing, algorithms benchmarking, disease modeling and statistical atlases. PMID:25415993

  13. Biomass estimator for NIR image with a few additional spectral band images taken from light UAS

    NASA Astrophysics Data System (ADS)

    Pölönen, Ilkka; Salo, Heikki; Saari, Heikki; Kaivosoja, Jere; Pesonen, Liisa; Honkavaara, Eija

    2012-05-01

    A novel way to produce biomass estimation will offer possibilities for precision farming. Fertilizer prediction maps can be made based on accurate biomass estimation generated by a novel biomass estimator. By using this knowledge, a variable rate amount of fertilizers can be applied during the growing season. The innovation consists of light UAS, a high spatial resolution camera, and VTT's novel spectral camera. A few properly selected spectral wavelengths with NIR images and point clouds extracted by automatic image matching have been used in the estimation. The spectral wavelengths were chosen from green, red, and NIR channels.

  14. Image and compositional characteristics of the LDEF Big Guy impact crater

    SciTech Connect

    Bunch, T.E.; Paque, J.M.; Zolensky, M. |

    1995-02-01

    A 5.2 mm crater in Al-metal represents the largest found on LDEF. The authors have examined this crater by field emission scanning electron microscopy (FESEM), energy dispersive spectroscopy (EDS) and time-of-flight/secondary ion mass spectroscopy (TOF-SIMS) in order to determine if there is any evidence of impactor residue. Droplet and dome-shaped columns, along with flow features, are evidence of melting. EDS from the crater cavity and rim show Mg, C, O and variable amounts of Si, in addition to Al. No evidence for a chondritic impactor was found, and it is hypothesized that the crater may be the result of impact with space debris.

  15. Multidirectional curved integral imaging with large depth by additional use of a large-aperture lens.

    PubMed

    Shin, Dong-Hak; Lee, Byoungho; Kim, Eun-Soo

    2006-10-01

    We propose a curved integral imaging system with large depth achieved by the additional use of a large-aperture lens in a conventional large-depth integral imaging system. The additional large-aperture lens provides a multidirectional curvature effect and improves the viewing angle. The proposed system has a simple structure due to the use of well-fabricated, unmodified flat devices. To calculate the proper elemental images for the proposed system, we explain a modified computer-generated pickup technique based on an ABCD matrix and analyze an effective viewing zone in the proposed system. From experiments, we show that the proposed system has an improved viewing angle of more than 7 degrees compared with conventional integral imaging. PMID:16983427

  16. A patch-based cross masking model for natural images with detail loss and additive defects

    NASA Astrophysics Data System (ADS)

    Liu, Yucheng; Allebach, Jan P.

    2015-03-01

    Visual masking is an effect that contents of the image reduce the detectability of a given target signal hidden in the image. The effect of visual masking has found its application in numerous image processing and vision tasks. In the past few decades, numerous research has been conducted on visual masking based on models optimized for artificial targets placed upon unnatural masks. Over the years, there is a tendency to apply masking model to predict natural image quality and detection threshold of distortion presented in natural images. However, to our knowledge few studies have been conducted to understand the generalizability of masking model to different types of distortion presented in natural images. In this work, we measure the ability of natural image patches in masking three different types of distortion, and analyse the performance of conventional gain control model in predicting the distortion detection threshold. We then propose a new masking model, where detail loss and additive defects are modeled in two parallel vision channels and interact with each other via a cross masking mechanism. We show that the proposed cross masking model has better adaptability to various image structures and distortions in natural scenes.

  17. Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    Big Bear Solar Observatory (BBSO) is located at the end of a causeway in a mountain lake more than 2 km above sea level. The site has more than 300 sunny days a year and a natural inversion caused by the lake which makes for very clean images. BBSO is the only university observatory in the US making high-resolution observations of the Sun. Its daily images are posted at http://www.bbso.njit.e...

  18. In-line image analysis on the effects of additives in batch cooling crystallization

    NASA Astrophysics Data System (ADS)

    Qu, Haiyan; Louhi-Kultanen, Marjatta; Kallas, Juha

    2006-03-01

    The effects of two potassium salt additives, ethylene diamine tetra acetic acid dipotassium salt (EDTA) and potassium pyrophosphate (KPY), on the batch cooling crystallization of potassium dihydrogen phosphate (KDP) were investigated. The crystal growth rates of certain crystal faces were determined from in-line images taken with a MTS particle image analysis (PIA) video microscope. An in-line image processing method was developed to characterize the size and shape of the crystals. The nucleation kinetics was studied by measurement of the metastable zone width and induction time. A significant promotion effect on both nucleation and growth of KDP was observed when EDTA was used as an additive. KPY, however, exhibited strong inhibiting impacts. The mechanism underlying the EDTA promotion effect on crystal growth was further studied with the 2-dimension nucleation model. It is shown that the presence of EDTA increased the density of adsorbed molecules of the crystallizing solute on the surface of the crystal.

  19. Additional Merits of Two-dimensional Single Thick-slice Magnetic Resonance Myelography in Spinal Imaging

    PubMed Central

    Aggarwal, Abhishek; Azad, Rajiv; Ahmad, Armeen; Arora, Pankaj; Gupta, Puneet

    2012-01-01

    Objective: To validate the additional merits of two-dimensional (2D) single thick-slice Magnetic Resonance Myelography (MRM) in spinal imaging. Materials and Methods: 2D single thick-slice MRM was performed using T2 half-Fourier acquisition single-shot turbo spin-echo (HASTE) sequence in addition to routine Magnetic resonance (MR) sequences for spine in 220 patients. The images were evaluated for additional diagnostic information in spinal and extra-spinal regions. A three-point grading system was adopted depending upon the utility of MRM in contributing to the detection of spinal or extra-spinal findings. Grade 1 represented no contribution of MRM while grade 3 would indicate that it was essential to detection of findings. Results: Utility of MRM in spine was categorized as grade 3 in 10.9% cases (24/220), grade 2 in 21.8% (48/220) cases and grade 1 in 67.3% cases (148/220). Thus, the overall additional merit of MRM in spine was seen in 32.7% (72/220) of cases. Besides in 14.1% cases (31/220) extra-spinal pathologies were identified. Conclusion: 2D single thick-slice MRM could have additional merits in spinal imaging when used as an adjunct to routine MR sequences. PMID:23393640

  20. Whole Language Using Big Books.

    ERIC Educational Resources Information Center

    Whyte, Sarah

    Designed as thematic units around Wright Company Big Books, the lessons in this guide demonstrate ways that Big Books can be used in a whole language first grade program. Each lesson indicates skill focus, needed materials, procedures, and additional thoughts or suggestions about the lesson. Units consist of: "Bedtime" (five lessons); "Monsters…

  1. Astronomy in the Cloud: Using MapReduce for Image Co-Addition

    NASA Astrophysics Data System (ADS)

    Wiley, K.; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-03-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computation challenges such as anomaly detection and classification and moving-object tracking. Since such studies benefit from the highest-quality data, methods such as image co-addition, i.e., astrometric registration followed by per-pixel summation, will be a critical preprocessing step prior to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources such as potentially hazardous asteroids or transient objects such as supernovae, these data streams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this article we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data are partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources: i.e., platforms where Hadoop is offered as a service. We report on our experience of implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multiterabyte imaging data set provides a good testbed for algorithm development, since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image co-addition to the MapReduce framework. Then we describe a number of optimizations to our basic approach

  2. Temperature of Solar Prominences Obtained with the Fast Imaging Solar Spectrograph on the 1.6 m New Solar Telescope at the Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Park, Hyungmin; Chae, Jongchul; Song, Donguk; Maurya, Ram Ajor; Yang, Heesu; Park, Young-Deuk; Jang, Bi-Ho; Nah, Jakyoung; Cho, Kyung-Suk; Kim, Yeon-Han; Ahn, Kwangsu; Cao, Wenda; Goode, Philip R.

    2013-11-01

    We observed solar prominences with the Fast Imaging Solar Spectrograph (FISS) at the Big Bear Solar Observatory on 30 June 2010 and 15 August 2011. To determine the temperature of the prominence material, we applied a nonlinear least-squares fitting of the radiative transfer model. From the Doppler broadening of the Hα and Ca ii lines, we determined the temperature and nonthermal velocity separately. The ranges of temperature and nonthermal velocity were 4000 - 20 000 K and 4 - 11 km s-1. We also found that the temperature varied much from point to point within one prominence.

  3. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants. PMID:27153828

  4. Terahertz imaging and tomography as efficient instruments for testing polymer additive manufacturing objects.

    PubMed

    Perraud, J B; Obaton, A F; Bou-Sleiman, J; Recur, B; Balacey, H; Darracq, F; Guillet, J P; Mounaix, P

    2016-05-01

    Additive manufacturing (AM) technology is not only used to make 3D objects but also for rapid prototyping. In industry and laboratories, quality controls for these objects are necessary though difficult to implement compared to classical methods of fabrication because the layer-by-layer printing allows for very complex object manufacturing that is unachievable with standard tools. Furthermore, AM can induce unknown or unexpected defects. Consequently, we demonstrate terahertz (THz) imaging as an innovative method for 2D inspection of polymer materials. Moreover, THz tomography may be considered as an alternative to x-ray tomography and cheaper 3D imaging for routine control. This paper proposes an experimental study of 3D polymer objects obtained by additive manufacturing techniques. This approach allows us to characterize defects and to control dimensions by volumetric measurements on 3D data reconstructed by tomography. PMID:27140357

  5. Clinical Outcome of Magnetic Resonance Imaging-Detected Additional Lesions in Breast Cancer Patients

    PubMed Central

    Ha, Gi-Won; Yi, Mi Suk; Lee, Byoung Kil; Jung, Sung Hoo

    2011-01-01

    Purpose The aim of this study was to investigate the clinical outcome of additional breast lesions identified with breast magnetic resonance imaging (MRI) in breast cancer patients. Methods A total of 153 patients who underwent breast MRI between July 2006 and March 2008 were retrospectively reviewed. Thirty-three patients (21.6&) were recommended for second-look ultrasound (US) for further characterization of additional lesions detected on breast MRI and these patients constituted our study population. Results Assessment for lesions detected on breast MRI consisted of the following: 25 benign lesions (73.5&), two indeterminate (5.9%), and seven malignant (20.6%) in 33 patients. Second-look US identified 12 additional lesions in 34 lesions (35.3%) and these lesions were confirmed by histological examination. Of the 12 lesions found in the 11 patients, six (50.0%) including one contralateral breast cancer were malignant. The surgical plan was altered in 18.2% (six of 33) of the patients. The use of breast MRI justified a change in treatment for four patients (66.7%) and caused two patients (33.3&) to undergo unwarranted additional surgical procedures. Conclusion Breast MRI identified additional multifocal or contralateral cancer which was not detected initially on conventional imaging in breast cancer patients. Breast MRI has become an indispensable modality in conjunction with conventional modalities for preoperative evaluation of patients with operable breast cancer. PMID:22031803

  6. Big bluestem

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big Bluestem (Andropogon gerardii) is a warm season grass native to North America, accounting for 40% of the herbaceous biomass of the tall grass prairie, and a candidate for bioenergy feedstock production. The goal of this study was to measure among and within population genetic variation of natura...

  7. Big Dreams

    ERIC Educational Resources Information Center

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  8. Big Opportunities and Big Concerns of Big Data in Education

    ERIC Educational Resources Information Center

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  9. Big Bang Circus

    NASA Astrophysics Data System (ADS)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  10. Color reproductivity improvement with additional virtual color filters for WRGB image sensor

    NASA Astrophysics Data System (ADS)

    Kawada, Shun; Kuroda, Rihito; Sugawa, Shigetoshi

    2013-02-01

    We have developed a high accuracy color reproduction method based on an estimated spectral reflectance of objects using additional virtual color filters for a wide dynamic range WRGB color filter CMOS image sensor. The four virtual color filters are created by multiplying the spectral sensitivity of White pixel by gauss functions which have different central wave length and standard deviation, and the virtual sensor outputs of those virtual filters are estimated from the four real output signals of the WRGB image sensor. The accuracy of color reproduction was evaluated with a Macbeth Color Checker (MCC), and the averaged value of the color difference ΔEab of 24 colors was 1.88 with our approach.

  11. Planetary rover navigation: improving visual odometry via additional images and multisensor fusion

    NASA Astrophysics Data System (ADS)

    Casalino, G.; Zereik, E.; Simetti, E.; Turetta, A.; Torelli, S.; Sperindé, A.

    2013-12-01

    Visual odometry (VO) is very important for a mobile robot, above all in a planetary scenario, to accurately estimate the rover occurred motion. The present work deals with the possibility to improve a previously developed VO technique by means of additional image processing, together with suitable mechanisms such as the classical Extended/Iterated Kalman Filtering and also Sequence Estimators. The possible employment of both techniques is then addressed and, consequently, a better behaving integration scheme is proposed. Moreover, the eventuality of exploiting other localization sensors is also investigated, leading to a final multisensor scheme.

  12. Big Bang, Big Data, Big Computers

    NASA Astrophysics Data System (ADS)

    Conference website: http://www.apc.univ-paris7.fr/APC/Conferences/Workshop_Big3/Home.html Observations of the Cosmic Microwave Background (CMB) radiation have transformed modern cosmology propelling it into high-precision, data-driven science it is today. CMB data analysis has been a cornerstone of this transformation and it continues in this role preparing currently to meet its possibly ultimate challenge as posed by ever-growing in size and complexity forthcoming data sets required by new science goals posed for the field. These include providing key pieces of information about the very early Universe: Gaussianity of the initial conditions, the presence of the primordial gravity waves, as well as constraints on the large-scale structure formation and possibly properties of dark energy. The sophistication of the involved data models is matched by precision levels, which have to be attained to deliver robust detections and result in firm conclusions. The overall challenge is indeed breathtaking and, without a doubt, the success will be only possible if the data analysis effort becomes truly interdisciplinary and capitalizes on the latest advances in statistics, applied mathematics, and computer science - all of which constitute veritable foundations of the contemporary data analysis work.

  13. A method for predicting DCT-based denoising efficiency for grayscale images corrupted by AWGN and additive spatially correlated noise

    NASA Astrophysics Data System (ADS)

    Rubel, Aleksey S.; Lukin, Vladimir V.; Egiazarian, Karen O.

    2015-03-01

    Results of denoising based on discrete cosine transform for a wide class of images corrupted by additive noise are obtained. Three types of noise are analyzed: additive white Gaussian noise and additive spatially correlated Gaussian noise with middle and high correlation levels. TID2013 image database and some additional images are taken as test images. Conventional DCT filter and BM3D are used as denoising techniques. Denoising efficiency is described by PSNR and PSNR-HVS-M metrics. Within hard-thresholding denoising mechanism, DCT-spectrum coefficient statistics are used to characterize images and, subsequently, denoising efficiency for them. Results of denoising efficiency are fitted for such statistics and efficient approximations are obtained. It is shown that the obtained approximations provide high accuracy of prediction of denoising efficiency.

  14. Additive controlled synthesis of gold nanorods (GNRs) for two-photon luminescence imaging of cancer cells

    NASA Astrophysics Data System (ADS)

    Zhu, Jing; Yong, Ken-Tye; Roy, Indrajit; Hu, Rui; Ding, Hong; Zhao, Lingling; Swihart, Mark T.; He, Guang S.; Cui, Yiping; Prasad, Paras N.

    2010-07-01

    Gold nanorods (GNRs) with a longitudinal surface plasmon resonance peak that is tunable from 600 to 1100 nm have been fabricated in a cetyl trimethylammoniumbromide (CTAB) micellar medium using hydrochloric acid and silver nitrate as additives to control their shape and size. By manipulating the concentrations of silver nitrate and hydrochloric acid, the aspect ratio of the GNRs was reliably and reproducibly tuned from 2.5 to 8. The GNRs were first coated with polyelectrolyte multilayers and then bioconjugated to transferrin (Tf) to target pancreatic cancer cells. Two-photon imaging excited from the bioconjugated GNRs demonstrated receptor-mediated uptake of the bioconjugates into Panc-1 cells, overexpressing the transferrin receptor (TfR). The bioconjugated GNR formulation exhibited very low toxicity, suggesting that it is biocompatible and potentially suitable for targeted two-photon bioimaging.

  15. Solving the Big Data (BD) Problem in Advanced Manufacturing (Subcategory for work done at Georgia Tech. Study Process and Design Factors for Additive Manufacturing Improvement)

    SciTech Connect

    Clark, Brett W.; Diaz, Kimberly A.; Ochiobi, Chinaza Darlene; Paynabar, Kamran

    2015-09-01

    3D printing originally known as additive manufacturing is a process of making 3 dimensional solid objects from a CAD file. This ground breaking technology is widely used for industrial and biomedical purposes such as building objects, tools, body parts and cosmetics. An important benefit of 3D printing is the cost reduction and manufacturing flexibility; complex parts are built at the fraction of the price. However, layer by layer printing of complex shapes adds error due to the surface roughness. Any such error results in poor quality products with inaccurate dimensions. The main purpose of this research is to measure the amount of printing errors for parts with different geometric shapes and to analyze them for finding optimal printing settings to minimize the error. We use a Design of Experiments framework, and focus on studying parts with cone and ellipsoid shapes. We found that the orientation and the shape of geometric shapes have significant effect on the printing error. From our analysis, we also determined the optimal orientation that gives the least printing error.

  16. Big data in multiple sclerosis: development of a web-based longitudinal study viewer in an imaging informatics-based eFolder system for complex data analysis and management

    NASA Astrophysics Data System (ADS)

    Ma, Kevin; Wang, Ximing; Lerner, Alex; Shiroishi, Mark; Amezcua, Lilyana; Liu, Brent

    2015-03-01

    In the past, we have developed and displayed a multiple sclerosis eFolder system for patient data storage, image viewing, and automatic lesion quantification results stored in DICOM-SR format. The web-based system aims to be integrated in DICOM-compliant clinical and research environments to aid clinicians in patient treatments and disease tracking. This year, we have further developed the eFolder system to handle big data analysis and data mining in today's medical imaging field. The database has been updated to allow data mining and data look-up from DICOM-SR lesion analysis contents. Longitudinal studies are tracked, and any changes in lesion volumes and brain parenchyma volumes are calculated and shown on the webbased user interface as graphical representations. Longitudinal lesion characteristic changes are compared with patients' disease history, including treatments, symptom progressions, and any other changes in the disease profile. The image viewer is updated such that imaging studies can be viewed side-by-side to allow visual comparisons. We aim to use the web-based medical imaging informatics eFolder system to demonstrate big data analysis in medical imaging, and use the analysis results to predict MS disease trends and patterns in Hispanic and Caucasian populations in our pilot study. The discovery of disease patterns among the two ethnicities is a big data analysis result that will help lead to personalized patient care and treatment planning.

  17. Studies of Microflares in RHESSI Hard X-Ray, Big Bear Solar Observatory Hα, and Michelson Doppler Imager Magnetograms

    NASA Astrophysics Data System (ADS)

    Liu, Chang; Qiu, Jiong; Gary, Dale E.; Krucker, Säm; Wang, Haimin

    2004-03-01

    In this paper, we present a study of the morphology of 12 microflares jointly observed by RHESSI in the energy range from 3 to 15 keV and by Big Bear Solar Observatory (BBSO) at the Hα line. They are A2-B3 events in GOES classification. From their time profiles, we find that all of these microflares are seen in soft X-ray, hard X-ray, and Hα wavelengths, and their temporal evolution resembles that of large flares. Co-aligned hard X-ray, Hα, and magnetic field observations show that the events all occurred in active regions and were located near magnetic neutral lines. In almost all of the events, the hard X-ray sources are elongated structures connecting two Hα bright kernels in opposite magnetic fields. These results suggest that, similar to large flares, the X-ray sources of the microflares represent emission from small magnetic loops and that the Hα bright kernels indicate emission at footpoints of these flare loops in the lower atmosphere. Among the 12 microflares, we include five events that are clearly associated with type III radio bursts as observed by the radio spectrometer on board Wind. Spectral fitting results indicate the nonthermal origin of the X-ray emission at over ~10 keV during the impulsive phase of all the events, and the photon spectra of the microflares associated with type III bursts are generally harder than those without type III bursts. TRACE observations at EUV wavelengths are available for five events in our list, and in two of these, coincident EUV jets are clearly identified to be spatially associated with the microflares. Such findings suggest that some microflares are produced by magnetic reconnection, which results in closed compact loops and open field lines. Electrons accelerated during the flare escape along the open field lines to interplanetary space.

  18. Big Sky Carbon Atlas

    DOE Data Explorer

    The Big Sky Carbon Atlas is an online geoportal designed for you to discover, interpret, and access geospatial data and maps relevant to decision support and education on carbon sequestration in the Big Sky Region. In serving as the public face of the Partnership's spatial Data Libraries, the Atlas provides a gateway to geographic information characterizing CO2 sources, potential geologic sinks, terrestrial carbon fluxes, civil and energy infrastructure, energy use, and related themes. In addition to directly serving the BSCSP and its stakeholders, the Atlas feeds regional data to the NatCarb Portal, contributing to a national perspective on carbon sequestration. Established components of the Atlas include a gallery of thematic maps and an interactive map that allows you to: • Navigate and explore regional characterization data through a user-friendly interface • Print your map views or publish them as PDFs • Identify technical references relevant to specific areas of interest • Calculate straight-line or pipeline-constrained distances from point sources of CO2 to potential geologic sink features • Download regional data layers (feature under development) (Acknowledgment to the Big Sky Carbon Sequestration Partnership (BSCSP); see home page at http://www.bigskyco2.org/)

  19. Live 3D image overlay for arterial duct closure with Amplatzer Duct Occluder II additional size.

    PubMed

    Goreczny, Sebstian; Morgan, Gareth J; Dryzek, Pawel

    2016-03-01

    Despite several reports describing echocardiography for the guidance of ductal closure, two-dimensional angiography remains the mainstay imaging tool; three-dimensional rotational angiography has the potential to overcome some of the drawbacks of standard angiography, and reconstructed image overlay provides reliable guidance for device placement. We describe arterial duct closure solely from venous approach guided by live three-dimensional image overlay. PMID:26358032

  20. Additional value of biplane transoesophageal imaging in assessment of mitral valve prostheses.

    PubMed Central

    Groundstroem, K; Rittoo, D; Hoffman, P; Bloomfield, P; Sutherland, G R

    1993-01-01

    OBJECTIVES--To determine whether biplane transoesophageal imaging offers advantages in the evaluation of mitral prostheses when compared with standard single transverse plane imaging or the precordial approach in suspected prosthetic dysfunction. DESIGN--Prospective mitral valve prosthesis in situ using precordial and biplane transoesophageal ultrasonography. SETTING--Tertiary cardiac referral centre. SUBJECTS--67 consecutive patients with suspected dysfunction of a mitral valve prosthesis (16 had bioprostheses and 51 mechanical prostheses) who underwent precordial, transverse plane, and biplane transoesophageal echocardiography. Correlative invasive confirmation from surgery or angiography, or both, was available in 44 patients. MAIN OUTCOME MEASURES--Number, type, and site of leak according to the three means of scanning. RESULTS--Transverse plane transoesophageal imaging alone identified all 31 medial/lateral paravalvar leaks but only 24/30 of the anterior/posterior leaks. Combining the information from both imaging planes confirmed that biplane scanning identified all paravalvar leaks. Five of the six patients with prosthetic valve endocarditis, all three with valvar thrombus or obstruction, and all three with mitral annulus rupture were diagnosed from transverse plane imaging alone. Longitudinal plane imaging alone enabled diagnosis of the remaining case of prosthetic endocarditis and a further case of subvalvar pannus formation. CONCLUSIONS--Transverse plane transoesophageal imaging was superior to the longitudinal imaging in identifying medial and lateral lesions around the sewing ring of a mitral valve prosthesis. Longitudinal plane imaging was superior in identifying anterior and posterior lesions. Biplane imaging is therefore an important development in the study of mitral prosthesis function. Images PMID:8398497

  1. Application of Tapping-Mode Scanning Probe Electrospray Ionization to Mass Spectrometry Imaging of Additives in Polymer Films

    PubMed Central

    Shimazu, Ryo; Yamoto, Yoshinari; Kosaka, Tomoya; Kawasaki, Hideya; Arakawa, Ryuichi

    2014-01-01

    We report the application of tapping-mode scanning probe electrospray ionization (t-SPESI) to mass spectrometry imaging of industrial materials. The t-SPESI parameters including tapping solvent composition, solvent flow rate, number of tapping at each spot, and step-size were optimized using a quadrupole mass spectrometer to improve mass spectrometry (MS) imaging of thin-layer chromatography (TLC) and additives in polymer films. Spatial resolution of approximately 100 μm was achieved by t-SPESI imaging mass spectrometry using a fused-silica capillary (50 μm i.d., 150 μm o.d.) with the flow rate set at 0.2 μL/min. This allowed us to obtain discriminable MS imaging profiles of three dyes separated by TLC and the additive stripe pattern of a PMMA model film depleted by UV irradiation. PMID:26819894

  2. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  3. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  4. Satellite-based land use mapping: comparative analysis of Landsat-8, Advanced Land Imager, and big data Hyperion imagery

    NASA Astrophysics Data System (ADS)

    Pervez, Wasim; Uddin, Vali; Khan, Shoab Ahmad; Khan, Junaid Aziz

    2016-04-01

    Until recently, Landsat technology has suffered from low signal-to-noise ratio (SNR) and comparatively poor radiometric resolution, which resulted in limited application for inland water and land use/cover mapping. The new generation of Landsat, the Landsat Data Continuity Mission carrying the Operational Land Imager (OLI), has improved SNR and high radiometric resolution. This study evaluated the utility of orthoimagery from OLI in comparison with the Advanced Land Imager (ALI) and hyperspectral Hyperion (after preprocessing) with respect to spectral profiling of classes, land use/cover classification, classification accuracy assessment, classifier selection, study area selection, and other applications. For each data source, the support vector machine (SVM) model outperformed the spectral angle mapper (SAM) classifier in terms of class discrimination accuracy (i.e., water, built-up area, mixed forest, shrub, and bare soil). Using the SVM classifier, Hyperion hyperspectral orthoimagery achieved higher overall accuracy than OLI and ALI. However, OLI outperformed both hyperspectral Hyperion and multispectral ALI using the SAM classifier, and with the SVM classifier outperformed ALI in terms of overall accuracy and individual classes. The results show that the new generation of Landsat achieved higher accuracies in mapping compared with the previous Landsat multispectral satellite series.

  5. Big data for bipolar disorder.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process. PMID:27068058

  6. Comparison of operation efficiency for the insert task when using stereoscopic images with additional lines, stereoscopic images, and a manipulator with force feedback

    NASA Astrophysics Data System (ADS)

    Matsunaga, Katsuya; Shidoji, Kazunori; Matsubara, Kenjiro

    1999-05-01

    It has been reported that operation efficiency for the teleoperations using stereoscopic video images is lower than when using the naked eye in real environments. Here, the authors tried to improve the human-machine interface of this particular system to achieve higher operation efficiency for stereoscopic video imags by adding other information. An experiment was carried out under the four following conditions: when the insert task was performed by subjects using conventional stereoscopic video imags, when the centering lines of the cylindrical objects and holes were added to the conventional stereoscopic video images, when the force feedback was provided to the system manipulator as one object touched another object, and when both of the additional centering lines and force feedback were provided. The subject's task was to inset a cylindrical object into a round hole. The completion time was measured from the time of the starting signal to the time when the object was inserted into the hole. Completion time, when additional lines were given, was shorter than when the force feedback was provided, and when no additional information was provided. It was concluded that additional visual information contributed more to the recognition of the space rather than providing additional information about surface phenomena.

  7. Five Big Ideas

    ERIC Educational Resources Information Center

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  8. Enhancement of Glossiness Perception by Retinal-Image Motion: Additional Effect of Head-Yoked Motion Parallax

    PubMed Central

    Tani, Yusuke; Araki, Keisuke; Nagai, Takehiro; Koida, Kowa; Nakauchi, Shigeki; Kitazaki, Michiteru

    2013-01-01

    It has been argued that when an observer moves, a contingent retinal-image motion of a stimulus would strengthen the perceived glossiness. This would be attributed to the veridical perception of three-dimensional structure by motion parallax. However, it has not been investigated whether the effect of motion parallax is more than that of retinal-image motion of the stimulus. Using a magnitude estimation method, we examine in this paper whether cross-modal coordination of the stimulus change and the observer's motion (i.e., motion parallax) is essential or the retinal-image motion alone is sufficient for enhancing the perceived glossiness. Our data show that a retinal-image motion simulating motion parallax without head motion strengthened the perceived glossiness but that its effect was weaker than that of motion parallax with head motion. These results suggest the existence of an additional effect of the cross-modal coordination between vision and proprioception on glossiness perception. That is, motion parallax enhances the perception of glossiness, in addition to retinal-image motions of specular surfaces. PMID:23336006

  9. SU-E-J-06: Additional Imaging Guidance Dose to Patient Organs Resulting From X-Ray Tubes Used in CyberKnife Image Guidance System

    SciTech Connect

    Sullivan, A; Ding, G

    2015-06-15

    Purpose: The use of image-guided radiation therapy (IGRT) has become increasingly common, but the additional radiation exposure resulting from repeated image guidance procedures raises concerns. Although there are many studies reporting imaging dose from different image guidance devices, imaging dose for the CyberKnife Robotic Radiosurgery System is not available. This study provides estimated organ doses resulting from image guidance procedures on the CyberKnife system. Methods: Commercially available Monte Carlo software, PCXMC, was used to calculate average organ doses resulting from x-ray tubes used in the CyberKnife system. There are seven imaging protocols with kVp ranging from 60 – 120 kV and 15 mAs for treatment sites in the Cranium, Head and Neck, Thorax, and Abdomen. The output of each image protocol was measured at treatment isocenter. For each site and protocol, Adult body sizes ranging from anorexic to extremely obese were simulated since organ dose depends on patient size. Doses for all organs within the imaging field-of-view of each site were calculated for a single image acquisition from both of the orthogonal x-ray tubes. Results: Average organ doses were <1.0 mGy for every treatment site and imaging protocol. For a given organ, dose increases as kV increases or body size decreases. Higher doses are typically reported for skeletal components, such as the skull, ribs, or clavicles, than for softtissue organs. Typical organ doses due to a single exposure are estimated as 0.23 mGy to the brain, 0.29 mGy to the heart, 0.08 mGy to the kidneys, etc., depending on the imaging protocol and site. Conclusion: The organ doses vary with treatment site, imaging protocol and patient size. Although the organ dose from a single image acquisition resulting from two orthogonal beams is generally insignificant, the sum of repeated image acquisitions (>100) could reach 10–20 cGy for a typical treatment fraction.

  10. Big Data

    PubMed Central

    SOBEK, MATTHEW; CLEVELAND, LARA; FLOOD, SARAH; HALL, PATRICIA KELLY; KING, MIRIAM L.; RUGGLES, STEVEN; SCHROEDER, MATTHEW

    2011-01-01

    The Minnesota Population Center (MPC) provides aggregate data and microdata that have been integrated and harmonized to maximize crosstemporal and cross-spatial comparability. All MPC data products are distributed free of charge through an interactive Web interface that enables users to limit the data and metadata being analyzed to samples and variables of interest to their research. In this article, the authors describe the integrated databases available from the MPC, report on recent additions and enhancements to these data sets, and summarize new online tools and resources that help users to analyze the data over time. They conclude with a description of the MPC’s newest and largest infrastructure project to date: a global population and environment data network. PMID:21949459

  11. Big data are coming to psychiatry: a general introduction.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Bauer, Michael

    2015-12-01

    Big data are coming to the study of bipolar disorder and all of psychiatry. Data are coming from providers and payers (including EMR, imaging, insurance claims and pharmacy data), from omics (genomic, proteomic, and metabolomic data), and from patients and non-providers (data from smart phone and Internet activities, sensors and monitoring tools). Analysis of the big data will provide unprecedented opportunities for exploration, descriptive observation, hypothesis generation, and prediction, and the results of big data studies will be incorporated into clinical practice. Technical challenges remain in the quality, analysis and management of big data. This paper discusses some of the fundamental opportunities and challenges of big data for psychiatry. PMID:26440506

  12. Big Burst

    NASA Technical Reports Server (NTRS)

    2007-01-01

    What would a starburst look like if you could see it up close? Probably a lot like the Carina Nebula, a rather small region of one of the Galaxy's spiral arms, a complex of massive clouds of gas and dust, and a region where, about a million or two years ago, for some reason and extraordinary amount of very massive stars formed. And at only some 8500 lightyears distant, it's relatively nearby. Such regions are of great interest to astronomers, since they are very young, and they show how massive stars form and how they create and disperse the elements necessary for life. The image above is a beautiful new study of the Carina Nebula in X-rays, taken by the XMM Newton X-ray observatory. The X-ray colors represent X-ray energy, as usual: red means low energy X-ray emission, green is somewhat higher in energy than red, and blue somewhat higher than green. Thus blue objects are either very high energy objects, or else very absorbed objects. Most of the point sources are massive stars, some X-ray emitting binaries, and some objects still to be identified. The clustering of the X-ray point sources is very evident, showing how massive stars like to form in groups. A number of interesting sources are identified. Interestingly, the Carina Nebula is immersed in a large diffuse glow of X-radiation. This X-ray glow might be produced by the combined winds of the massive stars colliding with the dense cold clouds in the nebula. Another interesting possibility: perhaps this emission represents an old supernova. But if so which star died?

  13. Testing a Gender Additive Model: The Role of Body Image in Adolescent Depression

    ERIC Educational Resources Information Center

    Bearman, Sarah Kate; Stice, Eric

    2008-01-01

    Despite consistent evidence that adolescent girls are at greater risk of developing depression than adolescent boys, risk factor models that account for this difference have been elusive. The objective of this research was to examine risk factors proposed by the "gender additive" model of depression that attempts to partially explain the increased…

  14. Thermal Imaging for Assessment of Electron-Beam Free Form Fabrication (EBF(sup 3)) Additive Manufacturing Welds

    NASA Technical Reports Server (NTRS)

    Zalameda, Joseph N.; Burke, Eric R.; Hafley, Robert A.; Taminger, Karen M.; Domack, Christopher S.; Brewer, Amy R.; Martin, Richard E.

    2013-01-01

    Additive manufacturing is a rapidly growing field where 3-dimensional parts can be produced layer by layer. NASA s electron beam free-form fabrication (EBF(sup 3)) technology is being evaluated to manufacture metallic parts in a space environment. The benefits of EBF(sup 3) technology are weight savings to support space missions, rapid prototyping in a zero gravity environment, and improved vehicle readiness. The EBF(sup 3) system is composed of 3 main components: electron beam gun, multi-axis position system, and metallic wire feeder. The electron beam is used to melt the wire and the multi-axis positioning system is used to build the part layer by layer. To insure a quality weld, a near infrared (NIR) camera is used to image the melt pool and solidification areas. This paper describes the calibration and application of a NIR camera for temperature measurement. In addition, image processing techniques are presented for weld assessment metrics.

  15. Assessing the use of an infrared spectrum hyperpixel array imager to measure temperature during additive and subtractive manufacturing

    NASA Astrophysics Data System (ADS)

    Whitenton, Eric; Heigel, Jarred; Lane, Brandon; Moylan, Shawn

    2016-05-01

    Accurate non-contact temperature measurement is important to optimize manufacturing processes. This applies to both additive (3D printing) and subtractive (material removal by machining) manufacturing. Performing accurate single wavelength thermography suffers numerous challenges. A potential alternative is hyperpixel array hyperspectral imaging. Focusing on metals, this paper discusses issues involved such as unknown or changing emissivity, inaccurate greybody assumptions, motion blur, and size of source effects. The algorithm which converts measured thermal spectra to emissivity and temperature uses a customized multistep non-linear equation solver to determine the best-fit emission curve. Emissivity dependence on wavelength may be assumed uniform or have a relationship typical for metals. The custom software displays residuals for intensity, temperature, and emissivity to gauge the correctness of the greybody assumption. Initial results are shown from a laser powder-bed fusion additive process, as well as a machining process. In addition, the effects of motion blur are analyzed, which occurs in both additive and subtractive manufacturing processes. In a laser powder-bed fusion additive process, the scanning laser causes the melt pool to move rapidly, causing a motion blur-like effect. In machining, measuring temperature of the rapidly moving chip is a desirable goal to develop and validate simulations of the cutting process. A moving slit target is imaged to characterize how the measured temperature values are affected by motion of a measured target.

  16. Determination of detergent and dispensant additives in gasoline by ring-oven and near infrared hypespectral imaging.

    PubMed

    Rodrigues e Brito, Lívia; da Silva, Michelle P F; Rohwedder, Jarbas J R; Pasquini, Celio; Honorato, Fernanda A; Pimentel, Maria Fernanda

    2015-03-10

    A method using the ring-oven technique for pre-concentration in filter paper discs and near infrared hyperspectral imaging is proposed to identify four detergent and dispersant additives, and to determine their concentration in gasoline. Different approaches were used to select the best image data processing in order to gather the relevant spectral information. This was attained by selecting the pixels of the region of interest (ROI), using a pre-calculated threshold value of the PCA scores arranged as histograms, to select the spectra set; summing up the selected spectra to achieve representativeness; and compensating for the superimposed filter paper spectral information, also supported by scores histograms for each individual sample. The best classification model was achieved using linear discriminant analysis and genetic algorithm (LDA/GA), whose correct classification rate in the external validation set was 92%. Previous classification of the type of additive present in the gasoline is necessary to define the PLS model required for its quantitative determination. Considering that two of the additives studied present high spectral similarity, a PLS regression model was constructed to predict their content in gasoline, while two additional models were used for the remaining additives. The results for the external validation of these regression models showed a mean percentage error of prediction varying from 5 to 15%. PMID:25732308

  17. Scanning thermal microscopy probe capable of simultaneous electrical imaging and the addition of diamond tip

    NASA Astrophysics Data System (ADS)

    Brown, E.; Hao, L.; Cox, D. C.; Gallop, J. C.

    2008-03-01

    Scanning Thermal Microscopy (SThM) is a scanning probe technique that allows the mapping of the thermal properties and/or temperature of a substrate. Developments in this scanning probe technique are of great importance to further the study of thermal transport at the micron and at the nano scale, for instance to better the understanding of heat transport in nano-electronic devices or energy transfer in biological systems. Here we describe: 1) the scanning technique developed to acquire simultaneous images of the topography, the thermal and electrical properties of the substrate using a commercially available Veeco SThM probe; 2) how the SThM probe was modified by mounting a micron-sized diamond pyramid on its tip in order to improve the probe's lateral resolution and the topography resolution tests on the performance of the modified probe.

  18. Big Spherules near 'Victoria'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This frame from the microscopic imager on NASA's Mars Exploration Rover Opportunity shows spherules up to about 5 millimeters (one-fifth of an inch) in diameter. The camera took this image during the 924th Martian day, or sol, of Opportunity's Mars-surface mission (Aug. 30, 2006), when the rover was about 200 meters (650 feet) north of 'Victoria Crater.'

    Opportunity discovered spherules like these, nicknamed 'blueberries,' at its landing site in 'Eagle Crater,' and investigations determined them to be iron-rich concretions that formed inside deposits soaked with groundwater. However, such concretions were much smaller or absent at the ground surface along much of the rover's trek of more than 5 kilometers (3 miles) southward to Victoria. The big ones showed up again when Opportunity got to the ring, or annulus, of material excavated and thrown outward by the impact that created Victoria Crater. Researchers hypothesize that some layer beneath the surface in Victoria's vicinity was once soaked with water long enough to form the concretions, that the crater-forming impact dispersed some material from that layer, and that Opportunity might encounter that layer in place if the rover drives down into the crater.

  19. Temperature Profile and Imaging Analysis of Laser Additive Manufacturing of Stainless Steel

    NASA Astrophysics Data System (ADS)

    Islam, M.; Purtonen, T.; Piili, H.; Salminen, A.; Nyrhilä, O.

    Powder bed fusion is a laser additive manufacturing (LAM) technology which is used to manufacture parts layer-wise from powdered metallic materials. The technology has advanced vastly in the recent years and current systems can be used to manufacture functional parts for e.g. aerospace industry. The performance and accuracy of the systems have improved also, but certain difficulties in the powder fusion process are reducing the final quality of the parts. One of these is commonly known as the balling phenomenon. The aim of this study was to define some of the process characteristics in powder bed fusion by performing comparative studies with two different test setups. This was done by comparing measured temperature profiles and on-line photography of the process. The material used during the research was EOS PH1 stainless steel. Both of the test systems were equipped with 200 W single mode fiber lasers. The main result of the research was that some of the process instabilities are resulting from the energy input during the process.

  20. Dual of big bang and big crunch

    SciTech Connect

    Bak, Dongsu

    2007-01-15

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory.

  1. Implementing Big History.

    ERIC Educational Resources Information Center

    Welter, Mark

    2000-01-01

    Contends that world history should be taught as "Big History," a view that includes all space and time beginning with the Big Bang. Discusses five "Cardinal Questions" that serve as a course structure and address the following concepts: perspectives, diversity, change and continuity, interdependence, and causes. (CMK)

  2. Big Ideas in Art

    ERIC Educational Resources Information Center

    Day, Kathleen

    2008-01-01

    In this article, the author shares how she was able to discover some big ideas about art education. She relates how she found great ideas to improve her teaching from the book "Rethinking Curriculum in Art." She also shares how she designed a "Big Idea" unit in her class.

  3. Big data: the management revolution.

    PubMed

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster. PMID:23074865

  4. Big data, big knowledge: big data for personalized healthcare.

    PubMed

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority. PMID:26218867

  5. Big Data in industry

    NASA Astrophysics Data System (ADS)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  6. BigDog

    NASA Astrophysics Data System (ADS)

    Playter, R.; Buehler, M.; Raibert, M.

    2006-05-01

    BigDog's goal is to be the world's most advanced quadruped robot for outdoor applications. BigDog is aimed at the mission of a mechanical mule - a category with few competitors to date: power autonomous quadrupeds capable of carrying significant payloads, operating outdoors, with static and dynamic mobility, and fully integrated sensing. BigDog is about 1 m tall, 1 m long and 0.3 m wide, and weighs about 90 kg. BigDog has demonstrated walking and trotting gaits, as well as standing up and sitting down. Since its creation in the fall of 2004, BigDog has logged tens of hours of walking, climbing and running time. It has walked up and down 25 & 35 degree inclines and trotted at speeds up to 1.8 m/s. BigDog has walked at 0.7 m/s over loose rock beds and carried over 50 kg of payload. We are currently working to expand BigDog's rough terrain mobility through the creation of robust locomotion strategies and terrain sensing capabilities.

  7. The Big Bang Theory

    SciTech Connect

    Lincoln, Don

    2014-09-30

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  8. Getting the most out of additional guidance information in deformable image registration by leveraging multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Alderliesten, Tanja; Bosman, Peter A. N.; Bel, Arjan

    2015-03-01

    Incorporating additional guidance information, e.g., landmark/contour correspondence, in deformable image registration is often desirable and is typically done by adding constraints or cost terms to the optimization function. Commonly, deciding between a "hard" constraint and a "soft" additional cost term as well as the weighting of cost terms in the optimization function is done on a trial-and-error basis. The aim of this study is to investigate the advantages of exploiting guidance information by taking a multi-objective optimization perspective. Hereto, next to objectives related to match quality and amount of deformation, we define a third objective related to guidance information. Multi-objective optimization eliminates the need to a-priori tune a weighting of objectives in a single optimization function or the strict requirement of fulfilling hard guidance constraints. Instead, Pareto-efficient trade-offs between all objectives are found, effectively making the introduction of guidance information straightforward, independent of its type or scale. Further, since complete Pareto fronts also contain less interesting parts (i.e., solutions with near-zero deformation effort), we study how adaptive steering mechanisms can be incorporated to automatically focus more on solutions of interest. We performed experiments on artificial and real clinical data with large differences, including disappearing structures. Results show the substantial benefit of using additional guidance information. Moreover, compared to the 2-objective case, additional computational cost is negligible. Finally, with the same computational budget, use of the adaptive steering mechanism provides superior solutions in the area of interest.

  9. Genesis of the big bang

    NASA Astrophysics Data System (ADS)

    Alpher, Ralph A.; Herman, Robert

    The authors of this volume have been intimately connected with the conception of the big bang model since 1947. Following the late George Gamov's ideas in 1942 and more particularly in 1946 that the early universe was an appropriate site for the synthesis of the elements, they became deeply involved in the question of cosmic nucleosynthesis and particularly the synthesis of the light elements. In the course of this work they developed a general relativistic model of the expanding universe with physics folded in, which led in a progressive, logical sequence to our prediction of the existence of a present cosmic background radiation some seventeen years before the observation of such radiation was reported by Penzias and Wilson. In addition, they carried out with James W. Follin, Jr., a detailed study of the physics of what was then considered to be the very early universe, starting a few seconds after the big bang, which still provides a methodology for studies of light element nucleosynthesis. Because of their involvement, they bring a personal perspective to the subject. They present a picture of what is now believed to be the state of knowledge about the evolution of the expanding universe and delineate the story of the development of the big bang model as they have seen and lived it from their own unique vantage point.

  10. The Big Bang Singularity

    NASA Astrophysics Data System (ADS)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  11. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    ERIC Educational Resources Information Center

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own "big" words and dreams. During the one…

  12. Big bang theory under fire.

    NASA Astrophysics Data System (ADS)

    Mitchell, W. C.

    The very old big bang (BB) problems (of the singularity, smoothness, horizon, and flatness) and the failed solutions of inflation theory; newer BB problems relating to missing mass (as required for a flat inflationary universe), the age of the universe, radiation from the "decoupling" ("smearing" of blackbody spectrum), a contrived BB chronology, the abundances of light elements, and redshift anomalies; and problems, newer yet regarding inconsistencies of redshift interpretation, curved space, inflation theory, the decelerating expansion of a BB universe, and some additional logical inconsistencies of BB theory are presented.

  13. Personality and job performance: the Big Five revisited.

    PubMed

    Hurtz, G M; Donovan, J J

    2000-12-01

    Prior meta-analyses investigating the relation between the Big 5 personality dimensions and job performance have all contained a threat to construct validity, in that much of the data included within these analyses was not derived from actual Big 5 measures. In addition, these reviews did not address the relations between the Big 5 and contextual performance. Therefore, the present study sought to provide a meta-analytic estimate of the criterion-related validity of explicit Big 5 measures for predicting job performance and contextual performance. The results for job performance closely paralleled 2 of the previous meta-analyses, whereas analyses with contextual performance showed more complex relations among the Big 5 and performance. A more critical interpretation of the Big 5-performance relationship is presented, and suggestions for future research aimed at enhancing the validity of personality predictors are provided. PMID:11125652

  14. Will Big Data Mean the End of Privacy?

    ERIC Educational Resources Information Center

    Pence, Harry E.

    2015-01-01

    Big Data is currently a hot topic in the field of technology, and many campuses are considering the addition of this topic into their undergraduate courses. Big Data tools are not just playing an increasingly important role in many commercial enterprises; they are also combining with new digital devices to dramatically change privacy. This article…

  15. Facile preparation and biological imaging of luminescent polymeric nanoprobes with aggregation-induced emission characteristics through Michael addition reaction.

    PubMed

    Lv, Qiulan; Wang, Ke; Xu, Dazhuang; Liu, Meiying; Wan, Qing; Huang, Hongye; Liang, Shangdong; Zhang, Xiaoyong; Wei, Yen

    2016-09-01

    Water dispersion aggregation-induced emission (AIE) dyes based nanomaterials have recently attracted increasing attention in the biomedical fields because of their unique optical properties, outstanding performance as imaging and therapeutic agents. The methods to conjugate hydrophilic polymers with AIE dyes to solve the hydrophobic nature of AIE dyes and makeS them widely used in biomedicine, which have been extensively explored and paid great effort previously. Although great advance has been made in the fabrication and biomedical applications of AIE-active polymeric nanoprobes, facile and efficient strategies for fabrication of biodegradable AIE-active nanoprobes are still high desirable. In this work, amphiphilic biodegradable fluorescent organic nanoparticles (PLL-TPE-O-E FONs) have been fabricated for the first time by conjugation of AIE dye tetraphenylethene acrylate (TPE-O-E) with Poly-l-Lysine (PLL) through a facile one-step Michael addition reaction, which was carried out under rather mild conditions, included air atmosphere, near room temperature and absent of metal catalysts or hazardous reagents. Due to the unique AIE properties, these amphiphilic copolymers tend to self-assemble into high luminescent water dispersible nanoparticles with size range from 400 to 600nm. Laser scanning microscope and cytotoxicity results revealed that PLL-TPE-O-E FONs can be internalized into cytoplasm with negative cytotoxicity, which implied that PLL-TPE-O-E FONs are promising for biological applications. PMID:27311129

  16. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    PubMed

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed. PMID:21859221

  17. 'Big Crater' in 360-degree panorama

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The crater dubbed 'Big Crater', approximately 2200 meters (7200 feet)away was imaged by the Imager for Mars Pathfinder (IMP) as part of a 360-degree color panorama, taken over sols 8, 9 and 10. 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona.

    Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is an operating division of the California Institute of Technology (Caltech). The Imager for Mars Pathfinder (IMP) was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.

  18. Big Questions: Missing Antimatter

    SciTech Connect

    Lincoln, Don

    2013-08-27

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  19. Big data in biomedicine.

    PubMed

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. PMID:24183925

  20. Bayesian big bang

    NASA Astrophysics Data System (ADS)

    Daum, Fred; Huang, Jim

    2011-09-01

    We show that the flow of particles corresponding to Bayes' rule has a number of striking similarities with the big bang, including cosmic inflation and cosmic acceleration. We derive a PDE for this flow using a log-homotopy from the prior probability density to the posteriori probability density. We solve this PDE using the gradient of the solution to Poisson's equation, which is computed using an exact Green's function and the standard Monte Carlo approximation of integrals. The resulting flow is analogous to Coulomb's law in electromagnetics. We have used no physics per se to derive this flow, but rather we have only used Bayes' rule and the definition of normalized probability and a loghomotopy parameter that could be interpreted as time. The details of this big bang resemble very recent theories much more closely than the so-called new inflation models, which postulate enormous inflation immediately after the big bang.

  1. Big Questions: Missing Antimatter

    ScienceCinema

    Lincoln, Don

    2014-08-07

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  2. Astronomical surveys and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  3. Evidence of big bang turbulence

    NASA Astrophysics Data System (ADS)

    Gibson, Carl H.

    2002-11-01

    Chaotic, eddy-like motions dominated by inertial-vortex forces begin at Planck scales in a hot big-bang-turbulence (BBT) cosmological model where this version of the quantum-gravitational-dynamics epoch produces not only the first space-time-energy of the universe but the first high Reynolds number turbulence and turbulent mixing with Kolmogorov and Batchelor-Obukhov-Corrsin velocity and temperature gradient spectra. Strong-force-freeze-out and inflation produced the first fossil-temperature-turbulence by stretching the fluctuations beyond the horizon scale ct of causal connection for light speed c and time t. Recent Cosmic Background Imager spectra of the cosmic microwave background (CMB) temperature anisotropies at high wavenumbers support the prediction that fossil BBT fluctuation patterns imprinted by nucleosynthesis on light element densities and the associated Sachs-Wolfe temperature fluctuations should not decay by thermal diffusion as expected if the CMB anisotropies were acoustic as commonly assumed. Extended Self Similarity coefficients of the CMB anisotropies exactly match those of high Reynolds number turbulence (Bershadskii and Sreenivasan 2002), supporting the conclusion that fossil big-bang-turbulence seeded nucleosynthesis of light elements and the first hydro-gravitational structure formation.

  4. A Big Bang Lab

    ERIC Educational Resources Information Center

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  5. Big-City Rules

    ERIC Educational Resources Information Center

    Gordon, Dan

    2011-01-01

    When it comes to implementing innovative classroom technology programs, urban school districts face significant challenges stemming from their big-city status. These range from large bureaucracies, to scalability, to how to meet the needs of a more diverse group of students. Because of their size, urban districts tend to have greater distance…

  6. Big Enough for Everyone?

    ERIC Educational Resources Information Center

    Coote, Anna

    2010-01-01

    The UK's coalition government wants to build a "Big Society." The Prime Minister says "we are all in this together" and building it is the responsibility of every citizen as well as every government department. The broad vision is welcome, but everything depends on how the vision is translated into policy and practice. The government aims to put…

  7. The big bang

    NASA Astrophysics Data System (ADS)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  8. Thinking Big, Aiming High

    ERIC Educational Resources Information Center

    Berkeley, Viv

    2010-01-01

    What do teachers, providers and policymakers need to do in order to support disabled learners to "think big and aim high"? That was the question put to delegates at NIACE's annual disability conference. Some clear themes emerged, with delegates raising concerns about funding, teacher training, partnership-working and employment for disabled…

  9. The Big Empty.

    ERIC Educational Resources Information Center

    Brook, Richard; Smith, Shelley; Tisdale, Mary

    1995-01-01

    Discusses "The Big Empty" or, the Great Basin. Suggests that it is not empty but rather a great ecosystem rich in plants, animals, and minerals. Presents information and activities to guide students in exploring the Great Basin in order to understand the ways in which such an arid and seemingly harsh environment can support so many living things.…

  10. A Sobering Big Idea

    ERIC Educational Resources Information Center

    Wineburg, Sam

    2006-01-01

    Since Susan Adler, Alberta Dougan, and Jesus Garcia like "big ideas," the author offers one to ponder: young people in this country can not read with comprehension. The saddest thing about this crisis is that it is no secret. The 2001 results of the National Assessment of Educational Progress (NAEP) for reading, published in every major newspaper,…

  11. The Big Fish

    ERIC Educational Resources Information Center

    DeLisle, Rebecca; Hargis, Jace

    2005-01-01

    The Killer Whale, Shamu jumps through hoops and splashes tourists in hopes for the big fish, not because of passion, desire or simply the enjoyment of doing so. What would happen if those fish were obsolete? Would this killer whale be able to find the passion to continue to entertain people? Or would Shamu find other exciting activities to do…

  12. Big Bang Theory

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    The theory which asserts that the universe originated a finite time ago by expanding from an infinitely compressed state. According to this model, space, time and matter originated together, and the universe has been expanding ever since. Key stages in the history of the Big Bang universe are summarized below....

  13. The Big Sky inside

    ERIC Educational Resources Information Center

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  14. Business and Science - Big Data, Big Picture

    NASA Astrophysics Data System (ADS)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  15. Dual-source dual-energy CT with additional tin filtration: Dose and image quality evaluation in phantoms and in-vivo

    PubMed Central

    Primak, Andrew N.; Giraldo, Juan Carlos Ramirez; Eusemann, Christian D.; Schmidt, Bernhard; Kantor, B.; Fletcher, Joel G.; McCollough, Cynthia H.

    2010-01-01

    Purpose To investigate the effect on radiation dose and image quality of the use of additional spectral filtration for dual-energy CT (DECT) imaging using dual-source CT (DSCT). Materials and Methods A commercial DSCT scanner was modified by adding tin filtration to the high-kV tube, and radiation output and noise measured in water phantoms. Dose values for equivalent image noise were compared among DE-modes with and without tin filtration and single-energy (SE) mode. To evaluate DECT material discrimination, the material-specific DEratio for calcium and iodine were determined using images of anthropomorphic phantoms. Data were additionally acquired in 38 and 87 kg pigs, and noise for the linearly mixed and virtual non-contrast (VNC) images compared between DE-modes. Finally, abdominal DECT images from two patients of similar sizes undergoing clinically-indicated CT were compared. Results Adding tin filtration to the high-kV tube improved the DE contrast between iodine and calcium as much as 290%. Pig data showed that the tin filtration had no effect on noise in the DECT mixed images, but decreased noise by as much as 30% in the VNC images. Patient VNC-images acquired using 100/140 kV with added tin filtration had improved image quality compared to those generated with 80/140 kV without tin filtration. Conclusion Tin filtration of the high-kV tube of a DSCT scanner increases the ability of DECT to discriminate between calcium and iodine, without increasing dose relative to SECT. Furthermore, use of 100/140 kV tube potentials allows improved DECT imaging of large patients. PMID:20966323

  16. Big Data and Chemical Education

    ERIC Educational Resources Information Center

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  17. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    PubMed

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record PMID:26348599

  18. Superconducting properties and magneto-optical imaging of Ba0.6K0.4Fe2As2 PIT wires with Ag addition

    NASA Astrophysics Data System (ADS)

    Ding, Qing-Ping; Prombood, Trirat; Tsuchiya, Yuji; Nakajima, Yasuyuki; Tamegai, Tsuyoshi

    2012-03-01

    We have fabricated (Ba,K)Fe2As2 superconducting wires through an ex situ powder-in-tube method. Silver was used as a chemical addition to improve the performance of these superconducting wires. The transport critical current densities (Jc) have reached 1.3 × 104 A cm-2 and 1.0 × 104 A cm-2 at 4.2 K under self-field in the wires with and without Ag addition. We used a magneto-optical (MO) imaging technique to investigate the properties of grain boundaries in the (Ba,K)Fe2As2 superconducting wire with Ag addition. MO images show the weak links in the Fe-based superconducting wires for the first time. An intragranular Jc of 6.0 × 104 A cm-2 at 20 K is obtained from the MO image, which is consistent with the estimation from M-H measurement.

  19. Improvement in perception of image sharpness through the addition of noise and its relationship with memory texture

    NASA Astrophysics Data System (ADS)

    Wan, Xiazi; Kobayashi, Hiroyuki; Aoki, Naokazu

    2015-03-01

    In a preceding study, we investigated the effects of image noise on the perception of image sharpness using white noise, and one- and two-dimensional single-frequency sinusoidal patterns as stimuli. This study extends our preceding study by evaluating natural color images, rather than black-and-white patterns. The results showed that the effect of noise in improving image sharpness perception is more evident in blurred images than in sharp images. This is consistent with the results of the preceding study. In another preceding study, we proposed "memory texture" to explain the preferred granularity of images, as a concept similar to "memory color" for preferred color reproduction. We observed individual differences in type of memory texture for each object, that is, white or 1/f noise. This study discusses the relationship between improvement of sharpness perception by adding noise, and the memory texture, following its individual differences. We found that memory texture is one of the elements that affect sharpness perception.

  20. Post-lumpectomy CT-guided tumor bed delineation for breast boost and partial breast irradiation: Can additional pre- and postoperative imaging reduce interobserver variability?

    PubMed Central

    DEN HARTOGH, MARISKA D.; PHILIPPENS, MARIELLE E.P.; VAN DAM, IRIS E.; KLEYNEN, CATHARINA E.; TERSTEEG, ROBBERT J.H.A.; KOTTE, ALEXIS N.T.J.; VAN VULPEN, MARCO; VAN ASSELEN, BRAM; VAN DEN BONGARD, DESIRÉE H.J.G.

    2015-01-01

    For breast boost radiotherapy or accelerated partial breast irradiation, the tumor bed (TB) is delineated by the radiation oncologist on a planning computed tomography (CT) scan. The aim of the present study was to investigate whether the interobserver variability (IOV) of the TB delineation is reduced by providing the radiation oncologist with additional magnetic resonance imaging (MRI) or CT scans. A total of 14 T1-T2 breast cancer patients underwent a standard planning CT in the supine treatment position following lumpectomy, as well as additional pre- and postoperative imaging in the same position. Post-lumpectomy TBs were independently delineated by four breast radiation oncologists on standard postoperative CT and on CT registered to an additional imaging modality. The additional imaging modalities used were postoperative MRI, preoperative contrast-enhanced (CE)-CT and preoperative CE-MRI. A cavity visualization score (CVS) was assigned to each standard postoperative CT by each observer. In addition, the conformity index (CI), volume and distance between centers of mass (dCOM) of the TB delineations were calculated. On CT, the median CI was 0.57, with a median volume of 22 cm3 and dCOM of 5.1 mm. The addition of postoperative MRI increased the median TB volume significantly to 28 cm3 (P<0.001), while the CI (P=0.176) and dCOM (P=0.110) were not affected. The addition of preoperative CT or MRI increased the TB volume to 26 and 25 cm3, respectively (both P<0.001), while the CI increased to 0.58 and 0.59 (both P<0.001) and the dCOM decreased to 4.7 mm (P=0.004) and 4.6 mm (P=0.001), respectively. In patients with CVS≤3, the median CI was 0.40 on CT, which was significantly increased by all additional imaging modalities, up to 0.52, and was accompanied by a median volume increase up to 6 cm3. In conclusion, the addition of postoperative MRI, preoperative CE-CT or preoperative CE-MRI did not result in a considerable reduction in the IOV in postoperative CT

  1. Analyzing Big Data with the Hybrid Interval Regression Methods

    PubMed Central

    Kao, Han-Ying

    2014-01-01

    Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM) was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes. PMID:25143968

  2. Analyzing big data with the hybrid interval regression methods.

    PubMed

    Huang, Chia-Hui; Yang, Keng-Chieh; Kao, Han-Ying

    2014-01-01

    Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM) was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes. PMID:25143968

  3. Big Data and the Future of Radiology Informatics.

    PubMed

    Kansagra, Akash P; Yu, John-Paul J; Chatterjee, Arindam R; Lenchik, Leon; Chow, Daniel S; Prater, Adam B; Yeh, Jean; Doshi, Ankur M; Hawkins, C Matthew; Heilbrun, Marta E; Smith, Stacy E; Oselkin, Martin; Gupta, Pushpender; Ali, Sayed

    2016-01-01

    Rapid growth in the amount of data that is electronically recorded as part of routine clinical operations has generated great interest in the use of Big Data methodologies to address clinical and research questions. These methods can efficiently analyze and deliver insights from high-volume, high-variety, and high-growth rate datasets generated across the continuum of care, thereby forgoing the time, cost, and effort of more focused and controlled hypothesis-driven research. By virtue of an existing robust information technology infrastructure and years of archived digital data, radiology departments are particularly well positioned to take advantage of emerging Big Data techniques. In this review, we describe four areas in which Big Data is poised to have an immediate impact on radiology practice, research, and operations. In addition, we provide an overview of the Big Data adoption cycle and describe how academic radiology departments can promote Big Data development. PMID:26683510

  4. Big3. Editorial

    PubMed Central

    Lehmann, Christoph U.; Séroussi, Brigitte; Jaulent, Marie-Christine

    2014-01-01

    Summary Objectives To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. Methods A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. Results ‘Big Data’ has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that ‘Big Data’ will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics – some to a higher degree than others. It was our goal to provide a comprehensive view at the state of ‘Big Data’ today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. Conclusions For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016. PMID:24853037

  5. Small turbines, big unknown

    SciTech Connect

    Gipe, P.

    1995-07-01

    While financial markets focus on the wheeling and dealing of the big wind companies, the small wind turbine industry quietly keeps churning out its smaller but effective machines. Some, the micro turbines, are so small they can be carried by hand. Though worldwide sales of small wind turbines fall far short of even one large windpower plant, figures reach $8 million to $10 million annually and could be as much as twice that if batteries and engineering services are included.

  6. The Next Big Idea

    PubMed Central

    2013-01-01

    Abstract George S. Eisenbarth will remain in our memories as a brilliant scientist and great collaborator. His quest to discover the cause and prevention of type 1 (autoimmune) diabetes started from building predictive models based on immunogenetic markers. Despite his tremendous contributions to our understanding of the natural history of pre-type 1 diabetes and potential mechanisms, George left us with several big questions to answer before his quest is completed. PMID:23786296

  7. DARPA's Big Mechanism program.

    PubMed

    Cohen, Paul R

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers. PMID:26178259

  8. DARPA's Big Mechanism program

    NASA Astrophysics Data System (ADS)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  9. A holographic big bang?

    NASA Astrophysics Data System (ADS)

    Afshordi, N.; Mann, R. B.; Pourhasan, R.

    2015-11-01

    We present a cosmological model in which the Universe emerges out of the collapse of a five-dimensional (5D) star as a spherical three-brane. The initial singularity of the big bang becomes hidden behind a causal horizon. Near scale-invariant primordial curvature perturbations can be induced on the brane via a thermal atmosphere that is in equilibrium with the brane, circumventing the need for a separate inflationary process and providing an important test of the model.

  10. Big Data Technologies

    PubMed Central

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  11. First Results from the Big Bear Solar Observatory's Digital Vectormagnetograph

    NASA Astrophysics Data System (ADS)

    Spirock, T. J.; Denker, C.; Chen, H.; Qui, J.; Goode, P. R.; Wang, H.

    2000-05-01

    During the past three years, the Big Bear Solar Observatory has begun an aggressive program to upgrade the observatory's instrumentation. In the forefront of this effort is the development of a highly sensitive, high cadence, filter based, digital vector magnetograph for the observatory's 10" vacuum-refractor to replace the old video magnetograph to improve our measurements of the FeI line at 6301A. The hardware is being replaced by a 512 x 512, 12-bit, 30 frames per second CCD camera and high quality polarization optics. In addition, software tools are being written to aid instrument development by quickly evaluating images (bias, cross talk, etc.) and to generate near real-time vector magnetograms, which will aid space weather forecasting and the support of space weather missions. Data acquisition, data calibration and flat fielding methods will be discussed and quiet sun and active region magnetograms will be presented.

  12. Long-Term Seeing Characteristics at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Denker, C.; Espinosa, O. D.; Nenow, J.; Marquette, W. H.

    2003-05-01

    We present observations of long-term seeing characteristics from June 1997 to September 2002 obtained with Seykora-type scintillometers at Big Bear Solar Observatory (BBSO). BBSO is an ideal site for ground-based campaign-style observations. Since BBSO is situated on a small island in a 2,000 m high mountain lake in the cloudless mountains of Souther California, it benefits from excellent seeing conditions all day long. The atmospheric turbulence that degrades images originates primarily from two layers near the ground and at the level of the jet stream. BBSO's dome is located at the end of a 300 m long causeway jutting into the lake. Since the lake, with its cool waters, provides a natural inversion, and the dome has three kilometers of open water to its west, the boundary layer seeing is effectively suppressed. In addition, the east-west orientation of the Big Bear Valley provides a natural channel for the prevailing winds from the west resulting in a nearly laminar flow at the observatory site. We present a comparison of scintillometer data with climate data and analyze a one year long sub-set for local seeing variations near the lake shore and at the observatory island. We would like to thank Jacques Beckers and the National Solar Observatory for providing the scintillometer data. This work was supported by NSF under grant ATM 00-86999, ATM 00-76602, and ATM 02-36945 and by NASA under grant NAG 5-9682.

  13. The additional value of an oblique image plane for MRI of the anterior and posterior distal tibiofibular syndesmosis

    PubMed Central

    Ginai, Abida Z.; Wentink, Noortje; Hop, Wim C. J.; Beumer, Annechien

    2010-01-01

    Objective The optimal MRI scan planes of collateral ligaments of the ankle have been described extensively, with the exception of the syndesmotic ligaments. We assessed the optimal scan plane for depicting the distal tibiofibular syndesmosis. Materials and Methods In order to determine the optimal oblique caudal-cranial and lateral-medial MRI scan plane, two fresh frozen cadaveric ankles were used. The angle of the scan plane that demonstrated the anterior and posterior distal tibiofibular ligament uninterrupted in their full length was determined. In a prospective study this oblique scan plane was then used in addition to the axial and coronal planes, for MRI scans of both ankles in 21 healthy volunteers. Two observers independently evaluated the anterior tibiofibular ligament (ATIFL) and posterior tibiofibular ligament (PTIFL) regarding the continuity of the individual fascicles, thickness and wavy contour of the ligaments in both the axial and the oblique plane. Kappa was calculated to determine the interobserver agreement. McNemar’s test was used to statistically quantify the significance of the two scan planes. Results In the axial plane the ATIFL was in 31% (13/42) partly and in 69% (29/42) completely discontinuous; in the oblique plane the ATIFL was continuous in 88% (37/42) and partly discontinuous in 12% (5/42). Compared with the axial plane, the oblique plane demonstrated significantly less discontinuity (p < 0.001), but not significantly less thickening (p = 1.00) or less wavy contour (p = 0.06) of the ATIFL. In the axial scan plane the PTIFL was continuous in 76% (32/42), partially discontinuous in 19% (8/42) and completely discontinuous in 5% (2/42); in the oblique plane the PTIFL was continuous in 100% (42/42). Compared with the axial plane, the oblique plane demonstrated significantly less discontinuity (p = 0.002), but not significantly less thickening (p = 1.00) or less wavy contour (p = 0.50) of the PTIFL. The

  14. The trashing of Big Green

    SciTech Connect

    Felten, E.

    1990-11-26

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature.

  15. How Big is Earth?

    NASA Astrophysics Data System (ADS)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  16. Low Rates of Additional Cancer Detection by Magnetic Resonance Imaging in Newly Diagnosed Breast Cancer Patients Who Undergo Preoperative Mammography and Ultrasonography

    PubMed Central

    Kim, Jisun; Han, Wonshik; Moon, Hyeong-Gon; Ahn, Soo Kyung; Shin, Hee-Chul; You, Jee-Man; Chang, Jung Min; Cho, Nariya; Moon, Woo Kyung; Park, In-Ae

    2014-01-01

    Purpose We evaluated the efficacy of breast magnetic resonance imaging (MRI) for detecting additional malignancies in breast cancer patients newly diagnosed by breast ultrasonography and mammography. Methods We retrospectively reviewed the records of 1,038 breast cancer patients who underwent preoperative mammography, bilateral breast ultrasonography, and subsequent breast MRI between August 2007 and December 2010 at single institution in Korea. MRI-detected additional lesions were defined as those lesions detected by breast MRI that were previously undetected by mammography and ultrasonography and which would otherwise have not been identified. Results Among the 1,038 cases, 228 additional lesions (22.0%) and 30 additional malignancies (2.9%) were detected by breast MRI. Of these 228 lesions, 109 were suspected to be malignant (Breast Imaging-Reporting and Data System category 4 or 5) on breast MRI and second-look ultrasonography and 30 were pathologically confirmed to be malignant (13.2%). Of these 30 lesions, 21 were ipsilateral to the main lesion and nine were contralateral. Fourteen lesions were in situ carcinomas and 16 were invasive carcinomas. The positive predictive value of breast MRI was 27.5% (30/109). No clinicopathological factors were significantly associated with additional malignant foci. Conclusion Breast MRI was useful in detecting additional malignancy in a small number of patients who underwent ultrasonography and mammography. PMID:25013439

  17. Dark radiation emerging after big bang nucleosynthesis?

    SciTech Connect

    Fischler, Willy; Meyers, Joel

    2011-03-15

    We show how recent data from observations of the cosmic microwave background may suggest the presence of additional radiation density which appeared after big bang nucleosynthesis. We propose a general scheme by which this radiation could be produced from the decay of nonrelativistic matter, we place constraints on the properties of such matter, and we give specific examples of scenarios in which this general scheme may be realized.

  18. The dropped big toe.

    PubMed

    Satku, K; Wee, J T; Kumar, V P; Ong, B; Pho, R W

    1992-03-01

    Surgical procedures for exposure of the upper third of the fibula have been known to cause weakness of the long extensor of the big toe post-operatively. The authors present three representative cases of surgically induced dropped big toe. From cadaveric dissection, an anatomic basis was found for this phenomenon. The tibialis anterior and extensor digitorum longus muscles have their origin at the proximal end of the leg and receive their first motor innervation from a branch that arises from the common peroneal or deep peroneal nerve at about the level of the neck of the fibula. However, the extensor hallucis longus muscle originates in the middle one-third of the leg and the nerves innervating this muscle run a long course in close proximity to the fibula for up to ten centimeters from a level below the neck of the fibula before entering the muscle. Surgical intervention in the proximal one-third of the fibula just distal to the origin of the first motor branch to the tibialis anterior and extensor digitorum longus muscles carries a risk of injury to the nerves innervating the extensor hallucis longus. PMID:1519891

  19. Optical image hiding based on computational ghost imaging

    NASA Astrophysics Data System (ADS)

    Wang, Le; Zhao, Shengmei; Cheng, Weiwen; Gong, Longyan; Chen, Hanwu

    2016-05-01

    Imaging hiding schemes play important roles in now big data times. They provide copyright protections of digital images. In the paper, we propose a novel image hiding scheme based on computational ghost imaging to have strong robustness and high security. The watermark is encrypted with the configuration of a computational ghost imaging system, and the random speckle patterns compose a secret key. Least significant bit algorithm is adopted to embed the watermark and both the second-order correlation algorithm and the compressed sensing (CS) algorithm are used to extract the watermark. The experimental and simulation results show that the authorized users can get the watermark with the secret key. The watermark image could not be retrieved when the eavesdropping ratio is less than 45% with the second-order correlation algorithm, whereas it is less than 20% with the TVAL3 CS reconstructed algorithm. In addition, the proposed scheme is robust against the 'salt and pepper' noise and image cropping degradations.

  20. Big Sisters: An Experimental Evaluation.

    ERIC Educational Resources Information Center

    Seidl, Fredrick W.

    1982-01-01

    Assessed the effects of participation in a Big Sisters' Program. The first part consisted of interviews (N=20) with pairs of Big Sisters-Little Sisters. The second part evaluated program effectiveness experimentally. Findings indicated positive relationships between pairs, and improved behavior of experimental girls versus controls. (RC)

  1. Think Big, Bigger ... and Smaller

    ERIC Educational Resources Information Center

    Nisbett, Richard E.

    2010-01-01

    One important principle of social psychology, writes Nisbett, is that some big-seeming interventions have little or no effect. This article discusses a number of cases from the field of education that confirm this principle. For example, Head Start seems like a big intervention, but research has indicated that its effects on academic achievement…

  2. The International Big History Association

    ERIC Educational Resources Information Center

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big History" is a…

  3. The Big Read: Case Studies

    ERIC Educational Resources Information Center

    National Endowment for the Arts, 2009

    2009-01-01

    The Big Read evaluation included a series of 35 case studies designed to gather more in-depth information on the program's implementation and impact. The case studies gave readers a valuable first-hand look at The Big Read in context. Both formal and informal interviews, focus groups, attendance at a wide range of events--all showed how…

  4. The Rise of Big Data in Neurorehabilitation.

    PubMed

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry. PMID:26882360

  5. Big-bounce genesis

    NASA Astrophysics Data System (ADS)

    Li, Changhong; Brandenberger, Robert H.; Cheung, Yeuk-Kwan E.

    2014-12-01

    We report on the possibility of using dark matter particle's mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the pre-bounce contraction and the post-bounce expansion epochs of the bounce universe reveals a new venue for achieving the observed relic abundance of our present universe, in which a significantly smaller amount of dark matter with a smaller cross section—as compared to the prediction of standard cosmology—is produced and the information about the bounce universe evolution is preserved by the out-of-thermal-equilibrium process. Once the value of dark matter mass and interaction cross section are obtained by direct detection in laboratories, this alternative route becomes a signature prediction of the bounce universe scenario.

  6. Big cat genomics.

    PubMed

    O'Brien, Stephen J; Johnson, Warren E

    2005-01-01

    Advances in population and quantitative genomics, aided by the computational algorithms that employ genetic theory and practice, are now being applied to biological questions that surround free-ranging species not traditionally suitable for genetic enquiry. Here we review how applications of molecular genetic tools have been used to describe the natural history, present status, and future disposition of wild cat species. Insight into phylogenetic hierarchy, demographic contractions, geographic population substructure, behavioral ecology, and infectious diseases have revealed strategies for survival and adaptation of these fascinating predators. Conservation, stabilization, and management of the big cats are important areas that derive benefit from the genome resources expanded and applied to highly successful species, imperiled by an expanding human population. PMID:16124868

  7. Towards Big Earth Data Analytics: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2013-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data whereby the term "coverage", according to ISO and OGC, is defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timeseries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The EarthServer initiative, funded by EU FP7 eInfrastructures, unites 11 partners from computer and earth sciences to establish Big Earth Data Analytics. One key ingredient is flexibility for users to ask what they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level query languages; these have proven tremendously successful on tabular and XML data, and we extend them with a central geo data structure, multi-dimensional arrays. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing code has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, an Array DBMS enabling efficient storage and retrieval of any-size, any-type multi-dimensional raster data. In the project, rasdaman is being extended with several functionality and scalability features, including: support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data

  8. Big bang and big crunch in matrix string theory

    SciTech Connect

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-04-15

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe.

  9. Big bang and big crunch in matrix string theory

    NASA Astrophysics Data System (ADS)

    Bedford, J.; Papageorgakis, C.; Rodríguez-Gómez, D.; Ward, J.

    2007-04-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe.

  10. Some experiences and opportunities for big data in translational research.

    PubMed

    Chute, Christopher G; Ullman-Cullere, Mollie; Wood, Grant M; Lin, Simon M; He, Min; Pathak, Jyotishman

    2013-10-01

    Health care has become increasingly information intensive. The advent of genomic data, integrated into patient care, significantly accelerates the complexity and amount of clinical data. Translational research in the present day increasingly embraces new biomedical discovery in this data-intensive world, thus entering the domain of "big data." The Electronic Medical Records and Genomics consortium has taught us many lessons, while simultaneously advances in commodity computing methods enable the academic community to affordably manage and process big data. Although great promise can emerge from the adoption of big data methods and philosophy, the heterogeneity and complexity of clinical data, in particular, pose additional challenges for big data inferencing and clinical application. However, the ultimate comparability and consistency of heterogeneous clinical information sources can be enhanced by existing and emerging data standards, which promise to bring order to clinical data chaos. Meaningful Use data standards in particular have already simplified the task of identifying clinical phenotyping patterns in electronic health records. PMID:24008998

  11. Big questions, big science: meeting the challenges of global ecology.

    PubMed

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects. PMID:25680334

  12. Synoptic Observing at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Denker, C.; Naqvi, M.; Deng, N.; Tritschler, A.; Marquette, W. H.

    2007-05-01

    Synoptic solar observations in the chromospheric absorption lines Ca II K and Hα have a long tradition at Big Bear Solar Observatory (BBSO). The advent of the New Solar Telescope (NST) will shift the focus of BBSO's synoptic observing program toward high-resolution observations. We present an overview of the telescopes and instrumentation and show some of the most recent results. This includes Ca II K data to track solar irradiance variations, Hα full-disk data to monitor eruptive events, Dopplergrams from two-dimensional spectroscopy, as well as image restorations of diffraction-limited quality.

  13. Homogeneous and isotropic big rips?

    SciTech Connect

    Giovannini, Massimo

    2005-10-15

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behavior is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  14. The challenges of big data

    PubMed Central

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  15. Big Data and Ambulatory Care

    PubMed Central

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2015-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an overview of the legal framework governing health information, dispels misconceptions about privacy regulations, and highlights how ambulatory care providers in particular can maximize the utility of big data to improve care. PMID:25401945

  16. The challenges of big data.

    PubMed

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  17. Rate Change Big Bang Theory

    NASA Astrophysics Data System (ADS)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  18. Big climate data analysis

    NASA Astrophysics Data System (ADS)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  19. The NOAA Big Data Project

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  20. Navigating a Sea of Big Data

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Chandler, C. L.; Groman, R. C.; Shepherd, A.; Allison, M. D.; Rauch, S.; Wiebe, P. H.; Glover, D. M.

    2014-12-01

    Oceanographic research is evolving rapidly. New technologies, strategies, and related infrastructures have catalyzed a change in the nature of oceanographic data. Heterogeneous and complex data types can be produced and transferred at great speeds. This shift in volume, variety, and velocity of data produced has led to increased challenges in managing these Big Data. In addition, distributed research communities have greater needs for data quality control, discovery and public accessibility, and seamless integration for interdisciplinary study. Organizations charged with curating oceanographic data must also evolve to meet these needs and challenges, by employing new technologies and strategies. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in 2006, to fulfill the data management needs of investigators funded by the NSF Ocean Sciences Biological and Chemical Sections and Polar Programs Antarctic Organisms and Ecosystems Program. Since its inception, the Office has had to modify internal systems and operations to address Big Data challenges to meet the needs of the ever-evolving oceanographic research community. Some enhancements include automated procedures replacing labor-intensive manual tasks, adoption of metadata standards facilitating machine client access, a geospatial interface and the use of Semantic Web technologies to increase data discovery and interoperability. This presentation will highlight some of the BCO-DMO advances that enable us to successfully fulfill our mission in a Big Data world.

  1. Big bang darkleosynthesis

    NASA Astrophysics Data System (ADS)

    Krnjaic, Gordan; Sigurdson, Kris

    2015-12-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV /dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S ≫ 3 / 2), whose discovery would be smoking gun evidence for dark nuclei.

  2. The BigBOSS Experiment

    SciTech Connect

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  3. Solid-phase synthesis of graphene quantum dots from the food additive citric acid under microwave irradiation and their use in live-cell imaging.

    PubMed

    Zhuang, Qianfen; Wang, Yong; Ni, Yongnian

    2016-05-01

    The work demonstrated that solid citric acid, one of the most common food additives, can be converted to graphene quantum dots (GQDs) under microwave heating. The as-prepared GQDs were further characterized by various analytical techniques like transmission electron microscopy, atomic force microscopy, X-ray diffraction, X-ray photoelectron spectroscopy, Fourier transform infrared spectroscopy, fluorescence and UV-visible spectroscopy. Cytotoxicity of the GQDs was evaluated using HeLa cells. The result showed that the GQDs almost did not exhibit cytotoxicity at concentrations as high as 1000 µg mL(-1) . In addition, it was found that the GQDs showed good solubility, excellent photostability, and excitation-dependent multicolor photoluminescence. Subsequently, the multicolor GQDs were successfully used as a fluorescence light-up probe for live-cell imaging. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26310294

  4. Challenges of Big Data Analysis.

    PubMed

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469

  5. Challenges of Big Data Analysis

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-01-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469

  6. The Economics of Big Area Addtiive Manufacturing

    SciTech Connect

    Post, Brian; Lloyd, Peter D; Lindahl, John; Lind, Randall F; Love, Lonnie J; Kunc, Vlastimil

    2016-01-01

    Case studies on the economics of Additive Manufacturing (AM) suggest that processing time is the dominant cost in manufacturing. Most additive processes have similar performance metrics: small part sizes, low production rates and expensive feedstocks. Big Area Additive Manufacturing is based on transitioning polymer extrusion technology from a wire to a pellet feedstock. Utilizing pellets significantly increases deposition speed and lowers material cost by utilizing low cost injection molding feedstock. The use of carbon fiber reinforced polymers eliminates the need for a heated chamber, significantly reducing machine power requirements and size constraints. We hypothesize that the increase in productivity coupled with decrease in feedstock and energy costs will enable AM to become more competitive with conventional manufacturing processes for many applications. As a test case, we compare the cost of using traditional fused deposition modeling (FDM) with BAAM for additively manufacturing composite tooling.

  7. The Big Bang Theory: What It Is, Where It Came From, and Why It Works

    NASA Astrophysics Data System (ADS)

    Fox, Karen C.

    2002-02-01

    A lively, accessible look at the Big Bang theory This compelling book describes how the Big Bang theory arose, how it has evolved, and why it is the best theory so far to explain the current state of the universe. In addition to understanding the birth of the cosmos, readers will learn how the theory stands up to challenges and what it fails to explain. Karen Fox provides clear answers to some of the hardest questions including: Why was the Big Bang theory accepted to begin with? Will the Big Bang theory last into the next century or even the next decade? Is the theory at odds with new scientific findings? One of the most well-known theories in modern science, the Big Bang is the most accurate model yet devised in humanity's tireless serach for the ultimate moment of creation. The Big Bang Theory is the first title in a planned series on the major theories of modern science.

  8. Is There an Additional Value of {sup 11}C-Choline PET-CT to T2-weighted MRI Images in the Localization of Intraprostatic Tumor Nodules?

    SciTech Connect

    Van den Bergh, Laura; Koole, Michel; Isebaert, Sofie; Joniau, Steven; Deroose, Christophe M.; Oyen, Raymond; Lerut, Evelyne; Budiharto, Tom; Mottaghy, Felix; Bormans, Guy; Van Poppel, Hendrik; Haustermans, Karin

    2012-08-01

    Purpose: To investigate the additional value of {sup 11}C-choline positron emission tomography (PET)-computed tomography (CT) to T2-weighted (T2w) magnetic resonance imaging (MRI) for localization of intraprostatic tumor nodules. Methods and Materials: Forty-nine prostate cancer patients underwent T2w MRI and {sup 11}C-choline PET-CT before radical prostatectomy and extended lymphadenectomy. Tumor regions were outlined on the whole-mount histopathology sections and on the T2w MR images. Tumor localization was recorded in the basal, middle, and apical part of the prostate by means of an octant grid. To analyze {sup 11}C-choline PET-CT images, the same grid was used to calculate the standardized uptake values (SUV) per octant, after rigid registration with the T2w MR images for anatomic reference. Results: In total, 1,176 octants were analyzed. Sensitivity, specificity, and accuracy of T2w MRI were 33.5%, 94.6%, and 70.2%, respectively. For {sup 11}C-choline PET-CT, the mean SUV{sub max} of malignant octants was significantly higher than the mean SUV{sub max} of benign octants (3.69 {+-} 1.29 vs. 3.06 {+-} 0.97, p < 0.0001) which was also true for mean SUV{sub mean} values (2.39 {+-} 0.77 vs. 1.94 {+-} 0.61, p < 0.0001). A positive correlation was observed between SUV{sub mean} and absolute tumor volume (Spearman r = 0.3003, p = 0.0362). No correlation was found between SUVs and prostate-specific antigen, T-stage or Gleason score. The highest accuracy (61.1%) was obtained with a SUV{sub max} cutoff of 2.70, resulting in a sensitivity of 77.4% and a specificity of 44.9%. When both modalities were combined (PET-CT or MRI positive), sensitivity levels increased as a function of SUV{sub max} but at the cost of specificity. When only considering suspect octants on {sup 11}C-choline PET-CT (SUV{sub max} {>=} 2.70) and T2w MRI, 84.7% of these segments were in agreement with the gold standard, compared with 80.5% for T2w MRI alone. Conclusions: The additional value of {sup

  9. Big Sky Carbon Sequestration Partnership

    SciTech Connect

    Susan Capalbo

    2005-12-31

    has significant potential to sequester large amounts of CO{sub 2}. Simulations conducted to evaluate mineral trapping potential of mafic volcanic rock formations located in the Idaho province suggest that supercritical CO{sub 2} is converted to solid carbonate mineral within a few hundred years and permanently entombs the carbon. Although MMV for this rock type may be challenging, a carefully chosen combination of geophysical and geochemical techniques should allow assessment of the fate of CO{sub 2} in deep basalt hosted aquifers. Terrestrial carbon sequestration relies on land management practices and technologies to remove atmospheric CO{sub 2} where it is stored in trees, plants, and soil. This indirect sequestration can be implemented today and is on the front line of voluntary, market-based approaches to reduce CO{sub 2} emissions. Initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil Carbon (C) on rangelands, and forested, agricultural, and reclaimed lands. Rangelands can store up to an additional 0.05 mt C/ha/yr, while the croplands are on average four times that amount. Estimates of technical potential for soil sequestration within the region in cropland are in the range of 2.0 M mt C/yr over 20 year time horizon. This is equivalent to approximately 7.0 M mt CO{sub 2}e/yr. The forestry sinks are well documented, and the potential in the Big Sky region ranges from 9-15 M mt CO{sub 2} equivalent per year. Value-added benefits include enhanced yields, reduced erosion, and increased wildlife habitat. Thus the terrestrial sinks provide a viable, environmentally beneficial, and relatively low cost sink that is available to sequester C in the current time frame. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts

  10. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-06-01

    soil C in the partnership region, and to design a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long term viability. Scientifically sound information on MMV is critical for public acceptance of these technologies. Two key deliverables were completed this quarter--a literature review/database to assess the soil carbon on rangelands, and the draft protocols, contracting options for soil carbon trading. To date, there has been little research on soil carbon on rangelands, and since rangeland constitutes a major land use in the Big Sky region, this is important in achieving a better understanding of terrestrial sinks. The protocols developed for soil carbon trading are unique and provide a key component of the mechanisms that might be used to efficiently sequester GHG and reduce CO{sub 2} concentrations. Progress on other deliverables is noted in the PowerPoint presentations. A series of meetings held during the second quarter have laid the foundations for assessing the issues surrounding the implementation of a market-based setting for soil C credits. These meetings provide a connection to stakeholders in the region and a basis on which to draw for the DOE PEIS hearings. Finally, the education and outreach efforts have resulted in a comprehensive plan and process which serves as a guide for implementing the outreach activities under Phase I. While we are still working on the public website, we have made many presentations to stakeholders and policy makers, connections to other federal and state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, we have laid plans for integration of our outreach efforts with the students, especially at the tribal colleges and at the universities involved in our partnership

  11. JPL Big Data Technologies for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Jones, Dayton L.; D'Addario, L. R.; De Jong, E. M.; Mattmann, C. A.; Rebbapragada, U. D.; Thompson, D. R.; Wagstaff, K.

    2014-04-01

    During the past three years the Jet Propulsion Laboratory has been working on several technologies to deal with big data challenges facing next-generation radio arrays, among other applications. This program has focused on the following four areas: 1) We are investigating high-level ASIC architectures that reduce power consumption for cross-correlation of data from large interferometer arrays by one to two orders of magnitude. The cost of operations for the Square Kilometre Array (SKA), which may be dominated by the cost of power for data processing, is a serious concern. A large improvement in correlator power efficiency could have a major positive impact. 2) Data-adaptive algorithms (machine learning) for real-time detection and classification of fast transient signals in high volume data streams are being developed and demonstrated. Studies of the dynamic universe, particularly searches for fast (<< 1 second) transient events, require that data be analyzed rapidly and with robust RFI rejection. JPL, in collaboration with the International Center for Radio Astronomy Research in Australia, has developed a fast transient search system for eventual deployment on ASKAP. In addition, a real-time transient detection experiment is now running continuously and commensally on NRAO's Very Long Baseline Array. 3) Scalable frameworks for data archiving, mining, and distribution are being applied to radio astronomy. A set of powerful open-source Object Oriented Data Technology (OODT) tools is now available through Apache. OODT was developed at JPL for Earth science data archives, but it is proving to be useful for radio astronomy, planetary science, health care, Earth climate, and other large-scale archives. 4) We are creating automated, event-driven data visualization tools that can be used to extract information from a wide range of complex data sets. Visualization of complex data can be improved through algorithms that detect events or features of interest and autonomously

  12. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany. PMID:26077871

  13. GEOSS: Addressing Big Data Challenges

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  14. Big Data: Astronomical or Genomical?

    PubMed Central

    Stephens, Zachary D.; Lee, Skylar Y.; Faghri, Faraz; Campbell, Roy H.; Zhai, Chengxiang; Efron, Miles J.; Iyer, Ravishankar; Schatz, Michael C.; Sinha, Saurabh; Robinson, Gene E.

    2015-01-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a “four-headed beast”—it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the “genomical” challenges of the next decade. PMID:26151137

  15. Big Data: Astronomical or Genomical?

    PubMed

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade. PMID:26151137

  16. Food additives

    MedlinePlus

    Food additives are substances that become part of a food product when they are added during the processing or making of that food. "Direct" food additives are often added during processing to: Add nutrients ...

  17. [Big Data- challenges and risks].

    PubMed

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented. PMID:26614539

  18. Big Sky Carbon Sequestration Partnership

    SciTech Connect

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  19. Little Science to Big Science: Big Scientists to Little Scientists?

    ERIC Educational Resources Information Center

    Simonton, Dean Keith

    2010-01-01

    This article presents the author's response to Hisham B. Ghassib's essay entitled "Where Does Creativity Fit into a Productivist Industrial Model of Knowledge Production?" Professor Ghassib's (2010) essay presents a provocative portrait of how the little science of the Babylonians, Greeks, and Arabs became the Big Science of the modern industrial…

  20. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  1. Rasdaman for Big Spatial Raster Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.

    2015-12-01

    Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.

  2. Baryon symmetric big-bang cosmology. [matter-antimatter symmetry

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1978-01-01

    The framework of baryon-symmetric big-bang cosmology offers the greatest potential for deducing the evolution of the universe as a consequence of physical laws and processes with the minimum number of arbitrary assumptions as to initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the universe and how galaxies and galaxy clusters are formed, and also provides the only acceptable explanation at present for the origin of the cosmic gamma ray background radiation.

  3. Tick-Borne Diseases: The Big Two

    MedlinePlus

    ... Feature: Ticks and Diseases Tick-borne Diseases: The Big Two Past Issues / Spring - Summer 2010 Table of ... to Remove a Tick / Tick-borne Diseases: The Big Two Spring / Summer 2010 Issue: Volume 5 Number ...

  4. Do Big Bottles Kickstart Infant Weight Issues?

    MedlinePlus

    ... nih.gov/medlineplus/news/fullstory_159241.html Do Big Bottles Kickstart Infant Weight Issues? Smaller baby bottles ... 2016 (HealthDay News) -- Feeding babies formula from a big bottle might put them at higher risk for ...

  5. Observational hints on the Big Bounce

    SciTech Connect

    Mielczarek, Jakub; Kurek, Aleksandra; Szydłowski, Marek; Kamionka, Michał E-mail: kamionka@astro.uni.wroc.pl E-mail: uoszydlo@cyf-kr.edu.pl

    2010-07-01

    In this paper we study possible observational consequences of the bouncing cosmology. We consider a model where a phase of inflation is preceded by a cosmic bounce. While we consider in this paper only that the bounce is due to loop quantum gravity, most of the results presented here can be applied for different bouncing cosmologies. We concentrate on the scenario where the scalar field, as the result of contraction of the universe, is driven from the bottom of the potential well. The field is amplified, and finally the phase of the standard slow-roll inflation is realized. Such an evolution modifies the standard inflationary spectrum of perturbations by the additional oscillations and damping on the large scales. We extract the parameters of the model from the observations of the cosmic microwave background radiation. In particular, the value of inflaton mass is equal to m = (1.7±0.6)·10{sup 13} GeV. In our considerations we base on the seven years of observations made by the WMAP satellite. We propose the new observational consistency check for the phase of slow-roll inflation. We investigate the conditions which have to be fulfilled to make the observations of the Big Bounce effects possible. We translate them to the requirements on the parameters of the model and then put the observational constraints on the model. Based on assumption usually made in loop quantum cosmology, the Barbero-Immirzi parameter was shown to be constrained by γ < 1100 from the cosmological observations. We have compared the Big Bounce model with the standard Big Bang scenario and showed that the present observational data is not informative enough to distinguish these models.

  6. Big Explosives Experimental Facility - BEEF

    SciTech Connect

    2014-10-31

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  7. The Case for "Big History."

    ERIC Educational Resources Information Center

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  8. Fossils of big bang turbulence

    NASA Astrophysics Data System (ADS)

    Gibson, C. H.

    2004-12-01

    A model is proposed connecting turbulence, fossil turbulence, and the big bang origin of the universe. While details are incomplete, the model is consistent with our knowledge of these processes and is supported by observations. Turbulence arises in a hot-big-bang quantum-gravitational-dynamics scenario at Planck scales. Chaotic, eddy-like-motions produce an exothermic Planck particle cascade from 10-35 m at 1032 K to 108 larger, 104 cooler, quark-gluon scales. A Planck-Kerr instability gives high-Reynolds-number (Re 106) turbulent combustion, space-time-energy-entropy and turbulent mixing. Batchelor-Obukhov-Corrsin turbulent-temperature fluctuations are preserved as the first fossil-turbulence by inflation stretching the patterns beyond the horizon ct of causal connection faster than light speed c in time t 10-33 seconds. Fossil-big-bang-temperature-turbulence re-enters the horizon and imprints nucleosynthesis of H-He densities that seed fragmentation by gravity at 1012 s in the low Reynolds number plasma before its transition to gas at t 1013 s and T 3000 K. Multi-scaling coefficients of the cosmic-microwave-background (CMB) temperature anisotropies closely match those for high Reynolds number turbulence, Bershadskii and Sreenivasan 2002, 2003. CMB spectra support the interpretation that big-bang-turbulence-fossils triggered fragmentation of the viscous plasma at supercluster to galaxy mass scales from 1046 to 1042 kg, Gibson 1996, 2000, 2004ab.

  9. Big6 Turbotools and Synthesis

    ERIC Educational Resources Information Center

    Tooley, Melinda

    2005-01-01

    The different tools that are helpful during the Synthesis stage, their role in boosting students abilities in Synthesis and the way in which it can be customized to meet the needs of each group of students are discussed. Big6 TurboTools offers several tools to help complete the task. In Synthesis stage, these same tools along with Turbo Report and…

  10. Big Explosives Experimental Facility - BEEF

    ScienceCinema

    None

    2015-01-07

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  11. 1976 Big Thompson flood, Colorado

    USGS Publications Warehouse

    Jarrett, R. D., (compiler); Vandas, S.J.

    2006-01-01

    In the early evening of July 31, 1976, a large stationary thunderstorm released as much as 7.5 inches of rainfall in about an hour (about 12 inches in a few hours) in the upper reaches of the Big Thompson River drainage. This large amount of rainfall in such a short period of time produced a flash flood that caught residents and tourists by surprise. The immense volume of water that churned down the narrow Big Thompson Canyon scoured the river channel and destroyed everything in its path, including 418 homes, 52 businesses, numerous bridges, paved and unpaved roads, power and telephone lines, and many other structures. The tragedy claimed the lives of 144 people. Scores of other people narrowly escaped with their lives. The Big Thompson flood ranks among the deadliest of Colorado's recorded floods. It is one of several destructive floods in the United States that has shown the necessity of conducting research to determine the causes and effects of floods. The U.S. Geological Survey (USGS) conducts research and operates a Nationwide streamgage network to help understand and predict the magnitude and likelihood of large streamflow events such as the Big Thompson Flood. Such research and streamgage information are part of an ongoing USGS effort to reduce flood hazards and to increase public awareness.

  12. China: Big Changes Coming Soon

    ERIC Educational Resources Information Center

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  13. Big Data for a Big Ocean at the NOAA National Oceanographic Data Center

    NASA Astrophysics Data System (ADS)

    Casey, K. S.

    2014-12-01

    Covering most of planet Earth, the vast, physically challenging ocean environment was once the sole domain of hardy, sea-going oceanographers. More recently, however, ocean observing systems have become more operational as well as more diverse. With observations coming from satellites, automated ship-based systems, autonomous underwater and airborne vehicles, in situ observing systems, and numerical models the field of oceanography is now clearly in the domain of Big Data. The NOAA National Oceanographic Data Center (NODC) and its partners around the world are addressing the entire range of Big Data issues for the ocean environment. A growing variety, volume, and velocity of incoming "Big Ocean" data streams are being managed through numerous approaches including the automated ingest and archive of incoming data; deployment of standardized, machine-consumable data discovery services; and interoperable data access, visualization, and subset mechanisms. In addition, support to the community of data producers to help them create more machine-ready ocean observation data streams is being provided and pilot projects to effectively incorporate commercial and hybrid cloud storage, access, and processing services into existing workflows and systems are being conducted. NODC is also engaging more actively than ever in the broader community of environmental data facilities to address these challenges. Details on these efforts at NODC and its partners will be provided and input sought on new and evolving user requirements.

  14. Big Dust Devils

    NASA Technical Reports Server (NTRS)

    2005-01-01

    28 January 2004 Northern Amazonis Planitia is famous for its frequent, large (> 1 km high) dust devils. They occur throughout the spring and summer seasons, and can be detected from orbit, even at the 240 meters (278 yards) per pixel resolution of the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide angle instruments. This red wide angle image shows a plethora of large dust devils. The arrow points to an example. Shadows cast by the towering columns of swirling dust point away from the direction of sunlight illumination (sun is coming from the left/lower left). This December 2004 scene covers an area more than 125 km (> 78 mi) across and is located near 37oN, 154oW.

  15. 3-T Breast Diffusion-Weighted MRI by Echo-Planar Imaging with Spectral Spatial Excitation or with Additional Spectral Inversion Recovery: An In Vivo Comparison of Image Quality

    PubMed Central

    Jacobsen, Megan C.; Dogan, Basak E.; Adrada, Beatriz E.; Plaxco, Jeri Sue; Wei, Wei; Son, Jong Bum; Hazle, John D.; Ma, Jingfei

    2015-01-01

    Objective To compare conventional DWI with spectral spatial excitation (cDWI) and an enhanced DWI with additional adiabatic spectral inversion recovery (eDWI) for 3T breast MRI. Methods Twenty-four patients were enrolled in the study with both cDWI and eDWI. Three breast radiologists scored cDWI and eDWI images of each patient for fat-suppression quality, geometric distortion, visibility of normal structure and biopsy-proven lesions, and overall image quality. SNR, CNR and ADC for evaluable tissues were measured. Statistical tests were performed for qualitative and quantitative comparisons. Results eDWI yielded significantly higher CNR and SNR on a lesion and higher glandular CNR and SNR, and muscle SNR on a patient basis. eDWI also yielded significantly higher qualitative scores in all categories. No significant difference was found in ADC values. Conclusion eDWI provided superior image quality and higher CNR and SNR on a lesion basis. eDWI can replace cDWI for 3T breast DWI. PMID:25695868

  16. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-09

    ... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... comprehensive conservation plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge...: r3planning@fws.gov . Include ``Big Stone Draft CCP/ EA'' in the subject line of the message. Fax:...

  17. Big Data: Implications for Health System Pharmacy.

    PubMed

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services. PMID:27559194

  18. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  19. Big Sagebrush Seed Bank Densities Following Wildfires

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big sagebrush (Artemisia sp.) is a critical shrub to such sagebrush obligate species as sage grouse, (Centocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush do not sprout after wildfires wildfires and big sagebrush seed is generally sho...

  20. Big City Education: Its Challenge to Governance.

    ERIC Educational Resources Information Center

    Haskew, Laurence D.

    This chapter traces the migration from farms to cities and the later movement from cities to suburbs and discusses the impact of the resulting big city environment on the governance of big city education. The author (1) suggests how local, State, and Federal governments can improve big city education; (2) discusses ways of planning for the future…

  1. A SWOT Analysis of Big Data

    ERIC Educational Resources Information Center

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  2. Judging Big Deals: Challenges, Outcomes, and Advice

    ERIC Educational Resources Information Center

    Glasser, Sarah

    2013-01-01

    This article reports the results of an analysis of five Big Deal electronic journal packages to which Hofstra University's Axinn Library subscribes. COUNTER usage reports were used to judge the value of each Big Deal. Limitations of usage statistics are also discussed. In the end, the author concludes that four of the five Big Deals are good…

  3. A Spectrograph for BigBOSS

    NASA Astrophysics Data System (ADS)

    CARTON, Pierre-Henri; Bebek, C.; Cazaux, S.; Ealet, A.; Eppelle, D.; Kneib, J.; Karst, P.; levi, M.; magneville, C.; Palanque-Delabrouille, N.; Ruhlmann-Kleider, V.; Schlegel, D.; Yeche, C.

    2012-01-01

    The Big-Boss spectrographs assembly will take in charge the light from the fiber output to the detector, including the optics, gratings, mechanics and cryostats. The 5000 fibers are split in 10 bundles of 500 ones. Each of these channel feed one spectrograph. The full bandwidth from 0.36µm to 1.05µm is split in 3 bands. Each channel is composed with one collimator (doublet lenses), a VPH grating, and a 6 lenses camera. The 500 fiber spectrum are imaged onto a 4kx4k detector thanks to the F/2 camera. Each fiber core is imaged onto 4 pixels. Each channel of the BigBOSS spectrograph will be equipped with a single-CCD camera, resulting in 30 cryostats in total for the instrument. Based on its experience of CCD cameras for projects like EROS and MegaCam, CEA/Saclay has designed small and autonomous cryogenic vessels which integrate cryo-cooling, CCD positioning and slow control interfacing capabilities. The use of a Linear Pulse Tube with its own control unit, both developed by Thales Cryogenics BV, will ensure versatility, reliability and operational flexibility. CCD's will be cooled down to 140K, with stability better than 1K. CCD's will be positioned within 15µm along the optical axis and 50µm in the XY Plan. Slow Control machines will be directly interfaced to an Ethernet network, which will allow them to be operated remotely. The concept of spectrograph leads to a very robust concept without any mechanics (except the shutters). This 30 channels has a impressive compactness with its 3m3 volume. The development of such number of channel will drive to a quasi mass production philosophy.

  4. Gravitational waves from the big bounce

    SciTech Connect

    Mielczarek, Jakub

    2008-11-15

    In this paper we investigate gravitational wave production during the big bounce phase, inspired by loop quantum cosmology. We consider the influence of the holonomy corrections to the equation for tensor modes. We show that they act like additional effective graviton mass, suppressing gravitational wave creation. However, such effects can be treated perturbatively. We investigate a simplified model without holonomy corrections to the equation for modes and find its exact analytical solution. Assuming the form for matter {rho}{proportional_to}a{sup -2} we calculate the full spectrum of the gravitational waves from the big bounce phase. The spectrum obtained decreases to zero for the low energy modes. On the basis of this observation we infer that this effect can lead to low cosmic microwave background (CMB) multipole suppression and gives a potential way for testing loop quantum cosmology models. We also consider a scenario with a post-bounce inflationary phase. The power spectrum obtained gives a qualitative explanation of the CMB spectra, including low multipole suppression.

  5. Big Data Issues under the Copernicus Programme

    NASA Astrophysics Data System (ADS)

    Schulte-Braucks, R. L.

    2014-12-01

    The Copernicus Programme of Earth observation satellites (http://copernicus.eu) will be affected by a growing volume of data and information. The first satellite (Sentinel 1A) has just been launched. Seven additional satellites are to be launched by the end of the decade. These will produce 8 TB of data per day, i.e. considerably more than can be downloaded via normal Internet connections.There is no definitive answer to the many challenges of big data but there are gradual solutions for Copernicus in view of the progressive roll out of the space infrastructure and the thematic services which the European Commission will develop. This presentation will present several approaches to the big data issue. It will start from the needs of the Copernicus users, which are far from being homogeneous. As their needs are different, the European Commission and ESA will have to propose different solutions to fulfil these needs, taking into account the present and future state of technology. The presentation will discuss these solutions, both with regard to a better use of the network and with regard to hosted processing.

  6. Big Bang Cosmic Titanic: Cause for Concern?

    NASA Astrophysics Data System (ADS)

    Gentry, Robert

    2013-04-01

    This abstract alerts physicists to a situation that, unless soon addressed, may yet affect PRL integrity. I refer to Stanley Brown's and DAE Robert Caldwell's rejection of PRL submission LJ12135, A Cosmic Titanic: Big Bang Cosmology Unravels Upon Discovery of Serious Flaws in Its Foundational Expansion Redshift Assumption, by their claim that BB is an established theory while ignoring our paper's Titanic, namely, that BB's foundational spacetime expansion redshifts assumption has now been proven to be irrefutably false because it is contradicted by our seminal discovery that GPS operation unequivocally proves that GR effects do not produce in-flight photon wavelength changes demanded by this central assumption. This discovery causes the big bang to collapse as quickly as did Ptolemaic cosmology when Copernicus discovered its foundational assumption was heliocentric, not geocentric. Additional evidence that something is amiss in PRL's treatment of LJ12135 comes from both Brown and EiC Gene Spouse agreeing to meet at my exhibit during last year's Atlanta APS to discuss this cover-up issue. Sprouse kept his commitment; Brown didn't. Question: If Brown could have refuted my claim of a cover-up, why didn't he come to present it before Gene Sprouse? I am appealing LJ12135's rejection.

  7. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    SciTech Connect

    2009-10-13

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  8. Solution of a braneworld big crunch/big bang cosmology

    SciTech Connect

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-11-15

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c){sup 2}. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios.

  9. Antigravity and the big crunch/big bang transition

    NASA Astrophysics Data System (ADS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  10. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    ScienceCinema

    None

    2011-04-25

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  11. Seismic reflection data imaging and interpretation from Braniewo2014 experiment using additional wide-angle refraction and reflection and well-logs data

    NASA Astrophysics Data System (ADS)

    Trzeciak, Maciej; Majdański, Mariusz; Białas, Sebastian; Gaczyński, Edward; Maksym, Andrzej

    2015-04-01

    Braniewo2014 reflection and refraction experiment was realized in cooperation between Polish Oil and Gas Company (PGNiG) and the Institute of Geophysics (IGF), Polish Academy of Sciences, near the locality of Braniewo in northern Poland. PGNiG realized a 20-km-long reflection profile, using vibroseis and dynamite shooting; the aim of the reflection survey was to characterise Silurian shale gas reservoir. IGF deployed 59 seismic stations along this profile and registered additional full-spread wide-angle refraction and reflection data, with offsets up to 12 km; maximum offsets from the seismic reflection survey was 3 km. To improve the velocity information two velocity logs from near deep boreholes were used. The main goal of the joint reflection-refraction interpretation was to find relations between velocity field from reflection velocity analysis and refraction tomography, and to build a velocity model which would be consistent for both, reflection and refraction, datasets. In this paper we present imaging results and velocity models from Braniewo2014 experiment and the methodology we used.

  12. Big Questions: Dark Matter

    ScienceCinema

    Lincoln, Don

    2014-08-07

    Carl Sagan's oft-quoted statement that there are "billions and billions" of stars in the cosmos gives an idea of just how much "stuff" is in the universe. However scientists now think that in addition to the type of matter with which we are familiar, there is another kind of matter out there. This new kind of matter is called "dark matter" and there seems to be five times as much as ordinary matter. Dark matter interacts only with gravity, thus light simply zips right by it. Scientists are searching through their data, trying to prove that the dark matter idea is real. Fermilab's Dr. Don Lincoln tells us why we think this seemingly-crazy idea might not be so crazy after all.

  13. Big Questions: Dark Matter

    SciTech Connect

    Lincoln, Don

    2013-12-05

    Carl Sagan's oft-quoted statement that there are "billions and billions" of stars in the cosmos gives an idea of just how much "stuff" is in the universe. However scientists now think that in addition to the type of matter with which we are familiar, there is another kind of matter out there. This new kind of matter is called "dark matter" and there seems to be five times as much as ordinary matter. Dark matter interacts only with gravity, thus light simply zips right by it. Scientists are searching through their data, trying to prove that the dark matter idea is real. Fermilab's Dr. Don Lincoln tells us why we think this seemingly-crazy idea might not be so crazy after all.

  14. Big Data in Medicine is Driving Big Changes

    PubMed Central

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  15. Food additives.

    PubMed

    Berglund, F

    1978-01-01

    The use of additives to food fulfils many purposes, as shown by the index issued by the Codex Committee on Food Additives: Acids, bases and salts; Preservatives, Antioxidants and antioxidant synergists; Anticaking agents; Colours; Emulfifiers; Thickening agents; Flour-treatment agents; Extraction solvents; Carrier solvents; Flavours (synthetic); Flavour enhancers; Non-nutritive sweeteners; Processing aids; Enzyme preparations. Many additives occur naturally in foods, but this does not exclude toxicity at higher levels. Some food additives are nutrients, or even essential nutritents, e.g. NaCl. Examples are known of food additives causing toxicity in man even when used according to regulations, e.g. cobalt in beer. In other instances, poisoning has been due to carry-over, e.g. by nitrate in cheese whey - when used for artificial feed for infants. Poisonings also occur as the result of the permitted substance being added at too high levels, by accident or carelessness, e.g. nitrite in fish. Finally, there are examples of hypersensitivity to food additives, e.g. to tartrazine and other food colours. The toxicological evaluation, based on animal feeding studies, may be complicated by impurities, e.g. orthotoluene-sulfonamide in saccharin; by transformation or disappearance of the additive in food processing in storage, e.g. bisulfite in raisins; by reaction products with food constituents, e.g. formation of ethylurethane from diethyl pyrocarbonate; by metabolic transformation products, e.g. formation in the gut of cyclohexylamine from cyclamate. Metabolic end products may differ in experimental animals and in man: guanylic acid and inosinic acid are metabolized to allantoin in the rat but to uric acid in man. The magnitude of the safety margin in man of the Acceptable Daily Intake (ADI) is not identical to the "safety factor" used when calculating the ADI. The symptoms of Chinese Restaurant Syndrome, although not hazardous, furthermore illustrate that the whole ADI

  16. Big data and ophthalmic research.

    PubMed

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research. PMID:26844660

  17. District Bets Big on Standards

    ERIC Educational Resources Information Center

    Gewertz, Catherine

    2013-01-01

    The big clock in Dowan McNair-Lee's 8th grade classroom in the Stuart-Hobson Middle School is silent, but she can hear the minutes ticking away nonetheless. On this day, like any other, the clock is a constant reminder of how little time she has to prepare her students--for spring tests, and for high school and all that lies beyond it. The…

  18. EHR Big Data Deep Phenotyping

    PubMed Central

    Lenert, L.; Lopez-Campos, G.

    2014-01-01

    Summary Objectives Given the quickening speed of discovery of variant disease drivers from combined patient genotype and phenotype data, the objective is to provide methodology using big data technology to support the definition of deep phenotypes in medical records. Methods As the vast stores of genomic information increase with next generation sequencing, the importance of deep phenotyping increases. The growth of genomic data and adoption of Electronic Health Records (EHR) in medicine provides a unique opportunity to integrate phenotype and genotype data into medical records. The method by which collections of clinical findings and other health related data are leveraged to form meaningful phenotypes is an active area of research. Longitudinal data stored in EHRs provide a wealth of information that can be used to construct phenotypes of patients. We focus on a practical problem around data integration for deep phenotype identification within EHR data. The use of big data approaches are described that enable scalable markup of EHR events that can be used for semantic and temporal similarity analysis to support the identification of phenotype and genotype relationships. Conclusions Stead and colleagues’ 2005 concept of using light standards to increase the productivity of software systems by riding on the wave of hardware/processing power is described as a harbinger for designing future healthcare systems. The big data solution, using flexible markup, provides a route to improved utilization of processing power for organizing patient records in genotype and phenotype research. PMID:25123744

  19. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  20. Phosphazene additives

    SciTech Connect

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  1. Turning big bang into big bounce. I. Classical dynamics

    SciTech Connect

    Dzierzak, Piotr; Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz

    2009-11-15

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  2. 77 FR 19262 - Procurement List; Additions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... INFORMATION: Additions On 12/23/2011 (76 FR 80346), 1/6/2012 (77 FR 780) and 2/3/2012 (77 FR 5495-5496), the... Guard, 2800 Airport Ave B, Bldg. 62, Big Sky Diner, Great Falls, MT. NPA: Skils'kin, Spokane,...

  3. Empowering Personalized Medicine with Big Data and Semantic Web Technology: Promises, Challenges, and Use Cases

    PubMed Central

    Panahiazar, Maryam; Taslimitehrani, Vahid; Jadhav, Ashutosh; Pathak, Jyotishman

    2015-01-01

    In healthcare, big data tools and technologies have the potential to create significant value by improving outcomes while lowering costs for each individual patient. Diagnostic images, genetic test results and biometric information are increasingly generated and stored in electronic health records presenting us with challenges in data that is by nature high volume, variety and velocity, thereby necessitating novel ways to store, manage and process big data. This presents an urgent need to develop new, scalable and expandable big data infrastructure and analytical methods that can enable healthcare providers access knowledge for the individual patient, yielding better decisions and outcomes. In this paper, we briefly discuss the nature of big data and the role of semantic web and data analysis for generating “smart data” which offer actionable information that supports better decision for personalized medicine. In our view, the biggest challenge is to create a system that makes big data robust and smart for healthcare providers and patients that can lead to more effective clinical decision-making, improved health outcomes, and ultimately, managing the healthcare costs. We highlight some of the challenges in using big data and propose the need for a semantic data-driven environment to address them. We illustrate our vision with practical use cases, and discuss a path for empowering personalized medicine using big data and semantic web technology. PMID:25705726

  4. Big Data” and the Electronic Health Record

    PubMed Central

    Ross, M. K.; Wei, Wei

    2014-01-01

    Summary Objectives Implementation of Electronic Health Record (EHR) systems continues to expand. The massive number of patient encounters results in high amounts of stored data. Transforming clinical data into knowledge to improve patient care has been the goal of biomedical informatics professionals for many decades, and this work is now increasingly recognized outside our field. In reviewing the literature for the past three years, we focus on “big data” in the context of EHR systems and we report on some examples of how secondary use of data has been put into practice. Methods We searched PubMed database for articles from January 1, 2011 to November 1, 2013. We initiated the search with keywords related to “big data” and EHR. We identified relevant articles and additional keywords from the retrieved articles were added. Based on the new keywords, more articles were retrieved and we manually narrowed down the set utilizing predefined inclusion and exclusion criteria. Results Our final review includes articles categorized into the themes of data mining (pharmacovigilance, phenotyping, natural language processing), data application and integration (clinical decision support, personal monitoring, social media), and privacy and security. Conclusion The increasing adoption of EHR systems worldwide makes it possible to capture large amounts of clinical data. There is an increasing number of articles addressing the theme of “big data”, and the concepts associated with these articles vary. The next step is to transform healthcare big data into actionable knowledge. PMID:25123728

  5. Some experiences and opportunities for big data in translational research

    PubMed Central

    Chute, Christopher G.; Ullman-Cullere, Mollie; Wood, Grant M.; Lin, Simon M.; He, Min; Pathak, Jyotishman

    2014-01-01

    Health care has become increasingly information intensive. The advent of genomic data, integrated into patient care, significantly accelerates the complexity and amount of clinical data. Translational research in the present day increasingly embraces new biomedical discovery in this data-intensive world, thus entering the domain of “big data.” The Electronic Medical Records and Genomics consortium has taught us many lessons, while simultaneously advances in commodity computing methods enable the academic community to affordably manage and process big data. Although great promise can emerge from the adoption of big data methods and philosophy, the heterogeneity and complexity of clinical data, in particular, pose additional challenges for big data inferencing and clinical application. However, the ultimate comparability and consistency of heterogeneous clinical information sources can be enhanced by existing and emerging data standards, which promise to bring order to clinical data chaos. Meaningful Use data standards in particular have already simplified the task of identifying clinical phenotyping patterns in electronic health records. PMID:24008998

  6. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    SciTech Connect

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  7. Native perennial forb variation between mountain big sagebrush and Wyoming big sagebrush plant communities

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big sagebrush (Artemisia tridentata Nutt.) occupies large portions of the western United States and provides valuable wildlife habitat. However, information is lacking quantifying differences in native perennial forb characteristics between mountain big sagebrush (A. tridentata spp. vaseyana (Rydb....

  8. Big.

    ERIC Educational Resources Information Center

    Everhart, Nancy; Everhart, Harry

    2000-01-01

    Presents information on the different types of data projectors, to help librarians make a smart selection for their library. Describes multimedia projectors; document cameras; large-screen televisions, monitors, and flat panels; electronic whiteboards, and identifies uses, manufacturers, and prices. Notes three important ways that data projectors…

  9. Big bang nucleosynthesis: An update

    SciTech Connect

    Olive, Keith A.

    2013-07-23

    An update on the standard model of big bang nucleosynthesis (BBN) is presented. With the value of the baryon-tophoton ratio determined to high precision by WMAP, standard BBN is a parameter-free theory. In this context, the theoretical prediction for the abundances of D, {sup 4}He, and {sup 7}Li is discussed and compared to their observational determination. While concordance for D and {sup 4}He is satisfactory, the prediction for {sup 7}Li exceeds the observational determination by a factor of about four. Possible solutions to this problem are discussed.

  10. Fitting ERGMs on big networks.

    PubMed

    An, Weihua

    2016-09-01

    The exponential random graph model (ERGM) has become a valuable tool for modeling social networks. In particular, ERGM provides great flexibility to account for both covariates effects on tie formations and endogenous network formation processes. However, there are both conceptual and computational issues for fitting ERGMs on big networks. This paper describes a framework and a series of methods (based on existent algorithms) to address these issues. It also outlines the advantages and disadvantages of the methods and the conditions to which they are most applicable. Selected methods are illustrated through examples. PMID:27480375

  11. The LHC's Next Big Mystery

    NASA Astrophysics Data System (ADS)

    Lincoln, Don

    2015-03-01

    When the sun rose over America on July 4, 2012, the world of science had radically changed. The Higgs boson had been discovered. Mind you, the press releases were more cautious than that, with "a new particle consistent with being the Higgs boson" being the carefully constructed phrase of the day. But, make no mistake, champagne corks were popped and backs were slapped. The data had spoken and a party was in order. Even if the observation turned out to be something other than the Higgs boson, the first big discovery from data taken at the Large Hadron Collider had been made.

  12. Traffic information computing platform for big data

    SciTech Connect

    Duan, Zongtao Li, Ying Zheng, Xibin Liu, Yan Dai, Jiting Kang, Jun

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  13. Urgent Call for Nursing Big Data.

    PubMed

    Delaney, Connie W

    2016-01-01

    The purpose of this panel is to expand internationally a National Action Plan for sharable and comparable nursing data for quality improvement and big data science. There is an urgent need to assure that nursing has sharable and comparable data for quality improvement and big data science. A national collaborative - Nursing Knowledge and Big Data Science includes multi-stakeholder groups focused on a National Action Plan toward implementing and using sharable and comparable nursing big data. Panelists will share accomplishments and future plans with an eye toward international collaboration. This presentation is suitable for any audience attending the NI2016 conference. PMID:27332330

  14. Big data: an introduction for librarians.

    PubMed

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included. PMID:25023020

  15. Image Gallery

    MedlinePlus

    ... R S T U V W X Y Z Image Gallery Share: The Image Gallery contains high-quality digital photographs available from ... Select a category below to view additional thumbnail images. Images are available for direct download in 2 ...

  16. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    ERIC Educational Resources Information Center

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  17. Image Data Mining for Pattern Classification and Visualization of Morphological Changes in Brain MR Images.

    PubMed

    Murakawa, Saki; Ikuta, Rie; Uchiyama, Yoshikazu; Shiraishi, Junji

    2016-02-01

    Hospital information systems (HISs) and picture archiving and communication systems (PACSs) are archiving large amounts of data (i.e., "big data") that are not being used. Therefore, many research projects in progress are trying to use "big data" for the development of early diagnosis, prediction of disease onset, and personalized therapies. In this study, we propose a new method for image data mining to identify regularities and abnormalities in the large image data sets. We used 70 archived magnetic resonance (MR) images that were acquired using three-dimensional magnetization-prepared rapid acquisition with gradient echo (3D MP-RAGE). These images were obtained from the Alzheimer's disease neuroimaging initiative (ADNI) database. For anatomical standardization of the data, we used the statistical parametric mapping (SPM) software. Using a similarity matrix based on cross-correlation coefficients (CCs) calculated from an anatomical region and a hierarchical clustering technique, we classified all the abnormal cases into five groups. The Z score map identified the difference between a standard normal brain and each of those from the Alzheimer's groups. In addition, the scatter plot obtained from two similarity matrixes visualized the regularities and abnormalities in the image data sets. Image features identified using our method could be useful for understanding of image findings associated with Alzheimer's disease. PMID:26902379

  18. Fires Burning near Big Sur, California

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Fires near Big Sur, Calif., continued to burn unchecked when the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra satellite captured this image on Sunday, June 29. In Northern California alone, fires have consumed more than 346,000 acres.At least 18,000 people have deployed to attempt to extinguish or control the flames. Air quality as far away as San Francisco has been adversely impacted by the dense clouds of smoke and ash blowing towards the northwest. The satellite image combines a natural color portrayal of the landscape with thermal infrared data showing the active burning areas in red. The dark area in the lower right is a previous forest fire.

    ASTER is one of five Earth-observing instruments launched December 18, 1999, on NASA's Terra satellite. The instrument was built by Japan's Ministry of Economy, Trade and Industry. A joint U.S./Japan science team is responsible for validation and calibration of the instrument and the data products.

    The broad spectral coverage and high spectral resolution of ASTER provides scientists in numerous disciplines with critical information for surface mapping, and monitoring of dynamic conditions and temporal change. Example applications are: monitoring glacial advances and retreats; monitoring potentially active volcanoes; identifying crop stress; determining cloud morphology and physical properties; wetlands evaluation; thermal pollution monitoring; coral reef degradation; surface temperature mapping of soils and geology; and measuring surface heat balance.

    The U.S. science team is located at NASA's Jet Propulsion Laboratory, Pasadena, Calif. The Terra mission is part of NASA's Science Mission Directorate.

    Size: 35.4 by 57 kilometers (21.9 by 34.2 miles) Location: 36.1 degrees North latitude, 121.6 degrees West longitude Orientation: North at top Image Data: ASTER bands 3, 2, and 1 Original Data Resolution: 15 meters (49 feet) Dates Acquired: June 29

  19. Progress on the Big Optical Array (BOA)

    NASA Astrophysics Data System (ADS)

    Armstrong, John T.

    1994-06-01

    The Navy Prototype Optical Interferometer (NPOI) is nearing the completion of the first phase of construction at the Lowell Observatory on Anderson Mesa, AZ. The NPOI comprises two sub- arrays, the Big Optical Array (BOA) and the USNO Astrometric Interferometer (AI), which share delay lines, the optics laboratory, the control system, and parts of the feed optics. We describe the design of and progress on the BOA, the imaging component of the NPOI. The AI is described elsewhere (Hutter, these proceedings). As of the date of this symposium, most of the civil engineering is complete, including the control and laboratory buildings and the concrete piers for the initial array. Three AI siderostats and associated feed pipes, three delay lines, the initial three-way beam combiner, and much of the control system are in place. First fringes are anticipated in April. By the end of 1994, four AI and two BOA siderostats, as well as three more delay lines, will be installed, making imaging with all six siderostats possible. The complete BOA will consist of six 50 cm siderostats and 30 siderostat stations in a Y with 251 m arms, with baseline lengths from 4 m to 437 m. Nearly redundant baseline lengths will allow fringe tracking on long baselines on which the visibilities are too low for detection in real time. A six-way beam combiner (Mozurkewich, these proceedings) will allow simultaneous measurements of 15 visibilities and nine of 10 independent closure phases. The output beams will feed 32-channel spectrometers covering the range from 450 to 900 nm. We anticipate tracking fringes on stars brighter than 10(superscript m), imaging surfaces of stars brighter than 4(superscript m), measuring stellar diameters to 0.18 milliarcsec (mas), and measuring binary orbits with major axes as small as 0.4 mas.

  20. Untapped Potential: Fulfilling the Promise of Big Brothers Big Sisters and the Bigs and Littles They Represent

    ERIC Educational Resources Information Center

    Bridgeland, John M.; Moore, Laura A.

    2010-01-01

    American children represent a great untapped potential in our country. For many young people, choices are limited and the goal of a productive adulthood is a remote one. This report paints a picture of who these children are, shares their insights and reflections about the barriers they face, and offers ways forward for Big Brothers Big Sisters as…

  1. Baryon symmetric big bang cosmology

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  2. Three dimensional simulation for Big Hill Strategic Petroleum Reserve (SPR).

    SciTech Connect

    Ehgartner, Brian L.; Park, Byoung Yoon; Sobolik, Steven Ronald; Lee, Moo Yul

    2005-07-01

    3-D finite element analyses were performed to evaluate the structural integrity of caverns located at the Strategic Petroleum Reserve's Big Hill site. State-of-art analyses simulated the current site configuration and considered additional caverns. The addition of 5 caverns to account for a full site and a full dome containing 31 caverns were modeled. Operations including both normal and cavern workover pressures and cavern enlargement due to leaching were modeled to account for as many as 5 future oil drawdowns. Under the modeled conditions, caverns were placed very close to the edge of the salt dome. The web of salt separating the caverns and the web of salt between the caverns and edge of the salt dome were reduced due to leaching. The impacts on cavern stability, underground creep closure, surface subsidence and infrastructure, and well integrity were quantified. The analyses included recently derived damage criterion obtained from testing of Big Hill salt cores. The results show that from a structural view point, many additional caverns can be safely added to Big Hill.

  3. A systematic review of image segmentation methodology, used in the additive manufacture of patient-specific 3D printed models of the cardiovascular system

    PubMed Central

    Byrne, N; Velasco Forte, M; Tandon, A; Valverde, I

    2016-01-01

    Background Shortcomings in existing methods of image segmentation preclude the widespread adoption of patient-specific 3D printing as a routine decision-making tool in the care of those with congenital heart disease. We sought to determine the range of cardiovascular segmentation methods and how long each of these methods takes. Methods A systematic review of literature was undertaken. Medical imaging modality, segmentation methods, segmentation time, segmentation descriptive quality (SDQ) and segmentation software were recorded. Results Totally 136 studies met the inclusion criteria (1 clinical trial; 80 journal articles; 55 conference, technical and case reports). The most frequently used image segmentation methods were brightness thresholding, region growing and manual editing, as supported by the most popular piece of proprietary software: Mimics (Materialise NV, Leuven, Belgium, 1992–2015). The use of bespoke software developed by individual authors was not uncommon. SDQ indicated that reporting of image segmentation methods was generally poor with only one in three accounts providing sufficient detail for their procedure to be reproduced. Conclusions and implication of key findings Predominantly anecdotal and case reporting precluded rigorous assessment of risk of bias and strength of evidence. This review finds a reliance on manual and semi-automated segmentation methods which demand a high level of expertise and a significant time commitment on the part of the operator. In light of the findings, we have made recommendations regarding reporting of 3D printing studies. We anticipate that these findings will encourage the development of advanced image segmentation methods. PMID:27170842

  4. Estimation of aerosol optical depth and additional atmospheric parameters for the calculation of apparent reflectance from radiance measured by the Airborne Visible/Infrared Imaging Spectrometer

    NASA Technical Reports Server (NTRS)

    Green, Robert O.; Conel, James E.; Roberts, Dar A.

    1993-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures spatial images of the total upwelling spectral radiance from 400 to 2500 nm through 10 nm spectral channels. Quantitative research and application objectives for surface investigations require inversion of the measured radiance of surface reflectance or surface leaving radiance. To calculate apparent surface reflectance, estimates of atmospheric water vapor abundance, cirrus cloud effects, surface pressure elevation, and aerosol optical depth are required. Algorithms for the estimation of these atmospheric parameters from the AVIRIS data themselves are described. From these atmospheric parameters we show an example of the calculation of apparent surface reflectance from the AVIRIS-measured radiance using a radiative transfer code.

  5. MR Imaging as an Additional Screening Modality for the Detection of Breast Cancer in Women Aged 50-75 Years with Extremely Dense Breasts: The DENSE Trial Study Design.

    PubMed

    Emaus, Marleen J; Bakker, Marije F; Peeters, Petra H M; Loo, Claudette E; Mann, Ritse M; de Jong, Mathijn D F; Bisschops, Robertus H C; Veltman, Jeroen; Duvivier, Katya M; Lobbes, Marc B I; Pijnappel, Ruud M; Karssemeijer, Nico; de Koning, Harry J; van den Bosch, Maurice A A J; Monninkhof, Evelyn M; Mali, Willem P Th M; Veldhuis, Wouter B; van Gils, Carla H

    2015-11-01

    Women with extremely dense breasts have an increased risk of breast cancer and lower mammographic tumor detectability. Nevertheless, in most countries, these women are currently screened with mammography only. Magnetic resonance (MR) imaging has the potential to improve breast cancer detection at an early stage because of its higher sensitivity. However, MR imaging is more expensive and is expected to be accompanied by an increase in the number of false-positive results and, possibly, an increase in overdiagnosis. To study the additional value of MR imaging, a randomized controlled trial (RCT) design is needed in which one group undergoes mammography and the other group undergoes mammography and MR imaging. With this design, it is possible to determine the proportion of interval cancers within each study arm. For this to be an effective screening strategy, the additional cancers detected at MR imaging screening must be accompanied by a subsequent reduction in interval cancers. The Dense Tissue and Early Breast Neoplasm Screening, or DENSE, trial is a multicenter RCT performed in the Dutch biennial population-based screening program (subject age range, 50-75 years). The study was approved by the Dutch Minister of Health, Welfare and Sport. In this study, mammographic density is measured by using a fully automated volumetric method. Participants with extremely dense breasts (American College of Radiology breast density category 4) and a negative result at mammography (Breast Imaging Recording and Data System category 1 or 2) are randomly assigned to undergo additional MR imaging (n = 7237) or to be treated according to current practice (n = 28 948). Participants provide written informed consent before the MR imaging examination, which consists of dynamic breast MR imaging with gadolinium-based contrast medium and is intended to be performed for three consecutive screening rounds. The primary outcome is the difference in the proportions of interval cancers between the

  6. Results from the Big Bear Solar Observatory's New Digital Vector Magnetograph

    NASA Astrophysics Data System (ADS)

    Spirock, T. J.; Denker, C.; Varsik, J.; Shumko, S.; Qiu, J.; Gallagher, P.; Chae, J.; Goode, P.; Wang, H.

    2001-05-01

    During the past several years the Big Bear Solar Observatory has been involved in an aggressive program to modernize the observatory's instrumentation. At the forefront of this effort has been the upgrade of the observatory's digital vector magnetograph (DVMG), which has been recently integrated into the observatory's daily observing program. The DVMG, which is mounted on the observatory's 25 cm vacuum refractor, is a highly sensitive, high cadence magnetograph which studies the FeI line at 630.1 nm. An easy to use GUI observing tool has been written to aid instrument development and data acquisition. This tool automatically calibrates the data and generates near real-time vector magnetograms which will aid space weather forecasting and the support of space weather missions. Also, our plan is to integrate the DVMG data into the HESSI Synoptic Archive. The very sensitive quiet Sun magnetograms, produced by the DVMG, will aid the study of small scale magnetic reconnection at the intranetwork level and its possible contribution to the coronal heating problem. Quiet sun longitudinal and active region vector magnetograms will be presented. Image quality, such as bias, cross-talk, noise levels and sensitivity, will be discussed in addition to the improvements gained in post processing such as image selection and image alignment.

  7. An embedding for the big bang

    NASA Technical Reports Server (NTRS)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  8. What is beyond the big five?

    PubMed

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined. PMID:9728415

  9. Structuring the Curriculum around Big Ideas

    ERIC Educational Resources Information Center

    Alleman, Janet; Knighton, Barbara; Brophy, Jere

    2010-01-01

    This article provides an inside look at Barbara Knighton's classroom teaching. She uses big ideas to guide her planning and instruction and gives other teachers suggestions for adopting the big idea approach and ways for making the approach easier. This article also represents a "small slice" of a dozen years of collaborative research,…

  10. Big system: Interactive graphics for the engineer

    NASA Technical Reports Server (NTRS)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  11. Efficiency, Corporate Power, and the Bigness Complex.

    ERIC Educational Resources Information Center

    Adams, Walter; Brock, James W.

    1990-01-01

    Concludes that (1) the current infatuation with corporate bigness is void of credible empirical support; (2) disproportionate corporate size and industry concentration are incompatible with and destructive to good economic performance; and (3) structurally oriented antitrust policy must be revitalized to combat the burdens of corporate bigness.…

  12. A New Look at Big History

    ERIC Educational Resources Information Center

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  13. In Search of the Big Bubble

    ERIC Educational Resources Information Center

    Simoson, Andrew; Wentzky, Bethany

    2011-01-01

    Freely rising air bubbles in water sometimes assume the shape of a spherical cap, a shape also known as the "big bubble". Is it possible to find some objective function involving a combination of a bubble's attributes for which the big bubble is the optimal shape? Following the basic idea of the definite integral, we define a bubble's surface as…

  14. 2. Big Creek Road, worm fence and road at trailhead. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Big Creek Road, worm fence and road at trailhead. - Great Smoky Mountains National Park Roads & Bridges, Big Creek Road, Between State Route 284 & Big Creek Campground, Gatlinburg, Sevier County, TN

  15. Big sagebrush transplanting success in crested wheatgrass stands

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The conversion of formerly big sagebrush (Artemisia tridentate ssp. wyomingensis)/bunchgrass communities to annual grass dominance, primarily cheatgrass (Bromus tectorum), in Wyoming big sagebrush ecosystems has sparked the increasing demand to establish big sagebrush on disturbed rangelands. The e...

  16. Old Big Oak Flat Road at intersection with New Tioga ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Old Big Oak Flat Road at intersection with New Tioga Road. Note gate for road to Tamarack Campground - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  17. View of Old Big Oak Flat Road in Talus Slope. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Old Big Oak Flat Road in Talus Slope. Bridal Veil Falls at center distance. Looking east - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  18. Bioimage Informatics for Big Data.

    PubMed

    Peng, Hanchuan; Zhou, Jie; Zhou, Zhi; Bria, Alessandro; Li, Yujie; Kleissas, Dean Mark; Drenkow, Nathan G; Long, Brian; Liu, Xiaoxiao; Chen, Hanbo

    2016-01-01

    Bioimage informatics is a field wherein high-throughput image informatics methods are used to solve challenging scientific problems related to biology and medicine. When the image datasets become larger and more complicated, many conventional image analysis approaches are no longer applicable. Here, we discuss two critical challenges of large-scale bioimage informatics applications, namely, data accessibility and adaptive data analysis. We highlight case studies to show that these challenges can be tackled based on distributed image computing as well as machine learning of image examples in a multidimensional environment. PMID:27207370

  19. Epidemiology in the Era of Big Data

    PubMed Central

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  20. Big Data Application in Biomedical Research and Health Care: A Literature Review

    PubMed Central

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care. PMID:26843812

  1. Big Data Application in Biomedical Research and Health Care: A Literature Review.

    PubMed

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care. PMID:26843812

  2. Was the Big Bang hot?

    NASA Technical Reports Server (NTRS)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  3. Exploring Relationships in Big Data

    NASA Astrophysics Data System (ADS)

    Mahabal, A.; Djorgovski, S. G.; Crichton, D. J.; Cinquini, L.; Kelly, S.; Colbert, M. A.; Kincaid, H.

    2015-12-01

    Big Data are characterized by several different 'V's. Volume, Veracity, Volatility, Value and so on. For many datasets inflated Volumes through redundant features often make the data more noisy and difficult to extract Value out of. This is especially true if one is comparing/combining different datasets, and the metadata are diverse. We have been exploring ways to exploit such datasets through a variety of statistical machinery, and visualization. We show how we have applied it to time-series from large astronomical sky-surveys. This was done in the Virtual Observatory framework. More recently we have been doing similar work for a completely different domain viz. biology/cancer. The methodology reuse involves application to diverse datasets gathered through the various centers associated with the Early Detection Research Network (EDRN) for cancer, an initiative of the National Cancer Institute (NCI). Application to Geo datasets is a natural extension.

  4. Big Mysteries: The Higgs Mass

    SciTech Connect

    Lincoln, Don

    2014-04-28

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  5. Big Bang nucleosynthesis in crisis\\?

    NASA Astrophysics Data System (ADS)

    Hata, N.; Scherrer, R. J.; Steigman, G.; Thomas, D.; Walker, T. P.; Bludman, S.; Langacker, P.

    1995-11-01

    A new evaluation of the constraint on the number of light neutrino species (Nν) from big bang nucleosynthesis suggests a discrepancy between the predicted light element abundances and those inferred from observations, unless the inferred primordial 4He abundance has been underestimated by 0.014+/-0.004 (1σ) or less than 10% (95% C.L.) of 3He survives stellar processing. With the quoted systematic errors in the observed abundances and a conservative chemical evolution parametrization, the best fit to the combined data is Nν=2.1+/-0.3 (1σ) and the upper limit is Nν<2.6 (95% C.L.). The data are inconsistent with the standard model (Nν=3) at the 98.6% C.L.

  6. Evidence of the big fix

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2014-06-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value vh. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self-coupling are fixed when we vary vh. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental law in our case.

  7. Big Mysteries: The Higgs Mass

    ScienceCinema

    Lincoln, Don

    2014-06-03

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  8. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods

  9. Microsystems - The next big thing

    SciTech Connect

    STINNETT,REGAN W.

    2000-05-11

    Micro-Electro-Mechanical Systems (MEMS) is a big name for tiny devices that will soon make big changes in everyday life and the workplace. These and other types of Microsystems range in size from a few millimeters to a few microns, much smaller than a human hair. These Microsystems have the capability to enable new ways to solve problems in commercial applications ranging from automotive, aerospace, telecommunications, manufacturing equipment, medical diagnostics to robotics, and in national security applications such as nuclear weapons safety and security, battlefield intelligence, and protection against chemical and biological weapons. This broad range of applications of Microsystems reflects the broad capabilities of future Microsystems to provide the ability to sense, think, act, and communicate, all in a single integrated package. Microsystems have been called the next silicon revolution, but like many revolutions, they incorporate more elements than their predecessors. Microsystems do include MEMS components fabricated from polycrystalline silicon processed using techniques similar to those used in the manufacture of integrated electrical circuits. They also include optoelectronic components made from gallium arsenide and other semiconducting compounds from the III-V groups of the periodic table. Microsystems components are also being made from pure metals and metal alloys using the LIGA process, which utilizes lithography, etching, and casting at the micron scale. Generically, Microsystems are micron scale, integrated systems that have the potential to combine the ability to sense light, heat, pressure, acceleration, vibration, and chemicals with the ability to process the collected data using CMOS circuitry, execute an electrical, mechanical, or photonic response, and communicate either optically or with microwaves.

  10. Image

    2007-08-31

    The computer side of the IMAGE project consists of a collection of Perl scripts that perform a variety of tasks; scripts are available to insert, update and delete data from the underlying Oracle database, download data from NCBI's Genbank and other sources, and generate data files for download by interested parties. Web scripts make up the tracking interface, and various tools available on the project web-site (image.llnl.gov) that provide a search interface to the database.

  11. Sub-meter desiccation crack patterns imaged by Curiosity at Gale Crater on Mars shed additional light on former lakes evident from examined outcrops

    NASA Astrophysics Data System (ADS)

    Hallet, B.; Sletten, R. S.; Mangold, N.; Oehler, D. Z.; Williams, R. M. E.; Bish, D. L.; Heydari, E.; Rubin, D. M.; Rowland, S. K.

    2015-12-01

    Small-scale desiccation crack patterns (mudcrack-like arrays of uniform ~0.1 to 1 m polygonal domains separated by linear or curving cracks in exposed bedding) imaged by Curiosity in Gale Crater, Mars complement a wealth of diverse data obtained from exposures of sedimentary rocks that point to deposition "in fluvial, deltaic, and lacustrine environments" including an "intracrater lake system likely [to have] existed intermittently for thousands to millions of years …"(e.g. Grotzinger et al., 2015, Science, submitted). We interpret these mudcrack-like patterns, found on many of the bedrock exposures imaged by Curiosity, as desiccation cracks that developed either of two ways: 1) at the soft sediment-air interface like common mudcracks, or 2) at or below the sediment-water interface by synaeresis or diastasis (involving differential compaction). In the context of recent studies of terrestrial mudcracks, and cracks formed experimentally in various wet powders as they loose moisture, these desiccation features reflect diverse aspects of the formative environment. If they formed as mudcracks, some of the lakes were shallow enough to permit the recurrent drying and wetting that can lead to the geometric regularity characteristic of several of sets of mudcracks. Moreover, the water likely contained little suspended sediment otherwise the mudcracks would be buried too rapidly for the crack pattern to persist and to mature into regular polygonal patterns. The preservation of these desiccation crack patterns does not require, but does not exclude, deep burial and exhumation. Although invisible from satellite because of their size, a multitude of Mastcam and Navcam images reveals these informative features in considerable detail. These images complement much evidence, mostly from HiRISE data from several regions, suggesting that potential desiccation polygons on larger scales may be more common on the surface of Mars than generally recognized.

  12. Big, Dark Dunes Northeast of Syrtis Major

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Big sand dunes! Mars is home to some very large, windblown dunes. The dunes shown here rise to almost 100 meters (275 feet) at their crests. Unlike dunes on Earth, the larger dunes of Mars are composed of dark, rather than light grains. This is probably related to the composition of the sand, since different materials will have different brightnesses. For example, beaches on the island of Oahu in Hawaii are light colored because they consist of ground-up particles of seashells, while beaches in the southern shores of the island of Hawaii (the 'Big Island' in the Hawaiian island chain) are dark because they consist of sand derived from dark lava rock.

    The dunes in this picture taken by the Mars Orbiter Camera (MOC) are located on the floor of an old, 72 km-(45 mi)-diameter crater located northeast of Syrtis Major. The sand is being blown from the upper right toward the lower left. The surface that the dunes have been travelling across is pitted and cratered. The substrate is also hard and bright--i.e., it is composed of a material of different composition than the sand in the dunes. The dark streaks on the dune surfaces area puzzle...at first glance one might conclude they are the result of holiday visitors with off-road vehicles. However, the streaks more likely result from passing dust devils or wind gusts that disturb the sand surface just enough to leave a streak. The image shown here covers an area approximately 2.6 km (1.6 mi) wide, and is illuminated from the lower right.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  13. Optical design of high-order adaptive optics for the NSO Dunn Solar Telescope and the Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Ren, Deqing; Hegwer, Steven L.; Rimmele, Thomas; Didkovsky, Leonid V.; Goode, Philip R.

    2003-02-01

    The National Solar Observatory (NSO) and the New Jersey Institute of Technology are jointly developing high order solar Adaptive Optics (AO) to be deployed at both the Dunn Solar Telescope (DST) and the Big Bear Solar Telescope (BBST). These AO systems are expected to deliver first light at the end of 2003. We discuss the AO optical designs for both the DST and the BBST. The requirements for the optical design of the AO system are as follows: the optics must deliver diffraction-limited imaging at visible and near infrared over a 190"×190" field of view. The focal plane image must be flat over the entire field of view to accommodate a long slit and fast spectrograph. The wave-front sensor must be able to lock on solar structure such as granulation. Finally, the cost for the optical system must fit the limited budget. Additional design considerations are the desired high bandwidth for tip/tilt correction, which leads to a small, fast and off-the-shelf tilt-tip mirror system and high throughput, i.e., a minimal number of optical surfaces. In order to eliminate pupil image wander on the wave-front sensor, both the deformable mirror and tip-tilt mirror are located on the conjugation images of the telescope pupil. We discuss the details of the optical design for the high order AO system, which will deliver high resolution image at the 0.39 - 1.6 μm wavelength range.

  14. The Big Apple's Core: Exploring Manhattan

    ERIC Educational Resources Information Center

    Groce, Eric C.; Groce, Robin D.; Colby, Susan

    2005-01-01

    Children are exposed to a wide variety of images related to New York City through various media outlets. They may have seen glimpses of Manhattan by watching movies such as Spiderman or Stuart Little or by taking in annual television events such as the Macy's Thanksgiving Day Parade or the Times Square New Year's Eve celebration. Additionally,…

  15. Two-Dimensional Spectroscopy at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Denker, Carsten; Deng, N.; Tritschler, A.

    2006-06-01

    Two-dimensional spectroscopy is an important tool to measure the physical parameters related to solar activity in both the photosphere and chromosphere. We present a description of the visible-light post-focus instrumentation at the Big Bear Solar Observatory (BBSO) including adaptive optics and image restoration. We report the first science observations obtained with two-dimensional spectroscopy during the 2005 observing season. In particular we discuss the properties of flows associated with a small delta-spot in solar active region NOAA 10756.

  16. [Big data in medicine and healthcare].

    PubMed

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare. PMID:26063521

  17. Pathfinder Landing Site Observed by Mars Orbiter Camera - 'Big Crater' in Stereo View

    NASA Technical Reports Server (NTRS)

    1998-01-01

    On its 256th orbit of Mars, the camera on-board the Mars Global Surveyor spacecraft successfully observed the vicinity of the Pathfinder landing site. The images shown are a stereoscopic image pair in anaglyph format, made from the overlapping area of MOC 25603 and 23703. This image is reproduced at a scale of 5 m (16.4 feet) per pixel. Image 23703 was acquired on 13 April at 7:50 AM PDT; Image 25603 was acquired on 22 April at 1:11 PM PDT. The P237 observation was made from a distance of 675 km while the P256 measurement was made from 800 km. The viewing angle for 23703 was 21.2o, for 25603, 30.67o, giving an angular difference of about 9.5o. Owing to the relief on 'Big Crater,' this relatively small angular difference was in this case sufficient to show good stereo parallax.

    The resolution of the MOC image that covered the Pathfinder landing site (MOC 25603) was about 3.3 m or 11 feet per pixel. The Pathfinder lander and airbags form a roughly equilateral triangle 5 m on a side. Noting that the camera has not yet been focussed (it needs to be in the stable temperature conditions of the low altitude, circular mapping orbit in order to achieve best focus) and the hazy atmospheric conditions, the effective scale of the image is probably closer to 5 m (16.4 feet). Thus, the scale of the image was insufficient to resolve the lander (more than one pixel is needed to resolve a feature). In addition, the relatively high sun angle of the image (the sun was 40o above the horizon) reduced the length of shadows (for example, only a few boulders are seen), also decreasing the ability to discriminate small features. Work continues to locate intermediate-scale features in the lander and orbiter images in the hope of identifying the precise landing site based on these comparisons.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego

  18. Big Earth observation data analytics for land use and land cover change information

    NASA Astrophysics Data System (ADS)

    Câmara, Gilberto

    2015-04-01

    Current scientific methods for extracting information for Earth observation data lag far behind our capacity to build complex satellites. In response to this challenge, our work explores a new type of knowledge platform to improve the extraction of land use and land cover change information from big Earth Observation data sets. We take a space-time perspective of Earth Observation data, considering that each sensor revisits the same place at regular intervals. Sensor data can, in principle, be calibrated so that observations of the same place in different times are comparable and each measure from a sensor is mapped into a three dimensional array in space-time. To fully enable the use of space-time arrays for working with Earth Observation data, we use the SciDB array database. Arrays naturally fit the data structure of Earth Observation images, breaking the image-as-a-snapshot paradigm. Thus, entire collections of images can be stored as multidimensional arrays. However, array databases do not understand the specific nature of geographical data, and do not capture the meaning and the differences between spatial and temporal dimensions. In our work, we have extended SciDB to include additional information about satellite image metadata, cartographical projections, and time. We are currently developing methods to extract land use and land cover information based on space-time analysis on array databases. Our experiments show these space-time methods give us significant improvements over current space-only remote sensing image processing methods. We have been able to capture tropical forest degradation and forest regrowth and also to distinguish between single-cropping and double-cropping practices in tropical agriculture.

  19. Processing Solutions for Big Data in Astronomy

    NASA Astrophysics Data System (ADS)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  20. Big data and the electronic health record.

    PubMed

    Peters, Steve G; Buntrock, James D

    2014-01-01

    The electronic medical record has evolved from a digital representation of individual patient results and documents to information of large scale and complexity. Big Data refers to new technologies providing management and processing capabilities, targeting massive and disparate data sets. For an individual patient, techniques such as Natural Language Processing allow the integration and analysis of textual reports with structured results. For groups of patients, Big Data offers the promise of large-scale analysis of outcomes, patterns, temporal trends, and correlations. The evolution of Big Data analytics moves us from description and reporting to forecasting, predictive modeling, and decision optimization. PMID:24887521

  1. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-01-04

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the first performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first Partnership meeting the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Complementary to the efforts on evaluation of sources and sinks is the development of the Big Sky Partnership Carbon Cyberinfrastructure (BSP-CC) and a GIS Road Map for the Partnership. These efforts will put in place a map-based integrated information management system for our Partnership, with transferability to the national carbon sequestration effort. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but other policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best

  2. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  3. NOAA Big Data Partnership RFI

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  4. Big Bang Nucleosynthesis in the New Cosmology

    SciTech Connect

    Fields, Brian D.

    2008-01-24

    Big bang nucleosynthesis (BBN) describes the production of the lightest elements in the first minutes of cosmic time. We review the physics of cosmological element production, and the observations of the primordial element abundances. The comparison between theory and observation has heretofore provided our earliest probe of the universe, and given the best measure of the cosmic baryon content. However, BBN has now taken a new role in cosmology, in light of new precision measurements of the cosmic microwave background (CMB). Recent CMB anisotropy data yield a wealth of cosmological parameters; in particular, the baryon-to-photon ratio {eta} = n{sub B}/n{sub {gamma}} is measured to high precision. The confrontation between the BBN and CMB ''baryometers'' poses a new and stringent test of the standard cosmology; the status of this test are discussed. Moreover, it is now possible to recast the role of BBN by using the CMB to fix the baryon density and even some light element abundances. This strategy sharpens BBN into a more powerful probe of early universe physics, and of galactic nucleosynthesis processes. The impact of the CMB results on particle physics beyond the Standard Model, and on non-standard cosmology, are illustrated. Prospects for improvement of these bounds via additional astronomical observations and nuclear experiments are discussed, as is the lingering ''lithium problem.''.

  5. ATLAS: Big Data in a Small Package

    NASA Astrophysics Data System (ADS)

    Denneau, Larry; Tonry, John

    2015-08-01

    For even small telescope projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (ATLAS; Tonry 2011) will robotically survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids (NEAs) on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards -- two 0.5 m F/2.0 telescopes -- each year the ATLAS system will obtain ~103 measurements of 109 astronomical sources to a photometric accuracy of <5%. This ever-growing dataset must be searched in real-time for moving objects then archived for further analysis, and alerts for newly discovered near-Earth NEAs disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many ``rifle shot'' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of satellites and pieces of space junk that ATLAS will see each night. Additional interrogation will identify interesting phenomena from beyond the solar system occurring over millions of transient sources per night. The data processing and storage requirements for ATLAS demand a ``big data'' approach typical of commercial Internet enterprises. We describe our approach to deploying a nimble, scalable and reliable data processing infrastructure, and promote ATLAS as steppingstone to eventual processing scales in the era of LSST.

  6. Big bang nucleosynthesis in the new cosmology

    NASA Astrophysics Data System (ADS)

    Fields, B. D.

    2006-03-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest elements in the first minutes of cosmic time. We review the physics of cosmological element production, and the observations of the primordial element abundances. The comparison between theory and observation has heretofore provided our earliest probe of the universe, and given the best measure of the cosmic baryon content. However, BBN has now taken a new role in cosmology, in light of new precision measurements of the cosmic microwave background (CMB). Recent CMB anisotropy data yield a wealth of cosmological parameters; in particular, the baryon-to-photon ratio η = n B/n γ is measured to high precision. The confrontation between the BBN and CMB “baryometers” poses a new and stringent test of the standard cosmology; the status of this test is discussed. Moreover, it is now possible to recast the role of BBN by using the CMB to fix the baryon density and even some light element abundances. This strategy sharpens BBN into a more powerful probe of early universe physics, and of galactic nucleosynthesis processes. The impact of the CMB results on particle physics beyond the Standard Model, and on non-standard cosmology, are illustrated. Prospects for improvement of these bounds via additional astronomical observations and nuclear experiments are discussed, as is the lingering “lithium problem.”

  7. Images.

    ERIC Educational Resources Information Center

    Barr, Catherine, Ed.

    1997-01-01

    The theme of this month's issue is "Images"--from early paintings and statuary to computer-generated design. Resources on the theme include Web sites, CD-ROMs and software, videos, books, and others. A page of reproducible activities is also provided. Features include photojournalism, inspirational Web sites, art history, pop art, and myths. (AEF)

  8. Interoperability Outlook in the Big Data Future

    NASA Astrophysics Data System (ADS)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center

  9. Integrating BigBOSS with the Mayall Telescope

    NASA Astrophysics Data System (ADS)

    Besuner, Robert; Bebek, Chris; Dey, Arjun; Goble, Will; Joyce, Dick; Levi, Michael E.; Reil, Kevin; Schlegel, David; Sholl, Michael

    2012-09-01

    BigBOSS is a proposed ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of large scale structure. It consists of a fiber-fed multi-object spectrograph designed to be installed on the Mayall 4-meter telescope at Kitt Peak, Arizona. BigBOSS includes an optical corrector assembly and 5000-fiber-positioner focal plane assembly that replace the existing Mayall prime focus hardware. 40-meter long optical fiber bundles are routed from the focal plane, through the telescope declination and right ascension pivots, to spectrographs in the thermally insulated FTS Laboratory, immediately adjacent to the telescope. Each of the ten spectrographs includes three separate spectral bands. The FTS Laboratory also houses support electronics, cooling, and vacuum equipment. The prime focus assembly includes mounts for the existing Mayall f/8 secondary mirror to allow observations with Cassegrain instruments. We describe the major elements of the BigBOSS instrument, plans for integrating with the Telescope, and proposed modifications and additions to existing Mayall facilities.

  10. The big five personality traits: psychological entities or statistical constructs?

    PubMed

    Franić, Sanja; Borsboom, Denny; Dolan, Conor V; Boomsma, Dorret I

    2014-11-01

    The present study employed multivariate genetic item-level analyses to examine the ontology and the genetic and environmental etiology of the Big Five personality dimensions, as measured by the NEO Five Factor Inventory (NEO-FFI) [Costa and McCrae, Revised NEO personality inventory (NEO PI-R) and NEO five-factor inventory (NEO-FFI) professional manual, 1992; Hoekstra et al., NEO personality questionnaires NEO-PI-R, NEO-FFI: manual, 1996]. Common and independent pathway model comparison was used to test whether the five personality dimensions fully mediate the genetic and environmental effects on the items, as would be expected under the realist interpretation of the Big Five. In addition, the dimensionalities of the latent genetic and environmental structures were examined. Item scores of a population-based sample of 7,900 adult twins (including 2,805 complete twin pairs; 1,528 MZ and 1,277 DZ) on the Dutch version of the NEO-FFI were analyzed. Although both the genetic and the environmental covariance components display a 5-factor structure, applications of common and independent pathway modeling showed that they do not comply with the collinearity constraints entailed in the common pathway model. Implications for the substantive interpretation of the Big Five are discussed. PMID:24162101

  11. bwtool: a tool for bigWig files

    PubMed Central

    Pohl, Andy; Beato, Miguel

    2014-01-01

    BigWig files are a compressed, indexed, binary format for genome-wide signal data for calculations (e.g. GC percent) or experiments (e.g. ChIP-seq/RNA-seq read depth). bwtool is a tool designed to read bigWig files rapidly and efficiently, providing functionality for extracting data and summarizing it in several ways, globally or at specific regions. Additionally, the tool enables the conversion of the positions of signal data from one genome assembly to another, also known as ‘lifting’. We believe bwtool can be useful for the analyst frequently working with bigWig data, which is becoming a standard format to represent functional signals along genomes. The article includes supplementary examples of running the software. Availability and implementation: The C source code is freely available under the GNU public license v3 at http://cromatina.crg.eu/bwtool. Contact: andrew.pohl@crg.eu, andypohl@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24489365

  12. BigFoot: Bayesian alignment and phylogenetic footprinting with MCMC

    PubMed Central

    Satija, Rahul; Novák, Ádám; Miklós, István; Lyngsø, Rune; Hein, Jotun

    2009-01-01

    Background We have previously combined statistical alignment and phylogenetic footprinting to detect conserved functional elements without assuming a fixed alignment. Considering a probability-weighted distribution of alignments removes sensitivity to alignment errors, properly accommodates regions of alignment uncertainty, and increases the accuracy of functional element prediction. Our method utilized standard dynamic programming hidden markov model algorithms to analyze up to four sequences. Results We present a novel approach, implemented in the software package BigFoot, for performing phylogenetic footprinting on greater numbers of sequences. We have developed a Markov chain Monte Carlo (MCMC) approach which samples both sequence alignments and locations of slowly evolving regions. We implement our method as an extension of the existing StatAlign software package and test it on well-annotated regions controlling the expression of the even-skipped gene in Drosophila and the α-globin gene in vertebrates. The results exhibit how adding additional sequences to the analysis has the potential to improve the accuracy of functional predictions, and demonstrate how BigFoot outperforms existing alignment-based phylogenetic footprinting techniques. Conclusion BigFoot extends a combined alignment and phylogenetic footprinting approach to analyze larger amounts of sequence data using MCMC. Our approach is robust to alignment error and uncertainty and can be applied to a variety of biological datasets. The source code and documentation are publicly available for download from PMID:19715598

  13. Big-bang nucleosynthesis revisited

    NASA Technical Reports Server (NTRS)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  14. Heat Waves Pose Big Health Threats

    MedlinePlus

    ... news/fullstory_159744.html Heat Waves Pose Big Health Threats Kids, elderly among those at greatest risk, ... Illness Seniors' Health Recent Health News Related MedlinePlus Health Topics Child Safety Heat Illness Seniors' Health About ...

  15. Cosmic relics from the big bang

    SciTech Connect

    Hall, L.J.

    1988-12-01

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab.

  16. Do Big Bottles Kickstart Infant Weight Issues?

    MedlinePlus

    ... baby bottles might help prevent early obesity in formula-fed infants, study suggests To use the sharing ... TUESDAY, June 7, 2016 (HealthDay News) -- Feeding babies formula from a big bottle might put them at ...

  17. Quantum nature of the big bang.

    PubMed

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime. PMID:16712061

  18. The caBIG terminology review process.

    PubMed

    Cimino, James J; Hayamizu, Terry F; Bodenreider, Olivier; Davis, Brian; Stafford, Grace A; Ringwald, Martin

    2009-06-01

    The National Cancer Institute (NCI) is developing an integrated biomedical informatics infrastructure, the cancer Biomedical Informatics Grid (caBIG), to support collaboration within the cancer research community. A key part of the caBIG architecture is the establishment of terminology standards for representing data. In order to evaluate the suitability of existing controlled terminologies, the caBIG Vocabulary and Data Elements Workspace (VCDE WS) working group has developed a set of criteria that serve to assess a terminology's structure, content, documentation, and editorial process. This paper describes the evolution of these criteria and the results of their use in evaluating four standard terminologies: the Gene Ontology (GO), the NCI Thesaurus (NCIt), the Common Terminology for Adverse Events (known as CTCAE), and the laboratory portion of the Logical Objects, Identifiers, Names and Codes (LOINC). The resulting caBIG criteria are presented as a matrix that may be applicable to any terminology standardization effort. PMID:19154797

  19. Big data in nephrology: friend or foe?

    PubMed

    Ketchersid, Terry

    2013-01-01

    The phrase 'big data' has arrived in today's lexicon with great fanfare and some degree of hyperbole. Generally speaking, big data refer to data sets that are too complex to be successfully interrogated using standard statistical software. A wide variety of business sectors has utilized big data to garner competitive advantage within their respective markets. Medicine and nephrology, in particular, have been late to this table. This is beginning to change, however, as data scientists begin to work with these large data sets, developing predictive models that permit us to peer into the future. Coupled with an expanding understanding of genomics, predictive models constructed with the assistance of big data may soon provide us with a powerful tool to use as we provide care to patients with renal disease. PMID:24496185

  20. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its Second Revised and Restated Open Access Transmission Tariff. Big Rivers also requests waiver of the...

  1. 3. OVERVIEW CONTEXTUAL VIEW OF BIG CREEK NO. 3 COMPLEX ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. OVERVIEW CONTEXTUAL VIEW OF BIG CREEK NO. 3 COMPLEX SHOWING SWITCHRACKS AND SUPPORT BUILDINGS TO PHOTO RIGHT OF POWERHOUSE, SAN JOAQUIN RIVER FLOWING IN PHOTO CENTER TO LOWER RIGHT, AND PENSTOCKS AND STANDPIPES IN BACKGROUND ABOVE POWERHOUSE. VIEW TO EAST. - Big Creek Hydroelectric System, Powerhouse 3 Penstock Standpipes, Big Creek, Big Creek, Fresno County, CA

  2. 2. CONTEMPORARY PHOTOGRAPH OF BIG CREEK POWERHOUSE NO. 3 TAKEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. CONTEMPORARY PHOTOGRAPH OF BIG CREEK POWERHOUSE NO. 3 TAKEN FROM SAME ANGLE AS CA-167-X-1. THREE ORIGINAL PENSTOCKS PLUS FOURTH AND FIFTH PENSTOCKS (VISIBLE TO LEFT OF ORIGINAL THREE), AND THREE ORIGINAL STANDPIPES COUPLED TO FOURTH STANDPIPE SHOWN BEHIND AND ABOVE POWERHOUSE BUILDING. VIEW TO NORTHEAST. - Big Creek Hydroelectric System, Powerhouse 3 Penstock Standpipes, Big Creek, Big Creek, Fresno County, CA

  3. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  4. Big Red Eye is Ready

    NASA Astrophysics Data System (ADS)

    2007-01-01

    The world's biggest infrared camera for Europe's newest telescope left the UK today for Chile. The 67 million pixel camera will equip VISTA - a UK provided survey telescope being constructed in Chile for ESO. VISTA will map the infrared sky faster than any previous telescope, studying areas of the Universe that are hard to see in the visible due to either their cool temperature, surrounding dust or high redshift. ESO PR Photo 04a/07 ESO PR Photo 04a/07 The VISTA Camera The 2.9-tonne VISTA camera has been designed and built by a consortium including the CCLRC Rutherford Appleton Laboratory, the UK Astronomy Technology Centre (UK ATC) in Edinburgh and the University of Durham. "The camera operates under vacuum at a temperature of -200 degrees Celsius, so in many ways it has been like designing an instrument for use in space, but with the additional constraint of having to survive an earthquake environment," said Kim Ward, the Camera Manager from the Rutherford Appleton Laboratory, who oversaw the technical challenges. "With a total of 67 million pixels, VISTA has a much larger number of infrared sensitive detectors than previous infrared instruments." VISTA is due to start scientific operations in the last quarter of 2007. "VISTA will be able to take images of sky areas each about 3 times as large as the full Moon," said Jim Emerson of Queen Mary, University of London, UK and VISTA's Principal Investigator. "This means it can survey quickly. The camera is crucial to carrying out VISTA's surveys which will provide statistical samples of objects and at the same time locate and characterise rare and variable objects, and perhaps most tantalisingly make discoveries of the as-yet unknown." The 4-m VISTA will survey large areas of the southern sky at near-infrared wavelengths to study objects that are not seen easily in optical light either because they are too cool, or are surrounded by dust (which infrared light penetrates much better than optical), or whose optical

  5. COBE looks back to the Big Bang

    NASA Technical Reports Server (NTRS)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  6. Dark energy, wormholes, and the big rip

    SciTech Connect

    Faraoni, V.; Israel, W.

    2005-03-15

    The time evolution of a wormhole in a Friedmann universe approaching the big rip is studied. The wormhole is modeled by a thin spherical shell accreting the superquintessence fluid--two different models are presented. Contrary to recent claims that the wormhole overtakes the expansion of the universe and engulfs it before the big rip is reached, it is found that the wormhole becomes asymptotically comoving with the cosmic fluid and the future evolution of the universe is fully causal.

  7. Data Confidentiality Challenges in Big Data Applications

    SciTech Connect

    Yin, Jian; Zhao, Dongfang

    2015-12-15

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  8. Quality of Big Data in Healthcare

    DOE PAGESBeta

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    2015-01-01

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  9. Big-Data RHEED analysis for understanding epitaxial film growth processes

    SciTech Connect

    Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V

    2014-10-28

    Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in-situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED image, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the dataset are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of RHEED image sequence. This approach is illustrated for growth of LaxCa1-xMnO3 films grown on etched (001) SrTiO3 substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the assymetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.

  10. Big Data: Survey, Technologies, Opportunities, and Challenges

    PubMed Central

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  11. Big data: survey, technologies, opportunities, and challenges.

    PubMed

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  12. Installing an additional emission quenching pathway in the design of iridium(III)-based phosphorogenic biomaterials for bioorthogonal labelling and imaging.

    PubMed

    Li, Steve Po-Yam; Yip, Alex Man-Hei; Liu, Hua-Wei; Lo, Kenneth Kam-Wing

    2016-10-01

    We report the synthesis, characterization, photophysical and electrochemical behaviour and biological labelling applications of new phosphorogenic bioorthogonal probes derived from iridium(III) polypyridine complexes containing a 1,2,4,5-tetrazine moiety. In contrast to common luminescent cyclometallated iridium(III) polypyridine complexes, these tetrazine complexes are almost non-emissive due to effective Förster resonance energy transfer (FRET) and/or photoinduced electron transfer (PET) from the excited iridium(III) polypyridine unit to the appended tetrazine moiety. However, they exhibited significant emission enhancement upon reacting with (1R,8S,9s)-bicyclo[6.1.0]non-4-yn-9-ylmethanol (BCN-OH) (ca. 19.5-121.9 fold) and BCN-modified bovine serum albumin (BCN-BSA) (ca. 140.8-1133.7 fold) as a result of the conversion of the tetrazine unit to a non-quenching pyridazine derivative. The complexes were applied to image azide-modified glycans in live cells using a homobifunctional crosslinker, 1,13-bis((1R,8S,9s)-bicyclo[6.1.0]non-4-yn-9-ylmethyloxycarbonylamino)-4,7,10-trioxatridecane (bis-BCN). PMID:27429251

  13. 78 FR 3911 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-17

    ... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... (CCP) and finding of no significant impact (FONSI) for the environmental assessment (EA) for Big Stone.../FONSI on the planning Web site at http://www.fws.gov/midwest/planning/BigStoneNWR/index.html . A...

  14. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  15. Boosting Big National Lab Data

    SciTech Connect

    Kleese van Dam, Kerstin

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  16. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-06-30

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop (see attached agenda). The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement

  17. Mosaicking Mexico - the Big Picture of Big Data

    NASA Astrophysics Data System (ADS)

    Hruby, F.; Melamed, S.; Ressl, R.; Stanley, D.

    2016-06-01

    The project presented in this article is to create a completely seamless and cloud-free mosaic of Mexico at a resolution of 5m, using approximately 4,500 RapidEye images. To complete this project in a timely manner and with limited operators, a number of processing architectures were required to handle a data volume of 12 terabytes. This paper will discuss the different operations realized to complete this project, which include, preprocessing, mosaic generation and post mosaic editing. Prior to mosaic generation, it was necessary to filter the 50,000 RapidEye images captured over Mexico between 2011 and 2014 to identify the top candidate images, based on season and cloud cover. Upon selecting the top candidate images, PCI Geomatics' GXL system was used to reproject, color balance and generate seamlines for the output 1TB+ mosaic. This paper will also discuss innovative techniques used by the GXL for color balancing large volumes of imagery with substantial radiometric differences. Furthermore, post-mosaicking steps, such as, exposure correction, cloud and cloud shadow elimination will be presented.

  18. DIRECT Distances to Nearby Galaxies Using Detached Eclipsing Binaries and Cepheids. VII. Additional Variables in the Field M33A Discovered with Image Subtraction

    NASA Astrophysics Data System (ADS)

    Mochejska, B. J.; Kaluzny, J.; Stanek, K. Z.; Sasselov, D. D.; Szentgyorgyi, A. H.

    2001-04-01

    DIRECT is a project to directly obtain the distances to two Local Group galaxies, M31 and M33, which occupy a crucial position near the bottom of the cosmological distance ladder. As the first step of the DIRECT project, we have searched for detached eclipsing binaries (DEBs) and new Cepheids in the M31 and M33 galaxies with 1 m class telescopes. In this paper, we present a catalog of variable stars discovered in the data from the follow-up observations of the DEB system D33J013346.2+304439.9 in field M33A (α=23.55d, δ=30.72d J2000.0), collected with the Kitt Peak National Observatory's 2.1 m telescope. In our search covering an area of 108 arcmin2, we have found 434 variable stars: 63 eclipsing binaries, 305 Cepheids, and 66 other periodic, possible long-period, or nonperiodic variables. Of these variables, 280 are newly discovered, mainly short-period and/or faint Cepheids. Their light curves were extracted using the ISIS image subtraction package. For 85% of the variables, we present light curves in standard V and B magnitudes, with the remaining 15% expressed in units of differential flux. We have discovered a population of first-overtone Cepheid candidates, and for eight of them we present strong arguments in favor of this interpretation. We also report on the detection of a nonlinearity in the KPNO T2KA and T1KA cameras. The catalog of variables, as well as their photometry (~7.8×104 BV measurements) and finding charts, is available electronically via anonymous ftp and the World Wide Web. The complete set of the CCD frames is available upon request. Based on observations obtained with the 2.1 m telescope at the Kitt Peak National Observatory.

  19. DIRECT Distances to Nearby Galaxies Using Detached Eclipsing Binaries and Cepheids. VIII. Additional Variables in the Field M33B Discovered with Image Subtraction

    NASA Astrophysics Data System (ADS)

    Mochejska, B. J.; Kaluzny, J.; Stanek, K. Z.; Sasselov, D. D.; Szentgyorgyi, A. H.

    2001-11-01

    DIRECT is a project to obtain directly the distances to two Local Group galaxies, M31 and M33, which occupy a crucial position near the bottom of the cosmological distance ladder. As the first step of the DIRECT project we have searched for detached eclipsing binaries (DEBs) and new Cepheids in the M31 and M33 galaxies with 1 m class telescopes. In this eighth paper we present a catalog of variable stars discovered in the data from the follow-up observations of DEB system D33J013337.0+303032.8 in field M33B [(α,δ)=(23.48d,30.57d), J2000.0], collected with the Kitt Peak National Observatory 2.1 m telescope. In our search covering an area of 108 arcmin2 we have found 895 variable stars: 96 eclipsing binaries, 349 Cepheids, and 450 other periodic, possibly long-period or nonperiodic variables. Of these variables 612 are newly discovered. Their light curves were extracted using the ISIS image subtraction package. For 77% of the variables we present light curves in standard V and B magnitudes, with the remaining 23% expressed in units of differential flux. We have discovered a population of first-overtone Cepheid candidates, and for six of them we present strong arguments in favor of this interpretation. The catalog of variables, as well as their photometry (about 9.2×104 BV measurements) and finding charts, is available electronically via anonymous ftp and the World Wide Web. The complete set of the CCD frames is available upon request. Based on observations obtained with the 2.1 m telescope at Kitt Peak National Observatory, National Optical Astronomy Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under cooperative agreement with the National Science Foundation.

  20. Big bang nucleosynthesis: Present status

    NASA Astrophysics Data System (ADS)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nν<3.2 . The new precision of the CMB and D/H observations together leaves D/H predictions as the largest source of uncertainties. Future improvement in BBN calculations will therefore rely on improved nuclear cross-section data. In contrast with D/H and 4He, 7Li predictions continue to disagree with observations, perhaps pointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  1. The New Solar Telescope at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Denker, C.; Marquette, W. H.; Varsik, J.; Wang, H.; Goode, P. R.; Moretto, G.; Kuhn, J.; Coulter, R.

    2004-05-01

    The New Solar Telescope (NST) at Big Bear Solar Observatory is the replacement of the current 65 cm vacuum telescope. We present the optical design of this novel off-axis telescope with a 1.6 m clear aperture. The NST has been designed to exploit the excellent seeing conditions at a lake-site observatory and provide data with a spatial resolution close the telescope's diffraction limit from the visible to the near-infrared (NIR) wavelength region. The post-focus instrumentation is located in the Coudé-room, a new optical laboratory below the observing floor, which also hosts a high-order adaptive optics system. The main instruments are two imaging spectro-polarimeters for visible and NIR observations and a real-time image reconstruction system for visible-light multi-color photometry. This unique combination of instruments will realize its full potential in the studies of active region evolution and space weather forecasts.

  2. Infrared Observations from the New Solar Telescope at Big Bear

    NASA Astrophysics Data System (ADS)

    Goode, Philip R.; Cao, Wenda

    2013-10-01

    The 1.6 m clear aperture solar telescope in Big Bear is operational and with its adaptive optics (AO) system it provides diffraction limited solar imaging and polarimetry in the near-infrared (NIR). While the AO system is being upgraded to provide diffraction limited imaging at bluer wavelengths, the instrumentation and observations are concentrated in the NIR. The New Solar Telescope (NST) operates in campaigns, making it the ideal ground-based telescope to provide complementary/supplementary data to SDO and Hinode. The NST makes photometric observations in Hα (656.3 nm) and TiO (705.6 nm) among other lines. As well, the NST collects vector magnetograms in the 1565 nm lines and is beginning such observations in 1083.0 nm. Here we discuss the relevant NST instruments, including AO, and present some results that are germane to NASA solar missions.

  3. Big biomedical data as the key resource for discovery science.

    PubMed

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-11-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an "-ome to home" approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center's computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson's and Alzheimer's. PMID:26198305

  4. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  5. Big Data in Caenorhabditis elegans: quo vadis?

    PubMed Central

    Hutter, Harald; Moerman, Donald

    2015-01-01

    A clear definition of what constitutes “Big Data” is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of “complete” data sets for this organism is actually rather small—not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein–protein interaction—important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. PMID:26543198

  6. Making a Difference in Schools: The Big Brothers Big Sisters School-Based Mentoring Impact Study

    ERIC Educational Resources Information Center

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; Feldman, Amy F.; McMaken, Jennifer

    2007-01-01

    School-based mentoring is one of the fastest growing forms of mentoring in the US today; yet, few studies have rigorously examined its impacts. This landmark random assignment impact study of Big Brothers Big Sisters School-Based Mentoring is the first national study of this program model. It involves 10 agencies, 71 schools and 1,139 9- to…

  7. Making a Difference. An Impact Study of Big Brothers/Big Sisters.

    ERIC Educational Resources Information Center

    Tierney, Joseph P.; And Others

    This report provides reliable evidence that mentoring programs can positively affect young people. The evidence is derived from research conducted at local affiliates of Big Brothers/Big Sisters of America (BB/BSA), the oldest, best-known, and arguably most sophisticated of the country's mentoring programs. Public/Private Ventures, Inc. conducted…

  8. Serving, Learning and Mentoring through the Big Brothers Big Sisters Program

    ERIC Educational Resources Information Center

    Sivukamaran, Thillainatarajan; Holland, Glenda; Clark, Leonard J.

    2010-01-01

    This study describes the collaborative partnership between a Big Brothers Big Sisters organization, an elementary school and the College of Education at a public university. The partnership utilized a mentoring system consisting of elementary students, college students, elementary teachers and university faculty. Benefits of the various…

  9. Enhancement of β-catenin activity by BIG1 plus BIG2 via Arf activation and cAMP signals.

    PubMed

    Li, Chun-Chun; Le, Kang; Kato, Jiro; Moss, Joel; Vaughan, Martha

    2016-05-24

    Multifunctional β-catenin, with critical roles in both cell-cell adhesion and Wnt-signaling pathways, was among HeLa cell proteins coimmunoprecipitated by antibodies against brefeldin A-inhibited guanine nucleotide-exchange factors 1 and 2 (BIG1 or BIG2) that activate ADP-ribosylation factors (Arfs) by accelerating the replacement of bound GDP with GTP. BIG proteins also contain A-kinase anchoring protein (AKAP) sequences that can act as scaffolds for multimolecular assemblies that facilitate and limit cAMP signaling temporally and spatially. Direct interaction of BIG1 N-terminal sequence with β-catenin was confirmed using yeast two-hybrid assays and in vitro synthesized proteins. Depletion of BIG1 and/or BIG2 or overexpression of guanine nucleotide-exchange factor inactive mutant, but not wild-type, proteins interfered with β-catenin trafficking, leading to accumulation at perinuclear Golgi structures. Both phospholipase D activity and vesicular trafficking were required for effects of BIG1 and BIG2 on β-catenin activation. Levels of PKA-phosphorylated β-catenin S675 and β-catenin association with PKA, BIG1, and BIG2 were also diminished after BIG1/BIG2 depletion. Inferring a requirement for BIG1 and/or BIG2 AKAP sequence in PKA modification of β-catenin and its effect on transcription activation, we confirmed dependence of S675 phosphorylation and transcription coactivator function on BIG2 AKAP-C sequence. PMID:27162341

  10. Small Things Draw Big Interest

    ERIC Educational Resources Information Center

    Green, Susan; Smith III, Julian

    2005-01-01

    Although the microscope is a basic tool in both physical and biological sciences, it is notably absent from most elementary school science programs. One reason teachers find it challenging to introduce microscopy at the elementary level is because children can have a hard time connecting the image of an object seen through a microscope with what…

  11. New Digital Magnetograph at Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Wang, Haimin; Denker, Carsten; Spirock, Thomas; Yang, Shu; Goode, Philip

    1997-05-01

    A new magnetograph system has been installed and tested at Big Bear Solar Observatory. The system uses part of BBSO's existing VMG system: a quarter wave plate, a Ferro-Electric Liquid Crystal to switch polarizations, and a 0.25A bandpass Zeiss filter tuned at CaI 6103A. A 256 by 256 12-bit Dalsa camera is used as the detector and as the driver to switch the liquid crystal. The data rate of the camera is 90 frames/s. The camera is interfaced by a Pentium-166 with a Mutech imaging board for data acquisition and analyses. The computer has 128mb of ram, up to 700 live images can be stored in the memory for a quick post-exposure image processing (image selection and alignment). We have improved the sensitivity and spatial resolution significantly over the old BBSO VMG system for the following reasons: (1) new digital image data is in 12 bits while the video signal is below 8 bits. Polarizations weaker than 1% can not be detected by a single pair subtraction in the video system. The digital system can detect a polarization signal below 0.1% by a single pair subtraction. (2) Data rate of the digital system is 90 frames/s, that of the video system is 30 frames/s. So the time difference between two polarizations is reduced in the new system. Under good seeing conditions, the data rate of 90 frames/s ensures that the wavefront distortions are "frozen" and approximately the same for the left and right circular polarized image pairs. (3) Magnetograms are constructed after image selection and alignment. The same system has potential for further imaging processing, e.g. image de-stretch, and speckle interferometry. Preliminary results will be presented at the meeting.

  12. The Confluence of Exascale and Big Data

    NASA Astrophysics Data System (ADS)

    Dosanjh, Sudip

    2014-04-01

    Exascale computing has rightly received considerable attention within the high performance computing community. In many fields, scientific progress requires a thousand-fold increase in supercomputing performance over the next decade. Science needs include performing single simulations that span a large portion of an exascale system, as well high throughput computing. The big data problem has also received considerable attention, but is sometimes viewed as being orthogonal to exascale computing. This talk focuses on the confluence of exascale and big data. Exascale and big data face many similar technical challenges including increasing power/energy constraints, the growing mismatch between computing and data movement speeds, an explosion in concurrency and the reduced reliability of large computing systems. Even though exascale and data intensive systems might have different system-level architectures, the fundamental building blocks will be similar. Analyzing all the information produced by exascale simulations will also generate a big data problem. And finally, many experimental facilities are being inundated with large quantities of data as sensors and sequencers improve at rates that surpass Moore's Law. It is becoming increasingly difficult to analyze all of the data from a single experiment and it is often impossible to make comparisons across data sets. It will only be possible to accelerate scientific discovery if we bring together the high performance computing and big data communities.

  13. Volume and Value of Big Healthcare Data

    PubMed Central

    Dinov, Ivo D.

    2016-01-01

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309

  14. ATLAS: Big Data in a Small Package?

    NASA Astrophysics Data System (ADS)

    Denneau, Larry

    2016-01-01

    For even small astronomy projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (Tonry 2011) will survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards - two 0.5 m F/2.0 telescopes - each night the ATLAS system will measure nearly 109 astronomical sources to a photometric accuracy of <5%, totaling 1012 individual observations over its initial 3-year mission. This ever-growing dataset must be searched in real-time for moving objects and transients then archived for further analysis, and alerts for newly discovered near-Earth asteroids (NEAs) disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many `rifle shot' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of low-Earth orbit (LEO) and geosynchronous orbit (GEO) satellites ATLAS will see each night. Additional interrogation will identify interesting phenomena from millions of transient sources per night beyond the solar system. The data processing and storage requirements for ATLAS demand a `big data' approach typical of commercial internet enterprises. We describe our experience in deploying a nimble, scalable and reliable data processing infrastructure, and suggest ATLAS as steppingstone to data processing capability needed as we enter the era of LSST.

  15. Breaking Barriers in Polymer Additive Manufacturing

    SciTech Connect

    Love, Lonnie J; Duty, Chad E; Post, Brian K; Lind, Randall F; Lloyd, Peter D; Kunc, Vlastimil; Peter, William H; Blue, Craig A

    2015-01-01

    Additive Manufacturing (AM) enables the creation of complex structures directly from a computer-aided design (CAD). There are limitations that prevent the technology from realizing its full potential. AM has been criticized for being slow and expensive with limited build size. Oak Ridge National Laboratory (ORNL) has developed a large scale AM system that improves upon each of these areas by more than an order of magnitude. The Big Area Additive Manufacturing (BAAM) system directly converts low cost pellets into a large, three-dimensional part at a rate exceeding 25 kg/h. By breaking these traditional barriers, it is possible for polymer AM to penetrate new manufacturing markets.

  16. The caBIG Terminology Review Process

    PubMed Central

    Cimino, James J.; Hayamizu, Terry F.; Bodenreider, Olivier; Davis, Brian; Stafford, Grace A.; Ringwald, Martin

    2009-01-01

    The National Cancer Institute (NCI) is developing an integrated biomedical informatics infrastructure, the cancer Biomedical Informatics Grid (caBIG®), to support collaboration within the cancer research community. A key part of the caBIG architecture is the establishment of terminology standards for representing data. In order to evaluate the suitability of existing controlled terminologies, the caBIG Vocabulary and Data Elements Workspace (VCDE WS) working group has developed a set of criteria that serve to assess a terminology's structure, content, documentation, and editorial process. This paper describes the evolution of these criteria and the results of their use in evaluating four standard terminologies: the Gene Ontology (GO), the NCI Thesaurus (NCIt), the Common Terminology for Adverse Events (known as CTCAE), and the laboratory portion of the Logical Objects, Identifiers, Names and Codes (LOINC). The resulting caBIG criteria are presented as a matrix that may be applicable to any terminology standardization effort. PMID:19154797

  17. Little Big Horn River Water Quality Project

    SciTech Connect

    Bad Bear, D.J.; Hooker, D.

    1995-10-01

    This report summarizes the accomplishments of the Water Quality Project on the Little Big horn River during the summer of 1995. The majority of the summer was spent collecting data on the Little Big Horn River, then testing the water samples for a number of different tests which was done at the Little Big Horn College in Crow Agency, Montana. The intention of this study is to preform stream quality analysis to gain an understanding of the quality of selected portion of the river, to assess any impact that the existing developments may be causing to the environment and to gather base-line data which will serve to provide information concerning the proposed development. Citizens of the reservation have expressed a concern of the quality of the water on the reservation; surface waters, ground water, and well waters.

  18. Big Science and the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Giudice, Gian Francesco

    2012-03-01

    The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

  19. Implications of Big Data for cell biology

    PubMed Central

    Dolinski, Kara; Troyanskaya, Olga G.

    2015-01-01

    Big Data” has surpassed “systems biology” and “omics” as the hottest buzzword in the biological sciences, but is there any substance behind the hype? Certainly, we have learned about various aspects of cell and molecular biology from the many individual high-throughput data sets that have been published in the past 15–20 years. These data, although useful as individual data sets, can provide much more knowledge when interrogated with Big Data approaches, such as applying integrative methods that leverage the heterogeneous data compendia in their entirety. Here we discuss the benefits and challenges of such Big Data approaches in biology and how cell and molecular biologists can best take advantage of them. PMID:26174066

  20. ALMA imaging of gas and dust in a galaxy protocluster at redshift 5.3: [C II] emission in 'typical' galaxies and dusty starbursts ≈1 billion years after the big bang

    SciTech Connect

    Riechers, Dominik A.; Carilli, Christopher L.; Capak, Peter L.; Yan, Lin; Scoville, Nicholas Z.; Smolčić, Vernesa; Schinnerer, Eva; Yun, Min; Cox, Pierre; Bertoldi, Frank; Karim, Alexander

    2014-12-01

    We report interferometric imaging of [C II]({sup 2} P {sub 3/2}→{sup 2} P {sub 1/2}) and OH({sup 2}Π{sub 1/2} J = 3/2→1/2) emission toward the center of the galaxy protocluster associated with the z = 5.3 submillimeter galaxy (SMG) AzTEC-3, using the Atacama Large (sub)Millimeter Array (ALMA). We detect strong [C II], OH, and rest-frame 157.7 μm continuum emission toward the SMG. The [C II]({sup 2} P {sub 3/2}→{sup 2} P {sub 1/2}) emission is distributed over a scale of 3.9 kpc, implying a dynamical mass of 9.7 × 10{sup 10} M {sub ☉}, and a star formation rate (SFR) surface density of Σ{sub SFR} = 530 M {sub ☉} yr{sup –1} kpc{sup –2}. This suggests that AzTEC-3 forms stars at Σ{sub SFR} approaching the Eddington limit for radiation pressure supported disks. We find that the OH emission is slightly blueshifted relative to the [C II] line, which may indicate a molecular outflow associated with the peak phase of the starburst. We also detect and dynamically resolve [C II]({sup 2} P {sub 3/2}→{sup 2} P {sub 1/2}) emission over a scale of 7.5 kpc toward a triplet of Lyman-break galaxies with moderate UV-based SFRs in the protocluster at ∼95 kpc projected distance from the SMG. These galaxies are not detected in the continuum, suggesting far-infrared SFRs of <18-54 M {sub ☉} yr{sup –1}, consistent with a UV-based estimate of 22 M {sub ☉} yr{sup –1}. The spectral energy distribution of these galaxies is inconsistent with nearby spiral and starburst galaxies, but resembles those of dwarf galaxies. This is consistent with expectations for young starbursts without significant older stellar populations. This suggests that these galaxies are significantly metal-enriched, but not heavily dust-obscured, 'normal' star-forming galaxies at z > 5, showing that ALMA can detect the interstellar medium in 'typical' galaxies in the very early universe.

  1. Energy scale of the Big Bounce

    SciTech Connect

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz

    2009-09-15

    We examine the nature of the cosmological Big Bounce transition within the loop geometry underlying loop quantum cosmology at classical and quantum levels. Our canonical quantization method is an alternative to the standard loop quantum cosmology. An evolution parameter we use has a clear interpretation. Our method opens the door for analyses of spectra of physical observables like the energy density and the volume operator. We find that one cannot determine the energy scale specific to the Big Bounce by making use of the loop geometry without an extra input from observational cosmology.

  2. How quantum is the big bang?

    PubMed

    Bojowald, Martin

    2008-06-01

    When quantum gravity is used to discuss the big bang singularity, the most important, though rarely addressed, question is what role genuine quantum degrees of freedom play. Here, complete effective equations are derived for isotropic models with an interacting scalar to all orders in the expansions involved. The resulting coupling terms show that quantum fluctuations do not affect the bounce much. Quantum correlations, however, do have an important role and could even eliminate the bounce. How quantum gravity regularizes the big bang depends crucially on properties of the quantum state. PMID:18643411

  3. Effective dynamics of the matrix big bang

    SciTech Connect

    Craps, Ben; Rajaraman, Arvind; Sethi, Savdeep

    2006-05-15

    We study the leading quantum effects in the recently introduced matrix big bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the big bang. Surprisingly, the potential decays very rapidly at late times where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form of our one-loop potential seems to violate these constraints in a manner that suggests a connection between the cosmological singularity and long wavelength, late time physics.

  4. Livermore Big Trees Park: 1998 Results

    SciTech Connect

    Mac Queen, D; Gallegos, G; Surano, K

    2002-04-18

    This report is an in-depth study of results from environmental sampling conducted in 1998 by the Lawrence Livermore National Laboratory (LLNL) at Big Trees Park in the city of Livermore. The purpose of the sampling was to determine the extent and origin of plutonium found in soil at concentrations above fallout-background levels in the park. This report describes the sampling that was conducted, the chemical and radio-chemical analyses of the samples, the quality control assessments and statistical analyses of the analytical results, and LLNL's interpretations of the results. It includes a number of data analyses not presented in LLNL's previous reports on Big Trees Park.

  5. [Biobank in the age of big data].

    PubMed

    Zhang, Lianhai; Ji, Jiafu

    2015-01-01

    In big data era, researchers pay more attention to the correlation between biological information of patient samples and related clinical information of diseases. The large volume correlation analysis will help predict the initiation, development and outcome for specific diseases. Disease-related biobank is the core facility bridging the gap between the clinical information and biological information of the disease. The volume, diversity, and especially the quality, and standardization of sample and sample-related information will influence the outcome of big data prediction. Therefore, the establishment of quality management system, and implement of standard inspection and test method are very urgent for continuous improvement of biobanks. PMID:25656021

  6. Harnessing the Heart of Big Data

    PubMed Central

    Scruggs, Sarah B.; Watson, Karol; Su, Andrew I.; Hermjakob, Henning; Yates, John R.; Lindsey, Merry L.; Ping, Peipei

    2015-01-01

    The exponential increase in Big Data generation combined with limited capitalization on the wealth of information embedded within Big Data have prompted us to revisit our scientific discovery paradigms. A successful transition into this digital era of medicine holds great promise for advancing fundamental knowledge in biology, innovating human health and driving personalized medicine, however, this will require a drastic shift of research culture in how we conceptualize science and use data. An e-transformation will require global adoption and synergism among computational science, biomedical research and clinical domains. PMID:25814682

  7. Big brake singularity is accommodated as an exotic quintessence field

    NASA Astrophysics Data System (ADS)

    Chimento, Luis P.; Richarte, Martín G.

    2016-02-01

    We describe a big brake singularity in terms of a modified Chaplygin gas equation of state p =(γm-1 )ρ +α γmρ-n, accommodate this late-time event as an exotic quintessence model obtained from an energy-momentum tensor, and focus on the cosmological behavior of the exotic field, its kinetic energy, and the potential energy. At the background level the exotic field does not blow up, whereas its kinetic energy and potential both grow without limit near the future singularity. We evaluate the classical stability of this background solution by examining the scalar perturbations of the metric along with the inclusion of entropy perturbation in the perturbed pressure. Within the Newtonian gauge, the gravitational field approaches a constant near the singularity plus additional regular terms. When the perturbed exotic field is associated with α >0 the perturbed pressure and contrast density both diverge, whereas the perturbed exotic field and the divergence of the exotic field's velocity go to zero exponentially. When the perturbed exotic field is associated with α <0 the contrast density always blows up, but the perturbed pressure can remain bounded. In addition, the perturbed exotic field and the divergence of the exotic field's velocity vanish near the big brake singularity. We also briefly look at the behavior of the intrinsic entropy perturbation near the singular event.

  8. Design and development of a medical big data processing system based on Hadoop.

    PubMed

    Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song

    2015-03-01

    Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations. PMID:25666927

  9. Characterization of Stream Morphology and Sediment Yield for the Big Black and Tombigbee River Basins, Mississippi

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Three segments within the Big Black River Basin, and nine within the Tombigbee River Basin are on the Mississippi 303d list of water bodies as having impaired conditions for aquatic life due to sediment. An additional 56 reaches of channel are listed for biologic impairment between the two basins. ...

  10. The Reliability and Validity of Big Five Inventory Scores with African American College Students

    ERIC Educational Resources Information Center

    Worrell, Frank C.; Cross, William E., Jr.

    2004-01-01

    This article describes a study that examined the reliability and validity of scores on the Big Five Inventory (BFI; O. P. John, E. M. Donahue, & R. L. Kentle, 1991) in a sample of 336 African American college students. Results from the study indicated moderate reliability and structural validity for BFI scores. Additionally, BFI subscales had few…

  11. Human Neuroimaging as a “Big Data” Science

    PubMed Central

    Van Horn, John Darrell; Toga, Arthur W.

    2013-01-01

    The maturation of in vivo neuroimaging has lead to incredible quantities of digital information about the human brain. While much is made of the data deluge in science, neuroimaging represents the leading edge of this onslaught of “big data”. A range of neuroimaging databasing approaches has streamlined the transmission, storage, and dissemination of data from such brain imaging studies. Yet few, if any, common solutions exist to support the science of neuroimaging. In this article, we discuss how modern neuroimaging research represents a mutifactorial and broad ranging data challenge, involving the growing size of the data being acquired; sociologial and logistical sharing issues; infrastructural challenges for multi-site, multi-datatype archiving; and the means by which to explore and mine these data. As neuroimaging advances further, e.g. aging, genetics, and age-related disease, new vision is needed to manage and process this information while marshalling of these resources into novel results. Thus, “big data” can become “big” brain science. PMID:24113873

  12. Spatial Big Data Organization, Access and Visualization with ESSG

    NASA Astrophysics Data System (ADS)

    Wu, L. X.; Yu, J. Q.; Yang, Y. Z.; Jia, Y. J.

    2013-10-01

    There are hundreds of spatial reference frame (SRF) being applied, and the great difference among SRFs has blocked the share of global data on planet Earth. A conceptual spheroid of radius 12,800 km and a spheroid degenerated octree grid method are applied to produce an earth system spatial grid (ESSG), which is of natural characteristics to be applied as a new common SRF. A triple CTA is designed as ESSG-based data structure to organize the big data of planet Earth, and a 2D table of a unique label and limitless records for time slices and attribute values is present to record the data of each grid. The big data on planet Earth can hence be gridded and interrelated without discipline gaps and SRF obstacles. An integral data organization mode is designed, and three potential routes are presented for users to access shareable global data in cloud environment. Furthermore, with global crust, atmosphere, DEM, and satellite image being examples, the integrated visualization of global large objects is demonstrated.

  13. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online. PMID:24729671

  14. Big Events in Greece and HIV Infection Among People Who Inject Drugs

    PubMed Central

    Nikolopoulos, Georgios K.; Sypsa, Vana; Bonovas, Stefanos; Paraskevis, Dimitrios; Malliori-Minerva, Melpomeni; Hatzakis, Angelos; Friedman, Samuel R.

    2015-01-01

    Big Events are processes like macroeconomic transitions that have lowered social well-being in various settings in the past. Greece has been hit by the global crisis and experienced an HIV outbreak among people who inject drugs. Since the crisis began (2008), Greece has seen population displacement, inter-communal violence, cuts in governmental expenditures, and social movements. These may have affected normative regulation, networks, and behaviors. However, most pathways to risk remain unknown or unmeasured. We use what is known and unknown about the Greek HIV outbreak to suggest modifications in Big Events models and the need for additional research. PMID:25723309

  15. Doublet III Big Dee Project

    SciTech Connect

    Davis, L.G.; Luxon, J.L.

    1985-05-01

    The Doublet III tokamak is presently being reconfigured into a new larger dee-shaped plasma configuration. Experiments will begin in 1986 with a goal of high current, high beta plasma operation at moderate magnetic field. The existing toroidal field coil, Ohmic heating coil, and innermost plasma shaping coils will be retained. A new water-cooled vacuum vessel is being fabricated using a corrugated Inconel sandwich wall construction. Six new water-cooled copper poloidal field coils are also being fabricated. The resultant device along with additional power supplies will provide a capability for plasma currents of 3.5 MA for 1.5 s during the first phase of operations; the tokamak systems are designed for 5 MA operation with additional power systems. The four existing 80 keV, 3 MW neutral beam lines are being modified for optimum torus access and 0.7 s operation. These injectors will be upgraded to allow 5 s operation with new sources in 1987. The device has been designed to accommodate an additional 20 MW of ICRH and ECH power in the future. Limiters and vessel wall protection will be provided for initial operation with up to 40 MJ of input energy. Future installation of additional thermal armor will allow operation with up to 200 MJ of input energy over a 10 s period. Most of the existing diagnostics will be modified as required and reinstalled on the new vessel.

  16. Heat Exchange, Additive Manufacturing, and Neutron Imaging

    SciTech Connect

    Geoghegan, Patrick

    2015-02-23

    Researchers at the Oak Ridge National Laboratory have captured undistorted snapshots of refrigerants flowing through small heat exchangers, helping them to better understand heat transfer in heating, cooling and ventilation systems.

  17. [Research applications in digital radiology. Big data and co].

    PubMed

    Müller, H; Hanbury, A

    2016-02-01

    Medical imaging produces increasingly complex images (e.g. thinner slices and higher resolution) with more protocols, so that image reading has also become much more complex. More information needs to be processed and usually the number of radiologists available for these tasks has not increased to the same extent. The objective of this article is to present current research results from projects on the use of image data for clinical decision support. An infrastructure that can allow large volumes of data to be accessed is presented. In this way the best performing tools can be identified without the medical data having to leave secure servers. The text presents the results of the VISCERAL and Khresmoi EU-funded projects, which allow the analysis of previous cases from institutional archives to support decision-making and for process automation. The results also represent a secure evaluation environment for medical image analysis. This allows the use of data extracted from past cases to solve information needs occurring when diagnosing new cases. The presented research prototypes allow direct extraction of knowledge from the visual data of the images and to use this for decision support or process automation. Real clinical use has not been tested but several subjective user tests showed the effectiveness and efficiency of the process. The future in radiology will clearly depend on better use of the important knowledge in clinical image archives to automate processes and aid decision-making via big data analysis. This can help concentrate the work of radiologists towards the most important parts of diagnostics. PMID:26561024

  18. What's the Big Sweat about Dehydration? (For Kids)

    MedlinePlus

    ... Dictionary of Medical Words En Español What Other Kids Are Reading Back-to-School Butterflies? Read This ... What's the Big Sweat About Dehydration? KidsHealth > For Kids > What's the Big Sweat About Dehydration? Print A ...

  19. Health Before a Stroke Is Big Predictor of Second Attack

    MedlinePlus

    ... fullstory_159881.html Health Before a Stroke Is Big Predictor of Second Attack Getting hypertension, cholesterol under ... the researchers explained. The findings weren't a big surprise to the study authors. "We found in ...

  20. Big Creek Hydroelectric System, East & West Transmission Line, 241mile ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Big Creek Hydroelectric System, East & West Transmission Line, 241-mile transmission corridor extending between the Big Creek Hydroelectric System in the Sierra National Forest in Fresno County and the Eagle Rock Substation in Los Angeles, California, Visalia, Tulare County, CA

  1. "Small Steps, Big Rewards": Preventing Type 2 Diabetes

    MedlinePlus

    ... please turn Javascript on. Feature: Diabetes "Small Steps, Big Rewards": Preventing Type 2 Diabetes Past Issues / Fall ... These are the plain facts in "Small Steps. Big Rewards: Prevent Type 2 Diabetes," an education campaign ...

  2. Device Data Ingestion for Industrial Big Data Platforms with a Case Study.

    PubMed

    Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei

    2016-01-01

    Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data. PMID:26927121

  3. Device Data Ingestion for Industrial Big Data Platforms with a Case Study †

    PubMed Central

    Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei

    2016-01-01

    Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data. PMID:26927121

  4. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Note: [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to 1000 light-years, or about 9000 million million km! More Information: This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates

  5. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Notes [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to several hundreds light years on each side of the black hole, or about several thousand million million km! More information This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising

  6. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Note: [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to 1000 light-years, or about 9000 million million km! More Information: This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates

  7. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Notes [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to several hundreds light years on each side of the black hole, or about several thousand million million km! More information This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising

  8. A Big Problem for Magellan: Food Preservation

    ERIC Educational Resources Information Center

    Galvao, Cecilia; Reis, Pedro; Freire, Sofia

    2008-01-01

    In this paper, we present data related to how a Portuguese teacher developed the module "A big problem for Magellan: Food preservation." Students were asked to plan an investigation in order to identify which were the best food preservation methods in the XV and XVI centuries of Portuguese overseas navigation, and then establish a parallel between…

  9. Big Island Demonstration Project - Black Liquor

    SciTech Connect

    2006-08-01

    Black liquor is a papermaking byproduct that also serves as a fuel for pulp and paper mills. This project involves the design, construction, and operation of a black liquor gasifier that will be integrated into Georgia-Pacific's Big Island facility in Virginia, a mill that has been in operation for more than 100 years.

  10. Big Broadband Connectivity in the United States

    ERIC Educational Resources Information Center

    Windhausen, John, Jr.

    2008-01-01

    The economic and social future of the United States depends on answering the growing demand for very high-speed broadband connectivity, a capability termed "big broadband." Failure to take on the challenge could lead to a decline in global competitiveness and an inability to educate students. (Contains 20 notes.)

  11. Big-Time Sports in American Universities

    ERIC Educational Resources Information Center

    Clotfelter, Charles T.

    2011-01-01

    For almost a century, big-time college sports has been a wildly popular but consistently problematic part of American higher education. The challenges it poses to traditional academic values have been recognized from the start, but they have grown more ominous in recent decades, as cable television has become ubiquitous, commercial opportunities…

  12. Big-Time Fundraising for Today's Schools

    ERIC Educational Resources Information Center

    Levenson, Stanley

    2006-01-01

    In this enlightening book, nationally recognized author and fundraising consultant Stanley Levenson shows school leaders how to move away from labor-intensive, nickel-and-dime bake sales and car washes, and into the world of big-time fundraising. Following the model used by colleges and universities, the author presents a wealth of practical…

  13. Big Bubbles in Boiling Liquids: Students' Views

    ERIC Educational Resources Information Center

    Costu, Bayram

    2008-01-01

    The aim of this study was to elicit students' conceptions about big bubbles in boiling liquids (water, ethanol and aqueous CuSO[subscript 4] solution). The study is based on twenty-four students at different ages and grades. The clinical interviews technique was conducted to solicit students' conceptions and the interviews were analyzed to…

  14. Science Literacy Circles: Big Ideas about Science

    ERIC Educational Resources Information Center

    Devick-Fry, Jane; LeSage, Teresa

    2010-01-01

    Science literacy circles incorporate the organization of both science notebooks and literature circles to help K-8 students internalize big ideas about science. Using science literacy circles gives students opportunities to engage in critical thinking as they inductively develop understanding about science concepts. (Contains 1 table and 7…

  15. More on Sports and the Big6.

    ERIC Educational Resources Information Center

    Eisenberg, Mike

    1998-01-01

    Presents strategies for relating the Big6 information problem-solving process to sports to gain students' attention, sustain it, and make instruction relevant to their interests. Lectures by coaches, computer-based sports games, sports information sources, the use of technology in sports, and judging sports events are discussed. (LRW)

  16. Marketing Your Library with the Big Read

    ERIC Educational Resources Information Center

    Johnson, Wendell G.

    2012-01-01

    The Big Read was developed by the National Endowment for the Arts to revitalize the role of culture in American society and encourage the reading of landmark literature. Each year since 2007, the DeKalb Public Library, Northern Illinois University, and Kishwaukee Community College have partnered to foster literacy in the community. This article…

  17. Challenges of Big Data in Educational Assessment

    ERIC Educational Resources Information Center

    Gibson, David C.; Webb, Mary; Ifenthaler, Dirk

    2015-01-01

    This paper briefly discusses four measurement challenges of data science or "big data" in educational assessments that are enabled by technology: 1. Dealing with change over time via time-based data. 2. How a digital performance space's relationships interact with learner actions, communications and products. 3. How layers of…

  18. Big physics quartet win government backing

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2014-09-01

    Four major physics-based projects are among 10 to have been selected by Japan’s Ministry of Education, Culture, Sports, Science and Technology for funding in the coming decade as part of its “roadmap” of big-science projects.

  19. UV - BIG BEND NATIONAL PARK TX

    EPA Science Inventory

    Brewer 130 is located in Big Bend NP, measuring ultraviolet solar radiation. Irradiance and column ozone are derived from this data. Ultraviolet solar radiation is measured with a Brewer Mark IV, single-monochrometer, spectrophotometer manufactured by SCI-TEC Instruments, Inc. of...

  20. 1. OVERVIEW OF EXTREME EAST END OF BIG CREEK TOWN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. OVERVIEW OF EXTREME EAST END OF BIG CREEK TOWN ACROSS POWERHOUSE NO. 2 FOREBAY (POWERHOUSE NO. 1 AFTERBAY). TOWER CARRYING TRANSMISSION LINES FROM POWERHOUSE NO. 1 IS AT PHOTO CENTER. BEHIND TOWER IS BUILDING 103. TO PHOTO LEFT OF BUILDING 103 IS BUILDING 105. VIEW TO NORTH. - Big Creek Hydroelectric System, Big Creek Town, Operator House, Orchard Avenue south of Huntington Lake Road, Big Creek, Fresno County, CA

  1. The Big Island of Hawaii

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Boasting snow-covered mountain peaks and tropical forest, the Island of Hawaii, the largest of the Hawaiian Islands, is stunning at any altitude. This false-color composite (processed to simulate true color) image of Hawaii was constructed from data gathered between 1999 and 2001 by the Enhanced Thematic Mapper plus (ETM+) instrument, flying aboard the Landsat 7 satellite. The Landsat data were processed by the National Oceanographic and Atmospheric Administration (NOAA) to develop a landcover map. This map will be used as a baseline to chart changes in land use on the islands. Types of change include the construction of resorts along the coastal areas, and the conversion of sugar plantations to other crop types. Hawaii was created by a 'hotspot' beneath the ocean floor. Hotspots form in areas where superheated magma in the Earth's mantle breaks through the Earth's crust. Over the course of millions of years, the Pacific Tectonic Plate has slowly moved over this hotspot to form the entire Hawaiian Island archipelago. The black areas on the island (in this scene) that resemble a pair of sun-baked palm fronds are hardened lava flows formed by the active Mauna Loa Volcano. Just to the north of Mauna Loa is the dormant grayish Mauna Kea Volcano, which hasn't erupted in an estimated 3,500 years. A thin greyish plume of smoke is visible near the island's southeastern shore, rising from Kilauea-the most active volcano on Earth. Heavy rainfall and fertile volcanic soil have given rise to Hawaii's lush tropical forests, which appear as solid dark green areas in the image. The light green, patchy areas near the coasts are likely sugar cane plantations, pineapple farms, and human settlements. Courtesy of the NOAA Coastal Services Center Hawaii Land Cover Analysis project

  2. Bayesian Model Selection in 'Big Data' Spectral Analysis

    NASA Astrophysics Data System (ADS)

    Fischer, Travis C.; Crenshaw, D. Michael; Baron, Fabien; Kloppenborg, Brian K.; Pope, Crystal L.

    2015-01-01

    As IFU observations and large spectral surveys continue to become more prevalent, the handling of thousands of spectra has become common place. Astronomers look at objects with increasingly complex emission-linestructures, so establishing a method that will easily allow for multiple-component analysis of these features in an automated fashion would be of great use to the community. Already used in exoplanet detection and interferometric image reconstruction, we present a new application of Bayesian model selection in `big data' spectral analysis. With this technique, the fitting of multiple emission-line components in an automated fashion while simultaneously determining the correct number of components in each spectrum streamlines the line measurements for a large number of spectra into a single process.

  3. Mining of Solar Big Data: Challenges and Opportunities

    NASA Astrophysics Data System (ADS)

    Angryk, R.

    2014-12-01

    The focus of this talk is on the challenges and opportunities linked to the big data analytics of the Solar Observatory (www.nasa.gov/sdo/), which is a flagship of NASA's current "Living with a Star" program. The audience will first learn about the importance of solar data analysis, then about the complexity of data maintained on the servers in our Data Mining Lab (dmlab.cs.montana.edu/). After that, our three ongoing research projects will be discussed: (1) the Content-based Image Retrieval (CBIR) system for solar data, (2) the development of machine learning techniques for automated validation and expansion of FFT's software modules, and (3) the search for spatio-temporal co-occurrence patterns among different types of solar activity. Finally, we will briefly talk about the future of our solar databases and data mining projects.

  4. Big Ideas in Primary Mathematics: Issues and Directions

    ERIC Educational Resources Information Center

    Askew, Mike

    2013-01-01

    This article is located within the literature arguing for attention to Big Ideas in teaching and learning mathematics for understanding. The focus is on surveying the literature of Big Ideas and clarifying what might constitute Big Ideas in the primary Mathematics Curriculum based on both theoretical and pragmatic considerations. This is…

  5. 76 FR 26240 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-06

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming..., 2011, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn County Weed...

  6. 78 FR 33326 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-04

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... at 3:00 p.m. ADDRESSES: The meeting will be held at Big Horn County Weed and Pest Building,...

  7. 75 FR 141 - Big Rivers Electric Corporation; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-04

    ... Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing December 23, 2009. Take... Granting Petition for Declaratory Order and Granting Waivers,'' Big Rivers Elec. Corp., 128 FERC ] 61,264 (2009) (September 17 Order), Big Rivers Electric Corporation filed revised tariff sheets to its...

  8. 75 FR 49886 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-16

    ...; ] DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in..., 2010, and will begin at 9 a.m. ADDRESSES: The meeting will be held at the Big Horn County...

  9. 76 FR 47141 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-04

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ] ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming..., 2011 and will begin at 3 p.m. ADDRESSES: The meeting will be held at the Big Horn County Weed and...

  10. 76 FR 7810 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Lovell, Wyoming..., 2011, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn Federal...

  11. 77 FR 49779 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-17

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... 11, 2012 and will begin at 3 p.m. ADDRESSES: The meeting will be held at the Big Horn County Weed...

  12. 75 FR 71069 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... December 1, 2010, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn County...

  13. View of New Big Oak Flat Road seen from Old ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of New Big Oak Flat Road seen from Old Wawona Road near location of photograph HAER CA-148-17. Note road cuts, alignment, and tunnels. Devils Dance Floor at left distance. Looking northwest - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  14. 9. View from middle adit Wawona Tunnel of Big Oak ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. View from middle adit Wawona Tunnel of Big Oak Flat Road with retaining walls at lower left and center left with east portal of tunnel #1. - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  15. Clarity and causality needed in claims about Big Gods.

    PubMed

    Watts, Joseph; Bulbulia, Joseph; Gray, Russell D; Atkinson, Quentin D

    2016-01-01

    We welcome Norenzayan et al.'s claim that the prosocial effects of beliefs in supernatural agents extend beyond Big Gods. To date, however, supporting evidence has focused on the Abrahamic Big God, making generalisations difficult. We discuss a recent study that highlights the need for clarity about the causal path by which supernatural beliefs affect the evolution of big societies. PMID:26948745

  16. 11. VIEW OF UPSTREAM ELEVATION OF BIG DALTON DAM SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. VIEW OF UPSTREAM ELEVATION OF BIG DALTON DAM SHOWING CONSTRUCTION OF THE ARCH WALLS, TAKEN ON SEPTEMBER 11, 1928 (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON 6/5/1973 BY PHOTOGRAPHER GATSON OF L.A. COUNTY PUBLIC WORKS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  17. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    PubMed

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges. PMID:25670967

  18. The Ethics of Big Data: Current and Foreseeable Issues in Biomedical Contexts.

    PubMed

    Mittelstadt, Brent Daniel; Floridi, Luciano

    2016-04-01

    The capacity to collect and analyse data is growing exponentially. Referred to as 'Big Data', this scientific, social and technological trend has helped create destabilising amounts of information, which can challenge accepted social and ethical norms. Big Data remains a fuzzy idea, emerging across social, scientific, and business contexts sometimes seemingly related only by the gigantic size of the datasets being considered. As is often the case with the cutting edge of scientific and technological progress, understanding of the ethical implications of Big Data lags behind. In order to bridge such a gap, this article systematically and comprehensively analyses academic literature concerning the ethical implications of Big Data, providing a watershed for future ethical investigations and regulations. Particular attention is paid to biomedical Big Data due to the inherent sensitivity of medical information. By means of a meta-analysis of the literature, a thematic narrative is provided to guide ethicists, data scientists, regulators and other stakeholders through what is already known or hypothesised about the ethical risks of this emerging and innovative phenomenon. Five key areas of concern are identified: (1) informed consent, (2) privacy (including anonymisation and data protection), (3) ownership, (4) epistemology and objectivity, and (5) 'Big Data Divides' created between those who have or lack the necessary resources to analyse increasingly large datasets. Critical gaps in the treatment of these themes are identified with suggestions for future research. Six additional areas of concern are then suggested which, although related have not yet attracted extensive debate in the existing literature. It is argued that they will require much closer scrutiny in the immediate future: (6) the dangers of ignoring group-level ethical harms; (7) the importance of epistemology in assessing the ethics of Big Data; (8) the changing nature of fiduciary relationships that

  19. Focusing on the big picture.

    PubMed

    Chen, Ingfei

    2003-09-10

    As a postdoc in cognitive neuroscience who's also a neurology fellow, Adam Gazzaley is a meld of basic science expertise and clinical experience: He studies brain aging in people by using functional magnetic resonance imaging at the University of California (UC), Berkeley, and he also sees patients at UC San Francisco's Memory and Aging Center. The 34-year-old native New Yorker dives with equal fervor into scientific research and nature photography, two lenses for viewing a single world of discovery. Growing up in Queens, Gazzaley knew from age 7 that he wanted to become a scientist, and as a teenager, he commuted long hours to attend the Bronx High School of Science. He earned an M.D.-Ph.D. from the Mount Sinai School of Medicine in New York City. Gazzaley's hobby as a shutterbug periodically takes him on backpacking trips to document the beauty of the great outdoors. He sells fine-art prints of his photographs to individuals, hospitals, and clinics through his company, Wanderings Inc. PMID:12968055

  20. Assessment of acreage and vegetation change in Florida's Big Bend tidal wetlands using satellite imagery

    USGS Publications Warehouse

    Raabe, Ellen A.; Stumpf, Richard P.

    1997-01-01

    Fluctuations in sea level and impending development on the west coast of Florida have aroused concern for the relatively pristine tidal marshes of the Big Bend. Landsat Thematic Mapper (TM) images for 1986 and 1995 are processed and evaluated for signs of change. The images cover 250 km of Florida's Big Bend Gulf Coast, encompassing 160,000 acres of tidal marshes. Change is detected using the normalized difference vegetation index (NDVI) and land cover classification. The imagery shows negligible net loss or gain in the marsh over the 9-year period. However, regional changes in biomass are apparent and are due to natural disturbances such as low winter temperatures, fire, storm surge, and the conversion of forest to march. Within the marsh, the most prominent changes in NDVI and in land cover result from the recovery of mangroves from freezes, a decline of transitional upland vegetation, and susceptibility of the marsh edge and interior to variations in tidal flooding.

  1. Geriatrics in Brazil: a big country with big opportunities.

    PubMed

    Garcez-Leme, Luiz E; Leme, Mariana Deckers; Espino, David V

    2005-11-01

    Brazil has approximately 180 million inhabitants, of whom 15.2 million are aged 60 and older and 1.9 million are aged 80 and older. By 2025, the Brazilian elderly population is expected to grow to more than 32 million. Brazil has many problems related to its geographic and population size. Great distances between major cities, marked cultural and racial heterogeneity between the various geographic regions, high poverty levels, and decreasing family size all combine to put pressure on the medical and social services that can be made available to the elder population. Less than 500 Brazilian physicians are certified as geriatricians, translating into one geriatrician for every 37,000 elderly Brazilians. Beside 15 geriatric medicine residencies a larger number of fellowship programs exist, and these programs are in high demand, with more than 20 candidates per position, indicating new opportunities for growth in elder care. In addition, geriatric initiatives such as the annual elder vaccination program and the elder statute, recently approved by the Brazilian Congress, indicate that geriatric care in Brazil is entering a new era of growth and development. Although the challenges remain great, there are opportunities for Brazilian geriatrics and gerontology. PMID:16274389

  2. High School Students as Mentors: Findings from the Big Brothers Big Sisters School-Based Mentoring Impact Study

    ERIC Educational Resources Information Center

    Herrera, Carla; Kauh, Tina J.; Cooney, Siobhan M.; Grossman, Jean Baldwin; McMaken, Jennifer

    2008-01-01

    High schools have recently become a popular source of mentors for school-based mentoring (SBM) programs. The high school Bigs program of Big Brothers Big Sisters of America, for example, currently involves close to 50,000 high-school-aged mentors across the country. While the use of these young mentors has several potential advantages, their age…

  3. Primordial comets: big bang nucleosynthesis, dark matter and life

    NASA Astrophysics Data System (ADS)

    Sheldon, Robert B.

    2015-09-01

    Primordial comets are comets made of Big Bang synthesized materials—water, ammonium, and carbon ices. These are the basic elements for life, so that these comets can be colonized by cyanobacteria that grow and bioengineer it for life dispersal. In addition, should they exist in large enough quantities, they would easily satisfy the qualifications for dark matter: low albedo with low visibility, gravitationally femtolensing, galactic negative viscosity, early galaxy formation seeds, and a self-interaction providing cosmic structure. The major arguments against their existence are the absence of metals (elements heavier than He) in ancient Population III stars, and the stringent requirements put on the Big Bang (BB) baryonic density by the BB nucleosynthesis (BBN) models. We argue that CI chondrites, hyperbolic comets, and carbon-enriched Pop III stars are all evidence for primordial comets. The BBN models provide the greater obstacle, but we argue that they crucially omit the magnetic field in their homogeneous, isotropic, "ideal baryon gas" model. Should large magnetic fields exist, not only would they undermine the 1-D models, but if their magnitude exceeds some critical field/density ratio, then the neutrino interacts with the fields, changing the equilibrium ratio of protons to neutrons. Since BBN models are strongly dependent on this ratio, magnetic fields have the potential to radically change the production of C, N, and O (CNO) to produce primordial comets. Then the universe from the earliest moments is not only seeded for galaxy formation, but it is seeded with the ingredients for life.

  4. Nonsingular big bounces and the evolution of linear fluctuations

    NASA Astrophysics Data System (ADS)

    Hwang, Jai-Chan; Noh, Hyerim

    2002-06-01

    We consider the evolutions of linear fluctuations as the background Friedmann world model goes from contracting to expanding phases through smooth and nonsingular bouncing phases. As long as gravity dominates over the pressure gradient in the perturbation equation, the growing mode in the expanding phase is characterized by a conserved amplitude; we call this a C mode. In spherical geometry with a pressureless medium, we show that there exists a special gauge-invariant combination Φ which stays constant throughout the evolution from the big bang to the big crunch, with the same value even after the bounce: it characterizes the coefficient of the C mode. We show this result by using a bounce model where the pressure gradient term is negligible during the bounce; this requires the additional presence of exotic matter. In such a bounce, even in more general situations for the equation of state before and after the bounce, the C mode in the expanding phase is affected only by the C mode in the contracting phase; thus the growing mode in the contracting phase decays away as the world model enters the expanding phase. When the background curvature plays a significant role during the bounce, the pressure gradient term becomes important and we cannot trace the C mode in the expanding phase to the one before the bounce. In such situations, perturbations in a fluid bounce model show exponential instability, whereas perturbations in a scalar field bounce model show oscillatory behavior.

  5. Big Data Archives: Replication and synchronizing on a large scale

    NASA Astrophysics Data System (ADS)

    King, T. A.; Walker, R. J.

    2015-12-01

    Modern data archives provide unique challenges to replication and synchronization because of their large size. We collect more digital information today than any time before and the volume of data collected is continuously increasing. Some of these data are from unique observations, like those from planetary missions that should be preserved for use by future generations. In addition data from NASA missions are considered federal records and must be retained. While the data may be stored on resilient hardware (i.e. RAID systems) they also must be protected from local or regional disasters. Meeting this challenge requires creating multiple copies. This task is complicated by the fact that new data are constantly being added creating what are called "active archives". Having reliable, high performance tools for replicating and synchronizing active archives in a timely fashion is critical to preservation of the data. When archives were smaller using tools like bbcp, rsync and rcp worked fairly well. While these tools are affective they are not optimized for synchronizing big data archives and their poor performance at scale lead us to develop a new tool designed specifically for big data archives. It combines the best features of git, bbcp, rsync and rcp. We call this tool "Mimic" and we discuss the design of the tool, performance comparisons and its use at NASA's Planetary Plasma Interactions (PPI) Node of the Planetary Data System (PDS).

  6. ';Big Data' can make a big difference: Applying Big Data to National Scale Change Analyses

    NASA Astrophysics Data System (ADS)

    Mueller, N. R.; Curnow, S.; Melrose, R.; Purss, M. B.; Lewis, A.

    2013-12-01

    The traditional method of change detection in remote sensing is based on acquiring a pair of images and conducting a set of analyses to determine what is different between them. The end result is a single change analysis for a single time period. While this may be repeated several times, it is generally a time consuming, often manual process providing a series of snapshots of change. As datasets become larger, and time series analyses become more sophisticated, these traditional methods of analysis are unviable. The Geoscience Australia ';Data Cube' provides a 25-year time series of all Landsat-5 and Landsat-7 data for the entire Australian continent. Each image is orthorectified to a standard set of pixel locations and is fully calibrated to a measure of surface reflectance (the 25m Australian Reflectance Grid [ARG25]). These surface reflectance measurements are directly comparable, between different scenes, and regardless of whether they are sourced from the Landsat-5 TM instrument or the Landsat-7 ETM+. The advantage of the Data Cube environment lies in the ability to apply an algorithm to every pixel across Australia (some 1013 pixels) in a consistent way, enabling change analysis for every acquired observation. This provides a framework to analyse change through time on a scene to scene basis, and across national-scale areas for the entire duration of the archive. Two examples of applications of the Data Cube are described here: surface water extent mapping across Australia; and vegetation condition mapping across the Murray-Darling Basin, Australia's largest river system.. Ongoing water mapping and vegetation condition mapping is required by the Australian government to produce information products for a range of requirements including ecological monitoring and emergency management risk planning. With a 25 year archive of Landsat-5 and Landsat-7 imagery hosted on an efficient High Performance Computing (HPC) environment, high speed analyses of long time

  7. SETI as a part of Big History

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive Big History Theory based on GBMs: just as the GBM is the “movie” unfolding in time, so the Statistical Drake Equation is its “still picture”, static in time, and the GBM is the time-extension of the Drake Equation. Darwinian Evolution on Earth may be easily described as an increasing GBM in the number of living species on Earth over the last 3.5 billion years. The first of them was RNA 3.5 billion years ago, and now 50

  8. Supramolecular polymerisation in water; elucidating the role of hydrophobic and hydrogen-bond interactions† †Electronic supplementary information (ESI) available: Experimental details, characterization by IR and UV spectroscopy and dynamic light scattering, video files of optical microscopy imaging. See DOI: 10.1039/c5sm02843d Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Leenders, Christianus M. A.; Baker, Matthew B.; Pijpers, Imke A. B.; Lafleur, René P. M.; Albertazzi, Lorenzo

    2016-01-01

    Understanding the self-assembly of small molecules in water is crucial for the development of responsive, biocompatible soft materials. Here, a family of benzene-1,3,5-tricarboxamide (BTA) derivatives that comprise a BTA moiety connected to an amphiphilic chain is synthesised with the aim to elucidate the role of hydrophobic and hydrogen-bonding interactions in the self-assembly of these BTAs. The amphiphilic chain consists of an alkyl chain with a length of 10, 11, or 12 methylene units, connected to a tetraethylene glycol (at the periphery). The results show that an undecyl spacer is the minimum length required for these BTAs to self-assemble into supramolecular polymers. Interestingly, exchange studies reveal only minor differences in exchange rates between BTAs containing undecyl or dodecyl spacers. Additionally, IR spectroscopy provides the first experimental evidence that hydrogen-bonding is operative and contributes to the stabilisation of the supramolecular polymers in water. PMID:26892482

  9. Multi-Scale Change Detection Research of Remotely Sensed Big Data in CyberGIS

    NASA Astrophysics Data System (ADS)

    Xing, J.; Sieber, R.

    2015-12-01

    Big remotely sensed data, the heterogeneity of satellite platforms and file formats along with increasing volumes and velocities, offers new types of analyses. This makes big remotely sensed data a good candidate for CyberGIS, the aim of which is to enable knowledge discovery of big data in the cloud. We apply CyberGIS to feature-based multi-scale land use/cover change (LUCC) detection. There have been attempts to do multi-scale LUCC. However, studies were done with small data and could not consider the mismatch between multi-scale analysis and computational scale. They have yet to consider the possibilities for scalar research across numerous temporal and spatial scales afforded by big data, especially if we want to advance beyond pixel-based analysis and also reduce preprocessing requirements. We create a geospatial cyberinfrastructure (GCI) to handle multi-spatio-temporal scale change detection. We first clarify different meanings of scale in CyberGIS and LUCC to derive a feature scope layer in the GCI based on Stommel modelling. Our analysis layer contains a multi-scale segmentation-based method based on normalized cut image segmentation and wavelet-based image scaling algorithms. Our computer resource utilization layer uses Wang and Armstrong's (2009) method for mainly for memory, I/O and CPU time. Our case is urban-rural change detection in the Greater Montreal Area (5 time periods, 2006-2012, 100 virtual machines), 36,000km2 and varying from 0.6m to 38m resolution. We present a ground truthed accuracy assessment of a change matrix that is composed of 6 feature classes at 12 different spatio-temporal scales, and the performance of the change detection GCI for multi-scale LUCC study. The GCI allows us to extract and coordinate different types of changes by varying spatio-temporal scales from the big imagery datasets.

  10. Pinna nobilis: A big bivalve with big haemocytes?

    PubMed

    Matozzo, V; Pagano, M; Spinelli, A; Caicci, F; Faggio, C

    2016-08-01

    The fan mussel Pinna nobilis (Linnaeus, 1758) is one of the biggest bivalves worldwide. Currently, no updated information is available in the literature concerning the morpho-functional aspects of haemocytes from this bivalve species. Consequently, in this study, we characterised P. nobilis haemocytes from both a morphological and functional point of view. The mean number of haemocytes was about 5 (×10(5)) cells mL haemolymph(-1), and the cell viability was about 92-100%. Two haemocyte types were distinguished under the light microscope: granulocytes (51.6%), with evident cytoplasmic granules, and hyalinocytes (48.4%), with a few granules. The granules of the granulocytes were mainly lysosomes, as indicated by the in vivo staining with Neutral Red. Haemocytes were further distinguished in basophils (83.75%), acidophils (14.75%) and neutrophils (1.5%). After adhesion to slides and fixation, the cell diameter was approximately 10 μm for granulocytes and 7 μm for hyalinocytes. The granulocytes and hyalinocytes were both positive to the Periodic Acid-Schiff reaction for carbohydrates. Only granulocytes were able to phagocytise yeast cells. The phagocytic index (6%) increased significantly up to twofold after preincubation of yeast in cell-free haemolymph, suggesting that haemolymph has opsonising properties. In addition, haemocytes produce superoxide anion and acid and alkaline phosphatases. Summarising, this preliminary study indicates that both the granulocytes and hyalinocytes circulate in the haemolymph of P. nobilis and that they are active immunocytes. PMID:27346153

  11. Elevation of neuron specific enolase and brain iron deposition on susceptibility-weighted imaging as diagnostic clues for beta-propeller protein-associated neurodegeneration in early childhood: Additional case report and review of the literature.

    PubMed

    Takano, Kyoko; Shiba, Naoko; Wakui, Keiko; Yamaguchi, Tomomi; Aida, Noriko; Inaba, Yuji; Fukushima, Yoshimitsu; Kosho, Tomoki

    2016-02-01

    Beta-propeller protein-associated neurodegeneration (BPAN), also known as static encephalopathy of childhood with neurodegeneration in adulthood (SENDA), is a subtype of neurodegeneration with brain iron accumulation (NBIA). BPAN is caused by mutations in an X-linked gene WDR45 that is involved in autophagy. BPAN is characterized by developmental delay or intellectual disability until adolescence or early adulthood, followed by severe dystonia, parkinsonism, and progressive dementia. Brain magnetic resonance imaging (MRI) shows iron deposition in the bilateral globus pallidus (GP) and substantia nigra (SN). Clinical manifestations and laboratory findings in early childhood are limited. We report a 3-year-old girl with BPAN who presented with severe developmental delay and characteristic facial features. In addition to chronic elevation of serum aspartate transaminase, lactate dehydrogenase, creatine kinase, and soluble interleukin-2 receptor, she had persistent elevation of neuron specific enolase (NSE) in serum and cerebrospinal fluid. MRI using susceptibility-weighted imaging (SWI) demonstrated iron accumulation in the GP and SN bilaterally. Targeted next-generation sequencing identified a de novo splice-site mutation, c.831-1G>C in WDR45, which resulted in aberrant splicing evidenced by reverse transcriptase-PCR. Persistent elevation of NSE and iron deposition on SWI may provide clues for diagnosis of BPAN in early childhood. PMID:26481852

  12. Big Data - What is it and why it matters.

    PubMed

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers. PMID:27168254

  13. The Visible--Light Magnetograph at the Big Bear Solar Observatory: Hardware and Software

    NASA Astrophysics Data System (ADS)

    Shumko, S.; Abramenko, V.; Denker, C.; Goode, P.; Tritschler, A.; Varsik, J.

    2005-12-01

    In this paper we report about the current status of the control and acquisition software package developed to control the visible-light imaging magnetograph (VIM) system at the Big Bear Solar Observatory (BBSO). The instrument is designed to perform high-spatial and high-temporal observations of the solar photosphere and chromosphere utilizing the remodeled Coudé-feed of the 65 cm vacuum telescope.

  14. Solution structure of leptospiral LigA4 Big domain.

    PubMed

    Mei, Song; Zhang, Jiahai; Zhang, Xuecheng; Tu, Xiaoming

    2015-11-13

    Pathogenic Leptospiraspecies express immunoglobulin-like proteins which serve as adhesins to bind to the extracellular matrices of host cells. Leptospiral immunoglobulin-like protein A (LigA), a surface exposed protein containing tandem repeats of bacterial immunoglobulin-like (Big) domains, has been proved to be involved in the interaction of pathogenic Leptospira with mammalian host. In this study, the solution structure of the fourth Big domain of LigA (LigA4 Big domain) from Leptospira interrogans was solved by nuclear magnetic resonance (NMR). The structure of LigA4 Big domain displays a similar bacterial immunoglobulin-like fold compared with other Big domains, implying some common structural aspects of Big domain family. On the other hand, it displays some structural characteristics significantly different from classic Ig-like domain. Furthermore, Stains-all assay and NMR chemical shift perturbation revealed the Ca(2+) binding property of LigA4 Big domain. PMID:26449456

  15. The big data-big model (BDBM) challenges in ecological research

    NASA Astrophysics Data System (ADS)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  16. EDITORIAL: Big challenges and nanosolutions Big challenges and nanosolutions

    NASA Astrophysics Data System (ADS)

    Demming, Anna

    2011-07-01

    Population increases have triggered a number of concerns over the impact of human activity on the global environment. In addition these anxieties are exacerbated by the trend towards high levels of energy consumption and waste generation in developed nations. Pollutants that figure highly in environmental debate include greenhouse gases from fuel combustion and waste decomposition [1] and nitrogen from fertilisers [2]. In fact, human activity is transforming the nitrogen cycle at a record pace [3], and the pressure on available natural resources is mounting. As a collaboration of researchers in Saudi Arabia and the US explain in this issue, 26 countries across the world do not have sufficient water resources to sustain agriculture and economic development, and approximately one billion people lack access to safe drinking water [4]. They also point out a number of ways the potential of nanoscience and technology can be harnessed to tackle the problem. The key to managing pollutants is their detection. The biodegradation of waste in land fill sites can generate a build up of a number of green house and other gases. Olfactometry using the human expertise of a trained test panel is not a viable option for continuous monitoring of potentially odourless gases on industrial scales with any valid objectivity. Researchers in Italy have fabricated forest-like structures of carbon nanotubes loaded with metal nanoparticles and unmodified nanotubes on low-cost iron-coated alumina substrates [1]. The structure was able to detect NO2 in a multicomponent gas mixture of CO2, CH4, H2, NH3, CO and NO2 with sensitivity better than one part per million. Nanostructures exhibit a number of properties that lend themselves to sensing applications. They often have unique electrical properties that are readily affected by their environment. Such features were exploited by researchers in China who created nanoporous structures in ZnO sheets that can detect formaldehyde and ammonia, the

  17. The Interplay of "Big Five" Personality Factors and Metaphorical Schemas: A Pilot Study with 20 Lung Transplant Recipients

    ERIC Educational Resources Information Center

    Goetzmann, Lutz; Moser, Karin S.; Vetsch, Esther; Grieder, Erhard; Klaghofer, Richard; Naef, Rahel; Russi, Erich W.; Boehler, Annette; Buddeberg, Claus

    2007-01-01

    The aim of the present study was to investigate the interplay between personality factors and metaphorical schemas. The "Big Five" personality factors of 20 patients after lung transplantation were examined with the NEO-FFI. Patients were questioned about their social network, and self- and body-image. The interviews were assessed with metaphor…

  18. Water quality time series for Big Melen stream (Turkey): its decomposition analysis and comparison to upstream.

    PubMed

    Karakaya, N; Evrendilek, F

    2010-06-01

    Big Melen stream is one of the major water resources providing 0.268 [corrected] km(3) year(-1) of drinking and municipal water for Istanbul. Monthly time series data between 1991 and 2004 for 25 chemical, biological, and physical water properties of Big Melen stream were separated into linear trend, seasonality, and error components using additive decomposition models. Water quality index (WQI) derived from 17 water quality variables were used to compare Aksu upstream and Big Melen downstream water quality. Twenty-six additive decomposition models of water quality time series data including WQI had R (2) values ranging from 88% for log(water temperature) (P < or = 0.001) to 3% for log(total dissolved solids) (P < or = 0.026). Linear trend models revealed that total hardness, calcium concentration, and log(nitrite concentration) had the highest rate of increase over time. Tukey's multiple comparison pointed to significant decreases in 17 water quality variables including WQI of Big Melen downstream relative to those of Aksu upstream (P < or = 0.001). Monitoring changes in water quality on the basis of watersheds through WQI and decomposition analysis of time series data paves the way for an adaptive management process of water resources that can be tailored in response to effectiveness and dynamics of management practices. PMID:19444637

  19. The good body: when big is better.

    PubMed

    Cassidy, C M

    1991-09-01

    An important cultural question is, "What is a 'good'--desirable, beautiful, impressive--body?" The answers are legion; here I examine why bigger bodies represent survival skill, and how this power symbolism is embodied by behaviors that guide larger persons toward the top of the social hierarchy. bigness is a complex concept comprising tallness, boniness, muscularity and fattiness. Data show that most people worldwide want to be big--both tall and fat. Those who achieve the ideal are disproportionately among the society's most socially powerful. In the food-secure West, fascination with power and the body has not waned, but has been redefined such that thinness is desired. This apparent anomaly is resolved by realizing that thinness in the midst of abundance--as long as one is also tall and muscular--still projects the traditional message of power, and brings such social boons as upward mobility. PMID:1961102

  20. Risk externalities and too big to fail

    NASA Astrophysics Data System (ADS)

    Taleb, Nassim N.; Tapiero, Charles S.

    2010-09-01

    This paper establishes the case for a fallacy of economies of scale in large aggregate institutions and the effects of scale risks. The problem of rogue trading and excessive risk taking is taken as a case example. Assuming (conservatively) that a firm exposure and losses are limited to its capital while external losses are unbounded, we establish a condition for a firm not to be allowed to be too big to fail. In such a case, the expected external losses second derivative with respect to the firm capital at risk is positive. Examples and analytical results are obtained based on simplifying assumptions and focusing exclusively on the risk externalities that firms too big to fail can have.

  1. Big bang nucleosynthesis - Theories and observations

    NASA Astrophysics Data System (ADS)

    Boesgaard, A. M.; Steigman, G.

    The evidence in support of the nearly universally accepted hot big bang model of cosmology is almost exclusively related to the blackbody spectrum of the microwave background. Primordial nucleosynthesis provides a unique opportunity to test the assumptions of the 'standard' model. The present review provides a summary of the predictions of the standard model, taking into account also a critical evaluation of the implications of the current observational data. The standard hot big bang model is discussed, taking into account the epoch of nucleosynthesis, the primordial abundances, uncertainties in the predicted abundances, and possible variations on the theme of the standard model. The observed abundances are considered, giving attention to destruction and production during galactic evolution, deuterium, He-3, He-4, lithium, and an abundance summary. Predicted and observed abundances are compared, and cosmological constraints are examined.

  2. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'. PMID:26918190

  3. Anticipated Changes in Conducting Scientific Data-Analysis Research in the Big-Data Era

    NASA Technical Reports Server (NTRS)

    Kuo, Kwo-Sen; Seablom, Michael; Clune, Thomas; Ramachandran, Rahul

    2014-01-01

    A Big-Data environment is one that is capable of orchestrating quick-turnaround analyses involving large volumes of data for numerous simultaneous users. Based on our experiences with a prototype Big-Data analysis environment, we anticipate some important changes in research behaviors and processes while conducting scientific data-analysis research in the near future as such Big-Data environments become the mainstream. The first anticipated change will be the reduced effort and difficulty in most parts of the data management process. A Big-Data analysis environment is likely to house most of the data required for a particular research discipline along with appropriate analysis capabilities. This will reduce the need for researchers to download local copies of data. In turn, this also reduces the need for compute and storage procurement by individual researchers or groups, as well as associated maintenance and management afterwards. It is almost certain that Big-Data environments will require a different "programming language" to fully exploit the latent potential. In addition, the process of extending the environment to provide new analysis capabilities will likely be more involved than, say, compiling a piece of new or revised code.We thus anticipate that researchers will require support from dedicated organizations associated with the environment that are composed of professional software engineers and data scientists. A major benefit will likely be that such extensions are of higherquality and broader applicability than ad hoc changes by physical scientists. Another anticipated significant change is improved collaboration among the researchers using the same environment. Since the environment is homogeneous within itself, many barriers to collaboration are minimized or eliminated. For example, data and analysis algorithms can be seamlessly shared, reused and re-purposed. In conclusion, we will be able to achieve a new level of scientific productivity in the Big

  4. Big Data Analytics for Disaster Preparedness and Response of Mobile Communication Infrastructure during Natural Hazards

    NASA Astrophysics Data System (ADS)

    Zhong, L.; Takano, K.; Ji, Y.; Yamada, S.

    2015-12-01

    The disruption of telecommunications is one of the most critical disasters during natural hazards. As the rapid expanding of mobile communications, the mobile communication infrastructure plays a very fundamental role in the disaster response and recovery activities. For this reason, its disruption will lead to loss of life and property, due to information delays and errors. Therefore, disaster preparedness and response of mobile communication infrastructure itself is quite important. In many cases of experienced disasters, the disruption of mobile communication networks is usually caused by the network congestion and afterward long-term power outage. In order to reduce this disruption, the knowledge of communication demands during disasters is necessary. And big data analytics will provide a very promising way to predict the communication demands by analyzing the big amount of operational data of mobile users in a large-scale mobile network. Under the US-Japan collaborative project on 'Big Data and Disaster Research (BDD)' supported by the Japan Science and Technology Agency (JST) and National Science Foundation (NSF), we are going to investigate the application of big data techniques in the disaster preparedness and response of mobile communication infrastructure. Specifically, in this research, we have considered to exploit the big amount of operational information of mobile users for predicting the communications needs in different time and locations. By incorporating with other data such as shake distribution of an estimated major earthquake and the power outage map, we are able to provide the prediction information of stranded people who are difficult to confirm safety or ask for help due to network disruption. In addition, this result could further facilitate the network operators to assess the vulnerability of their infrastructure and make suitable decision for the disaster preparedness and response. In this presentation, we are going to introduce the

  5. Anticipated Changes in Conducting Scientific Data-Analysis Research in the Big-Data Era

    NASA Astrophysics Data System (ADS)

    Kuo, Kwo-Sen; Seablom, Michael; Clune, Thomas; Ramachandran, Rahul

    2014-05-01

    A Big-Data environment is one that is capable of orchestrating quick-turnaround analyses involving large volumes of data for numerous simultaneous users. Based on our experiences with a prototype Big-Data analysis environment, we anticipate some important changes in research behaviors and processes while conducting scientific data-analysis research in the near future as such Big-Data environments become the mainstream. The first anticipated change will be the reduced effort and difficulty in most parts of the data management process. A Big-Data analysis environment is likely to house most of the data required for a particular research discipline along with appropriate analysis capabilities. This will reduce the need for researchers to download local copies of data. In turn, this also reduces the need for compute and storage procurement by individual researchers or groups, as well as associated maintenance and management afterwards. It is almost certain that Big-Data environments will require a different "programming language" to fully exploit the latent potential. In addition, the process of extending the environment to provide new analysis capabilities will likely be more involved than, say, compiling a piece of new or revised code. We thus anticipate that researchers will require support from dedicated organizations associated with the environment that are composed of professional software engineers and data scientists. A major benefit will likely be that such extensions are of higher-quality and broader applicability than ad hoc changes by physical scientists. Another anticipated significant change is improved collaboration among the researchers using the same environment. Since the environment is homogeneous within itself, many barriers to collaboration are minimized or eliminated. For example, data and analysis algorithms can be seamlessly shared, reused and re-purposed. In conclusion, we will be able to achieve a new level of scientific productivity in the

  6. EDITORIAL: Big challenges and nanosolutions Big challenges and nanosolutions

    NASA Astrophysics Data System (ADS)

    Demming, Anna

    2011-07-01

    Population increases have triggered a number of concerns over the impact of human activity on the global environment. In addition these anxieties are exacerbated by the trend towards high levels of energy consumption and waste generation in developed nations. Pollutants that figure highly in environmental debate include greenhouse gases from fuel combustion and waste decomposition [1] and nitrogen from fertilisers [2]. In fact, human activity is transforming the nitrogen cycle at a record pace [3], and the pressure on available natural resources is mounting. As a collaboration of researchers in Saudi Arabia and the US explain in this issue, 26 countries across the world do not have sufficient water resources to sustain agriculture and economic development, and approximately one billion people lack access to safe drinking water [4]. They also point out a number of ways the potential of nanoscience and technology can be harnessed to tackle the problem. The key to managing pollutants is their detection. The biodegradation of waste in land fill sites can generate a build up of a number of green house and other gases. Olfactometry using the human expertise of a trained test panel is not a viable option for continuous monitoring of potentially odourless gases on industrial scales with any valid objectivity. Researchers in Italy have fabricated forest-like structures of carbon nanotubes loaded with metal nanoparticles and unmodified nanotubes on low-cost iron-coated alumina substrates [1]. The structure was able to detect NO2 in a multicomponent gas mixture of CO2, CH4, H2, NH3, CO and NO2 with sensitivity better than one part per million. Nanostructures exhibit a number of properties that lend themselves to sensing applications. They often have unique electrical properties that are readily affected by their environment. Such features were exploited by researchers in China who created nanoporous structures in ZnO sheets that can detect formaldehyde and ammonia, the

  7. Livermore Big Trees Park: 1998 summary results

    SciTech Connect

    Gallegos, G; MacQueen, D; Surano, K

    1999-08-13

    This report summarizes work conducted in 1998 by the Lawrence Livermore National Laboratory (LLNL) to determine the extent and origin of plutonium at concentrations above background levels at Big Trees Park in the city of Livermore. This summary includes the project background and sections that explain the sampling, radiochemical and data analysis, and data interpretation. This report is a summary report only and is not intended as a rigorous technical or statistical analysis of the data.

  8. Big Data Challenges for Large Radio Arrays

    NASA Technical Reports Server (NTRS)

    Jones, Dayton L.; Wagstaff, Kiri; Thompson, David; D'Addario, Larry; Navarro, Robert; Mattmann, Chris; Majid, Walid; Lazio, Joseph; Preston, Robert; Rebbapragada, Umaa

    2012-01-01

    Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields.

  9. Big bang models in string theory

    NASA Astrophysics Data System (ADS)

    Craps, Ben

    2006-11-01

    These proceedings are based on lectures delivered at the 'RTN Winter School on Strings, Supergravity and Gauge Theories', CERN, 16 20 January 2006. The school was mainly aimed at PhD students and young postdocs. The lectures start with a brief introduction to spacetime singularities and the string theory resolution of certain static singularities. Then they discuss attempts to resolve cosmological singularities in string theory, mainly focusing on two specific examples: the Milne orbifold and the matrix big bang.

  10. Can big business save health care?

    PubMed

    Dunn, Philip

    2007-01-01

    Corporate America has decided to stop bellyaching about the cost and quality of the health care it helps buy for its employees. Now it's taking concrete action. Large employers such as Wal-Mart, Oracle, Cisco, BP America and many, many others are pressuring providers to meet performance standards, adopt information technology and transform the efficiency of their operations. Big Business wants value for its buck, and it's now putting money where its mouth is. PMID:17302135

  11. Funding big research with small money.

    PubMed

    Hickey, Joanne V; Koithan, Mary; Unruh, Lynn; Lundmark, Vicki

    2014-06-01

    This department highlights change management strategies that maybe successful in strategically planning and executing organizational change initiatives.With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools,and resources that mobilize and sustain organizational change initiatives.In this article, the guest authors introduce crowd sourcing asa strategy for funding big research with small money. PMID:24853791

  12. Big Bend National Park, TX, USA, Mexico

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Sierra del Carmen of Mexico, across the Rio Grande River from Big Bend National Park, TX, (28.5N, 104.0W) is centered in this photo. The Rio Grande River bisects the scene; Mexico to the east, USA to the west. The thousand ft. Boquillas limestone cliff on the Mexican side of the river changes colors from white to pink to lavender at sunset. This severely eroded sedimentary landscape was once an ancient seabed later overlaid with volcanic activity.

  13. Adapting bioinformatics curricula for big data.

    PubMed

    Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  14. Bohmian quantization of the big rip

    SciTech Connect

    Pinto-Neto, Nelson; Pantoja, Diego Moraes

    2009-10-15

    It is shown in this paper that minisuperspace quantization of homogeneous and isotropic geometries with phantom scalar fields, when examined in the light of the Bohm-de Broglie interpretation of quantum mechanics, does not eliminate, in general, the classical big rip singularity present in the classical model. For some values of the Hamilton-Jacobi separation constant present in a class of quantum state solutions of the Wheeler-De Witt equation, the big rip can be either completely eliminated or may still constitute a future attractor for all expanding solutions. This is contrary to the conclusion presented in [M. P. Dabrowski, C. Kiefer, and B. Sandhofer, Phys. Rev. D 74, 044022 (2006).], using a different interpretation of the wave function, where the big rip singularity is completely eliminated ('smoothed out') through quantization, independently of such a separation constant and for all members of the above mentioned class of solutions. This is an example of the very peculiar situation where different interpretations of the same quantum state of a system are predicting different physical facts, instead of just giving different descriptions of the same observable facts: in fact, there is nothing more observable than the fate of the whole Universe.

  15. Adapting bioinformatics curricula for big data

    PubMed Central

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  16. Bohmian quantization of the big rip

    NASA Astrophysics Data System (ADS)

    Pinto-Neto, Nelson; Pantoja, Diego Moraes

    2009-10-01

    It is shown in this paper that minisuperspace quantization of homogeneous and isotropic geometries with phantom scalar fields, when examined in the light of the Bohm-de Broglie interpretation of quantum mechanics, does not eliminate, in general, the classical big rip singularity present in the classical model. For some values of the Hamilton-Jacobi separation constant present in a class of quantum state solutions of the Wheeler-De Witt equation, the big rip can be either completely eliminated or may still constitute a future attractor for all expanding solutions. This is contrary to the conclusion presented in [M. P. Dabrowski, C. Kiefer, and B. Sandhofer, Phys. Rev. DPRVDAQ1550-7998 74, 044022 (2006).10.1103/PhysRevD.74.044022], using a different interpretation of the wave function, where the big rip singularity is completely eliminated (“smoothed out”) through quantization, independently of such a separation constant and for all members of the above mentioned class of solutions. This is an example of the very peculiar situation where different interpretations of the same quantum state of a system are predicting different physical facts, instead of just giving different descriptions of the same observable facts: in fact, there is nothing more observable than the fate of the whole Universe.

  17. Buckling analysis of Big Dee Vacuum Vessel

    SciTech Connect

    Lightner, S.; Gallix, R.

    1983-12-01

    A simplified three-dimensional shell buckling analysis of the GA Technologies Inc., Big Dee Vacuum Vessel (V/V) was performed using the finite element program TRICO. A coarse-mesh linear elastic model, which accommodated the support boundary conditions, was used to determine the buckling mode shape under a uniform external pressure. Using this buckling mode shape, refined models were used to calculate the linear buckling load (P/sub crit/) more accurately. Several different designs of the Big Dee V/V were considered in this analysis. The supports for the V/V were equally-spaced radial pins at the outer diameter of the mid-plane. For all the cases considered, the buckling mode was axisymmetric in the toroidal direction. Therefore, it was possible to use only a small angular sector of a toric shell for the refined analysis. P/sub crit/ for the Big Dee is about 60 atm for a uniform external pressure. Also investigated in this analysis were the effects of geometrical imperfections and non-uniform pressure distributions.

  18. Quark mass variation constraints from Big Bang nucleosynthesis

    SciTech Connect

    Bedaque, P; Luu, T; Platter, L

    2010-12-13

    We study the impact on the primordial abundances of light elements created of a variation of the quark masses at the time of Big Bang nucleosynthesis (BBN). In order to navigate through the particle and nuclear physics required to connect quark masses to binding energies and reaction rates in a model-independent way we use lattice QCD data and an hierarchy of effective field theories. We find that the measured {sup 4}He abundances put a bound of {delta}-1% {approx}< m{sub q}/m{sub 1} {approx}< 0.7%. The effect of quark mass variations on the deuterium abundances can be largely compensated by changes of the baryon-to-photon ratio {eta}. Including the bounds on the variation of {eta} coming from WMAP results and some additional assumptions narrows the range of allowed values of {delta}m{sub q}/m{sub q} somewhat.

  19. Big Data Visual Analytics for Exploratory Earth System Simulation Analysis

    SciTech Connect

    Steed, Chad A.; Ricciuto, Daniel M.; Shipman, Galen M.; Smith, Brian E.; Thornton, Peter E.; Wang, Dali; Shi, Xiaoying; Williams, Dean N.

    2013-12-01

    Rapid increases in high performance computing are feeding the development of larger and more complex data sets in climate research, which sets the stage for so-called big data analysis challenges. However, conventional climate analysis techniques are inadequate in dealing with the complexities of today s data. In this paper, we describe and demonstrate a visual analytics system, called the Exploratory Data analysis ENvironment (EDEN), with specific application to the analysis of complex earth system simulation data sets. EDEN represents the type of interactive visual analysis tools that are necessary to transform data into insight, thereby improving critical comprehension of earth system processes. In addition to providing an overview of EDEN, we describe real-world studies using both point ensembles and global Community Land Model Version 4 (CLM4) simulations.

  20. Improving survival after endometrial cancer: the big picture

    PubMed Central

    2015-01-01

    To improve survival in women with endometrial cancer, we need to look at the "big picture" beyond initial treatment. Although the majority of women will be diagnosed with early stage disease and are cured with surgery alone, there is a subgroup of women with advanced and high-risk early stage disease whose life expectancy may be prolonged with the addition of chemotherapy. Immunohistochemistry will help to identify those women with Lynch syndrome who will benefit from more frequent colorectal cancer surveillance and genetic counseling. If they happen to be diagnosed with colorectal cancer, this information has an important therapeutic implication. And finally, because the majority of women will survive their diagnosis of endometrial cancer, they remain at risk for breast and colorectal cancer, so these women should be counselled about screening for these cancers. These three interventions will contribute to improving the overall survival of women with endometrial cancer. PMID:26197859

  1. Building Simulation Modelers are we big-data ready?

    SciTech Connect

    Sanyal, Jibonananda; New, Joshua Ryan

    2014-01-01

    Recent advances in computing and sensor technologies have pushed the amount of data we collect or generate to limits previously unheard of. Sub-minute resolution data from dozens of channels is becoming increasingly common and is expected to increase with the prevalence of non-intrusive load monitoring. Experts are running larger building simulation experiments and are faced with an increasingly complex data set to analyze and derive meaningful insight. This paper focuses on the data management challenges that building modeling experts may face in data collected from a large array of sensors, or generated from running a large number of building energy/performance simulations. The paper highlights the technical difficulties that were encountered and overcome in order to run 3.5 million EnergyPlus simulations on supercomputers and generating over 200 TBs of simulation output. This extreme case involved development of technologies and insights that will be beneficial to modelers in the immediate future. The paper discusses different database technologies (including relational databases, columnar storage, and schema-less Hadoop) in order to contrast the advantages and disadvantages of employing each for storage of EnergyPlus output. Scalability, analysis requirements, and the adaptability of these database technologies are discussed. Additionally, unique attributes of EnergyPlus output are highlighted which make data-entry non-trivial for multiple simulations. Practical experience regarding cost-effective strategies for big-data storage is provided. The paper also discusses network performance issues when transferring large amounts of data across a network to different computing devices. Practical issues involving lag, bandwidth, and methods for synchronizing or transferring logical portions of the data are presented. A cornerstone of big-data is its use for analytics; data is useless unless information can be meaningfully derived from it. In addition to technical

  2. 76 FR 29786 - Environmental Impact Statement for the Big Cypress National Preserve Addition, Florida

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-23

    ... Floodplain Statement of Findings for the General Management Plan/Wilderness Study/ Off-Road Vehicle... availability of the Record of Decision (ROD) and Floodplain Statement of Findings for the General Management... Floodplain Statement of Findings. ADDRESSES: The ROD is available online at...

  3. Addition goes where the big numbers are: evidence for a reversed operational momentum effect.

    PubMed

    Pinhas, Michal; Shaki, Samuel; Fischer, Martin H

    2015-08-01

    Number processing evokes spatial biases, both when dealing with single digits and in more complex mental calculations. Here we investigated whether these two biases have a common origin, by examining their flexibility. Participants pointed to the locations of arithmetic results on a visually presented line with an inverted, right-to-left number arrangement. We found directionally opposite spatial biases for mental arithmetic and for a parity task administered both before and after the arithmetic task. We discuss implications of this dissociation in our results for the task-dependent cognitive representation of numbers. PMID:25504457

  4. Using Data and Big Ideas: Teaching Distribution as an Instance of Repeated Addition. CRESST Report 734

    ERIC Educational Resources Information Center

    Vendlinski, Terry P.; Howard, Keith E.; Hemberg, Bryan C.; Vinyard, Laura; Martel, Annabel; Kyriacou, Elizabeth; Casper, Jennifer; Chai, Yourim; Phelan, Julia C.; Baker, Eva L.

    2008-01-01

    The inability of students to become proficient in algebra seems to be widespread in American schools. One of the reasons often cited for this inability is that instruction seldom builds on prior knowledge. Research suggests that teacher effectiveness is the most critical controllable variable in improving student achievement. This report details a…

  5. When small is better than BIG

    SciTech Connect

    McDaniel, Hunter; Beard, Matthew C; Wheeler, Lance M; Pietryga, Jeffrey M

    2013-07-18

    Representing the Center for Advanced Solar Photophysics (CASP), this document is one of the entries in the Ten Hundred and One Word Challenge and was awarded “Overall Winner Runner-up and People’s Choice Winner.” As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE: energy. The mission of CASP is to explore and exploit the unique physics of nanostructured materials to boost the efficiency of solar energy conversion through novel light-matter interactions, controlled excited-state dynamics, and engineered carrier-carrier coupling.

  6. Big Soda Lake (Nevada). 2. Pelagic sulfate reduction

    USGS Publications Warehouse

    Smith, Richard L.; Oremland, Ronald S.

    1987-01-01

    The epilimnion of hypersaline, alkaline, meromictic Big Soda Lake contains an average 58 mmol sulfate liter−1 and 0.4 µmol dissolved iron liter−1. The monimolimnion, which is permanently anoxic, has a sulfide concentration ranging seasonally from 4 to 7 mmol liter−1. Depth profiles of sulfate reduction in the monimolimnion, assayed with a 35S tracer technique and in situ incubations, demonstrated that sulfate reduction occurs within the water column of this extreme environment. The average rate of reduction in the monimolimnion was 3 µmol sulfate liter−1 d−1in May compared to 0.9 in October. These values are comparable to rates of sulfate reduction reported for anoxic waters of more moderate environments. Sulfate reduction also occurred in the anoxic zone of the mixolimnion, though at significantly lower rates (0.025–0.090 µmol liter−1 d−1 at 25 m). Additions of FeS (1.0 mmol liter−1) doubled the endogenous rate of sulfate reduction in the monimolimnion, while MnS and kaolinite had no effect. These results suggest that sulfate reduction in Big Soda Lake is iron limited and controlled by seasonal variables other than temperature. Estimates of the organic carbon mineralized by sulfate reduction exceed measured fluxes of particulate organic carbon sinking from the mixolimnion. Thus, additional sources of electron donors (other than those derived from the sinking of pelagic autotrophs) may also fuel monimolimnetic sulfate reduction in the lake.

  7. Big Data and Deep data in scanning and electron microscopies: functionality from multidimensional data sets

    SciTech Connect

    Belianinov, Alex; Vasudevan, Rama K; Strelcov, Evgheni; Steed, Chad A; Yang, Sang Mo; Tselev, Alexander; Jesse, Stephen; Biegalski, Michael D; Shipman, Galen M; Symons, Christopher T; Borisevich, Albina Y; Archibald, Richard K; Kalinin, Sergei

    2015-01-01

    The development of electron, and scanning probe microscopies in the second half of the twentieth century have produced spectacular images of internal structure and composition of matter with, at nanometer, molecular, and atomic resolution. Largely, this progress was enabled by computer-assisted methods of microscope operation, data acquisition and analysis. The progress in imaging technologies in the beginning of the twenty first century has opened the proverbial floodgates of high-veracity information on structure and functionality. High resolution imaging now allows information on atomic positions with picometer precision, allowing for quantitative measurements of individual bond length and angles. Functional imaging often leads to multidimensional data sets containing partial or full information on properties of interest, acquired as a function of multiple parameters (time, temperature, or other external stimuli). Here, we review several recent applications of the big and deep data analysis methods to visualize, compress, and translate this data into physically and chemically relevant information from imaging data.

  8. Optical Investigation of a Connecting-rod Big End Bearing Model Under Dynamic Loads

    NASA Astrophysics Data System (ADS)

    Optasanu, V.; Bonneau, D.

    A new experimental device used for optical investigations of transient elastohydrodynamic behaviour of connecting-rod big end bearing models is presented. Photoelasticity method and digital image correlation method are used in order to visualise isochromatic fringe patterns and, respectively, displacement field. Validation of recording methods in dynamic regime are made. An isochromatic fringe pattern of the whole bearing is reconstructed using images taken for different regions of the model. An example of displacement visualisation of the interface cap/body of the bearing is presented

  9. Occurrence and transport of nitrogen in the Big Sunflower River, northwestern Mississippi, October 2009-June 2011

    USGS Publications Warehouse

    Barlow, Jeannie R.B.; Coupe, Richard H.

    2014-01-01

    The Big Sunflower River Basin, located within the Yazoo River Basin, is subject to large annual inputs of nitrogen from agriculture, atmospheric deposition, and point sources. Understanding how nutrients are transported in, and downstream from, the Big Sunflower River is key to quantifying their eutrophying effects on the Gulf. Recent results from two Spatially Referenced Regressions on Watershed attributes (SPARROW models), which include the Big Sunflower River, indicate minimal losses of nitrogen in stream reaches typical of the main channels of major river systems. If SPARROW assumptions of relatively conservative transport of nitrogen are correct and surface-water losses through the bed of the Big Sunflower River are negligible, then options for managing nutrient loads to the Gulf of Mexico may be limited. Simply put, if every pound of nitrogen entering the Delta is eventually delivered to the Gulf, then the only effective nutrient management option in the Delta is to reduce inputs. If, on the other hand, it can be shown that processes within river channels of the Mississippi Delta act to reduce the mass of nitrogen in transport, other hydrologic approaches may be designed to further limit nitrogen transport. Direct validation of existing SPARROW models for the Delta is a first step in assessing the assumptions underlying those models. In order to characterize spatial and temporal variability of nitrogen in the Big Sunflower River Basin, water samples were collected at four U.S. Geological Survey gaging stations located on the Big Sunflower River between October 1, 2009, and June 30, 2011. Nitrogen concentrations were generally highest at each site during the spring of the 2010 water year and the fall and winter of the 2011 water year. Additionally, the dominant form of nitrogen varied between sites. For example, in samples collected from the most upstream site (Clarksdale), the concentration of organic nitrogen was generally higher than the concentrations of

  10. Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response.

    PubMed

    Ofli, Ferda; Meier, Patrick; Imran, Muhammad; Castillo, Carlos; Tuia, Devis; Rey, Nicolas; Briant, Julien; Millet, Pauline; Reinhard, Friedrich; Parkan, Matthew; Joost, Stéphane

    2016-03-01

    Aerial imagery captured via unmanned aerial vehicles (UAVs) is playing an increasingly important role in disaster response. Unlike satellite imagery, aerial imagery can be captured and processed within hours rather than days. In addition, the spatial resolution of aerial imagery is an order of magnitude higher than the imagery produced by the most sophisticated commercial satellites today. Both the United States Federal Emergency Management Agency (FEMA) and the European Commission's Joint Research Center (JRC) have noted that aerial imagery will inevitably present a big data challenge. The purpose of this article is to get ahead of this future challenge by proposing a hybrid crowdsourcing and real-time machine learning solution to rapidly process large volumes of aerial data for disaster response in a time-sensitive manner. Crowdsourcing can be used to annotate features of interest in aerial images (such as damaged shelters and roads blocked by debris). These human-annotated features can then be used to train a supervised machine learning system to learn to recognize such features in new unseen images. In this article, we describe how this hybrid solution for image analysis can be implemented as a module (i.e., Aerial Clicker) to extend an existing platform called Artificial Intelligence for Disaster Response (AIDR), which has already been deployed to classify microblog messages during disasters using its Text Clicker module and in response to Cyclone Pam, a category 5 cyclone that devastated Vanuatu in March 2015. The hybrid solution we present can be applied to both aerial and satellite imagery and has applications beyond disaster response such as wildlife protection, human rights, and archeological exploration. As a proof of concept, we recently piloted this solution using very high-resolution aerial photographs of a wildlife reserve in Namibia to support rangers with their wildlife conservation efforts (SAVMAP project, http://lasig.epfl.ch/savmap ). The

  11. "small problems, Big Trouble": An Art and Science Collaborative Exhibition Reflecting Seemingly small problems Leading to Big Threats

    NASA Astrophysics Data System (ADS)

    Waller, J. L.; Brey, J. A.

    2014-12-01

    disasters continues to inspire new chapters in their "Layers: Places in Peril" exhibit! A slide show includes images of paintings for "small problems, Big Trouble". Brey and Waller will lead a discussion on their process of incorporating broader collaboration with geoscientists and others in an educational art exhibition.

  12. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography.

    PubMed

    Jesse, S; Chi, M; Belianinov, A; Beekman, C; Kalinin, S V; Borisevich, A Y; Lupini, A R

    2016-01-01

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called "big-data" methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy. PMID:27211523

  13. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography

    PubMed Central

    Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.

    2016-01-01

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy. PMID:27211523

  14. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography

    NASA Astrophysics Data System (ADS)

    Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.

    2016-05-01

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.

  15. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    NASA Astrophysics Data System (ADS)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  16. The New Solar Telescope in Big Bear: Polarimetry II

    NASA Astrophysics Data System (ADS)

    Cao, W.; Ahn, K.; Goode, P. R.; Shumko, S.; Gorceix, N.; Coulter, R.

    2011-04-01

    IRIM (Infrared Imaging Magnetograph) is one of the first imaging solar spectro-polarimeters working in the near infrared (NIR). IRIM is being installed and commissioned in the Coudé Lab of the 1.6-meter New Solar Telescope (NST) at Big Bear Solar Observatory (BBSO). This innovative system, which includes a 2.5 nm interference filter, a unique 0.25 nm birefringent Lyot filter, and a Fabry-Pérot etalon, is capable of providing a bandpass as low as 0.01 nm over a field-of-view of 50" in a telecentric configuration. An NIR waveplate rotates ahead of M3 in the NST as the polarimeter modulator, and ahead of it locates a calibration unit to reduce polarization cross-talk induced by subsequent oblique mirrors. Dual-beam differential polarimetry is employed to minimize seeing-induced spurious polarization. Based on the unique advantages in IR window, the very capable NST with adaptive optics, IRIM will provide unprecedented solar spectro-polarimetry with high Zeeman sensitivity (10-3Ic), high spatial resolution (0.2"), and high cadence (15 s). In this paper, we discuss the design, fabrication, and calibration of IRIM, as well as the results of the first light observations.

  17. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography

    DOE PAGESBeta

    Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.

    2016-05-23

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. In this paper, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature andmore » does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. Finally, however, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.« less

  18. Data management by using R: big data clinical research series

    PubMed Central

    2015-01-01

    Electronic medical record (EMR) system has been widely used in clinical practice. Instead of traditional record system by hand writing and recording, the EMR makes big data clinical research feasible. The most important feature of big data research is its real-world setting. Furthermore, big data research can provide all aspects of information related to healthcare. However, big data research requires some skills on data management, which however, is always lacking in the curriculum of medical education. This greatly hinders doctors from testing their clinical hypothesis by using EMR. To make ends meet, a series of articles introducing data management techniques are put forward to guide clinicians to big data clinical research. The present educational article firstly introduces some basic knowledge on R language, followed by some data management skills on creating new variables, recoding variables and renaming variables. These are very basic skills and may be used in every project of big data research. PMID:26697463

  19. Ten aspects of the Big Five in the Personality Inventory for DSM-5.

    PubMed

    DeYoung, Colin G; Carey, Bridget E; Krueger, Robert F; Ross, Scott R

    2016-04-01

    Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into 5 higher order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In 2 healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS would be the highest loading BFAS on 1 and only 1 factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders. PMID:27032017

  20. 10 Aspects of the Big Five in the Personality Inventory for DSM-5

    PubMed Central

    DeYoung, Colin. G.; Carey, Bridget E.; Krueger, Robert F.; Ross, Scott R.

    2015-01-01

    DSM-5 includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into five higher-order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In two healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS scales would be the highest loading BFAS scale on one and only one factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders. PMID:27032017

  1. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  2. A Demonstration of Big Data Technology for Data Intensive Earth Science (Invited)

    NASA Astrophysics Data System (ADS)

    Kuo, K.; Clune, T.; Ramachandran, R.; Rushing, J.; Fekete, G.; Lin, A.; Doan, K.; Oloso, A. O.; Duffy, D.

    2013-12-01

    Big Data technologies exhibit great potential to change the way we conduct scientific investigations, especially analysis of voluminous and diverse data sets. Obviously, not all Big Data technologies are applicable to all aspects of scientific data analysis. Our NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) project, Automated Event Service (AES), pioneers the exploration of Big Data technologies for data intensive Earth science. Since Earth science data are largely stored and manipulated in the form of multidimensional arrays, the project first evaluates array performance of several candidate Big Data technologies, including MapReduce (Hadoop), SciDB, and a custom-built Polaris system, which have one important feature in common: shared nothing architecture. The evaluation finds SicDB to be the most promising. In this presentation, we demonstrate SciDB using a couple of use cases, each operating on a distinct data set in the regular latitude-longitude grid. The first use case is the discovery and identification of blizzards using NASA's Modern Era Retrospective-analysis for Research and Application (MERRA) data sets. The other finds diurnal signals in the same 8-year period using SSMI data from three different instruments with different equator crossing times by correlating their retrieved parameters. In addition, the AES project is also developing a collaborative component to enable the sharing of event queries and results. Preliminary capabilities will be presented as well.

  3. New Digital Magnetograph At Big Bear Solar Observatory

    NASA Astrophysics Data System (ADS)

    Wang, H.; Denker, C.; Spirock, T.; Goode, P. R.; Yang, S.; Marquette, W.; Varsik, J.; Fear, R. J.; Nenow, J.; Dingley, D. D.

    1998-11-01

    A new digital magnetograph system has been installed and tested at Big Bear Solar Observatory. The system uses part of BBSO's existing videomagnetograph (VMG) system: a quarter wave plate, a ferro-electric liquid crystal to switch polarizations, and a 0.25 Å bandpass Zeiss filter tuned at Cai 6103Å. A new 256x256 pixels, 12-bit Dalsa camera is used as the detector and as the driver to switch the liquid crystal. The data rate of the camera is 90 frames s-1. The camera is interfaced to a Pentium-166 PC with a muTech imaging board for data acquisition and analysis. The computer has 128 MByte of RAM, and up to 700 live images can be stored in memory for quick post-exposure image processing (image selection and alignment). We have significantly improved the sensitivity and spatial resolution over the old BBSO VMG system. In particular: (1) New digital image data are in 12 bits while the video signal is digitized as 8 bits. Polarizations weaker than 1% can not be detected by a single pair subtraction in the video system. The digital system can detect a polarization signal of about 0.3% by a single pair subtraction. (2) Data rate of the digital system is 90 frames s-1, that of the video system is 30 frames s-1. So the time difference between two polarizations is reduced in the new system. Under good seeing conditions, the data rate of 90 frames s-1 ensures that most of the wavefront distortions are 'frozen' and fairly closely the same for the left and right circular polarized image pairs. (3) Magnetograms are constructed after image selection and alignment. We discuss the characteristics of this new system. We present the results of our first tests to reconstruct magnetograms with speckle interferometric techniques. We also present some preliminary results on the comparison of facular/micropore contrasts and magnetic field structure. The experiment with this small detector lays ground for a larger format digital magnetograph system at BBSO, as well as a future Fabry

  4. String Theoretic Toy Models of the Big Bang

    NASA Astrophysics Data System (ADS)

    Michelson, Jeremy

    2006-03-01

    Recently, examples of toy cosmologies have been found that are exact solutions of String Theory. These solutions have the feature that the theoretical framework permits reliable calculation arbitrarily close to the big bang singularity. Thus one can understand both the big bang, and late time physics. I will describe these toy cosmologies, and how they fit into String Theory's chains of equivalences between gravitational and nongravitational theories. These equivalences are the means by which one theoretically probes the big bang.

  5. Big Impacts and Transient Oceans on Titan

    NASA Technical Reports Server (NTRS)

    Zahnle, K. J.; Korycansky, D. G.; Nixon, C. A.

    2014-01-01

    We have studied the thermal consequences of very big impacts on Titan [1]. Titan's thick atmosphere and volatile-rich surface cause it to respond to big impacts in a somewhat Earth-like manner. Here we construct a simple globally-averaged model that tracks the flow of energy through the environment in the weeks, years, and millenia after a big comet strikes Titan. The model Titan is endowed with 1.4 bars of N2 and 0.07 bars of CH4, methane lakes, a water ice crust, and enough methane underground to saturate the regolith to the surface. We assume that half of the impact energy is immediately available to the atmosphere and surface while the other half is buried at the site of the crater and is unavailable on time scales of interest. The atmosphere and surface are treated as isothermal. We make the simplifying assumptions that the crust is everywhere as methane saturated as it was at the Huygens landing site, that the concentration of methane in the regolith is the same as it is at the surface, and that the crust is made of water ice. Heat flow into and out of the crust is approximated by step-functions. If the impact is great enough, ice melts. The meltwater oceans cool to the atmosphere conductively through an ice lid while at the base melting their way into the interior, driven down in part through Rayleigh-Taylor instabilities between the dense water and the warm ice. Topography, CO2, and hydrocarbons other than methane are ignored. Methane and ethane clathrate hydrates are discussed quantitatively but not fully incorporated into the model.

  6. Big Data Analytics for Prostate Radiotherapy

    PubMed Central

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose–volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the “RadoncSpace”) in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  7. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  8. Nuclear Receptors, RXR, and the Big Bang.

    PubMed

    Evans, Ronald M; Mangelsdorf, David J

    2014-03-27

    Isolation of genes encoding the receptors for steroids, retinoids, vitamin D, and thyroid hormone and their structural and functional analysis revealed an evolutionarily conserved template for nuclear hormone receptors. This discovery sparked identification of numerous genes encoding related proteins, termed orphan receptors. Characterization of these orphan receptors and, in particular, of the retinoid X receptor (RXR) positioned nuclear receptors at the epicenter of the "Big Bang" of molecular endocrinology. This Review provides a personal perspective on nuclear receptors and explores their integrated and coordinated signaling networks that are essential for multicellular life, highlighting the RXR heterodimer and its associated ligands and transcriptional mechanism. PMID:24679540

  9. Probing the Big Bang with LEP

    NASA Technical Reports Server (NTRS)

    Schramm, David N.

    1990-01-01

    It is shown that LEP probes the Big Bang in two significant ways: (1) nucleosynthesis, and (2) dark matter constraints. In the first case, LEP verifies the cosmological standard model prediction on the number of neutrino types, thus strengthening the conclusion that the cosmological baryon density is approximately 6 percent of the critical value. In the second case, LEP shows that the remaining non-baryonic cosmological matter must be somewhat more massive and/or more weakly interacting than the favorite non-baryonic dark matter candidates of a few years ago.

  10. np -> d gamma for big bang nucleosynthesis

    SciTech Connect

    Jiunn-Wei Chen; Martin J. Savage

    1999-12-01

    The cross section from np -> dy is calculated at energies relevant to big-bang nucleosynthesis using the recently developed effective field theory that describes the two nucleon sector. The E1 amplitude is computed up to N{sup 3}LO and depends only upon nucleon-nucleon phase shift data. In contrast, the M1 contribution is determined by the cross section for cold neutron capture. The uncertainty in the calculation for nucleon energies up to E{approx}1 MeV is estimated to be <= 4%.

  11. The Next Big Thing - Eric Haseltine

    ScienceCinema

    Eric Haseltine

    2010-01-08

    Eric Haseltine, Haseltine Partners president and former chief of Walt Disney Imagineering, presented "The Next Big Thing," on Sept. 11, at the ORNL. He described the four "early warning signs" that a scientific breakthrough is imminent, and then suggested practical ways to turn these insights into breakthrough innovations. Haseltine is former director of research at the National Security Agency and associate director for science and technology for the director of National Intelligence, former executive vice president of Walt Disney Imagineering and director of engineering for Hughes Aircraft. He has 15 patents in optics, special effects and electronic media, and more than 100 publications in science and technical journals, the web and Discover Magazine.

  12. Pre - big bang inflation requires fine tuning

    SciTech Connect

    Turner, Michael S.; Weinberg, Erick J.

    1997-10-01

    The pre-big-bang cosmology inspired by superstring theories has been suggested as an alternative to slow-roll inflation. We analyze, in both the Jordan and Einstein frames, the effect of spatial curvature on this scenario and show that too much curvature --- of either sign --- reduces the duration of the inflationary era to such an extent that the flatness and horizon problems are not solved. Hence, a fine-tuning of initial conditions is required to obtain enough inflation to solve the cosmological problems.

  13. The Next Big Thing - Eric Haseltine

    SciTech Connect

    Eric Haseltine

    2009-09-16

    Eric Haseltine, Haseltine Partners president and former chief of Walt Disney Imagineering, presented "The Next Big Thing," on Sept. 11, at the ORNL. He described the four "early warning signs" that a scientific breakthrough is imminent, and then suggested practical ways to turn these insights into breakthrough innovations. Haseltine is former director of research at the National Security Agency and associate director for science and technology for the director of National Intelligence, former executive vice president of Walt Disney Imagineering and director of engineering for Hughes Aircraft. He has 15 patents in optics, special effects and electronic media, and more than 100 publications in science and technical journals, the web and Discover Magazine.

  14. Probing the Big Bang with LEP

    SciTech Connect

    Schramm, D.N. Fermi National Accelerator Lab., Batavia, IL )

    1990-06-01

    It is shown that LEP probes the Big Bang in two significant ways: (1) nucleosynthesis and (2) dark matter constraints. In the first case, LEP verifies the cosmological standard model prediction on the number of neutrino types, thus strengthening the conclusion that the cosmological baryon density is {approximately}6% of the critical value. In the second case, LEP shows that the remaining non-baryonic cosmological matter must be somewhat more massive and/or more weakly interacting that the favorite non-baryonic dark matter candidates of a few years ago. 59 refs., 4 figs., 2 tabs.

  15. Nuclear Receptors, RXR & the Big Bang

    PubMed Central

    Evans, Ronald M.; Mangelsdorf, David J.

    2014-01-01

    Summary Isolation of genes encoding the receptors for steroids, retinoids, vitamin D and thyroid hormone, and their structural and functional analysis revealed an evolutionarily conserved template for nuclear hormone receptors. This discovery sparked identification of numerous genes encoding related proteins, termed orphan receptors. Characterization of these orphan receptors, and in particular of the retinoid X receptor (RXR), positioned nuclear receptors at the epicenter of the “Big Bang” of molecular endocrinology. This review provides a personal perspective on nuclear receptors and explores their integrated and coordinated signaling networks that are essential for multi-cellular life, highlighting the RXR heterodimer and its associated ligands and transcriptional mechanism. PMID:24679540

  16. Ocean Networks Canada's "Big Data" Initiative

    NASA Astrophysics Data System (ADS)

    Dewey, R. K.; Hoeberechts, M.; Moran, K.; Pirenne, B.; Owens, D.

    2013-12-01

    Ocean Networks Canada operates two large undersea observatories that collect, archive, and deliver data in real time over the Internet. These data contribute to our understanding of the complex changes taking place on our ocean planet. Ocean Networks Canada's VENUS was the world's first cabled seafloor observatory to enable researchers anywhere to connect in real time to undersea experiments and observations. Its NEPTUNE observatory is the largest cabled ocean observatory, spanning a wide range of ocean environments. Most recently, we installed a new small observatory in the Arctic. Together, these observatories deliver "Big Data" across many disciplines in a cohesive manner using the Oceans 2.0 data management and archiving system that provides national and international users with open access to real-time and archived data while also supporting a collaborative work environment. Ocean Networks Canada operates these observatories to support science, innovation, and learning in four priority areas: study of the impact of climate change on the ocean; the exploration and understanding the unique life forms in the extreme environments of the deep ocean and below the seafloor; the exchange of heat, fluids, and gases that move throughout the ocean and atmosphere; and the dynamics of earthquakes, tsunamis, and undersea landslides. To date, the Ocean Networks Canada archive contains over 130 TB (collected over 7 years) and the current rate of data acquisition is ~50 TB per year. This data set is complex and diverse. Making these "Big Data" accessible and attractive to users is our priority. In this presentation, we share our experience as a "Big Data" institution where we deliver simple and multi-dimensional calibrated data cubes to a diverse pool of users. Ocean Networks Canada also conducts extensive user testing. Test results guide future tool design and development of "Big Data" products. We strive to bridge the gap between the raw, archived data and the needs and

  17. The New Big Science at the NSLS

    NASA Astrophysics Data System (ADS)

    Crease, Robert

    2016-03-01

    The term ``New Big Science'' refers to a phase shift in the kind of large-scale science that was carried out throughout the U.S. National Laboratory system, when large-scale materials science accelerators rather than high-energy physics accelerators became marquee projects at most major basic research laboratories in the post-Cold War era, accompanied by important changes in the character and culture of the research ecosystem at these laboratories. This talk explores some aspects of this phase shift at BNL's National Synchrotron Light Source.

  18. The Big Bang and Cosmic Inflation

    NASA Astrophysics Data System (ADS)

    Guth, Alan H.

    2014-03-01

    A summary is given of the key developments of cosmology in the 20th century, from the work of Albert Einstein to the emergence of the generally accepted hot big bang model. The successes of this model are reviewed, but emphasis is placed on the questions that the model leaves unanswered. The remainder of the paper describes the inflationary universe model, which provides plausible answers to a number of these questions. It also offers a possible explanation for the origin of essentially all the matter and energy in the observed universe.

  19. 'Big data' in pharmaceutical science: challenges and opportunities.

    PubMed

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science. PMID:24962278

  20. Parallel and Scalable Big Data Analysis in the Earth Sciences with JuML

    NASA Astrophysics Data System (ADS)

    Goetz, M.

    2015-12-01

    Recent developments of using a significantly increasing number of sensors with better resolutions in the wide variety of different earth observation projects continously contribute to the availability of 'big data' in the earth sciences. Not only the volume, velocity, and variety of the datasets pose increasing challenges for its analysis, but also the complexity of datasets (e.g. high number of dimensions in hyper-spectral images) requires data algorithms that are able to scale. This contribution will provide insights about the Juelich Machine learning Library (JuML) and its contents that have been actively used in several scientific use cases in the earth sciences. We discuss and categorize challenges related to 'big data' analysis and outline parallel algorithmic solutions driven by those use cases.