Sample records for big throughput camera

  1. Image Mosaicking Approach for a Double-Camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera.

    PubMed

    Cheng, Yufeng; Jin, Shuying; Wang, Mi; Zhu, Ying; Dong, Zhipeng

    2017-06-20

    The linear array push broom imaging mode is widely used for high resolution optical satellites (HROS). Using double-cameras attached by a high-rigidity support along with push broom imaging is one method to enlarge the field of view while ensuring high resolution. High accuracy image mosaicking is the key factor of the geometrical quality of complete stitched satellite imagery. This paper proposes a high accuracy image mosaicking approach based on the big virtual camera (BVC) in the double-camera system on the GaoFen2 optical remote sensing satellite (GF2). A big virtual camera can be built according to the rigorous imaging model of a single camera; then, each single image strip obtained by each TDI-CCD detector can be re-projected to the virtual detector of the big virtual camera coordinate system using forward-projection and backward-projection to obtain the corresponding single virtual image. After an on-orbit calibration and relative orientation, the complete final virtual image can be obtained by stitching the single virtual images together based on their coordinate information on the big virtual detector image plane. The paper subtly uses the concept of the big virtual camera to obtain a stitched image and the corresponding high accuracy rational function model (RFM) for concurrent post processing. Experiments verified that the proposed method can achieve seamless mosaicking while maintaining the geometric accuracy.

  2. Wide Field Imaging of the Hubble Deep Field-South Region III: Catalog

    NASA Technical Reports Server (NTRS)

    Palunas, Povilas; Collins, Nicholas R.; Gardner, Jonathan P.; Hill, Robert S.; Malumuth, Eliot M.; Rhodes, Jason; Teplitz, Harry I.; Woodgate, Bruce E.

    2002-01-01

    We present 1/2 square degree uBVRI imaging around the Hubble Deep Field - South. These data have been used in earlier papers to examine the QSO population and the evolution of the correlation function in the region around the HDF-S. The images were obtained with the Big Throughput Camera at CTIO in September 1998. The images reach 5 sigma limits of u approx. 24.4, B approx. 25.6, V approx. 25.3, R approx. 24.9 and I approx. 23.9. We present a catalog of approx. 22,000 galaxies. We also present number-magnitude counts and a comparison with other observations of the same field. The data presented here are available over the world wide web.

  3. MS-REDUCE: an ultrafast technique for reduction of big mass spectrometry data for high-throughput processing.

    PubMed

    Awan, Muaaz Gul; Saeed, Fahad

    2016-05-15

    Modern proteomics studies utilize high-throughput mass spectrometers which can produce data at an astonishing rate. These big mass spectrometry (MS) datasets can easily reach peta-scale level creating storage and analytic problems for large-scale systems biology studies. Each spectrum consists of thousands of peaks which have to be processed to deduce the peptide. However, only a small percentage of peaks in a spectrum are useful for peptide deduction as most of the peaks are either noise or not useful for a given spectrum. This redundant processing of non-useful peaks is a bottleneck for streaming high-throughput processing of big MS data. One way to reduce the amount of computation required in a high-throughput environment is to eliminate non-useful peaks. Existing noise removing algorithms are limited in their data-reduction capability and are compute intensive making them unsuitable for big data and high-throughput environments. In this paper we introduce a novel low-complexity technique based on classification, quantization and sampling of MS peaks. We present a novel data-reductive strategy for analysis of Big MS data. Our algorithm, called MS-REDUCE, is capable of eliminating noisy peaks as well as peaks that do not contribute to peptide deduction before any peptide deduction is attempted. Our experiments have shown up to 100× speed up over existing state of the art noise elimination algorithms while maintaining comparable high quality matches. Using our approach we were able to process a million spectra in just under an hour on a moderate server. The developed tool and strategy has been made available to wider proteomics and parallel computing community and the code can be found at https://github.com/pcdslab/MSREDUCE CONTACT: : fahad.saeed@wmich.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Achievable Rate Estimation of IEEE 802.11ad Visual Big-Data Uplink Access in Cloud-Enabled Surveillance Applications.

    PubMed

    Kim, Joongheon; Kim, Jong-Kook

    2016-01-01

    This paper addresses the computation procedures for estimating the impact of interference in 60 GHz IEEE 802.11ad uplink access in order to construct visual big-data database from randomly deployed surveillance camera sensing devices. The acquired large-scale massive visual information from surveillance camera devices will be used for organizing big-data database, i.e., this estimation is essential for constructing centralized cloud-enabled surveillance database. This performance estimation study captures interference impacts on the target cloud access points from multiple interference components generated by the 60 GHz wireless transmissions from nearby surveillance camera devices to their associated cloud access points. With this uplink interference scenario, the interference impacts on the main wireless transmission from a target surveillance camera device to its associated target cloud access point with a number of settings are measured and estimated under the consideration of 60 GHz radiation characteristics and antenna radiation pattern models.

  5. A fast new cadioptric design for fiber-fed spectrographs

    NASA Astrophysics Data System (ADS)

    Saunders, Will

    2012-09-01

    The next generation of massively multiplexed multi-object spectrographs (DESpec, SUMIRE, BigBOSS, 4MOST, HECTOR) demand fast, efficient and affordable spectrographs, with higher resolutions (R = 3000-5000) than current designs. Beam-size is a (relatively) free parameter in the design, but the properties of VPH gratings are such that, for fixed resolution and wavelength coverage, the effect on beam-size on overall VPH efficiency is very small. For alltransmissive cameras, this suggests modest beam-sizes (say 80-150mm) to minimize costs; while for cadioptric (Schmidt-type) cameras, much larger beam-sizes (say 250mm+) are preferred to improve image quality and to minimize obstruction losses. Schmidt designs have benefits in terms of image quality, camera speed and scattered light performance, and recent advances such as MRF technology mean that the required aspherics are no longer a prohibitive cost or risk. The main objections to traditional Schmidt designs are the inaccessibility of the detector package, and the loss in throughput caused by it being in the beam. With expected count rates and current read-noise technology, the gain in camera speed allowed by Schmidt optics largely compensates for the additional obstruction losses. However, future advances in readout technology may erase most of this compensation. A new Schmidt/Maksutov-derived design is presented, which differs from previous designs in having the detector package outside the camera, and adjacent to the spectrograph pupil. The telescope pupil already contains a hole at its center, because of the obstruction from the telescope top-end. With a 250mm beam, it is possible to largely hide a 6cm × 6cm detector package and its dewar within this hole. This means that the design achieves a very high efficiency, competitive with transmissive designs. The optics are excellent, as least as good as classic Schmidt designs, allowing F/1.25 or even faster cameras. The principal hardware has been costed at $300K per arm, making the design affordable.

  6. Cluster Lensing with the BTC

    NASA Astrophysics Data System (ADS)

    Fischer, P.

    1997-12-01

    Weak distortions of background galaxies are rapidly emerging as a powerful tool for the measurement of galaxy cluster mass distributions. Lensing based studies have the advantage of being direct measurements of mass and are not model-dependent as are other techniques (X-ray, radial velocities). To date studies have been limited by CCD field size meaning that full coverage of the clusters out to the virial radii and beyond has not been possible. Probing this large radius region is essential for testing models of large scale structure formation. New wide field CCD mosaics, for the first time, allow mass measurements out to very large radius. We have obtained images for a sample of clusters with the ``Big Throughput Camera'' (BTC) on the CTIO 4m. This camera comprises four thinned SITE 2048(2) CCDs, each 15arcmin on a side for a total area of one quarter of a square degree. We have developed an automated reduction pipeline which: 1) corrects for spatial distortions, 2) corrects for PSF anisotropy, 3) determines relative scaling and background levels, and 4) combines multiple exposures. In this poster we will present some preliminary results of our cluster lensing study. This will include radial mass and light profiles and 2-d mass and galaxy density maps.

  7. Big Data, Big Opportunities, and Big Challenges.

    PubMed

    Frelinger, Jeffrey A

    2015-11-01

    High-throughput assays have begun to revolutionize modern biology and medicine. The advent of cheap next-generation sequencing (NGS) has made it possible to interrogate cells and human populations as never before. Although this has allowed us to investigate the genetics, gene expression, and impacts of the microbiome, there remain both practical and conceptual challenges. These include data handling, storage, and statistical analysis, as well as an inherent problem of the analysis of heterogeneous cell populations.

  8. Low Cost Wireless Network Camera Sensors for Traffic Monitoring

    DOT National Transportation Integrated Search

    2012-07-01

    Many freeways and arterials in major cities in Texas are presently equipped with video detection cameras to : collect data and help in traffic/incident management. In this study, carefully controlled experiments determined : the throughput and output...

  9. Under surveillance: using cameras to improve care home practice.

    PubMed

    Pearce, Lynne

    2017-05-03

    When the owner of Bramley Court in Birmingham mooted the idea of installing surveillance cameras inside the care home, manager and nurse Ann Willey was unenthusiastic. 'It felt like Big Brother was watching you,' she says. But then she considered how cameras might reduce the risk of poor practice and the potential benefits for vulnerable residents.

  10. Making big sense from big data in toxicology by read-across.

    PubMed

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  11. Use of a Fluorometric Imaging Plate Reader in high-throughput screening

    NASA Astrophysics Data System (ADS)

    Groebe, Duncan R.; Gopalakrishnan, Sujatha; Hahn, Holly; Warrior, Usha; Traphagen, Linda; Burns, David J.

    1999-04-01

    High-throughput screening (HTS) efforts at Abbott Laboratories have been greatly facilitated by the use of a Fluorometric Imaging Plate Reader. The FLIPR consists of an incubated cabinet with integrated 96-channel pipettor and fluorometer. An argon laser is used to excite fluorophores in a 96-well microtiter plate and the emitted fluorometer. An argon laser is used to excite fluorophores in a 96-well microtiter plate and the emitted fluorescence is imaged by a cooled CCD camera. The image data is downloaded from the camera and processed to average the signal form each well of the microtiter pate for each time point. The data is presented in real time on the computer screen, facilitating interpretation and trouble-shooting. In addition to fluorescence, the camera can also detect luminescence form firefly luciferase.

  12. Privacy Challenges of Genomic Big Data.

    PubMed

    Shen, Hong; Ma, Jian

    2017-01-01

    With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.

  13. Deployment of Shaped Charges by a Semi-Autonomous Ground Vehicle

    DTIC Science & Technology

    2007-06-01

    lives on a daily basis. BigFoot seeks to replace the local human component by deploying and remotely detonating shaped charges to destroy IEDs...robotic arm to deploy and remotely detonate shaped charges. BigFoot incorporates improved communication range over previous Autonomous Ground Vehicles...and an updated user interface that includes controls for the arm and camera by interfacing multiple microprocessors. BigFoot is capable of avoiding

  14. From big data analysis to personalized medicine for all: challenges and opportunities.

    PubMed

    Alyass, Akram; Turcotte, Michelle; Meyre, David

    2015-06-27

    Recent advances in high-throughput technologies have led to the emergence of systems biology as a holistic science to achieve more precise modeling of complex diseases. Many predict the emergence of personalized medicine in the near future. We are, however, moving from two-tiered health systems to a two-tiered personalized medicine. Omics facilities are restricted to affluent regions, and personalized medicine is likely to widen the growing gap in health systems between high and low-income countries. This is mirrored by an increasing lag between our ability to generate and analyze big data. Several bottlenecks slow-down the transition from conventional to personalized medicine: generation of cost-effective high-throughput data; hybrid education and multidisciplinary teams; data storage and processing; data integration and interpretation; and individual and global economic relevance. This review provides an update of important developments in the analysis of big data and forward strategies to accelerate the global transition to personalized medicine.

  15. Acquisition of gamma camera and physiological data by computer.

    PubMed

    Hack, S N; Chang, M; Line, B R; Cooper, J A; Robeson, G H

    1986-11-01

    We have designed, implemented, and tested a new Research Data Acquisition System (RDAS) that permits a general purpose digital computer to acquire signals from both gamma camera sources and physiological signal sources concurrently. This system overcomes the limited multi-source, high speed data acquisition capabilities found in most clinically oriented nuclear medicine computers. The RDAS can simultaneously input signals from up to four gamma camera sources with a throughput of 200 kHz per source and from up to eight physiological signal sources with an aggregate throughput of 50 kHz. Rigorous testing has found the RDAS to exhibit acceptable linearity and timing characteristics. In addition, flood images obtained by this system were compared with flood images acquired by a commercial nuclear medicine computer system. National Electrical Manufacturers Association performance standards of the flood images were found to be comparable.

  16. [Medical big data and precision medicine: prospects of epidemiology].

    PubMed

    Song, J; Hu, Y H

    2016-08-10

    Since the development of high-throughput technology, electronic medical record system and big data technology, the value of medical data has caused more attention. On the other hand, the proposal of Precision Medicine Initiative opens up the prospect for medical big data. As a Tool-related Discipline, Epidemiology is, focusing on exploitation the resources of existing big data and promoting the integration of translational research and knowledge to completely unlocking the "black box" of exposure-disease continuum. It also tries to accelerating the realization of the ultimate goal on precision medicine. The overall purpose, however is to translate the evidence from scientific research to improve the health of the people.

  17. Machine learning for Big Data analytics in plants.

    PubMed

    Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng

    2014-12-01

    Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. The Scope of Big Data in One Medicine: Unprecedented Opportunities and Challenges.

    PubMed

    McCue, Molly E; McCoy, Annette M

    2017-01-01

    Advances in high-throughput molecular biology and electronic health records (EHR), coupled with increasing computer capabilities have resulted in an increased interest in the use of big data in health care. Big data require collection and analysis of data at an unprecedented scale and represents a paradigm shift in health care, offering (1) the capacity to generate new knowledge more quickly than traditional scientific approaches; (2) unbiased collection and analysis of data; and (3) a holistic understanding of biology and pathophysiology. Big data promises more personalized and precision medicine for patients with improved accuracy and earlier diagnosis, and therapy tailored to an individual's unique combination of genes, environmental risk, and precise disease phenotype. This promise comes from data collected from numerous sources, ranging from molecules to cells, to tissues, to individuals and populations-and the integration of these data into networks that improve understanding of heath and disease. Big data-driven science should play a role in propelling comparative medicine and "one medicine" (i.e., the shared physiology, pathophysiology, and disease risk factors across species) forward. Merging of data from EHR across institutions will give access to patient data on a scale previously unimaginable, allowing for precise phenotype definition and objective evaluation of risk factors and response to therapy. High-throughput molecular data will give insight into previously unexplored molecular pathophysiology and disease etiology. Investigation and integration of big data from a variety of sources will result in stronger parallels drawn at the molecular level between human and animal disease, allow for predictive modeling of infectious disease and identification of key areas of intervention, and facilitate step-changes in our understanding of disease that can make a substantial impact on animal and human health. However, the use of big data comes with significant challenges. Here we explore the scope of "big data," including its opportunities, its limitations, and what is needed capitalize on big data in one medicine.

  19. The Scope of Big Data in One Medicine: Unprecedented Opportunities and Challenges

    PubMed Central

    McCue, Molly E.; McCoy, Annette M.

    2017-01-01

    Advances in high-throughput molecular biology and electronic health records (EHR), coupled with increasing computer capabilities have resulted in an increased interest in the use of big data in health care. Big data require collection and analysis of data at an unprecedented scale and represents a paradigm shift in health care, offering (1) the capacity to generate new knowledge more quickly than traditional scientific approaches; (2) unbiased collection and analysis of data; and (3) a holistic understanding of biology and pathophysiology. Big data promises more personalized and precision medicine for patients with improved accuracy and earlier diagnosis, and therapy tailored to an individual’s unique combination of genes, environmental risk, and precise disease phenotype. This promise comes from data collected from numerous sources, ranging from molecules to cells, to tissues, to individuals and populations—and the integration of these data into networks that improve understanding of heath and disease. Big data-driven science should play a role in propelling comparative medicine and “one medicine” (i.e., the shared physiology, pathophysiology, and disease risk factors across species) forward. Merging of data from EHR across institutions will give access to patient data on a scale previously unimaginable, allowing for precise phenotype definition and objective evaluation of risk factors and response to therapy. High-throughput molecular data will give insight into previously unexplored molecular pathophysiology and disease etiology. Investigation and integration of big data from a variety of sources will result in stronger parallels drawn at the molecular level between human and animal disease, allow for predictive modeling of infectious disease and identification of key areas of intervention, and facilitate step-changes in our understanding of disease that can make a substantial impact on animal and human health. However, the use of big data comes with significant challenges. Here we explore the scope of “big data,” including its opportunities, its limitations, and what is needed capitalize on big data in one medicine. PMID:29201868

  20. BIG: a large-scale data integration tool for renal physiology.

    PubMed

    Zhao, Yue; Yang, Chin-Rang; Raghuram, Viswanathan; Parulekar, Jaya; Knepper, Mark A

    2016-10-01

    Due to recent advances in high-throughput techniques, we and others have generated multiple proteomic and transcriptomic databases to describe and quantify gene expression, protein abundance, or cellular signaling on the scale of the whole genome/proteome in kidney cells. The existence of so much data from diverse sources raises the following question: "How can researchers find information efficiently for a given gene product over all of these data sets without searching each data set individually?" This is the type of problem that has motivated the "Big-Data" revolution in Data Science, which has driven progress in fields such as marketing. Here we present an online Big-Data tool called BIG (Biological Information Gatherer) that allows users to submit a single online query to obtain all relevant information from all indexed databases. BIG is accessible at http://big.nhlbi.nih.gov/.

  1. [Big Data: the great opportunities and challenges to microbiome and other biomedical research].

    PubMed

    Xu, Zhenjiang

    2015-02-01

    With the development of high-throughput technologies, biomedical data has been increasing exponentially in an explosive manner. This brings enormous opportunities and challenges to biomedical researchers on how to effectively utilize big data. Big data is different from traditional data in many ways, described as 3Vs - volume, variety and velocity. From the perspective of biomedical research, here I introduced the characteristics of big data, such as its messiness, re-usage and openness. Focusing on microbiome research of meta-analysis, the author discussed the prospective principles in data collection, challenges of privacy protection in data management, and the scalable tools in data analysis with examples from real life.

  2. Big Data Bioinformatics

    PubMed Central

    GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO

    2017-01-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:27908398

  3. Big Data Bioinformatics

    PubMed Central

    GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO

    2017-01-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:24799088

  4. Big data bioinformatics.

    PubMed

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  5. BIG: a large-scale data integration tool for renal physiology

    PubMed Central

    Zhao, Yue; Yang, Chin-Rang; Raghuram, Viswanathan; Parulekar, Jaya

    2016-01-01

    Due to recent advances in high-throughput techniques, we and others have generated multiple proteomic and transcriptomic databases to describe and quantify gene expression, protein abundance, or cellular signaling on the scale of the whole genome/proteome in kidney cells. The existence of so much data from diverse sources raises the following question: “How can researchers find information efficiently for a given gene product over all of these data sets without searching each data set individually?” This is the type of problem that has motivated the “Big-Data” revolution in Data Science, which has driven progress in fields such as marketing. Here we present an online Big-Data tool called BIG (Biological Information Gatherer) that allows users to submit a single online query to obtain all relevant information from all indexed databases. BIG is accessible at http://big.nhlbi.nih.gov/. PMID:27279488

  6. The High-Throughput Analyses Era: Are We Ready for the Data Struggle?

    PubMed

    D'Argenio, Valeria

    2018-03-02

    Recent and rapid technological advances in molecular sciences have dramatically increased the ability to carry out high-throughput studies characterized by big data production. This, in turn, led to the consequent negative effect of highlighting the presence of a gap between data yield and their analysis. Indeed, big data management is becoming an increasingly important aspect of many fields of molecular research including the study of human diseases. Now, the challenge is to identify, within the huge amount of data obtained, that which is of clinical relevance. In this context, issues related to data interpretation, sharing and storage need to be assessed and standardized. Once this is achieved, the integration of data from different -omic approaches will improve the diagnosis, monitoring and therapy of diseases by allowing the identification of novel, potentially actionably biomarkers in view of personalized medicine.

  7. Big Brother Is Watching: Video Surveillance on Buses

    ERIC Educational Resources Information Center

    Sloggett, Joel

    2009-01-01

    Many school districts in North America have adopted policies to permit cameras on their properties and, when needed, on buses used to transport students. With regard to school buses, the camera is typically a tool for gathering information to monitor behavior or to help investigate a complaint about behavior. If a picture is worth a thousand…

  8. Quantifying the Onset and Progression of Plant Senescence by Color Image Analysis for High Throughput Applications

    PubMed Central

    Cai, Jinhai; Okamoto, Mamoru; Atieno, Judith; Sutton, Tim; Li, Yongle; Miklavcic, Stanley J.

    2016-01-01

    Leaf senescence, an indicator of plant age and ill health, is an important phenotypic trait for the assessment of a plant’s response to stress. Manual inspection of senescence, however, is time consuming, inaccurate and subjective. In this paper we propose an objective evaluation of plant senescence by color image analysis for use in a high throughput plant phenotyping pipeline. As high throughput phenotyping platforms are designed to capture whole-of-plant features, camera lenses and camera settings are inappropriate for the capture of fine detail. Specifically, plant colors in images may not represent true plant colors, leading to errors in senescence estimation. Our algorithm features a color distortion correction and image restoration step prior to a senescence analysis. We apply our algorithm to two time series of images of wheat and chickpea plants to quantify the onset and progression of senescence. We compare our results with senescence scores resulting from manual inspection. We demonstrate that our procedure is able to process images in an automated way for an accurate estimation of plant senescence even from color distorted and blurred images obtained under high throughput conditions. PMID:27348807

  9. High throughput integrated thermal characterization with non-contact optical calorimetry

    NASA Astrophysics Data System (ADS)

    Hou, Sichao; Huo, Ruiqing; Su, Ming

    2017-10-01

    Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.

  10. -Omic and Electronic Health Record Big Data Analytics for Precision Medicine.

    PubMed

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D

    2017-02-01

    Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.

  11. Modification of the Miyake-Apple technique for simultaneous anterior and posterior video imaging of wet laboratory-based corneal surgery.

    PubMed

    Tan, Johnson C H; Meadows, Howard; Gupta, Aanchal; Yeung, Sonia N; Moloney, Gregory

    2014-03-01

    The aim of this study was to describe a modification of the Miyake-Apple posterior video analysis for the simultaneous visualization of the anterior and posterior corneal surfaces during wet laboratory-based deep anterior lamellar keratoplasty (DALK). A human donor corneoscleral button was affixed to a microscope slide and placed onto a custom-made mounting box. A big bubble DALK was performed on the cornea in the wet laboratory. An 11-diopter intraocular lens was positioned over the aperture of the back camera of an iPhone. This served to video record the posterior view of the corneoscleral button during the big bubble formation. An overhead operating microscope with an attached video camcorder recorded the anterior view during the surgery. The anterior and posterior views of the wet laboratory-based DALK surgery were simultaneously captured and edited using video editing software. The formation of the big bubble can be studied. This video recording camera system has the potential to act as a valuable research and teaching tool in corneal lamellar surgery, especially in the behavior of the big bubble formation in DALK.

  12. Falling-incident detection and throughput enhancement in a multi-camera video-surveillance system.

    PubMed

    Shieh, Wann-Yun; Huang, Ju-Chin

    2012-09-01

    For most elderly, unpredictable falling incidents may occur at the corner of stairs or a long corridor due to body frailty. If we delay to rescue a falling elder who is likely fainting, more serious consequent injury may occur. Traditional secure or video surveillance systems need caregivers to monitor a centralized screen continuously, or need an elder to wear sensors to detect falling incidents, which explicitly waste much human power or cause inconvenience for elders. In this paper, we propose an automatic falling-detection algorithm and implement this algorithm in a multi-camera video surveillance system. The algorithm uses each camera to fetch the images from the regions required to be monitored. It then uses a falling-pattern recognition algorithm to determine if a falling incident has occurred. If yes, system will send short messages to someone needs to be noticed. The algorithm has been implemented in a DSP-based hardware acceleration board for functionality proof. Simulation results show that the accuracy of falling detection can achieve at least 90% and the throughput of a four-camera surveillance system can be improved by about 2.1 times. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.

  13. -Omic and Electronic Health Records Big Data Analytics for Precision Medicine

    PubMed Central

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.

    2017-01-01

    Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470

  14. HARNESSING BIG DATA FOR PRECISION MEDICINE: INFRASTRUCTURES AND APPLICATIONS.

    PubMed

    Yu, Kun-Hsing; Hart, Steven N; Goldfeder, Rachel; Zhang, Qiangfeng Cliff; Parker, Stephen C J; Snyder, Michael

    2017-01-01

    Precision medicine is a health management approach that accounts for individual differences in genetic backgrounds and environmental exposures. With the recent advancements in high-throughput omics profiling technologies, collections of large study cohorts, and the developments of data mining algorithms, big data in biomedicine is expected to provide novel insights into health and disease states, which can be translated into personalized disease prevention and treatment plans. However, petabytes of biomedical data generated by multiple measurement modalities poses a significant challenge for data analysis, integration, storage, and result interpretation. In addition, patient privacy preservation, coordination between participating medical centers and data analysis working groups, as well as discrepancies in data sharing policies remain important topics of discussion. In this workshop, we invite experts in omics integration, biobank research, and data management to share their perspectives on leveraging big data to enable precision medicine.Workshop website: http://tinyurl.com/PSB17BigData; HashTag: #PSB17BigData.

  15. Crop 3D-a LiDAR based platform for 3D high-throughput crop phenotyping.

    PubMed

    Guo, Qinghua; Wu, Fangfang; Pang, Shuxin; Zhao, Xiaoqian; Chen, Linhai; Liu, Jin; Xue, Baolin; Xu, Guangcai; Li, Le; Jing, Haichun; Chu, Chengcai

    2018-03-01

    With the growing population and the reducing arable land, breeding has been considered as an effective way to solve the food crisis. As an important part in breeding, high-throughput phenotyping can accelerate the breeding process effectively. Light detection and ranging (LiDAR) is an active remote sensing technology that is capable of acquiring three-dimensional (3D) data accurately, and has a great potential in crop phenotyping. Given that crop phenotyping based on LiDAR technology is not common in China, we developed a high-throughput crop phenotyping platform, named Crop 3D, which integrated LiDAR sensor, high-resolution camera, thermal camera and hyperspectral imager. Compared with traditional crop phenotyping techniques, Crop 3D can acquire multi-source phenotypic data in the whole crop growing period and extract plant height, plant width, leaf length, leaf width, leaf area, leaf inclination angle and other parameters for plant biology and genomics analysis. In this paper, we described the designs, functions and testing results of the Crop 3D platform, and briefly discussed the potential applications and future development of the platform in phenotyping. We concluded that platforms integrating LiDAR and traditional remote sensing techniques might be the future trend of crop high-throughput phenotyping.

  16. Performance evaluation of throughput computing workloads using multi-core processors and graphics processors

    NASA Astrophysics Data System (ADS)

    Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.

    2017-11-01

    Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.

  17. Mixel camera--a new push-broom camera concept for high spatial resolution keystone-free hyperspectral imaging.

    PubMed

    Høye, Gudrun; Fridman, Andrei

    2013-05-06

    Current high-resolution push-broom hyperspectral cameras introduce keystone errors to the captured data. Efforts to correct these errors in hardware severely limit the optical design, in particular with respect to light throughput and spatial resolution, while at the same time the residual keystone often remains large. The mixel camera solves this problem by combining a hardware component--an array of light mixing chambers--with a mathematical method that restores the hyperspectral data to its keystone-free form, based on the data that was recorded onto the sensor with large keystone. A Virtual Camera software, that was developed specifically for this purpose, was used to compare the performance of the mixel camera to traditional cameras that correct keystone in hardware. The mixel camera can collect at least four times more light than most current high-resolution hyperspectral cameras, and simulations have shown that the mixel camera will be photon-noise limited--even in bright light--with a significantly improved signal-to-noise ratio compared to traditional cameras. A prototype has been built and is being tested.

  18. FPGA Implementation of Stereo Disparity with High Throughput for Mobility Applications

    NASA Technical Reports Server (NTRS)

    Villalpando, Carlos Y.; Morfopolous, Arin; Matthies, Larry; Goldberg, Steven

    2011-01-01

    High speed stereo vision can allow unmanned robotic systems to navigate safely in unstructured terrain, but the computational cost can exceed the capacity of typical embedded CPUs. In this paper, we describe an end-to-end stereo computation co-processing system optimized for fast throughput that has been implemented on a single Virtex 4 LX160 FPGA. This system is capable of operating on images from a 1024 x 768 3CCD (true RGB) camera pair at 15 Hz. Data enters the FPGA directly from the cameras via Camera Link and is rectified, pre-filtered and converted into a disparity image all within the FPGA, incurring no CPU load. Once complete, a rectified image and the final disparity image are read out over the PCI bus, for a bandwidth cost of 68 MB/sec. Within the FPGA there are 4 distinct algorithms: Camera Link capture, Bilinear rectification, Bilateral subtraction pre-filtering and the Sum of Absolute Difference (SAD) disparity. Each module will be described in brief along with the data flow and control logic for the system. The system has been successfully fielded upon the Carnegie Mellon University's National Robotics Engineering Center (NREC) Crusher system during extensive field trials in 2007 and 2008 and is being implemented for other surface mobility systems at JPL.

  19. BIG MAC: A bolometer array for mid-infrared astronomy, Center Director's Discretionary Fund

    NASA Technical Reports Server (NTRS)

    Telesco, C. M.; Decher, R.; Baugher, C.

    1985-01-01

    The infrared array referred to as Big Mac (for Marshall Array Camera), was designed for ground based astronomical observations in the wavelength range 5 to 35 microns. It contains 20 discrete gallium-doped germanium bolometer detectors at a temperature of 1.4K. Each bolometer is irradiated by a square field mirror constituting a single pixel of the array. The mirrors are arranged contiguously in four columns and five rows, thus defining the array configuration. Big Mac utilized cold reimaging optics and an up looking dewar. The total Big Mac system also contains a telescope interface tube for mounting the dewar and a computer for data acquisition and processing. Initial astronomical observations at a major infrared observatory indicate that Big Mac performance is excellent, having achieved the design specifications and making this instrument an outstanding tool for astrophysics.

  20. Increasing Electrochemiluminescence Intensity of a Wireless Electrode Array Chip by Thousands of Times Using a Diode for Sensitive Visual Detection by a Digital Camera.

    PubMed

    Qi, Liming; Xia, Yong; Qi, Wenjing; Gao, Wenyue; Wu, Fengxia; Xu, Guobao

    2016-01-19

    Both a wireless electrochemiluminescence (ECL) electrode microarray chip and the dramatic increase in ECL by embedding a diode in an electromagnetic receiver coil have been first reported. The newly designed device consists of a chip and a transmitter. The chip has an electromagnetic receiver coil, a mini-diode, and a gold electrode array. The mini-diode can rectify alternating current into direct current and thus enhance ECL intensities by 18 thousand times, enabling a sensitive visual detection using common cameras or smart phones as low cost detectors. The detection limit of hydrogen peroxide using a digital camera is comparable to that using photomultiplier tube (PMT)-based detectors. Coupled with a PMT-based detector, the device can detect luminol with higher sensitivity with linear ranges from 10 nM to 1 mM. Because of the advantages including high sensitivity, high throughput, low cost, high portability, and simplicity, it is promising in point of care testing, drug screening, and high throughput analysis.

  1. PRISM Spectrograph Optical Design

    NASA Technical Reports Server (NTRS)

    Chipman, Russell A.

    1995-01-01

    The objective of this contract is to explore optical design concepts for the PRISM spectrograph and produce a preliminary optical design. An exciting optical configuration has been developed which will allow both wavelength bands to be imaged onto the same detector array. At present the optical design is only partially complete because PRISM will require a fairly elaborate optical system to meet its specification for throughput (area*solid angle). The most complex part of the design, the spectrograph camera, is complete, providing proof of principle that a feasible design is attainable. This camera requires 3 aspheric mirrors to fit inside the 20x60 cm cross-section package. A complete design with reduced throughput (1/9th) has been prepared. The design documents the optical configuration concept. A suitable dispersing prism material, CdTe, has been identified for the prism spectrograph, after a comparison of many materials.

  2. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis.

    PubMed

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S

    2017-06-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p , small n ' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.

  3. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis

    PubMed Central

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S.

    2016-01-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical ‘large p, small n’ problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package “adaptiveHM”, which is freely available from https://github.com/benliemory/adaptiveHM. PMID:28919931

  4. Big Data access and infrastructure for modern biology: case studies in data repository utility.

    PubMed

    Boles, Nathan C; Stone, Tyler; Bergeron, Charles; Kiehl, Thomas R

    2017-01-01

    Big Data is no longer solely the purview of big organizations with big resources. Today's routine tools and experimental methods can generate large slices of data. For example, high-throughput sequencing can quickly interrogate biological systems for the expression levels of thousands of different RNAs, examine epigenetic marks throughout the genome, and detect differences in the genomes of individuals. Multichannel electrophysiology platforms produce gigabytes of data in just a few minutes of recording. Imaging systems generate videos capturing biological behaviors over the course of days. Thus, any researcher now has access to a veritable wealth of data. However, the ability of any given researcher to utilize that data is limited by her/his own resources and skills for downloading, storing, and analyzing the data. In this paper, we examine the necessary resources required to engage Big Data, survey the state of modern data analysis pipelines, present a few data repository case studies, and touch on current institutions and programs supporting the work that relies on Big Data. © 2016 New York Academy of Sciences.

  5. Integration into Big Data: First Steps to Support Reuse of Comprehensive Toxicity Model Modules (SOT)

    EPA Science Inventory

    Data surrounding the needs of human disease and toxicity modeling are largely siloed limiting the ability to extend and reuse modules across knowledge domains. Using an infrastructure that supports integration across knowledge domains (animal toxicology, high-throughput screening...

  6. Profiling optimization for big data transfer over dedicated channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yun, D.; Wu, Qishi; Rao, Nageswara S

    The transfer of big data is increasingly supported by dedicated channels in high-performance networks, where transport protocols play an important role in maximizing applicationlevel throughput and link utilization. The performance of transport protocols largely depend on their control parameter settings, but it is prohibitively time consuming to conduct an exhaustive search in a large parameter space to find the best set of parameter values. We propose FastProf, a stochastic approximation-based transport profiler, to quickly determine the optimal operational zone of a given data transfer protocol/method over dedicated channels. We implement and test the proposed method using both emulations based onmore » real-life performance measurements and experiments over physical connections with short (2 ms) and long (380 ms) delays. Both the emulation and experimental results show that FastProf significantly reduces the profiling overhead while achieving a comparable level of end-to-end throughput performance with the exhaustive search-based approach.« less

  7. Big Data Analytics for Genomic Medicine

    PubMed Central

    He, Karen Y.; Ge, Dongliang; He, Max M.

    2017-01-01

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients’ genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs. PMID:28212287

  8. Big Data Analytics for Genomic Medicine.

    PubMed

    He, Karen Y; Ge, Dongliang; He, Max M

    2017-02-15

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients' genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs.

  9. Improved high-throughput quantification of luminescent microplate assays using a common Western-blot imaging system.

    PubMed

    Hawkins, Liam J; Storey, Kenneth B

    2017-01-01

    Common Western-blot imaging systems have previously been adapted to measure signals from luminescent microplate assays. This can be a cost saving measure as Western-blot imaging systems are common laboratory equipment and could substitute a dedicated luminometer if one is not otherwise available. One previously unrecognized limitation is that the signals captured by the cameras in these systems are not equal for all wells. Signals are dependent on the angle of incidence to the camera, and thus the location of the well on the microplate. Here we show that: •The position of a well on a microplate significantly affects the signal captured by a common Western-blot imaging system from a luminescent assay.•The effect of well position can easily be corrected for.•This method can be applied to commercially available luminescent assays, allowing for high-throughput quantification of a wide range of biological processes and biochemical reactions.

  10. Raspberry Pi-powered imaging for plant phenotyping.

    PubMed

    Tovar, Jose C; Hoyer, J Steen; Lin, Andy; Tielking, Allison; Callen, Steven T; Elizabeth Castillo, S; Miller, Michael; Tessman, Monica; Fahlgren, Noah; Carrington, James C; Nusinow, Dmitri A; Gehan, Malia A

    2018-03-01

    Image-based phenomics is a powerful approach to capture and quantify plant diversity. However, commercial platforms that make consistent image acquisition easy are often cost-prohibitive. To make high-throughput phenotyping methods more accessible, low-cost microcomputers and cameras can be used to acquire plant image data. We used low-cost Raspberry Pi computers and cameras to manage and capture plant image data. Detailed here are three different applications of Raspberry Pi-controlled imaging platforms for seed and shoot imaging. Images obtained from each platform were suitable for extracting quantifiable plant traits (e.g., shape, area, height, color) en masse using open-source image processing software such as PlantCV. This protocol describes three low-cost platforms for image acquisition that are useful for quantifying plant diversity. When coupled with open-source image processing tools, these imaging platforms provide viable low-cost solutions for incorporating high-throughput phenomics into a wide range of research programs.

  11. An Overview of High-performance Parallel Big Data transfers over multiple network channels with Transport Layer Security (TLS) and TLS plus Perfect Forward Secrecy (PFS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Chin; Corttrell, R. A.

    This Technical Note provides an overview of high-performance parallel Big Data transfers with and without encryption for data in-transit over multiple network channels. It shows that with the parallel approach, it is feasible to carry out high-performance parallel "encrypted" Big Data transfers without serious impact to throughput. But other impacts, e.g. the energy-consumption part should be investigated. It also explains our rationales of using a statistics-based approach for gaining understanding from test results and for improving the system. The presentation is of high-level nature. Nevertheless, at the end we will pose some questions and identify potentially fruitful directions for futuremore » work.« less

  12. Applying computation biology and "big data" to develop multiplex diagnostics for complex chronic diseases such as osteoarthritis.

    PubMed

    Ren, Guomin; Krawetz, Roman

    2015-01-01

    The data explosion in the last decade is revolutionizing diagnostics research and the healthcare industry, offering both opportunities and challenges. These high-throughput "omics" techniques have generated more scientific data in the last few years than in the entire history of mankind. Here we present a brief summary of how "big data" have influenced early diagnosis of complex diseases. We will also review some of the most commonly used "omics" techniques and their applications in diagnostics. Finally, we will discuss the issues brought by these new techniques when translating laboratory discoveries to clinical practice.

  13. An Overview of the HST Advanced Camera for Surveys' On-orbit Performance

    NASA Astrophysics Data System (ADS)

    Hartig, G. F.; Ford, H. C.; Illingworth, G. D.; Clampin, M.; Bohlin, R. C.; Cox, C.; Krist, J.; Sparks, W. B.; De Marchi, G.; Martel, A. R.; McCann, W. J.; Meurer, G. R.; Sirianni, M.; Tsvetanov, Z.; Bartko, F.; Lindler, D. J.

    2002-05-01

    The Advanced Camera for Surveys (ACS) was installed in the HST on 7 March 2002 during the fourth servicing mission to the observatory, and is now beginning science operations. The ACS provides HST observers with a considerably more sensitive, higher-resolution camera with wider field and polarimetric, coronagraphic, low-resolution spectrographic and solar-blind FUV capabilities. We review selected results of the early verification and calibration program, comparing the achieved performance with the advertised specifications. Emphasis is placed on the optical characteristics of the camera, including image quality, throughput, geometric distortion and stray-light performance. More detailed analyses of various aspects of the ACS performance are presented in other papers at this meeting. This work was supported by a NASA contract and a NASA grant.

  14. Deadpool: A how-to-build guide

    USDA-ARS?s Scientific Manuscript database

    An easy-to-customize, low-cost, low disturbance proximal sensing cart for field-based high-throughput phenotyping is described. General dimensions and build guidelines are provided. The cart, named Deadpool, supports mounting multiple proximal sensors and cameras for characterizing plant traits grow...

  15. Officials nationwide give a green light to automated traffic enforcement

    DOT National Transportation Integrated Search

    2000-03-11

    There has been resistance to using cameras to automatically identify vehicles driven by motorists who run red lights and drive faster than the posted speed limits. Fairness, privacy, and "big brother" have been cited as reasons. The article examines ...

  16. Advanced Virus Detection Technologies Interest Group (AVDTIG): Efforts on High Throughput Sequencing (HTS) for Virus Detection.

    PubMed

    Khan, Arifa S; Vacante, Dominick A; Cassart, Jean-Pol; Ng, Siemon H S; Lambert, Christophe; Charlebois, Robert L; King, Kathryn E

    Several nucleic-acid based technologies have recently emerged with capabilities for broad virus detection. One of these, high throughput sequencing, has the potential for novel virus detection because this method does not depend upon prior viral sequence knowledge. However, the use of high throughput sequencing for testing biologicals poses greater challenges as compared to other newly introduced tests due to its technical complexities and big data bioinformatics. Thus, the Advanced Virus Detection Technologies Users Group was formed as a joint effort by regulatory and industry scientists to facilitate discussions and provide a forum for sharing data and experiences using advanced new virus detection technologies, with a focus on high throughput sequencing technologies. The group was initiated as a task force that was coordinated by the Parenteral Drug Association and subsequently became the Advanced Virus Detection Technologies Interest Group to continue efforts for using new technologies for detection of adventitious viruses with broader participation, including international government agencies, academia, and technology service providers. © PDA, Inc. 2016.

  17. Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic.

    PubMed

    Šljivo, Amina; Kerkhove, Dwight; Tian, Le; Famaey, Jeroen; Munteanu, Adrian; Moerman, Ingrid; Hoebeke, Jeroen; De Poorter, Eli

    2018-01-23

    So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning of RAW and TIM parameters for throughput-demanding reliable applications (i.e., video streaming, firmware updates) on one hand, and very dense low-throughput reliable networks with bidirectional traffic on the other hand.

  18. Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic

    PubMed Central

    Kerkhove, Dwight; Tian, Le; Munteanu, Adrian; De Poorter, Eli

    2018-01-01

    So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning of RAW and TIM parameters for throughput-demanding reliable applications (i.e., video streaming, firmware updates) on one hand, and very dense low-throughput reliable networks with bidirectional traffic on the other hand. PMID:29360798

  19. ACS Data Handbook v.6.0

    NASA Astrophysics Data System (ADS)

    Gonzaga, S.; et al.

    2011-03-01

    ACS was designed to provide a deep, wide-field survey capability from the visible to near-IR using the Wide Field Camera (WFC), high resolution imaging from the near-UV to near-IR with the now-defunct High Resolution Camera (HRC), and solar-blind far-UV imaging using the Solar Blind Camera (SBC). The discovery efficiency of ACS's Wide Field Channel (i.e., the product of WFC's field of view and throughput) is 10 times greater than that of WFPC2. The failure of ACS's CCD electronics in January 2007 brought a temporary halt to CCD imaging until Servicing Mission 4 in May 2009, when WFC functionality was restored. Unfortunately, the high-resolution optical imaging capability of HRC was not recovered.

  20. Training Students to Extract Value from Big Data: Summary of a Workshop

    ERIC Educational Resources Information Center

    Mellody, Maureen

    2014-01-01

    As the availability of high-throughput data-collection technologies, such as information-sensing mobile devices, remote sensing, internet log records, and wireless sensor networks has grown, science, engineering, and business have rapidly transitioned from striving to develop information from scant data to a situation in which the challenge is now…

  1. Comparing modelling techniques when designing VPH gratings for BigBOSS

    NASA Astrophysics Data System (ADS)

    Poppett, Claire; Edelstein, Jerry; Lampton, Michael; Jelinsky, Patrick; Arns, James

    2012-09-01

    BigBOSS is a Stage IV Dark Energy instrument based on the Baryon Acoustic Oscillations (BAO) and Red Shift Distortions (RSD) techniques using spectroscopic data of 20 million ELG and LRG galaxies at 0.5<=z<=1.6 in addition to several hundred thousand QSOs at 0.5<=z<=3.5. When designing BigBOSS instrumentation, it is imperative to maximize throughput whilst maintaining a resolving power of between R=1500 and 4000 over a wavelength range of 360-980 nm. Volume phase Holographic (VPH) gratings have been identified as a key technology which will enable the efficiency requirement to be met, however it is important to be able to accurately predict their performance. In this paper we quantitatively compare different modelling techniques in order to assess the parameter space over which they are more capable of accurately predicting measured performance. Finally we present baseline parameters for grating designs that are most suitable for the BigBOSS instrument.

  2. Big Data in industry

    NASA Astrophysics Data System (ADS)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  3. Big Sky and Greenhorn Drill Holes and CheMin X-ray Diffraction

    NASA Image and Video Library

    2015-12-17

    The graph at right presents information from the NASA Curiosity Mars rover's onboard analysis of rock powder drilled from the "Big Sky" and "Greenhorn" target locations, shown at left. X-ray diffraction analysis of the Greenhorn sample inside the rover's Chemistry and Mineralogy (CheMin) instrument revealed an abundance of silica in the form of noncrystalline opal. The broad hump in the background of the X-ray diffraction pattern for Greenhorn, compared to Big Sky, is diagnostic of opal. The image of Big Sky at upper left was taken by the rover's Mars Hand Lens Imager (MAHLI) camera the day the hole was drilled, Sept. 29, 2015, during the mission's 1,119th Martian day, or sol. The Greenhorn hole was drilled, and the MAHLI image at lower left was taken, on Oct. 18, 2015 (Sol 1137). http://photojournal.jpl.nasa.gov/catalog/PIA20272

  4. Big Sky and Greenhorn Drilling Area on Mount Sharp

    NASA Image and Video Library

    2015-12-17

    This view from the Mast Camera (Mastcam) on NASA's Curiosity Mars rover covers an area in "Bridger Basin" that includes the locations where the rover drilled a target called "Big Sky" on the mission's Sol 1119 (Sept. 29, 2015) and a target called "Greenhorn" on Sol 1137 (Oct. 18, 2015). The scene combines portions of several observations taken from sols 1112 to 1126 (Sept. 22 to Oct. 6, 2015) while Curiosity was stationed at Big Sky drilling site. The Big Sky drill hole is visible in the lower part of the scene. The Greenhorn target, in a pale fracture zone near the center of the image, had not yet been drilled when the component images were taken. Researchers selected this pair of drilling sites to investigate the nature of silica enrichment in the fracture zones of the area. http://photojournal.jpl.nasa.gov/catalog/PIA20270

  5. Cytotoxicity Test Based on Human Cells Labeled with Fluorescent Proteins: Fluorimetry, Photography, and Scanning for High-Throughput Assay.

    PubMed

    Kalinina, Marina A; Skvortsov, Dmitry A; Rubtsova, Maria P; Komarova, Ekaterina S; Dontsova, Olga A

    2018-06-01

    High- and medium-throughput assays are now routine methods for drug screening and toxicology investigations on mammalian cells. However, a simple and cost-effective analysis of cytotoxicity that can be carried out with commonly used laboratory equipment is still required. The developed cytotoxicity assays are based on human cell lines stably expressing eGFP, tdTomato, mCherry, or Katushka2S fluorescent proteins. Red fluorescent proteins exhibit a higher signal-to-noise ratio, due to less interference by medium autofluorescence, in comparison to green fluorescent protein. Measurements have been performed on a fluorescence scanner, a plate fluorimeter, and a camera photodocumentation system. For a 96-well plate assay, the sensitivity per well and the measurement duration were 250 cells and 15 min for the scanner, 500 cells and 2 min for the plate fluorimeter, and 1000 cells and less than 1 min for the camera detection. These sensitivities are similar to commonly used MTT (tetrazolium dye) assays. The used scanner and the camera had not been previously applied for cytotoxicity evaluation. An image processing scheme for the high-resolution scanner is proposed that significantly diminishes the number of control wells, even for a library containing fluorescent substances. The suggested cytotoxicity assay has been verified by measurements of the cytotoxicity of several well-known cytotoxic drugs and further applied to test a set of novel bacteriotoxic compounds in a medium-throughput format. The fluorescent signal of living cells is detected without disturbing them and adding any reagents, thus allowing to investigate time-dependent cytotoxicity effects on the same sample of cells. A fast, simple and cost-effective assay is suggested for cytotoxicity evaluation based on mammalian cells expressing fluorescent proteins and commonly used laboratory equipment.

  6. Use of big data in drug development for precision medicine

    PubMed Central

    Kim, Rosa S.; Goossens, Nicolas; Hoshida, Yujin

    2016-01-01

    Summary Drug development has been a costly and lengthy process with an extremely low success rate and lack of consideration of individual diversity in drug response and toxicity. Over the past decade, an alternative “big data” approach has been expanding at an unprecedented pace based on the development of electronic databases of chemical substances, disease gene/protein targets, functional readouts, and clinical information covering inter-individual genetic variations and toxicities. This paradigm shift has enabled systematic, high-throughput, and accelerated identification of novel drugs or repurposed indications of existing drugs for pathogenic molecular aberrations specifically present in each individual patient. The exploding interest from the information technology and direct-to-consumer genetic testing industries has been further facilitating the use of big data to achieve personalized Precision Medicine. Here we overview currently available resources and discuss future prospects. PMID:27430024

  7. [Relevance of big data for molecular diagnostics].

    PubMed

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  8. Improved spatial resolution of luminescence images acquired with a silicon line scanning camera

    NASA Astrophysics Data System (ADS)

    Teal, Anthony; Mitchell, Bernhard; Juhl, Mattias K.

    2018-04-01

    Luminescence imaging is currently being used to provide spatially resolved defect in high volume silicon solar cell production. One option to obtain the high throughput required for on the fly detection is the use a silicon line scan cameras. However, when using a silicon based camera, the spatial resolution is reduced as a result of the weakly absorbed light scattering within the camera's chip. This paper address this issue by applying deconvolution from a measured point spread function. This paper extends the methods for determining the point spread function of a silicon area camera to a line scan camera with charge transfer. The improvement in resolution is quantified in the Fourier domain and in spatial domain on an image of a multicrystalline silicon brick. It is found that light spreading beyond the active sensor area is significant in line scan sensors, but can be corrected for through normalization of the point spread function. The application of this method improves the raw data, allowing effective detection of the spatial resolution of defects in manufacturing.

  9. A Precision Metrology System for the Hubble Space Telescope Wide Field Camera 3 Instrument

    NASA Technical Reports Server (NTRS)

    Toland, Ronald W.

    2003-01-01

    The Wide Field Camera 3 (WFC3) instrument for the Hubble Space Telescope (HST) will replace the current Wide Field and Planetary Camera 2 (WFPC2). By providing higher throughput and sensitivity than WFPC2, and operating from the near-IR to the near-UV, WFC3 will once again bring the performance of HST above that from ground-based observatories. Crucial to the integration of the WFC3 optical bench is a pair of 2-axis cathetometers used to view targets which cannot be seen by other means when the bench is loaded into its enclosure. The setup and calibration of these cathetometers is described, along with results from a comparison of the cathetometer system with other metrology techniques.

  10. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    PubMed

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  11. High-throughput screening for combinatorial thin-film library of thermoelectric materials.

    PubMed

    Watanabe, Masaki; Kita, Takuji; Fukumura, Tomoteru; Ohtomo, Akira; Ueno, Kazunori; Kawasaki, Masashi

    2008-01-01

    A high-throughput method has been developed to evaluate the Seebeck coefficient and electrical resistivity of combinatorial thin-film libraries of thermoelectric materials from room temperature to 673 K. Thin-film samples several millimeters in size were deposited on an integrated Al2O3 substrate with embedded lead wires and local heaters for measurement of the thermopower under a controlled temperature gradient. An infrared camera was used for real-time observation of the temperature difference Delta T between two electrical contacts on the sample to obtain the Seebeck coefficient. The Seebeck coefficient and electrical resistivity of constantan thin films were shown to be almost identical to standard data for bulk constantan. High-throughput screening was demonstrated for a thermoelectric Mg-Si-Ge combinatorial library.

  12. Big defensins, a diverse family of antimicrobial peptides that follows different patterns of expression in hemocytes of the oyster Crassostrea gigas.

    PubMed

    Rosa, Rafael D; Santini, Adrien; Fievet, Julie; Bulet, Philippe; Destoumieux-Garzón, Delphine; Bachère, Evelyne

    2011-01-01

    Big defensin is an antimicrobial peptide composed of a highly hydrophobic N-terminal region and a cationic C-terminal region containing six cysteine residues involved in three internal disulfide bridges. While big defensin sequences have been reported in various mollusk species, few studies have been devoted to their sequence diversity, gene organization and their expression in response to microbial infections. Using the high-throughput Digital Gene Expression approach, we have identified in Crassostrea gigas oysters several sequences coding for big defensins induced in response to a Vibrio infection. We showed that the oyster big defensin family is composed of three members (named Cg-BigDef1, Cg-BigDef2 and Cg-BigDef3) that are encoded by distinct genomic sequences. All Cg-BigDefs contain a hydrophobic N-terminal domain and a cationic C-terminal domain that resembles vertebrate β-defensins. Both domains are encoded by separate exons. We found that big defensins form a group predominantly present in mollusks and closer to vertebrate defensins than to invertebrate and fungi CSαβ-containing defensins. Moreover, we showed that Cg-BigDefs are expressed in oyster hemocytes only and follow different patterns of gene expression. While Cg-BigDef3 is non-regulated, both Cg-BigDef1 and Cg-BigDef2 transcripts are strongly induced in response to bacterial challenge. Induction was dependent on pathogen associated molecular patterns but not damage-dependent. The inducibility of Cg-BigDef1 was confirmed by HPLC and mass spectrometry, since ions with a molecular mass compatible with mature Cg-BigDef1 (10.7 kDa) were present in immune-challenged oysters only. From our biochemical data, native Cg-BigDef1 would result from the elimination of a prepropeptide sequence and the cyclization of the resulting N-terminal glutamine residue into a pyroglutamic acid. We provide here the first report showing that big defensins form a family of antimicrobial peptides diverse not only in terms of sequences but also in terms of genomic organization and regulation of gene expression.

  13. Big Defensins, a Diverse Family of Antimicrobial Peptides That Follows Different Patterns of Expression in Hemocytes of the Oyster Crassostrea gigas

    PubMed Central

    Rosa, Rafael D.; Santini, Adrien; Fievet, Julie; Bulet, Philippe; Destoumieux-Garzón, Delphine; Bachère, Evelyne

    2011-01-01

    Background Big defensin is an antimicrobial peptide composed of a highly hydrophobic N-terminal region and a cationic C-terminal region containing six cysteine residues involved in three internal disulfide bridges. While big defensin sequences have been reported in various mollusk species, few studies have been devoted to their sequence diversity, gene organization and their expression in response to microbial infections. Findings Using the high-throughput Digital Gene Expression approach, we have identified in Crassostrea gigas oysters several sequences coding for big defensins induced in response to a Vibrio infection. We showed that the oyster big defensin family is composed of three members (named Cg-BigDef1, Cg-BigDef2 and Cg-BigDef3) that are encoded by distinct genomic sequences. All Cg-BigDefs contain a hydrophobic N-terminal domain and a cationic C-terminal domain that resembles vertebrate β-defensins. Both domains are encoded by separate exons. We found that big defensins form a group predominantly present in mollusks and closer to vertebrate defensins than to invertebrate and fungi CSαβ-containing defensins. Moreover, we showed that Cg-BigDefs are expressed in oyster hemocytes only and follow different patterns of gene expression. While Cg-BigDef3 is non-regulated, both Cg-BigDef1 and Cg-BigDef2 transcripts are strongly induced in response to bacterial challenge. Induction was dependent on pathogen associated molecular patterns but not damage-dependent. The inducibility of Cg-BigDef1 was confirmed by HPLC and mass spectrometry, since ions with a molecular mass compatible with mature Cg-BigDef1 (10.7 kDa) were present in immune-challenged oysters only. From our biochemical data, native Cg-BigDef1 would result from the elimination of a prepropeptide sequence and the cyclization of the resulting N-terminal glutamine residue into a pyroglutamic acid. Conclusions We provide here the first report showing that big defensins form a family of antimicrobial peptides diverse not only in terms of sequences but also in terms of genomic organization and regulation of gene expression. PMID:21980497

  14. High throughput imaging cytometer with acoustic focussing.

    PubMed

    Zmijan, Robert; Jonnalagadda, Umesh S; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn; Glynne-Jones, Peter

    2015-10-31

    We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint.

  15. Camera Calibration with Radial Variance Component Estimation

    NASA Astrophysics Data System (ADS)

    Mélykuti, B.; Kruck, E. J.

    2014-11-01

    Camera calibration plays a more and more important role in recent times. Beside real digital aerial survey cameras the photogrammetric market is dominated by a big number of non-metric digital cameras mounted on UAVs or other low-weight flying platforms. The in-flight calibration of those systems has a significant role to enhance the geometric accuracy of survey photos considerably. It is expected to have a better precision of photo measurements in the center of images then along the edges or in the corners. With statistical methods the accuracy of photo measurements in dependency of the distance of points from image center has been analyzed. This test provides a curve for the measurement precision as function of the photo radius. A high number of camera types have been tested with well penetrated point measurements in image space. The result of the tests led to a general consequence to show a functional connection between accuracy and radial distance and to give a method how to check and enhance the geometrical capability of the cameras in respect to these results.

  16. Product Plan of New Generation System Camera "OLYMPUS PEN E-P1"

    NASA Astrophysics Data System (ADS)

    Ogawa, Haruo

    "OLYMPUS PEN E-P1", which is new generation system camera, is the first product of Olympus which is new standard "Micro Four-thirds System" for high-resolution mirror-less cameras. It continues good sales by the concept of "small and stylish design, easy operation and SLR image quality" since release on July 3, 2009. On the other hand, the half-size film camera "OLYMPUS PEN" was popular by the concept "small and stylish design and original mechanism" since the first product in 1959 and recorded sale number more than 17 million with 17 models. By the 50th anniversary topic and emotional value of the Olympus pen, Olympus pen E-P1 became big sales. I would like to explain the way of thinking of the product plan that included not only the simple functional value but also emotional value on planning the first product of "Micro Four-thirds System".

  17. Lights, camera, action: high-throughput plant phenotyping is ready for a close-up

    USDA-ARS?s Scientific Manuscript database

    Modern techniques for crop improvement rely on both DNA sequencing and accurate quantification of plant traits to identify genes and germplasm of interest. With rapid advances in DNA sequencing technologies, plant phenotyping is now a bottleneck in advancing crop yields [1,2]. Furthermore, the envir...

  18. Goodman High Throughput Spectrograph | SOAR

    Science.gov Websites

    SPARTAN Near-IR Camera Ohio State Infrared Imager/Spectrograph (OSIRIS) - NO LONGER AVAILABLE SOAR 320-850 nm wavelength range. The paper describing the instrument is Clemens et al. (2004) Applying for IRAF. Publishing results based on Goodman data?: ADS link to 2004 SPIE Goodman Spectrograph paper

  19. High throughput imaging and analysis for biological interpretation of agricultural plants and environmental interaction

    NASA Astrophysics Data System (ADS)

    Hong, Hyundae; Benac, Jasenka; Riggsbee, Daniel; Koutsky, Keith

    2014-03-01

    High throughput (HT) phenotyping of crops is essential to increase yield in environments deteriorated by climate change. The controlled environment of a greenhouse offers an ideal platform to study the genotype to phenotype linkages for crop screening. Advanced imaging technologies are used to study plants' responses to resource limitations such as water and nutrient deficiency. Advanced imaging technologies coupled with automation make HT phenotyping in the greenhouse not only feasible, but practical. Monsanto has a state of the art automated greenhouse (AGH) facility. Handling of the soil, pots water and nutrients are all completely automated. Images of the plants are acquired by multiple hyperspectral and broadband cameras. The hyperspectral cameras cover wavelengths from visible light through short wave infra-red (SWIR). Inhouse developed software analyzes the images to measure plant morphological and biochemical properties. We measure phenotypic metrics like plant area, height, and width as well as biomass. Hyperspectral imaging allows us to measure biochemcical metrics such as chlorophyll, anthocyanin, and foliar water content. The last 4 years of AGH operations on crops like corn, soybean, and cotton have demonstrated successful application of imaging and analysis technologies for high throughput plant phenotyping. Using HT phenotyping, scientists have been showing strong correlations to environmental conditions, such as water and nutrient deficits, as well as the ability to tease apart distinct differences in the genetic backgrounds of crops.

  20. Intuitive web-based experimental design for high-throughput biomedical data.

    PubMed

    Friedrich, Andreas; Kenar, Erhan; Kohlbacher, Oliver; Nahnsen, Sven

    2015-01-01

    Big data bioinformatics aims at drawing biological conclusions from huge and complex biological datasets. Added value from the analysis of big data, however, is only possible if the data is accompanied by accurate metadata annotation. Particularly in high-throughput experiments intelligent approaches are needed to keep track of the experimental design, including the conditions that are studied as well as information that might be interesting for failure analysis or further experiments in the future. In addition to the management of this information, means for an integrated design and interfaces for structured data annotation are urgently needed by researchers. Here, we propose a factor-based experimental design approach that enables scientists to easily create large-scale experiments with the help of a web-based system. We present a novel implementation of a web-based interface allowing the collection of arbitrary metadata. To exchange and edit information we provide a spreadsheet-based, humanly readable format. Subsequently, sample sheets with identifiers and metainformation for data generation facilities can be created. Data files created after measurement of the samples can be uploaded to a datastore, where they are automatically linked to the previously created experimental design model.

  1. DIVE: A Graph-based Visual Analytics Framework for Big Data

    PubMed Central

    Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie

    2014-01-01

    The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197

  2. Big data mining powers fungal research: recent advances in fission yeast systems biology approaches.

    PubMed

    Wang, Zhe

    2017-06-01

    Biology research has entered into big data era. Systems biology approaches therefore become the powerful tools to obtain the whole landscape of how cell separate, grow, and resist the stresses. Fission yeast Schizosaccharomyces pombe is wonderful unicellular eukaryote model, especially studying its division and metabolism can facilitate to understanding the molecular mechanism of cancer and discovering anticancer agents. In this perspective, we discuss the recent advanced fission yeast systems biology tools, mainly focus on metabolomics profiling and metabolic modeling, protein-protein interactome and genetic interaction network, DNA sequencing and applications, and high-throughput phenotypic screening. We therefore hope this review can be useful for interested fungal researchers as well as bioformaticians.

  3. Lunar UV-visible-IR mapping interferometric spectrometer

    NASA Technical Reports Server (NTRS)

    Smith, W. Hayden; Haskin, L.; Korotev, R.; Arvidson, R.; Mckinnon, W.; Hapke, B.; Larson, S.; Lucey, P.

    1992-01-01

    Ultraviolet-visible-infrared mapping digital array scanned interferometers for lunar compositional surveys was developed. The research has defined a no-moving-parts, low-weight and low-power, high-throughput, and electronically adaptable digital array scanned interferometer that achieves measurement objectives encompassing and improving upon all the requirements defined by the LEXSWIG for lunar mineralogical investigation. In addition, LUMIS provides a new, important, ultraviolet spectral mapping, high-spatial-resolution line scan camera, and multispectral camera capabilities. An instrument configuration optimized for spectral mapping and imaging of the lunar surface and provide spectral results in support of the instrument design are described.

  4. Big Crater as Viewed by Pathfinder Lander

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.

    The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.

    This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.

    Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is a division of the California Institute of Technology (Caltech). The IMP was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.

  5. Writing for the Big Screen: Literacy Experiences in a Moviemaking Project

    ERIC Educational Resources Information Center

    Bedard, Carol; Fuhrken, Charles

    2011-01-01

    An integrated language arts and technology program engaged students in reading and writing activities that funded an experience in moviemaking. With video cameras in hand, students, often working collaboratively, developed expanded views of the writing and revision processes as they created movies that mattered to them and found an audience beyond…

  6. Ultra-high throughput detection of single cell β-galactosidase activity in droplets using micro-optical lens array

    NASA Astrophysics Data System (ADS)

    Lim, Jiseok; Vrignon, Jérémy; Gruner, Philipp; Karamitros, Christos S.; Konrad, Manfred; Baret, Jean-Christophe

    2013-11-01

    We demonstrate the use of a hybrid microfluidic-micro-optical system for the screening of enzymatic activity at the single cell level. Escherichia coli β-galactosidase activity is revealed by a fluorogenic assay in 100 pl droplets. Individual droplets containing cells are screened by measuring their fluorescence signal using a high-speed camera. The measurement is parallelized over 100 channels equipped with microlenses and analyzed by image processing. A reinjection rate of 1 ml of emulsion per minute was reached corresponding to more than 105 droplets per second, an analytical throughput larger than those obtained using flow cytometry.

  7. Calibration of Action Cameras for Photogrammetric Purposes

    PubMed Central

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  8. Calibration of action cameras for photogrammetric purposes.

    PubMed

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-09-18

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.

  9. Big Data Application in Biomedical Research and Health Care: A Literature Review.

    PubMed

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care.

  10. Big Data Application in Biomedical Research and Health Care: A Literature Review

    PubMed Central

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care. PMID:26843812

  11. The Selection and Placement Method of Materialized Views on Big Data Platform of Equipment Condition Assessment

    NASA Astrophysics Data System (ADS)

    Ma, Yan; Yao, Jinxia; Gu, Chao; Chen, Yufeng; Yang, Yi; Zou, Lida

    2017-05-01

    With the formation of electric big data environment, more and more big data analyses emerge. In the complicated data analysis on equipment condition assessment, there exist many join operations, which are time-consuming. In order to save time, the approach of materialized view is usually used. It places part of common and critical join results on external storage and avoids the frequent join operation. In the paper we propose the methods of selecting and placing materialized views to reduce the query time of electric transmission and transformation equipment, and make the profits of service providers maximal. In selection method we design a computation way for the value of non-leaf node based on MVPP structure chart. In placement method we use relevance weights to place the selected materialized views, which help reduce the network transmission time. Our experiments show that the proposed selection and placement methods have a high throughput and good optimization ability of query time for electric transmission and transformation equipment.

  12. Integrity, standards, and QC-related issues with big data in pre-clinical drug discovery.

    PubMed

    Brothers, John F; Ung, Matthew; Escalante-Chong, Renan; Ross, Jermaine; Zhang, Jenny; Cha, Yoonjeong; Lysaght, Andrew; Funt, Jason; Kusko, Rebecca

    2018-06-01

    The tremendous expansion of data analytics and public and private big datasets presents an important opportunity for pre-clinical drug discovery and development. In the field of life sciences, the growth of genetic, genomic, transcriptomic and proteomic data is partly driven by a rapid decline in experimental costs as biotechnology improves throughput, scalability, and speed. Yet far too many researchers tend to underestimate the challenges and consequences involving data integrity and quality standards. Given the effect of data integrity on scientific interpretation, these issues have significant implications during preclinical drug development. We describe standardized approaches for maximizing the utility of publicly available or privately generated biological data and address some of the common pitfalls. We also discuss the increasing interest to integrate and interpret cross-platform data. Principles outlined here should serve as a useful broad guide for existing analytical practices and pipelines and as a tool for developing additional insights into therapeutics using big data. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Automated batch characterization of inkjet-printed elastomer lenses using a LEGO platform.

    PubMed

    Sung, Yu-Lung; Garan, Jacob; Nguyen, Hoang; Hu, Zhenyu; Shih, Wei-Chuan

    2017-09-10

    Small, self-adhesive, inkjet-printed elastomer lenses have enabled smartphone cameras to image and resolve microscopic objects. However, the performance of different lenses within a batch is affected by hard-to-control environmental variables. We present a cost-effective platform to perform automated batch characterization of 300 lens units simultaneously for quality inspection. The system was designed and configured with LEGO bricks, 3D printed parts, and a digital camera. The scheme presented here may become the basis of a high-throughput, in-line inspection tool for quality control purposes and can also be employed for optimization of the manufacturing process.

  14. Parallel separations using capillary electrophoresis on a multilane microchip with multiplexed laser-induced fluorescence detection.

    PubMed

    Nikcevic, Irena; Piruska, Aigars; Wehmeyer, Kenneth R; Seliskar, Carl J; Limbach, Patrick A; Heineman, William R

    2010-08-01

    Parallel separations using CE on a multilane microchip with multiplexed LIF detection is demonstrated. The detection system was developed to simultaneously record data on all channels using an expanded laser beam for excitation, a camera lens to capture emission, and a CCD camera for detection. The detection system enables monitoring of each channel continuously and distinguishing individual lanes without significant crosstalk between adjacent lanes. Multiple analytes can be determined in parallel lanes within a single microchip in a single run, leading to increased sample throughput. The pK(a) determination of small molecule analytes is demonstrated with the multilane microchip.

  15. Parallel separations using capillary electrophoresis on a multilane microchip with multiplexed laser induced fluorescence detection

    PubMed Central

    Nikcevic, Irena; Piruska, Aigars; Wehmeyer, Kenneth R.; Seliskar, Carl J.; Limbach, Patrick A.; Heineman, William R.

    2010-01-01

    Parallel separations using capillary electrophoresis on a multilane microchip with multiplexed laser induced fluorescence detection is demonstrated. The detection system was developed to simultaneously record data on all channels using an expanded laser beam for excitation, a camera lens to capture emission, and a CCD camera for detection. The detection system enables monitoring of each channel continuously and distinguishing individual lanes without significant crosstalk between adjacent lanes. Multiple analytes can be analyzed on parallel lanes within a single microchip in a single run, leading to increased sample throughput. The pKa determination of small molecule analytes is demonstrated with the multilane microchip. PMID:20737446

  16. A high-throughput next-generation sequencing-based method for detecting the mutational fingerprint of carcinogens

    PubMed Central

    Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella

    2012-01-01

    Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue® mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents. PMID:22735701

  17. A high-throughput next-generation sequencing-based method for detecting the mutational fingerprint of carcinogens.

    PubMed

    Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella

    2012-08-01

    Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents.

  18. About possibility of temperature trace observing on a human skin through clothes by using computer processing of IR image

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Shestakov, Ivan L.; Blednov, Roman G.

    2017-05-01

    One of urgent security problems is a detection of objects placed inside the human body. Obviously, for safety reasons one cannot use X-rays for such object detection widely and often. For this purpose, we propose to use THz camera and IR camera. Below we continue a possibility of IR camera using for a detection of temperature trace on a human body. In contrast to passive THz camera using, the IR camera does not allow to see very pronounced the object under clothing. Of course, this is a big disadvantage for a security problem solution based on the IR camera using. To find possible ways for this disadvantage overcoming we make some experiments with IR camera, produced by FLIR Company and develop novel approach for computer processing of images captured by IR camera. It allows us to increase a temperature resolution of IR camera as well as human year effective susceptibility enhancing. As a consequence of this, a possibility for seeing of a human body temperature changing through clothing appears. We analyze IR images of a person, which drinks water and eats chocolate. We follow a temperature trace on human body skin, caused by changing of temperature inside the human body. Some experiments are made with observing of temperature trace from objects placed behind think overall. Demonstrated results are very important for the detection of forbidden objects, concealed inside the human body, by using non-destructive control without using X-rays.

  19. Big Data: What Is It and Why Does It Matter?

    ERIC Educational Resources Information Center

    Waters, John K.

    2012-01-01

    Colleges and universities are swimming in an ever-widening sea of data. Human beings and machines together generate about 2.5 "quintillion" (10[superscript 18]) bytes every day, according to IBM's latest estimate. The sources of all that data are dizzyingly diverse: e-mail, blogs, click streams, security cameras, weather sensors, social networks,…

  20. msBiodat analysis tool, big data analysis for high-throughput experiments.

    PubMed

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  1. About possibility of temperature trace observing on the human skin using commercially available IR camera

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Shestakov, Ivan L.; Blednov, Roman G.

    2016-09-01

    One of urgent security problems is a detection of objects placed inside the human body. Obviously, for safety reasons one cannot use X-rays for such object detection widely and often. Three years ago, we have demonstrated principal possibility to see a temperature trace, induced by food eating or water drinking, on the human body skin by using a passive THz camera. However, this camera is very expensive. Therefore, for practice it will be very convenient if one can use the IR camera for this purpose. In contrast to passive THz camera using, the IR camera does not allow to see the object under clothing, if an image, produced by this camera, is used directly. Of course, this is a big disadvantage for a security problem solution based on the IR camera using. To overcome this disadvantage we develop novel approach for computer processing of IR camera images. It allows us to increase a temperature resolution of IR camera as well as increasing of human year effective susceptibility. As a consequence of this, a possibility for seeing of a human body temperature changing through clothing appears. We analyze IR images of a person, which drinks water and eats chocolate. We follow a temperature trace on human body skin, caused by changing of temperature inside the human body. Some experiments were made with measurements of a body temperature covered by T-shirt. Shown results are very important for the detection of forbidden objects, cancelled inside the human body, by using non-destructive control without using X-rays.

  2. Big Software for Big Data: Scaling Up Photometry for LSST (Abstract)

    NASA Astrophysics Data System (ADS)

    Rawls, M.

    2017-06-01

    (Abstract only) The Large Synoptic Survey Telescope (LSST) will capture mosaics of the sky every few nights, each containing more data than your computer's hard drive can store. As a result, the software to process these images is as critical to the science as the telescope and the camera. I discuss the algorithms and software being developed by the LSST Data Management team to handle such a large volume of data. All of our work is open source and available to the community. Once LSST comes online, our software will produce catalogs of objects and a stream of alerts. These will bring exciting new opportunities for follow-up observations and collaborations with LSST scientists.

  3. MSE spectrograph optical design: a novel pupil slicing technique

    NASA Astrophysics Data System (ADS)

    Spanò, P.

    2014-07-01

    The Maunakea Spectroscopic Explorer shall be mainly devoted to perform deep, wide-field, spectroscopic surveys at spectral resolutions from ~2000 to ~20000, at visible and near-infrared wavelengths. Simultaneous spectral coverage at low resolution is required, while at high resolution only selected windows can be covered. Moreover, very high multiplexing (3200 objects) must be obtained at low resolution. At higher resolutions a decreased number of objects (~800) can be observed. To meet such high demanding requirements, a fiber-fed multi-object spectrograph concept has been designed by pupil-slicing the collimated beam, followed by multiple dispersive and camera optics. Different resolution modes are obtained by introducing anamorphic lenslets in front of the fiber arrays. The spectrograph is able to switch between three resolution modes (2000, 6500, 20000) by removing the anamorphic lenses and exchanging gratings. Camera lenses are fixed in place to increase stability. To enhance throughput, VPH first-order gratings has been preferred over echelle gratings. Moreover, throughput is kept high over all wavelength ranges by splitting light into more arms by dichroic beamsplitters and optimizing efficiency for each channel by proper selection of glass materials, coatings, and grating parameters.

  4. High throughput field plant phenotyping facility at University of Nebraska-Lincoln and the first year experience

    NASA Astrophysics Data System (ADS)

    Ge, Y.; Bai, G.; Irmak, S.; Awada, T.; Stoerger, V.; Graef, G.; Scoby, D.; Schnable, J.

    2017-12-01

    University of Nebraska - Lincoln's high throughput field plant phenotyping facility is a cable robot based system built on a 1-ac field. The sensor platform is tethered with eight cables via four poles at the corners of the field for its precise control and positioning. The sensor modules on the platform include a 4-band RGB-NIR camera, a thermal infrared camera, a 3D LiDAR, VNIR spectrometers, and environmental sensors. These sensors are used to collect multifaceted physiological, structural and chemical properties of plants from the field plots. A subsurface drip irrigation system is established in this field which allows a controlled amount of water and fertilizers to be delivered to individual plots. An extensive soil moisture sensor network is also established to monitor soil water status, and serve as a feedback loop for irrigation scheduling. In the first year of operation, the field is planted maize and soybean. Weekly ground truth data were collected from the plots to validate image and sensor data from the phenotyping system. This presentation will provide an overview of this state-of-the-art field plant phenotyping facility, and present preliminary data from the first year operation of the system.

  5. Bioinformatics clouds for big data manipulation.

    PubMed

    Dai, Lin; Gao, Xin; Guo, Yan; Xiao, Jingfa; Zhang, Zhang

    2012-11-28

    As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor.

  6. Analysis of the effect on optical equipment caused by solar position in target flight measure

    NASA Astrophysics Data System (ADS)

    Zhu, Shun-hua; Hu, Hai-bin

    2012-11-01

    Optical equipment is widely used to measure flight parameters in target flight performance test, but the equipment is sensitive to the sun's rays. In order to avoid the disadvantage of sun's rays directly shines to the optical equipment camera lens when measuring target flight parameters, the angle between observation direction and the line which connects optical equipment camera lens and the sun should be kept at a big range, The calculation method of the solar azimuth and altitude to the optical equipment at any time and at any place on the earth, the equipment observation direction model and the calculating model of angle between observation direction and the line which connects optical equipment camera lens are introduced in this article. Also, the simulation of the effect on optical equipment caused by solar position at different time, different date, different month and different target flight direction is given in this article.

  7. Augmented reality in laser laboratories

    NASA Astrophysics Data System (ADS)

    Quercioli, Franco

    2018-05-01

    Laser safety glasses block visibility of the laser light. This is a big nuisance when a clear view of the beam path is required. A headset made up of a smartphone and a viewer can overcome this problem. The user looks at the image of the real world on the cellphone display, captured by its rear camera. An unimpeded and safe sight of the laser beam is then achieved. If the infrared blocking filter of the smartphone camera is removed, the spectral sensitivity of the CMOS image sensor extends in the near infrared region up to 1100 nm. This substantial improvement widens the usability of the device to many laser systems for industrial and medical applications, which are located in this spectral region. The paper describes this modification of a phone camera to extend its sensitivity beyond the visible and make a true augmented reality laser viewer.

  8. Spectral imaging of neurosurgical target tissues through operation microscope

    NASA Astrophysics Data System (ADS)

    Antikainen, Jukka; von Und Zu Fraunberg, Mikael; Orava, Joni; Jaaskelainen, Juha E.; Hauta-Kasari, Markku

    2011-11-01

    It has been noticed that spectral information can be used for analyzing and separating different biological tissues. However, most of the studies for spectral image acquisitions are mainly done in vitro. Usually the main restrictions for in vivo measurements are the size or the weight of the spectral camera. If the camera weights too much, the surgery microscope cannot be stabilized. If the size of the camera is too big, it will disturb the surgeon or even risk the safety of the patient. The main goal of this study was to develop an independent spectral imaging device which can be used for collecting spectral information from the neurosurgeries without any previously described restrictions. Size of the imaging system is small enough not to disturb the surgeon during the surgery. The developed spectral imaging system is used for collecting a spectral database which can be used for the future imaging systems.

  9. GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping

    PubMed Central

    Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan

    2016-01-01

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables. PMID:27529547

  10. GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping.

    PubMed

    Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan

    2016-01-01

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables.

  11. Light field reconstruction robust to signal dependent noise

    NASA Astrophysics Data System (ADS)

    Ren, Kun; Bian, Liheng; Suo, Jinli; Dai, Qionghai

    2014-11-01

    Capturing four dimensional light field data sequentially using a coded aperture camera is an effective approach but suffers from low signal noise ratio. Although multiplexing can help raise the acquisition quality, noise is still a big issue especially for fast acquisition. To address this problem, this paper proposes a noise robust light field reconstruction method. Firstly, scene dependent noise model is studied and incorporated into the light field reconstruction framework. Then, we derive an optimization algorithm for the final reconstruction. We build a prototype by hacking an off-the-shelf camera for data capturing and prove the concept. The effectiveness of this method is validated with experiments on the real captured data.

  12. High throughput imaging cytometer with acoustic focussing† †Electronic supplementary information (ESI) available: High throughput imaging cytometer with acoustic focussing. See DOI: 10.1039/c5ra19497k Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Zmijan, Robert; Jonnalagadda, Umesh S.; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn

    2015-01-01

    We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint. PMID:29456838

  13. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    NASA Astrophysics Data System (ADS)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  14. Experiments and Analyses of Data Transfers Over Wide-Area Dedicated Connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata

    Dedicated wide-area network connections are increasingly employed in high-performance computing and big data scenarios. One might expect the performance and dynamics of data transfers over such connections to be easy to analyze due to the lack of competing traffic. However, non-linear transport dynamics and end-system complexities (e.g., multi-core hosts and distributed filesystems) can in fact make analysis surprisingly challenging. We present extensive measurements of memory-to-memory and disk-to-disk file transfers over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory-to-memory transfers, profiles of both TCP and UDT throughput as a function of RTT show concavemore » and convex regions; large buffer sizes and more parallel flows lead to wider concave regions, which are highly desirable. TCP and UDT both also display complex throughput dynamics, as indicated by their Poincare maps and Lyapunov exponents. For disk-to-disk transfers, we determine that high throughput can be achieved via a combination of parallel I/O threads, parallel network threads, and direct I/O mode. Our measurements also show that Lustre filesystems can be mounted over long-haul connections using LNet routers, although challenges remain in jointly optimizing file I/O and transport method parameters to achieve peak throughput.« less

  15. On Data Transfers Over Wide-Area Dedicated Connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang

    Dedicated wide-area network connections are employed in big data and high-performance computing scenarios, since the absence of cross-traffic promises to make it easier to analyze and optimize data transfers over them. However, nonlinear transport dynamics and end-system complexity due to multi-core hosts and distributed file systems make these tasks surprisingly challenging. We present an overview of methods to analyze memory and disk file transfers using extensive measurements over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory transfers, we derive performance profiles of TCP and UDT throughput as a function of RTT, which showmore » concave regions in contrast to entirely convex regions predicted by previous models. These highly desirable concave regions can be expanded by utilizing large buffers and more parallel flows. We also present Poincar´e maps and Lyapunov exponents of TCP and UDT throughputtraces that indicate complex throughput dynamics. For disk file transfers, we show that throughput can be optimized using a combination of parallel I/O and network threads under direct I/O mode. Our initial throughput measurements of Lustre filesystems mounted over long-haul connections using LNet routers show convex profiles indicative of I/O limits.« less

  16. High-Throughput Block Optical DNA Sequence Identification.

    PubMed

    Sagar, Dodderi Manjunatha; Korshoj, Lee Erik; Hanson, Katrina Bethany; Chowdhury, Partha Pratim; Otoupal, Peter Britton; Chatterjee, Anushree; Nagpal, Prashant

    2018-01-01

    Optical techniques for molecular diagnostics or DNA sequencing generally rely on small molecule fluorescent labels, which utilize light with a wavelength of several hundred nanometers for detection. Developing a label-free optical DNA sequencing technique will require nanoscale focusing of light, a high-throughput and multiplexed identification method, and a data compression technique to rapidly identify sequences and analyze genomic heterogeneity for big datasets. Such a method should identify characteristic molecular vibrations using optical spectroscopy, especially in the "fingerprinting region" from ≈400-1400 cm -1 . Here, surface-enhanced Raman spectroscopy is used to demonstrate label-free identification of DNA nucleobases with multiplexed 3D plasmonic nanofocusing. While nanometer-scale mode volumes prevent identification of single nucleobases within a DNA sequence, the block optical technique can identify A, T, G, and C content in DNA k-mers. The content of each nucleotide in a DNA block can be a unique and high-throughput method for identifying sequences, genes, and other biomarkers as an alternative to single-letter sequencing. Additionally, coupling two complementary vibrational spectroscopy techniques (infrared and Raman) can improve block characterization. These results pave the way for developing a novel, high-throughput block optical sequencing method with lossy genomic data compression using k-mer identification from multiplexed optical data acquisition. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. High-throughput microfluidic line scan imaging for cytological characterization

    NASA Astrophysics Data System (ADS)

    Hutcheson, Joshua A.; Powless, Amy J.; Majid, Aneeka A.; Claycomb, Adair; Fritsch, Ingrid; Balachandran, Kartik; Muldoon, Timothy J.

    2015-03-01

    Imaging cells in a microfluidic chamber with an area scan camera is difficult due to motion blur and data loss during frame readout causing discontinuity of data acquisition as cells move at relatively high speeds through the chamber. We have developed a method to continuously acquire high-resolution images of cells in motion through a microfluidics chamber using a high-speed line scan camera. The sensor acquires images in a line-by-line fashion in order to continuously image moving objects without motion blur. The optical setup comprises an epi-illuminated microscope with a 40X oil immersion, 1.4 NA objective and a 150 mm tube lens focused on a microfluidic channel. Samples containing suspended cells fluorescently stained with 0.01% (w/v) proflavine in saline are introduced into the microfluidics chamber via a syringe pump; illumination is provided by a blue LED (455 nm). Images were taken of samples at the focal plane using an ELiiXA+ 8k/4k monochrome line-scan camera at a line rate of up to 40 kHz. The system's line rate and fluid velocity are tightly controlled to reduce image distortion and are validated using fluorescent microspheres. Image acquisition was controlled via MATLAB's Image Acquisition toolbox. Data sets comprise discrete images of every detectable cell which may be subsequently mined for morphological statistics and definable features by a custom texture analysis algorithm. This high-throughput screening method, comparable to cell counting by flow cytometry, provided efficient examination including counting, classification, and differentiation of saliva, blood, and cultured human cancer cells.

  18. Multi-MHz laser-scanning single-cell fluorescence microscopy by spatiotemporally encoded virtual source array

    PubMed Central

    Wu, Jianglai; Tang, Anson H. L.; Mok, Aaron T. Y.; Yan, Wenwei; Chan, Godfrey C. F.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2017-01-01

    Apart from the spatial resolution enhancement, scaling of temporal resolution, equivalently the imaging throughput, of fluorescence microscopy is of equal importance in advancing cell biology and clinical diagnostics. Yet, this attribute has mostly been overlooked because of the inherent speed limitation of existing imaging strategies. To address the challenge, we employ an all-optical laser-scanning mechanism, enabled by an array of reconfigurable spatiotemporally-encoded virtual sources, to demonstrate ultrafast fluorescence microscopy at line-scan rate as high as 8 MHz. We show that this technique enables high-throughput single-cell microfluidic fluorescence imaging at 75,000 cells/second and high-speed cellular 2D dynamical imaging at 3,000 frames per second, outperforming the state-of-the-art high-speed cameras and the gold-standard laser scanning strategies. Together with its wide compatibility to the existing imaging modalities, this technology could empower new forms of high-throughput and high-speed biological fluorescence microscopy that was once challenged. PMID:28966855

  19. Overview of machine vision methods in x-ray imaging and microtomography

    NASA Astrophysics Data System (ADS)

    Buzmakov, Alexey; Zolotov, Denis; Chukalina, Marina; Nikolaev, Dmitry; Gladkov, Andrey; Ingacheva, Anastasia; Yakimchuk, Ivan; Asadchikov, Victor

    2018-04-01

    Digital X-ray imaging became widely used in science, medicine, non-destructive testing. This allows using modern digital images analysis for automatic information extraction and interpretation. We give short review of scientific applications of machine vision in scientific X-ray imaging and microtomography, including image processing, feature detection and extraction, images compression to increase camera throughput, microtomography reconstruction, visualization and setup adjustment.

  20. Educating the People as a Digital Photographer and Camera Operator via Open Education System Studies from Turkey: Anadolu University Open Education Faculty Case

    ERIC Educational Resources Information Center

    Eryilmaz, Huseyin

    2010-01-01

    Today, photography and visual arts are very important in our modern life. Especially for the mass communication, the visual images and visual arts have very big importance. In modern societies, people must have knowledge about the visual things, such as photographs, cartoons, drawings, typography, etc. Briefly, the people need education on visual…

  1. Traffic monitoring with distributed smart cameras

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Rosner, Marcin; Ulm, Michael; Schwingshackl, Gert

    2012-01-01

    The observation and monitoring of traffic with smart visions systems for the purpose of improving traffic safety has a big potential. Today the automated analysis of traffic situations is still in its infancy--the patterns of vehicle motion and pedestrian flow in an urban environment are too complex to be fully captured and interpreted by a vision system. 3In this work we present steps towards a visual monitoring system which is designed to detect potentially dangerous traffic situations around a pedestrian crossing at a street intersection. The camera system is specifically designed to detect incidents in which the interaction of pedestrians and vehicles might develop into safety critical encounters. The proposed system has been field-tested at a real pedestrian crossing in the City of Vienna for the duration of one year. It consists of a cluster of 3 smart cameras, each of which is built from a very compact PC hardware system in a weatherproof housing. Two cameras run vehicle detection and tracking software, one camera runs a pedestrian detection and tracking module based on the HOG dectection principle. All 3 cameras use sparse optical flow computation in a low-resolution video stream in order to estimate the motion path and speed of objects. Geometric calibration of the cameras allows us to estimate the real-world co-ordinates of detected objects and to link the cameras together into one common reference system. This work describes the foundation for all the different object detection modalities (pedestrians, vehicles), and explains the system setup, tis design, and evaluation results which we have achieved so far.

  2. Advanced Multidimensional Separations in Mass Spectrometry: Navigating the Big Data Deluge

    PubMed Central

    May, Jody C.; McLean, John A.

    2017-01-01

    Hybrid analytical instrumentation constructed around mass spectrometry (MS) are becoming preferred techniques for addressing many grand challenges in science and medicine. From the omics sciences to drug discovery and synthetic biology, multidimensional separations based on MS provide the high peak capacity and high measurement throughput necessary to obtain large-scale measurements which are used to infer systems-level information. In this review, we describe multidimensional MS configurations as technologies which are big data drivers and discuss some new and emerging strategies for mining information from large-scale datasets. A discussion is included on the information content which can be obtained from individual dimensions, as well as the unique information which can be derived by comparing different levels of data. Finally, we discuss some emerging data visualization strategies which seek to make highly dimensional datasets both accessible and comprehensible. PMID:27306312

  3. Enriching semantic knowledge bases for opinion mining in big data applications.

    PubMed

    Weichselbraun, A; Gindl, S; Scharl, A

    2014-10-01

    This paper presents a novel method for contextualizing and enriching large semantic knowledge bases for opinion mining with a focus on Web intelligence platforms and other high-throughput big data applications. The method is not only applicable to traditional sentiment lexicons, but also to more comprehensive, multi-dimensional affective resources such as SenticNet. It comprises the following steps: (i) identify ambiguous sentiment terms, (ii) provide context information extracted from a domain-specific training corpus, and (iii) ground this contextual information to structured background knowledge sources such as ConceptNet and WordNet. A quantitative evaluation shows a significant improvement when using an enriched version of SenticNet for polarity classification. Crowdsourced gold standard data in conjunction with a qualitative evaluation sheds light on the strengths and weaknesses of the concept grounding, and on the quality of the enrichment process.

  4. Bioinformatics clouds for big data manipulation

    PubMed Central

    2012-01-01

    Abstract As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. Reviewers This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor. PMID:23190475

  5. Big Crater as Viewed by Pathfinder Lander - Anaglyph

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.

    The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.

    This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.

    The anaglyph view of Big Crater was produced by combining the left and right eye mosaics (above) by assigning the left eye view to the red color plane and the right eye view to the green and blue color planes (cyan), to produce a stereo anaglyph mosaic. This mosaic can be viewed in 3-D on your computer monitor or in color print form by wearing red-blue 3-D glasses.

    Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is a division of the California Institute of Technology (Caltech). The IMP was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.

    Click below to see the left and right views individually. [figure removed for brevity, see original site] Left [figure removed for brevity, see original site] Right

  6. Big pharma screening collections: more of the same or unique libraries? The AstraZeneca-Bayer Pharma AG case.

    PubMed

    Kogej, Thierry; Blomberg, Niklas; Greasley, Peter J; Mundt, Stefan; Vainio, Mikko J; Schamberger, Jens; Schmidt, Georg; Hüser, Jörg

    2013-10-01

    In this study, the screening collections of two major pharmaceutical companies (AstraZeneca and Bayer Pharma AG) have been compared using a 2D molecular fingerprint by a nearest neighborhood approach. Results revealed a low overlap between both collections in terms of compound identity and similarity. This emphasizes the value of screening multiple compound collections to expand the chemical space that can be accessed by high-throughput screening (HTS). Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. New Content Addressable Memory (CAM) Technologies for Big Data and Intelligent Electronics Enabled by Magneto-Electric Ternary CAM

    DTIC Science & Technology

    2017-12-11

    provides ultra-low energy search operations. To improve throughput, the in-array pipeline scheme has been developed, allowing the MeTCAM to operate at a...controlled magnetic tunnel junction (VC-MTJ), which not only reduces cell area (thus achieving higher density) but also eliminates standby energy . This...Variations of the cell design are presented and evaluated. The results indicated a potential 90x improvement in the energy efficiency and a 50x

  8. a R-Shiny Based Phenology Analysis System and Case Study Using Digital Camera Dataset

    NASA Astrophysics Data System (ADS)

    Zhou, Y. K.

    2018-05-01

    Accurate extracting of the vegetation phenology information play an important role in exploring the effects of climate changes on vegetation. Repeated photos from digital camera is a useful and huge data source in phonological analysis. Data processing and mining on phenological data is still a big challenge. There is no single tool or a universal solution for big data processing and visualization in the field of phenology extraction. In this paper, we proposed a R-shiny based web application for vegetation phenological parameters extraction and analysis. Its main functions include phenological site distribution visualization, ROI (Region of Interest) selection, vegetation index calculation and visualization, data filtering, growth trajectory fitting, phenology parameters extraction, etc. the long-term observation photography data from Freemanwood site in 2013 is processed by this system as an example. The results show that: (1) this system is capable of analyzing large data using a distributed framework; (2) The combination of multiple parameter extraction and growth curve fitting methods could effectively extract the key phenology parameters. Moreover, there are discrepancies between different combination methods in unique study areas. Vegetation with single-growth peak is suitable for using the double logistic module to fit the growth trajectory, while vegetation with multi-growth peaks should better use spline method.

  9. Reflective Filters Design for Self-Filtering Narrowband Ultraviolet Imaging Experiment Wide-Field Surveys (NUVIEWS) Project

    NASA Technical Reports Server (NTRS)

    Park, Jung- Ho; Kim, Jongmin; Zukic, Muamer; Torr, Douglas G.

    1994-01-01

    We report the design of multilayer reflective filters for the self-filtering cameras of the NUVIEWS project. Wide angle self-filtering cameras were designed to image the C IV (154.9 nm) line emission, and H2 Lyman band fluorescence (centered at 161 nm) over a 20 deg x 30 deg field of view. A key element of the filter design includes the development of pi-multilayers optimized to provide maximum reflectance at 154.9 nm and 161 nm for the respective cameras without significant spectral sensitivity to the large cone angle of the incident radiation. We applied self-filtering concepts to design NUVIEWS telescope filters that are composed of three reflective mirrors and one folding mirror. The filters with narrowband widths of 6 and 8 rim at 154.9 and 161 nm, respectively, have net throughputs of more than 50 % with average blocking of out-of-band wavelengths better than 3 x 10(exp -4)%.

  10. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends

    PubMed Central

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096

  11. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends.

    PubMed

    Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.

  12. BigView Image Viewing on Tiled Displays

    NASA Technical Reports Server (NTRS)

    Sandstrom, Timothy

    2007-01-01

    BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running Linux. Additionally, it can work in a multi-screen environment where multiple PCs cooperate to view a single, large image. Using this software, one can explore on relatively modest machines images such as the Mars Orbiter Camera mosaic [92,160 33,280 pixels]. The images must be first converted into paged format, where the image is stored in 256 256 pages to allow rapid movement of pixels into texture memory. The format contains an image pyramid : a set of scaled versions of the original image. Each scaled image is 1/2 the size of the previous, starting with the original down to the smallest, which fits into a single 256 x 256 page.

  13. New Airborne Sensors and Platforms for Solving Specific Tasks in Remote Sensing

    NASA Astrophysics Data System (ADS)

    Kemper, G.

    2012-07-01

    A huge number of small and medium sized sensors entered the market. Today's mid format sensors reach 80 MPix and allow to run projects of medium size, comparable with the first big format digital cameras about 6 years ago. New high quality lenses and new developments in the integration prepared the market for photogrammetric work. Companies as Phase One or Hasselblad and producers or integrators as Trimble, Optec, and others utilized these cameras for professional image production. In combination with small camera stabilizers they can be used also in small aircraft and make the equipment small and easy transportable e.g. for rapid assessment purposes. The combination of different camera sensors enables multi or hyper-spectral installations e.g. useful for agricultural or environmental projects. Arrays of oblique viewing cameras are in the market as well, in many cases these are small and medium format sensors combined as rotating or shifting devices or just as a fixed setup. Beside the proper camera installation and integration, also the software that controls the hardware and guides the pilot has to solve much more tasks than a normal FMS did in the past. Small and relatively cheap Laser Scanners (e.g. Riegl) are in the market and a proper combination with MS Cameras and an integrated planning and navigation is a challenge that has been solved by different softwares. Turnkey solutions are available e.g. for monitoring power line corridors where taking images is just a part of the job. Integration of thermal camera systems with laser scanner and video capturing must be combined with specific information of the objects stored in a database and linked when approaching the navigation point.

  14. Calculation for simulation of archery goal value using a web camera and ultrasonic sensor

    NASA Astrophysics Data System (ADS)

    Rusjdi, Darma; Abdurrasyid, Wulandari, Dewi Arianti

    2017-08-01

    Development of the device simulator digital indoor archery-based embedded systems as a solution to the limitations of the field or open space is adequate, especially in big cities. Development of the device requires simulations to calculate the value of achieving the target based on the approach defined by the parabolic motion variable initial velocity and direction of motion of the arrow reaches the target. The simulator device should be complemented with an initial velocity measuring device using ultrasonic sensors and measuring direction of the target using a digital camera. The methodology uses research and development of application software from modeling and simulation approach. The research objective to create simulation applications calculating the value of the achievement of the target arrows. Benefits as a preliminary stage for the development of the simulator device of archery. Implementation of calculating the value of the target arrows into the application program generates a simulation game of archery that can be used as a reference development of the digital archery simulator in a room with embedded systems using ultrasonic sensors and web cameras. Applications developed with the simulation calculation comparing the outer radius of the circle produced a camera from a distance of three meters.

  15. Passive stand-off terahertz imaging with 1 hertz frame rate

    NASA Astrophysics Data System (ADS)

    May, T.; Zieger, G.; Anders, S.; Zakosarenko, V.; Starkloff, M.; Meyer, H.-G.; Thorwirth, G.; Kreysa, E.

    2008-04-01

    Terahertz (THz) cameras are expected to be a powerful tool for future security applications. If such a technology shall be useful for typical security scenarios (e.g. airport check-in) it has to meet some minimum standards. A THz camera should record images with video rate from a safe distance (stand-off). Although active cameras are conceivable, a passive system has the benefit of concealed operation. Additionally, from an ethic perspective, the lack of exposure to a radiation source is a considerable advantage in public acceptance. Taking all these requirements into account, only cooled detectors are able to achieve the needed sensitivity. A big leap forward in the detector performance and scalability was driven by the astrophysics community. Superconducting bolometers and midsized arrays of them have been developed and are in routine use. Although devices with many pixels are foreseeable nowadays a device with an additional scanning optic is the straightest way to an imaging system with a useful resolution. We demonstrate the capabilities of a concept for a passive Terahertz video camera based on superconducting technology. The actual prototype utilizes a small Cassegrain telescope with a gyrating secondary mirror to record 2 kilopixel THz images with 1 second frame rate.

  16. A Review of Recent Advancement in Integrating Omics Data with Literature Mining towards Biomedical Discoveries

    PubMed Central

    Raja, Kalpana; Patrick, Matthew; Gao, Yilin; Madu, Desmond; Yang, Yuyang

    2017-01-01

    In the past decade, the volume of “omics” data generated by the different high-throughput technologies has expanded exponentially. The managing, storing, and analyzing of this big data have been a great challenge for the researchers, especially when moving towards the goal of generating testable data-driven hypotheses, which has been the promise of the high-throughput experimental techniques. Different bioinformatics approaches have been developed to streamline the downstream analyzes by providing independent information to interpret and provide biological inference. Text mining (also known as literature mining) is one of the commonly used approaches for automated generation of biological knowledge from the huge number of published articles. In this review paper, we discuss the recent advancement in approaches that integrate results from omics data and information generated from text mining approaches to uncover novel biomedical information. PMID:28331849

  17. Enriching semantic knowledge bases for opinion mining in big data applications

    PubMed Central

    Weichselbraun, A.; Gindl, S.; Scharl, A.

    2014-01-01

    This paper presents a novel method for contextualizing and enriching large semantic knowledge bases for opinion mining with a focus on Web intelligence platforms and other high-throughput big data applications. The method is not only applicable to traditional sentiment lexicons, but also to more comprehensive, multi-dimensional affective resources such as SenticNet. It comprises the following steps: (i) identify ambiguous sentiment terms, (ii) provide context information extracted from a domain-specific training corpus, and (iii) ground this contextual information to structured background knowledge sources such as ConceptNet and WordNet. A quantitative evaluation shows a significant improvement when using an enriched version of SenticNet for polarity classification. Crowdsourced gold standard data in conjunction with a qualitative evaluation sheds light on the strengths and weaknesses of the concept grounding, and on the quality of the enrichment process. PMID:25431524

  18. Comprehensive optical and data management infrastructure for high-throughput light-sheet microscopy of whole mouse brains.

    PubMed

    Müllenbroich, M Caroline; Silvestri, Ludovico; Onofri, Leonardo; Costantini, Irene; Hoff, Marcel Van't; Sacconi, Leonardo; Iannello, Giulio; Pavone, Francesco S

    2015-10-01

    Comprehensive mapping and quantification of neuronal projections in the central nervous system requires high-throughput imaging of large volumes with microscopic resolution. To this end, we have developed a confocal light-sheet microscope that has been optimized for three-dimensional (3-D) imaging of structurally intact clarified whole-mount mouse brains. We describe the optical and electromechanical arrangement of the microscope and give details on the organization of the microscope management software. The software orchestrates all components of the microscope, coordinates critical timing and synchronization, and has been written in a versatile and modular structure using the LabVIEW language. It can easily be adapted and integrated to other microscope systems and has been made freely available to the light-sheet community. The tremendous amount of data routinely generated by light-sheet microscopy further requires novel strategies for data handling and storage. To complete the full imaging pipeline of our high-throughput microscope, we further elaborate on big data management from streaming of raw images up to stitching of 3-D datasets. The mesoscale neuroanatomy imaged at micron-scale resolution in those datasets allows characterization and quantification of neuronal projections in unsectioned mouse brains.

  19. A compact imaging spectroscopic system for biomolecular detections on plasmonic chips.

    PubMed

    Lo, Shu-Cheng; Lin, En-Hung; Wei, Pei-Kuen; Tsai, Wan-Shao

    2016-10-17

    In this study, we demonstrate a compact imaging spectroscopic system for high-throughput detection of biomolecular interactions on plasmonic chips, based on a curved grating as the key element of light diffraction and light focusing. Both the curved grating and the plasmonic chips are fabricated on flexible plastic substrates using a gas-assisted thermal-embossing method. A fiber-coupled broadband light source and a camera are included in the system. Spectral resolution within 1 nm is achieved in sensing environmental index solutions and protein bindings. The detected sensitivities of the plasmonic chip are comparable with a commercial spectrometer. An extra one-dimensional scanning stage enables high-throughput detection of protein binding on a designed plasmonic chip consisting of several nanoslit arrays with different periods. The detected resonance wavelengths match well with the grating equation under an air environment. Wavelength shifts between 1 and 9 nm are detected for antigens of various concentrations binding with antibodies. A simple, mass-productive and cost-effective method has been demonstrated on the imaging spectroscopic system for real-time, label-free, highly sensitive and high-throughput screening of biomolecular interactions.

  20. The Widening Gulf between Genomics Data Generation and Consumption: A Practical Guide to Big Data Transfer Technology.

    PubMed

    Feltus, Frank A; Breen, Joseph R; Deng, Juan; Izard, Ryan S; Konger, Christopher A; Ligon, Walter B; Preuss, Don; Wang, Kuang-Ching

    2015-01-01

    In the last decade, high-throughput DNA sequencing has become a disruptive technology and pushed the life sciences into a distributed ecosystem of sequence data producers and consumers. Given the power of genomics and declining sequencing costs, biology is an emerging "Big Data" discipline that will soon enter the exabyte data range when all subdisciplines are combined. These datasets must be transferred across commercial and research networks in creative ways since sending data without thought can have serious consequences on data processing time frames. Thus, it is imperative that biologists, bioinformaticians, and information technology engineers recalibrate data processing paradigms to fit this emerging reality. This review attempts to provide a snapshot of Big Data transfer across networks, which is often overlooked by many biologists. Specifically, we discuss four key areas: 1) data transfer networks, protocols, and applications; 2) data transfer security including encryption, access, firewalls, and the Science DMZ; 3) data flow control with software-defined networking; and 4) data storage, staging, archiving and access. A primary intention of this article is to orient the biologist in key aspects of the data transfer process in order to frame their genomics-oriented needs to enterprise IT professionals.

  1. Big Data: the challenge for small research groups in the era of cancer genomics

    PubMed Central

    Noor, Aisyah Mohd; Holmberg, Lars; Gillett, Cheryl; Grigoriadis, Anita

    2015-01-01

    In the past decade, cancer research has seen an increasing trend towards high-throughput techniques and translational approaches. The increasing availability of assays that utilise smaller quantities of source material and produce higher volumes of data output have resulted in the necessity for data storage solutions beyond those previously used. Multifactorial data, both large in sample size and heterogeneous in context, needs to be integrated in a standardised, cost-effective and secure manner. This requires technical solutions and administrative support not normally financially accounted for in small- to moderate-sized research groups. In this review, we highlight the Big Data challenges faced by translational research groups in the precision medicine era; an era in which the genomes of over 75 000 patients will be sequenced by the National Health Service over the next 3 years to advance healthcare. In particular, we have looked at three main themes of data management in relation to cancer research, namely (1) cancer ontology management, (2) IT infrastructures that have been developed to support data management and (3) the unique ethical challenges introduced by utilising Big Data in research. PMID:26492224

  2. SpUpNIC (Spectrograph Upgrade: Newly Improved Cassegrain) on the South African Astronomical Observatory's 74-inch telescope

    NASA Astrophysics Data System (ADS)

    Crause, Lisa A.; Carter, Dave; Daniels, Alroy; Evans, Geoff; Fourie, Piet; Gilbank, David; Hendricks, Malcolm; Koorts, Willie; Lategan, Deon; Loubser, Egan; Mouries, Sharon; O'Connor, James E.; O'Donoghue, Darragh E.; Potter, Stephen; Sass, Craig; Sickafoose, Amanda A.; Stoffels, John; Swanevelder, Pieter; Titus, Keegan; van Gend, Carel; Visser, Martin; Worters, Hannah L.

    2016-08-01

    SpUpNIC (Spectrograph Upgrade: Newly Improved Cassegrain) is the extensively upgraded Cassegrain Spectrograph on the South African Astronomical Observatory's 74-inch (1.9-m) telescope. The inverse-Cassegrain collimator mirrors and woefully inefficient Maksutov-Cassegrain camera optics have been replaced, along with the CCD and SDSU controller. All moving mechanisms are now governed by a programmable logic controller, allowing remote configuration of the instrument via an intuitive new graphical user interface. The new collimator produces a larger beam to match the optically faster Folded-Schmidt camera design and nine surface-relief diffraction gratings offer various wavelength ranges and resolutions across the optical domain. The new camera optics (a fused silica Schmidt plate, a slotted fold flat and a spherically figured primary mirror, both Zerodur, and a fused silica field-flattener lens forming the cryostat window) reduce the camera's central obscuration to increase the instrument throughput. The physically larger and more sensitive CCD extends the available wavelength range; weak arc lines are now detectable down to 325 nm and the red end extends beyond one micron. A rear-of-slit viewing camera has streamlined the observing process by enabling accurate target placement on the slit and facilitating telescope focus optimisation. An interactive quick-look data reduction tool further enhances the user-friendliness of SpUpNI

  3. Studies on a silicon-photomultiplier-based camera for Imaging Atmospheric Cherenkov Telescopes

    NASA Astrophysics Data System (ADS)

    Arcaro, C.; Corti, D.; De Angelis, A.; Doro, M.; Manea, C.; Mariotti, M.; Rando, R.; Reichardt, I.; Tescaro, D.

    2017-12-01

    Imaging Atmospheric Cherenkov Telescopes (IACTs) represent a class of instruments which are dedicated to the ground-based observation of cosmic VHE gamma ray emission based on the detection of the Cherenkov radiation produced in the interaction of gamma rays with the Earth atmosphere. One of the key elements of such instruments is a pixelized focal-plane camera consisting of photodetectors. To date, photomultiplier tubes (PMTs) have been the common choice given their high photon detection efficiency (PDE) and fast time response. Recently, silicon photomultipliers (SiPMs) are emerging as an alternative. This rapidly evolving technology has strong potential to become superior to that based on PMTs in terms of PDE, which would further improve the sensitivity of IACTs, and see a price reduction per square millimeter of detector area. We are working to develop a SiPM-based module for the focal-plane cameras of the MAGIC telescopes to probe this technology for IACTs with large focal plane cameras of an area of few square meters. We will describe the solutions we are exploring in order to balance a competitive performance with a minimal impact on the overall MAGIC camera design using ray tracing simulations. We further present a comparative study of the overall light throughput based on Monte Carlo simulations and considering the properties of the major hardware elements of an IACT.

  4. High throughput and quantitative approaches for measuring circadian rhythms in cyanobacteria using bioluminescence

    PubMed Central

    Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.

    2016-01-01

    The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451

  5. Translational Biomedical Informatics in the Cloud: Present and Future

    PubMed Central

    Chen, Jiajia; Qian, Fuliang; Yan, Wenying; Shen, Bairong

    2013-01-01

    Next generation sequencing and other high-throughput experimental techniques of recent decades have driven the exponential growth in publicly available molecular and clinical data. This information explosion has prepared the ground for the development of translational bioinformatics. The scale and dimensionality of data, however, pose obvious challenges in data mining, storage, and integration. In this paper we demonstrated the utility and promise of cloud computing for tackling the big data problems. We also outline our vision that cloud computing could be an enabling tool to facilitate translational bioinformatics research. PMID:23586054

  6. Application of machine learning methods in bioinformatics

    NASA Astrophysics Data System (ADS)

    Yang, Haoyu; An, Zheng; Zhou, Haotian; Hou, Yawen

    2018-05-01

    Faced with the development of bioinformatics, high-throughput genomic technology have enabled biology to enter the era of big data. [1] Bioinformatics is an interdisciplinary, including the acquisition, management, analysis, interpretation and application of biological information, etc. It derives from the Human Genome Project. The field of machine learning, which aims to develop computer algorithms that improve with experience, holds promise to enable computers to assist humans in the analysis of large, complex data sets.[2]. This paper analyzes and compares various algorithms of machine learning and their applications in bioinformatics.

  7. Sunglint in Florida Bay taken by the Expedition Two crew

    NASA Image and Video Library

    2001-04-13

    ISS002-E-5466 (13 April 2001) --- From the International Space Station (ISS), an Expedition Two crew member photographed southern Florida, including Dade County with Miami and Miami Beach; Everglades National Park; Big Cypress National Reserve; and the Florida Keys and many other recognizable areas. The crew member, using a digital still camera on this same pass, also recorded imagery of the Lake Okeechobee area, just north of the area represented in this frame.

  8. Fast imaging diagnostics on the C-2U advanced beam-driven field-reversed configuration device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granstedt, E. M., E-mail: egranstedt@trialphaenergy.com; Petrov, P.; Knapp, K.

    2016-11-15

    The C-2U device employed neutral beam injection, end-biasing, and various particle fueling techniques to sustain a Field-Reversed Configuration (FRC) plasma. As part of the diagnostic suite, two fast imaging instruments with radial and nearly axial plasma views were developed using a common camera platform. To achieve the necessary viewing geometry, imaging lenses were mounted behind re-entrant viewports attached to welded bellows. During gettering, the vacuum optics were retracted and isolated behind a gate valve permitting their removal if cleaning was necessary. The axial view incorporated a stainless-steel mirror in a protective cap assembly attached to the vacuum-side of the viewport.more » For each system, a custom lens-based, high-throughput optical periscope was designed to relay the plasma image about half a meter to a high-speed camera. Each instrument also contained a remote-controlled filter wheel, set between shots to isolate a particular hydrogen or impurity emission line. The design of the camera platform, imaging performance, and sample data for each view is presented.« less

  9. Fast imaging diagnostics on the C-2U advanced beam-driven field-reversed configuration device

    NASA Astrophysics Data System (ADS)

    Granstedt, E. M.; Petrov, P.; Knapp, K.; Cordero, M.; Patel, V.

    2016-11-01

    The C-2U device employed neutral beam injection, end-biasing, and various particle fueling techniques to sustain a Field-Reversed Configuration (FRC) plasma. As part of the diagnostic suite, two fast imaging instruments with radial and nearly axial plasma views were developed using a common camera platform. To achieve the necessary viewing geometry, imaging lenses were mounted behind re-entrant viewports attached to welded bellows. During gettering, the vacuum optics were retracted and isolated behind a gate valve permitting their removal if cleaning was necessary. The axial view incorporated a stainless-steel mirror in a protective cap assembly attached to the vacuum-side of the viewport. For each system, a custom lens-based, high-throughput optical periscope was designed to relay the plasma image about half a meter to a high-speed camera. Each instrument also contained a remote-controlled filter wheel, set between shots to isolate a particular hydrogen or impurity emission line. The design of the camera platform, imaging performance, and sample data for each view is presented.

  10. Vaccines Meet Big Data: State-of-the-Art and Future Prospects. From the Classical 3Is (“Isolate–Inactivate–Inject”) Vaccinology 1.0 to Vaccinology 3.0, Vaccinomics, and Beyond: A Historical Overview

    PubMed Central

    Bragazzi, Nicola Luigi; Gianfredi, Vincenza; Villarini, Milena; Rosselli, Roberto; Nasr, Ahmed; Hussein, Amr; Martini, Mariano; Behzadifar, Masoud

    2018-01-01

    Vaccines are public health interventions aimed at preventing infections-related mortality, morbidity, and disability. While vaccines have been successfully designed for those infectious diseases preventable by preexisting neutralizing specific antibodies, for other communicable diseases, additional immunological mechanisms should be elicited to achieve a full protection. “New vaccines” are particularly urgent in the nowadays society, in which economic growth, globalization, and immigration are leading to the emergence/reemergence of old and new infectious agents at the animal–human interface. Conventional vaccinology (the so-called “vaccinology 1.0”) was officially born in 1796 thanks to the contribution of Edward Jenner. Entering the twenty-first century, vaccinology has shifted from a classical discipline in which serendipity and the Pasteurian principle of the three Is (isolate, inactivate, and inject) played a major role to a science, characterized by a rational design and plan (“vaccinology 3.0”). This shift has been possible thanks to Big Data, characterized by different dimensions, such as high volume, velocity, and variety of data. Big Data sources include new cutting-edge, high-throughput technologies, electronic registries, social media, and social networks, among others. The current mini-review aims at exploring the potential roles as well as pitfalls and challenges of Big Data in shaping the future vaccinology, moving toward a tailored and personalized vaccine design and administration. PMID:29556492

  11. Vaccines Meet Big Data: State-of-the-Art and Future Prospects. From the Classical 3Is ("Isolate-Inactivate-Inject") Vaccinology 1.0 to Vaccinology 3.0, Vaccinomics, and Beyond: A Historical Overview.

    PubMed

    Bragazzi, Nicola Luigi; Gianfredi, Vincenza; Villarini, Milena; Rosselli, Roberto; Nasr, Ahmed; Hussein, Amr; Martini, Mariano; Behzadifar, Masoud

    2018-01-01

    Vaccines are public health interventions aimed at preventing infections-related mortality, morbidity, and disability. While vaccines have been successfully designed for those infectious diseases preventable by preexisting neutralizing specific antibodies, for other communicable diseases, additional immunological mechanisms should be elicited to achieve a full protection. "New vaccines" are particularly urgent in the nowadays society, in which economic growth, globalization, and immigration are leading to the emergence/reemergence of old and new infectious agents at the animal-human interface. Conventional vaccinology (the so-called "vaccinology 1.0") was officially born in 1796 thanks to the contribution of Edward Jenner. Entering the twenty-first century, vaccinology has shifted from a classical discipline in which serendipity and the Pasteurian principle of the three I s (isolate, inactivate, and inject) played a major role to a science, characterized by a rational design and plan ("vaccinology 3.0"). This shift has been possible thanks to Big Data, characterized by different dimensions, such as high volume, velocity, and variety of data. Big Data sources include new cutting-edge, high-throughput technologies, electronic registries, social media, and social networks, among others. The current mini-review aims at exploring the potential roles as well as pitfalls and challenges of Big Data in shaping the future vaccinology, moving toward a tailored and personalized vaccine design and administration.

  12. Camera array based light field microscopy

    PubMed Central

    Lin, Xing; Wu, Jiamin; Zheng, Guoan; Dai, Qionghai

    2015-01-01

    This paper proposes a novel approach for high-resolution light field microscopy imaging by using a camera array. In this approach, we apply a two-stage relay system for expanding the aperture plane of the microscope into the size of an imaging lens array, and utilize a sensor array for acquiring different sub-apertures images formed by corresponding imaging lenses. By combining the rectified and synchronized images from 5 × 5 viewpoints with our prototype system, we successfully recovered color light field videos for various fast-moving microscopic specimens with a spatial resolution of 0.79 megapixels at 30 frames per second, corresponding to an unprecedented data throughput of 562.5 MB/s for light field microscopy. We also demonstrated the use of the reported platform for different applications, including post-capture refocusing, phase reconstruction, 3D imaging, and optical metrology. PMID:26417490

  13. High performance gel imaging with a commercial single lens reflex camera

    NASA Astrophysics Data System (ADS)

    Slobodan, J.; Corbett, R.; Wye, N.; Schein, J. E.; Marra, M. A.; Coope, R. J. N.

    2011-03-01

    A high performance gel imaging system was constructed using a digital single lens reflex camera with epi-illumination to image 19 × 23 cm agarose gels with up to 10,000 DNA bands each. It was found to give equivalent performance to a laser scanner in this high throughput DNA fingerprinting application using the fluorophore SYBR Green®. The specificity and sensitivity of the imager and scanner were within 1% using the same band identification software. Low and high cost color filters were also compared and it was found that with care, good results could be obtained with inexpensive dyed acrylic filters in combination with more costly dielectric interference filters, but that very poor combinations were also possible. Methods for determining resolution, dynamic range, and optical efficiency for imagers are also proposed to facilitate comparison between systems.

  14. Virtual-stereo fringe reflection technique for specular free-form surface testing

    NASA Astrophysics Data System (ADS)

    Ma, Suodong; Li, Bo

    2016-11-01

    Due to their excellent ability to improve the performance of optical systems, free-form optics have attracted extensive interest in many fields, e.g. optical design of astronomical telescopes, laser beam expanders, spectral imagers, etc. However, compared with traditional simple ones, testing for such kind of optics is usually more complex and difficult which has been being a big barrier for the manufacture and the application of these optics. Fortunately, owing to the rapid development of electronic devices and computer vision technology, fringe reflection technique (FRT) with advantages of simple system structure, high measurement accuracy and large dynamic range is becoming a powerful tool for specular free-form surface testing. In order to obtain absolute surface shape distributions of test objects, two or more cameras are often required in the conventional FRT which makes the system structure more complex and the measurement cost much higher. Furthermore, high precision synchronization between each camera is also a troublesome issue. To overcome the aforementioned drawback, a virtual-stereo FRT for specular free-form surface testing is put forward in this paper. It is able to achieve absolute profiles with the help of only one single biprism and a camera meanwhile avoiding the problems of stereo FRT based on binocular or multi-ocular cameras. Preliminary experimental results demonstrate the feasibility of the proposed technique.

  15. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Hui

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitablymore » designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm 2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.« less

  16. Comparison of Visual Quality after Implantation of Big Bag and Akreos Adapt Intraocular Lenses in Patients with High Myopia.

    PubMed

    Ma, Shengsheng; Zheng, Dongjian; Lin, Ling; Meng, Fanjian; Yuan, Yonggang

    2015-03-01

    To compare vision quality following phacoemulsification cataract extraction and implantation of a Big Bag or Akreos Adapt intraocular lens (IOL) in patients diagnosed with high myopia complicated with cataract. This was a randomized prospective control study. The patients with high myopia. complicated with cataract, with axial length ≥ 28 mm, and corneal astigmatism ≤ 1D were enrolled and randomly divided into the Big Bag and Akreos Adapt IOL groups. All patients underwent phacoemulsification cataract extraction and lens implantation. At 3 months after surgery, intraocular high-order aberration was measured by a Tracey-iTrace wavefront aberrometer at a pupil diameter of 5 mm in an absolutely dark room and statistically compared between two groups. The images of the anterior segment of eyes were photographed with a Scheimpflug camera using Penta-cam three-dimensional anterior segment analyzer. The tilt and decentration of the IOL were calculated by Image-pro plus 6.0 imaging analysis software and statistically compared between two groups. In total, 127 patients (127 eyes), including 52 males and 75 females, were enrolled in this study. The total high-order aberration and coma in the Akreos Adapt group (59 eyes) were significantly higher compared with those in the Big Bag (P < 0.05). The clover and spherical aberration did not differ between the two groups (P > 0.05). The horizontal and vertical decentration were significantly smaller in the Big Bag lens group than in the Akreos Adapt group (both P < 0.05), whereas the tilt of IOL did not significantly differ between the two groups (P > 0.05). Both Big Bag and Akreos Adapt IOLs possess relatively good intraocular stability implanted in patients with high myopia. Compared with the Akreos Adapt IOL, the Big Bag IOL presents with smaller intraocular high-order aberration. Coma is the major difference between the two groups.

  17. High throughput analysis of samples in flowing liquid

    DOEpatents

    Ambrose, W. Patrick; Grace, W. Kevin; Goodwin, Peter M.; Jett, James H.; Orden, Alan Van; Keller, Richard A.

    2001-01-01

    Apparatus and method enable imaging multiple fluorescent sample particles in a single flow channel. A flow channel defines a flow direction for samples in a flow stream and has a viewing plane perpendicular to the flow direction. A laser beam is formed as a ribbon having a width effective to cover the viewing plane. Imaging optics are arranged to view the viewing plane to form an image of the fluorescent sample particles in the flow stream, and a camera records the image formed by the imaging optics.

  18. X-ray intensity and source size characterizations for the 25 kV upgraded Manson source at Sandia National Laboratories

    NASA Astrophysics Data System (ADS)

    Loisel, G.; Lake, P.; Gard, P.; Dunham, G.; Nielsen-Weber, L.; Wu, M.; Norris, E.

    2016-11-01

    At Sandia National Laboratories, the x-ray generator Manson source model 5 was upgraded from 10 to 25 kV. The purpose of the upgrade is to drive higher characteristics photon energies with higher throughput. In this work we present characterization studies for the source size and the x-ray intensity when varying the source voltage for a series of K-, L-, and M-shell lines emitted from Al, Y, and Au elements composing the anode. We used a 2-pinhole camera to measure the source size and an energy dispersive detector to monitor the spectral content and intensity of the x-ray source. As the voltage increases, the source size is significantly reduced and line intensity is increased for the three materials. We can take advantage of the smaller source size and higher source throughput to effectively calibrate the suite of Z Pulsed Power Facility crystal spectrometers.

  19. High throughput dual-wavelength temperature distribution imaging via compressive imaging

    NASA Astrophysics Data System (ADS)

    Yao, Xu-Ri; Lan, Ruo-Ming; Liu, Xue-Feng; Zhu, Ge; Zheng, Fu; Yu, Wen-Kai; Zhai, Guang-Jie

    2018-03-01

    Thermal imaging is an essential tool in a wide variety of research areas. In this work we demonstrate high-throughput double-wavelength temperature distribution imaging using a modified single-pixel camera without the requirement of a beam splitter (BS). A digital micro-mirror device (DMD) is utilized to display binary masks and split the incident radiation, which eliminates the necessity of a BS. Because the spatial resolution is dictated by the DMD, this thermal imaging system has the advantage of perfect spatial registration between the two images, which limits the need for the pixel registration and fine adjustments. Two bucket detectors, which measures the total light intensity reflected from the DMD, are employed in this system and yield an improvement in the detection efficiency of the narrow-band radiation. A compressive imaging algorithm is utilized to achieve under-sampling recovery. A proof-of-principle experiment was presented to demonstrate the feasibility of this structure.

  20. Real-time Image Processing for Microscopy-based Label-free Imaging Flow Cytometry in a Microfluidic Chip.

    PubMed

    Heo, Young Jin; Lee, Donghyeon; Kang, Junsu; Lee, Keondo; Chung, Wan Kyun

    2017-09-14

    Imaging flow cytometry (IFC) is an emerging technology that acquires single-cell images at high-throughput for analysis of a cell population. Rich information that comes from high sensitivity and spatial resolution of a single-cell microscopic image is beneficial for single-cell analysis in various biological applications. In this paper, we present a fast image-processing pipeline (R-MOD: Real-time Moving Object Detector) based on deep learning for high-throughput microscopy-based label-free IFC in a microfluidic chip. The R-MOD pipeline acquires all single-cell images of cells in flow, and identifies the acquired images as a real-time process with minimum hardware that consists of a microscope and a high-speed camera. Experiments show that R-MOD has the fast and reliable accuracy (500 fps and 93.3% mAP), and is expected to be used as a powerful tool for biomedical and clinical applications.

  1. A traffic situation analysis system

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Rosner, Marcin

    2011-01-01

    The observation and monitoring of traffic with smart visions systems for the purpose of improving traffic safety has a big potential. For example embedded vision systems built into vehicles can be used as early warning systems, or stationary camera systems can modify the switching frequency of signals at intersections. Today the automated analysis of traffic situations is still in its infancy - the patterns of vehicle motion and pedestrian flow in an urban environment are too complex to be fully understood by a vision system. We present steps towards such a traffic monitoring system which is designed to detect potentially dangerous traffic situations, especially incidents in which the interaction of pedestrians and vehicles might develop into safety critical encounters. The proposed system is field-tested at a real pedestrian crossing in the City of Vienna for the duration of one year. It consists of a cluster of 3 smart cameras, each of which is built from a very compact PC hardware system in an outdoor capable housing. Two cameras run vehicle detection software including license plate detection and recognition, one camera runs a complex pedestrian detection and tracking module based on the HOG detection principle. As a supplement, all 3 cameras use additional optical flow computation in a low-resolution video stream in order to estimate the motion path and speed of objects. This work describes the foundation for all 3 different object detection modalities (pedestrians, vehi1cles, license plates), and explains the system setup and its design.

  2. STS-65 Earth observation of Hurricane Emilia taken aboard Columbia, OV-102

    NASA Image and Video Library

    1994-07-23

    STS-65 Earth observation of Hurricane Emilia in Pacific Ocean was taken aboard Columbia, Orbiter Vehicle (OV) 102. This vertical view, photographed with a handheld 70mm camera, reveals the well-defined eye of the hurricane as it moves westerly several hundred miles southeast of the big island of Hawaii. Early in the flight the crew was able to observe the evolution of the storm and there was some concern that it might eventually head toward the Hawaiian Islands. Fortunately it did not.

  3. PLAStiCC: Predictive Look-Ahead Scheduling for Continuous dataflows on Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumbhare, Alok; Simmhan, Yogesh; Prasanna, Viktor K.

    2014-05-27

    Scalable stream processing and continuous dataflow systems are gaining traction with the rise of big data due to the need for processing high velocity data in near real time. Unlike batch processing systems such as MapReduce and workflows, static scheduling strategies fall short for continuous dataflows due to the variations in the input data rates and the need for sustained throughput. The elastic resource provisioning of cloud infrastructure is valuable to meet the changing resource needs of such continuous applications. However, multi-tenant cloud resources introduce yet another dimension of performance variability that impacts the application’s throughput. In this paper wemore » propose PLAStiCC, an adaptive scheduling algorithm that balances resource cost and application throughput using a prediction-based look-ahead approach. It not only addresses variations in the input data rates but also the underlying cloud infrastructure. In addition, we also propose several simpler static scheduling heuristics that operate in the absence of accurate performance prediction model. These static and adaptive heuristics are evaluated through extensive simulations using performance traces obtained from public and private IaaS clouds. Our results show an improvement of up to 20% in the overall profit as compared to the reactive adaptation algorithm.« less

  4. Large-scale protein-protein interactions detection by integrating big biosensing data with computational model.

    PubMed

    You, Zhu-Hong; Li, Shuai; Gao, Xin; Luo, Xin; Ji, Zhen

    2014-01-01

    Protein-protein interactions are the basis of biological functions, and studying these interactions on a molecular level is of crucial importance for understanding the functionality of a living cell. During the past decade, biosensors have emerged as an important tool for the high-throughput identification of proteins and their interactions. However, the high-throughput experimental methods for identifying PPIs are both time-consuming and expensive. On the other hand, high-throughput PPI data are often associated with high false-positive and high false-negative rates. Targeting at these problems, we propose a method for PPI detection by integrating biosensor-based PPI data with a novel computational model. This method was developed based on the algorithm of extreme learning machine combined with a novel representation of protein sequence descriptor. When performed on the large-scale human protein interaction dataset, the proposed method achieved 84.8% prediction accuracy with 84.08% sensitivity at the specificity of 85.53%. We conducted more extensive experiments to compare the proposed method with the state-of-the-art techniques, support vector machine. The achieved results demonstrate that our approach is very promising for detecting new PPIs, and it can be a helpful supplement for biosensor-based PPI data detection.

  5. WIYN bench upgrade: a revitalized spectrograph

    NASA Astrophysics Data System (ADS)

    Bershady, M.; Barden, S.; Blanche, P.-A.; Blanco, D.; Corson, C.; Crawford, S.; Glaspey, J.; Habraken, S.; Jacoby, G.; Keyes, J.; Knezek, P.; Lemaire, P.; Liang, M.; McDougall, E.; Poczulp, G.; Sawyer, D.; Westfall, K.; Willmarth, D.

    2008-07-01

    We describe the redesign and upgrade of the versatile fiber-fed Bench Spectrograph on the WIYN 3.5m telescope. The spectrograph is fed by either the Hydra multi-object positioner or integral-field units (IFUs) at two other ports, and can be configured with an adjustable camera-collimator angle to use low-order and echelle gratings. The upgrade, including a new collimator, charge-coupled device (CCD) and modern controller, and volume-phase holographic gratings (VPHG), has high performance-to-cost ratio by combining new technology with a system reconfiguration that optimizes throughput while utilizing as much of the existing instrument as possible. A faster, all-refractive collimator enhances throughput by 60%, nearly eliminates the slit-function due to vignetting, and improves image quality to maintain instrumental resolution. Two VPH gratings deliver twice the diffraction efficiency of existing surface-relief gratings: A 740 l/mm grating (float-glass and post-polished) used in 1st and 2nd-order, and a large 3300 l/mm grating (spectral resolution comparable to the R2 echelle). The combination of collimator, high-quantum efficiency (QE) CCD, and VPH gratings yields throughput gain-factors of up to 3.5.

  6. The Widening Gulf between Genomics Data Generation and Consumption: A Practical Guide to Big Data Transfer Technology

    PubMed Central

    Feltus, Frank A.; Breen, Joseph R.; Deng, Juan; Izard, Ryan S.; Konger, Christopher A.; Ligon, Walter B.; Preuss, Don; Wang, Kuang-Ching

    2015-01-01

    In the last decade, high-throughput DNA sequencing has become a disruptive technology and pushed the life sciences into a distributed ecosystem of sequence data producers and consumers. Given the power of genomics and declining sequencing costs, biology is an emerging “Big Data” discipline that will soon enter the exabyte data range when all subdisciplines are combined. These datasets must be transferred across commercial and research networks in creative ways since sending data without thought can have serious consequences on data processing time frames. Thus, it is imperative that biologists, bioinformaticians, and information technology engineers recalibrate data processing paradigms to fit this emerging reality. This review attempts to provide a snapshot of Big Data transfer across networks, which is often overlooked by many biologists. Specifically, we discuss four key areas: 1) data transfer networks, protocols, and applications; 2) data transfer security including encryption, access, firewalls, and the Science DMZ; 3) data flow control with software-defined networking; and 4) data storage, staging, archiving and access. A primary intention of this article is to orient the biologist in key aspects of the data transfer process in order to frame their genomics-oriented needs to enterprise IT professionals. PMID:26568680

  7. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Machine learning and data mining advance predictive big data analysis in precision animal agriculture.

    PubMed

    Morota, Gota; Ventura, Ricardo V; Silva, Fabyano F; Koyama, Masanori; Fernando, Samodha C

    2018-04-14

    Precision animal agriculture is poised to rise to prominence in the livestock enterprise in the domains of management, production, welfare, sustainability, health surveillance, and environmental footprint. Considerable progress has been made in the use of tools to routinely monitor and collect information from animals and farms in a less laborious manner than before. These efforts have enabled the animal sciences to embark on information technology-driven discoveries to improve animal agriculture. However, the growing amount and complexity of data generated by fully automated, high-throughput data recording or phenotyping platforms, including digital images, sensor and sound data, unmanned systems, and information obtained from real-time noninvasive computer vision, pose challenges to the successful implementation of precision animal agriculture. The emerging fields of machine learning and data mining are expected to be instrumental in helping meet the daunting challenges facing global agriculture. Yet, their impact and potential in "big data" analysis have not been adequately appreciated in the animal science community, where this recognition has remained only fragmentary. To address such knowledge gaps, this article outlines a framework for machine learning and data mining and offers a glimpse into how they can be applied to solve pressing problems in animal sciences.

  8. Harnessing Big Data for Systems Pharmacology

    PubMed Central

    Xie, Lei; Draizen, Eli J.; Bourne, Philip E.

    2017-01-01

    Systems pharmacology aims to holistically understand mechanisms of drug actions to support drug discovery and clinical practice. Systems pharmacology modeling (SPM) is data driven. It integrates an exponentially growing amount of data at multiple scales (genetic, molecular, cellular, organismal, and environmental). The goal of SPM is to develop mechanistic or predictive multiscale models that are interpretable and actionable. The current explosions in genomics and other omics data, as well as the tremendous advances in big data technologies, have already enabled biologists to generate novel hypotheses and gain new knowledge through computational models of genome-wide, heterogeneous, and dynamic data sets. More work is needed to interpret and predict a drug response phenotype, which is dependent on many known and unknown factors. To gain a comprehensive understanding of drug actions, SPM requires close collaborations between domain experts from diverse fields and integration of heterogeneous models from biophysics, mathematics, statistics, machine learning, and semantic webs. This creates challenges in model management, model integration, model translation, and knowledge integration. In this review, we discuss several emergent issues in SPM and potential solutions using big data technology and analytics. The concurrent development of high-throughput techniques, cloud computing, data science, and the semantic web will likely allow SPM to be findable, accessible, interoperable, reusable, reliable, interpretable, and actionable. PMID:27814027

  9. Harnessing Big Data for Systems Pharmacology.

    PubMed

    Xie, Lei; Draizen, Eli J; Bourne, Philip E

    2017-01-06

    Systems pharmacology aims to holistically understand mechanisms of drug actions to support drug discovery and clinical practice. Systems pharmacology modeling (SPM) is data driven. It integrates an exponentially growing amount of data at multiple scales (genetic, molecular, cellular, organismal, and environmental). The goal of SPM is to develop mechanistic or predictive multiscale models that are interpretable and actionable. The current explosions in genomics and other omics data, as well as the tremendous advances in big data technologies, have already enabled biologists to generate novel hypotheses and gain new knowledge through computational models of genome-wide, heterogeneous, and dynamic data sets. More work is needed to interpret and predict a drug response phenotype, which is dependent on many known and unknown factors. To gain a comprehensive understanding of drug actions, SPM requires close collaborations between domain experts from diverse fields and integration of heterogeneous models from biophysics, mathematics, statistics, machine learning, and semantic webs. This creates challenges in model management, model integration, model translation, and knowledge integration. In this review, we discuss several emergent issues in SPM and potential solutions using big data technology and analytics. The concurrent development of high-throughput techniques, cloud computing, data science, and the semantic web will likely allow SPM to be findable, accessible, interoperable, reusable, reliable, interpretable, and actionable.

  10. Brain Connectivity as a DNA Sequencing Problem

    NASA Astrophysics Data System (ADS)

    Zador, Anthony

    The mammalian cortex consists of millions or billions of neurons, each connected to thousands of other neurons. Traditional methods for determining the brain connectivity rely on microscopy to visualize neuronal connections, but such methods are slow, labor-intensive and often lack single neuron resolution. We have recently developed a new method, MAPseq, to recast the determination of brain wiring into a form that can exploit the tremendous recent advances in high-throughput DNA sequencing. DNA sequencing technology has outpaced even Moore's law, so that the cost of sequencing the human genome has dropped from a billion dollars in 2001 to below a thousand dollars today. MAPseq works by introducing random sequences of DNA-``barcodes''-to tag neurons uniquely. With MAPseq, we can determine the connectivity of over 50K single neurons in a single mouse cortex in about a week, an unprecedented throughput, ushering in the era of ``big data'' for brain wiring. We are now developing analytical tools and algorithms to make sense of these novel data sets.

  11. Sophia: A Expedient UMLS Concept Extraction Annotator.

    PubMed

    Divita, Guy; Zeng, Qing T; Gundlapalli, Adi V; Duvall, Scott; Nebeker, Jonathan; Samore, Matthew H

    2014-01-01

    An opportunity exists for meaningful concept extraction and indexing from large corpora of clinical notes in the Veterans Affairs (VA) electronic medical record. Currently available tools such as MetaMap, cTAKES and HITex do not scale up to address this big data need. Sophia, a rapid UMLS concept extraction annotator was developed to fulfill a mandate and address extraction where high throughput is needed while preserving performance. We report on the development, testing and benchmarking of Sophia against MetaMap and cTAKEs. Sophia demonstrated improved performance on recall as compared to cTAKES and MetaMap (0.71 vs 0.66 and 0.38). The overall f-score was similar to cTAKES and an improvement over MetaMap (0.53 vs 0.57 and 0.43). With regard to speed of processing records, we noted Sophia to be several fold faster than cTAKES and the scaled-out MetaMap service. Sophia offers a viable alternative for high-throughput information extraction tasks.

  12. Tier-2 Optimisation for Computational Density/Diversity and Big Data

    NASA Astrophysics Data System (ADS)

    Fay, R. B.; Bland, J.

    2014-06-01

    As the number of cores on chip continues to trend upwards and new CPU architectures emerge, increasing CPU density and diversity presents multiple challenges to site administrators. These include scheduling for massively multi-core systems (potentially including Graphical Processing Units (GPU), integrated and dedicated) and Many Integrated Core (MIC)) to ensure a balanced throughput of jobs while preserving overall cluster throughput, as well as the increasing complexity of developing for these heterogeneous platforms, and the challenge in managing this more complex mix of resources. In addition, meeting data demands as both dataset sizes increase and as the rate of demand scales with increased computational power requires additional performance from the associated storage elements. In this report, we evaluate one emerging technology, Solid State Drive (SSD) caching for RAID controllers, with consideration to its potential to assist in meeting evolving demand. We also briefly consider the broader developing trends outlined above in order to identify issues that may develop and assess what actions should be taken in the immediate term to address those.

  13. Sophia: A Expedient UMLS Concept Extraction Annotator

    PubMed Central

    Divita, Guy; Zeng, Qing T; Gundlapalli, Adi V.; Duvall, Scott; Nebeker, Jonathan; Samore, Matthew H.

    2014-01-01

    An opportunity exists for meaningful concept extraction and indexing from large corpora of clinical notes in the Veterans Affairs (VA) electronic medical record. Currently available tools such as MetaMap, cTAKES and HITex do not scale up to address this big data need. Sophia, a rapid UMLS concept extraction annotator was developed to fulfill a mandate and address extraction where high throughput is needed while preserving performance. We report on the development, testing and benchmarking of Sophia against MetaMap and cTAKEs. Sophia demonstrated improved performance on recall as compared to cTAKES and MetaMap (0.71 vs 0.66 and 0.38). The overall f-score was similar to cTAKES and an improvement over MetaMap (0.53 vs 0.57 and 0.43). With regard to speed of processing records, we noted Sophia to be several fold faster than cTAKES and the scaled-out MetaMap service. Sophia offers a viable alternative for high-throughput information extraction tasks. PMID:25954351

  14. Materials Informatics: The Materials ``Gene'' and Big Data

    NASA Astrophysics Data System (ADS)

    Rajan, Krishna

    2015-07-01

    Materials informatics provides the foundations for a new paradigm of materials discovery. It shifts our emphasis from one of solely searching among large volumes of data that may be generated by experiment or computation to one of targeted materials discovery via high-throughput identification of the key factors (i.e., “genes”) and via showing how these factors can be quantitatively integrated by statistical learning methods into design rules (i.e., “gene sequencing”) governing targeted materials functionality. However, a critical challenge in discovering these materials genes is the difficulty in unraveling the complexity of the data associated with numerous factors including noise, uncertainty, and the complex diversity of data that one needs to consider (i.e., Big Data). In this article, we explore one aspect of materials informatics, namely how one can efficiently explore for new knowledge in regimes of structure-property space, especially when no reasonable selection pathways based on theory or clear trends in observations exist among an almost infinite set of possibilities.

  15. Application of high performance asynchronous socket communication in power distribution automation

    NASA Astrophysics Data System (ADS)

    Wang, Ziyu

    2017-05-01

    With the development of information technology and Internet technology, and the growing demand for electricity, the stability and the reliable operation of power system have been the goal of power grid workers. With the advent of the era of big data, the power data will gradually become an important breakthrough to guarantee the safe and reliable operation of the power grid. So, in the electric power industry, how to efficiently and robustly receive the data transmitted by the data acquisition device, make the power distribution automation system be able to execute scientific decision quickly, which is the pursuit direction in power grid. In this paper, some existing problems in the power system communication are analysed, and with the help of the network technology, a set of solutions called Asynchronous Socket Technology to the problem in network communication which meets the high concurrency and the high throughput is proposed. Besides, the paper also looks forward to the development direction of power distribution automation in the era of big data and artificial intelligence.

  16. Diverse Grains in Mars Sandstone Target Big Arm

    NASA Image and Video Library

    2015-07-01

    This view of a sandstone target called "Big Arm" covers an area about 1.3 inches (33 millimeters) wide in detail that shows differing shapes and colors of sand grains in the stone. Three separate images taken by the Mars Hand Lens Imager (MAHLI) camera on NASA's Curiosity Mars rover, at different focus settings, were combined into this focus-merge view. The Big Arm target on lower Mount Sharp is at a location near "Marias Pass" where a mudstone bedrock is in contact with overlying sandstone bedrock. MAHLI recorded the component images on May 29, 2015, during the 999th Martian day, or sol, of Curiosity's work on Mars. The rounded shape of some grains visible here suggests they traveled long distances before becoming part of the sediment that later hardened into sandstone. Other grains are more angular and may have originated closer to the rock's current location. Lighter and darker grains may have different compositions. MAHLI was built by Malin Space Science Systems, San Diego. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Science Laboratory Project for the NASA Science Mission Directorate, Washington. http://photojournal.jpl.nasa.gov/catalog/PIA19677

  17. Toward a Low-Cost System for High-Throughput Image-Based Phenotyping of Root System Architecture

    NASA Astrophysics Data System (ADS)

    Davis, T. W.; Schneider, D. J.; Cheng, H.; Shaw, N.; Kochian, L. V.; Shaff, J. E.

    2015-12-01

    Root system architecture is being studied more closely for improved nutrient acquisition, stress tolerance and carbon sequestration by relating the genetic material that corresponds to preferential physical features. This information can help direct plant breeders in addressing the growing concerns regarding the global demand on crops and fossil fuels. To help support this incentive comes a need to make high-throughput image-based phenotyping of plant roots, at the individual plant scale, simpler and more affordable. Our goal is to create an affordable and portable product for simple image collection, processing and management that will extend root phenotyping to institutions with limited funding (e.g., in developing countries). Thus, a new integrated system has been developed using the Raspberry Pi single-board computer. Similar to other 3D-based imaging platforms, the system utilizes a stationary camera to photograph a rotating crop root system (e.g., rice, maize or sorghum) that is suspended either in a gel or on a mesh (for hydroponics). In contrast, the new design takes advantage of powerful open-source hardware and software to reduce the system costs, simplify the imaging process, and manage the large datasets produced by the high-resolution photographs. A newly designed graphical user interface (GUI) unifies the system controls (e.g., adjusting camera and motor settings and orchestrating the motor motion with image capture), making it easier to accommodate a variety of experiments. During each imaging session, integral metadata necessary for reproducing experiment results are collected (e.g., plant type and age, growing conditions and treatments, camera settings) using hierarchical data format files. These metadata are searchable within the GUI and can be selected and extracted for further analysis. The GUI also supports an image previewer that performs limited image processing (e.g., thresholding and cropping). Root skeletonization, 3D reconstruction and trait calculation (e.g., rooting depth, rooting angle, total volume of roots) is being developed in conjunction with this project.

  18. Tunable sensitivity phase detection of transmitted-type dual-channel guided-mode resonance sensor based on phase-shift interferometry.

    PubMed

    Kuo, Wen-Kai; Syu, Siang-He; Lin, Peng-Zhi; Yu, Hsin Her

    2016-02-01

    This paper reports on a transmitted-type dual-channel guided-mode resonance (GMR) sensor system that uses phase-shifting interferometry (PSI) to achieve tunable phase detection sensitivity. Five interference images are captured for the PSI phase calculation within ∼15  s by using a liquid crystal retarder and a USB web camera. The GMR sensor structure is formed by a nanoimprinting process, and the dual-channel sensor device structure for molding is fabricated using a 3D printer. By changing the rotation angle of the analyzer in front of the camera in the PSI system, the sensor detection sensitivity can be tuned. The proposed system may achieve high throughput as well as high sensitivity. The experimental results show that an optimal detection sensitivity of 6.82×10(-4)  RIU can be achieved.

  19. A Radiation-Triggered Surveillance System for UF6 Cylinder Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Michael M.; Myjak, Mitchell J.

    This report provides background information and representative scenarios for testing a prototype radiation-triggered surveillance system at an operating facility that handles uranium hexafluoride (UF 6) cylinders. The safeguards objective is to trigger cameras using radiation, or radiation and motion, rather than motion alone, to reduce significantly the number of image files generated by a motion-triggered system. The authors recommend the use of radiation-triggered surveillance at all facilities where cylinder paths are heavily traversed by personnel. The International Atomic Energy Agency (IAEA) has begun using surveillance cameras in the feed and withdrawal areas of gas centrifuge enrichment plants (GCEPs). The camerasmore » generate imagery using elapsed time or motion, but this creates problems in areas occupied 24/7 by personnel. Either motion-or-interval-based triggering generates thousands of review files over the course of a month. Since inspectors must review the files to verify operator material-flow-declarations, a plethora of files significantly extends the review process. The primary advantage of radiation-triggered surveillance is the opportunity to obtain full-time cylinder throughput verification versus what presently amounts to part-time verification. Cost savings should be substantial, as the IAEA presently uses frequent unannounced inspections to verify cylinder-throughput declarations. The use of radiation-triggered surveillance allows the IAEA to implement less frequent unannounced inspections for the purpose of flow verification, but its principal advantage is significantly shorter and more effective inspector video reviews.« less

  20. The NCI Genomic Data Commons as an engine for precision medicine.

    PubMed

    Jensen, Mark A; Ferretti, Vincent; Grossman, Robert L; Staudt, Louis M

    2017-07-27

    The National Cancer Institute Genomic Data Commons (GDC) is an information system for storing, analyzing, and sharing genomic and clinical data from patients with cancer. The recent high-throughput sequencing of cancer genomes and transcriptomes has produced a big data problem that precludes many cancer biologists and oncologists from gleaning knowledge from these data regarding the nature of malignant processes and the relationship between tumor genomic profiles and treatment response. The GDC aims to democratize access to cancer genomic data and to foster the sharing of these data to promote precision medicine approaches to the diagnosis and treatment of cancer.

  1. Opinion: Why we need a centralized repository for isotopic data

    USGS Publications Warehouse

    Pauli, Jonathan N.; Newsome, Seth D.; Cook, Joseph A.; Harrod, Chris; Steffan, Shawn A.; Baker, Christopher J. O.; Ben-David, Merav; Bloom, David; Bowen, Gabriel J.; Cerling, Thure E.; Cicero, Carla; Cook, Craig; Dohm, Michelle; Dharampal, Prarthana S.; Graves, Gary; Gropp, Robert; Hobson, Keith A.; Jordan, Chris; MacFadden, Bruce; Pilaar Birch, Suzanne; Poelen, Jorrit; Ratnasingham, Sujeevan; Russell, Laura; Stricker, Craig A.; Uhen, Mark D.; Yarnes, Christopher T.; Hayden, Brian

    2017-01-01

    Stable isotopes encode and integrate the origin of matter; thus, their analysis offers tremendous potential to address questions across diverse scientific disciplines (1, 2). Indeed, the broad applicability of stable isotopes, coupled with advancements in high-throughput analysis, have created a scientific field that is growing exponentially, and generating data at a rate paralleling the explosive rise of DNA sequencing and genomics (3). Centralized data repositories, such as GenBank, have become increasingly important as a means for archiving information, and “Big Data” analytics of these resources are revolutionizing science and everyday life.

  2. Use of combinatorial chemistry to speed drug discovery.

    PubMed

    Rádl, S

    1998-10-01

    IBC's International Conference on Integrating Combinatorial Chemistry into the Discovery Pipeline was held September 14-15, 1998. The program started with a pre-conference workshop on High-Throughput Compound Characterization and Purification. The agenda of the main conference was divided into sessions of Synthesis, Automation and Unique Chemistries; Integrating Combinatorial Chemistry, Medicinal Chemistry and Screening; Combinatorial Chemistry Applications for Drug Discovery; and Information and Data Management. This meeting was an excellent opportunity to see how big pharma, biotech and service companies are addressing the current bottlenecks in combinatorial chemistry to speed drug discovery. (c) 1998 Prous Science. All rights reserved.

  3. Precision medicine in the age of big data: The present and future role of large-scale unbiased sequencing in drug discovery and development.

    PubMed

    Vicini, P; Fields, O; Lai, E; Litwack, E D; Martin, A-M; Morgan, T M; Pacanowski, M A; Papaluca, M; Perez, O D; Ringel, M S; Robson, M; Sakul, H; Vockley, J; Zaks, T; Dolsten, M; Søgaard, M

    2016-02-01

    High throughput molecular and functional profiling of patients is a key driver of precision medicine. DNA and RNA characterization has been enabled at unprecedented cost and scale through rapid, disruptive progress in sequencing technology, but challenges persist in data management and interpretation. We analyze the state-of-the-art of large-scale unbiased sequencing in drug discovery and development, including technology, application, ethical, regulatory, policy and commercial considerations, and discuss issues of LUS implementation in clinical and regulatory practice. © 2015 American Society for Clinical Pharmacology and Therapeutics.

  4. A computational approach to real-time image processing for serial time-encoded amplified microscopy

    NASA Astrophysics Data System (ADS)

    Oikawa, Minoru; Hiyama, Daisuke; Hirayama, Ryuji; Hasegawa, Satoki; Endo, Yutaka; Sugie, Takahisa; Tsumura, Norimichi; Kuroshima, Mai; Maki, Masanori; Okada, Genki; Lei, Cheng; Ozeki, Yasuyuki; Goda, Keisuke; Shimobaba, Tomoyoshi

    2016-03-01

    High-speed imaging is an indispensable technique, particularly for identifying or analyzing fast-moving objects. The serial time-encoded amplified microscopy (STEAM) technique was proposed to enable us to capture images with a frame rate 1,000 times faster than using conventional methods such as CCD (charge-coupled device) cameras. The application of this high-speed STEAM imaging technique to a real-time system, such as flow cytometry for a cell-sorting system, requires successively processing a large number of captured images with high throughput in real time. We are now developing a high-speed flow cytometer system including a STEAM camera. In this paper, we describe our approach to processing these large amounts of image data in real time. We use an analog-to-digital converter that has up to 7.0G samples/s and 8-bit resolution for capturing the output voltage signal that involves grayscale images from the STEAM camera. Therefore the direct data output from the STEAM camera generates 7.0G byte/s continuously. We provided a field-programmable gate array (FPGA) device as a digital signal pre-processor for image reconstruction and finding objects in a microfluidic channel with high data rates in real time. We also utilized graphics processing unit (GPU) devices for accelerating the calculation speed of identification of the reconstructed images. We built our prototype system, which including a STEAM camera, a FPGA device and a GPU device, and evaluated its performance in real-time identification of small particles (beads), as virtual biological cells, owing through a microfluidic channel.

  5. Deep-UV-sensitive high-frame-rate backside-illuminated CCD camera developments

    NASA Astrophysics Data System (ADS)

    Dawson, Robin M.; Andreas, Robert; Andrews, James T.; Bhaskaran, Mahalingham; Farkas, Robert; Furst, David; Gershstein, Sergey; Grygon, Mark S.; Levine, Peter A.; Meray, Grazyna M.; O'Neal, Michael; Perna, Steve N.; Proefrock, Donald; Reale, Michael; Soydan, Ramazan; Sudol, Thomas M.; Swain, Pradyumna K.; Tower, John R.; Zanzucchi, Pete

    2002-04-01

    New applications for ultra-violet imaging are emerging in the fields of drug discovery and industrial inspection. High throughput is critical for these applications where millions of drug combinations are analyzed in secondary screenings or high rate inspection of small feature sizes over large areas is required. Sarnoff demonstrated in1990 a back illuminated, 1024 X 1024, 18 um pixel, split-frame-transfer device running at > 150 frames per second with high sensitivity in the visible spectrum. Sarnoff designed, fabricated and delivered cameras based on these CCDs and is now extending this technology to devices with higher pixel counts and higher frame rates through CCD architectural enhancements. The high sensitivities obtained in the visible spectrum are being pushed into the deep UV to support these new medical and industrial inspection applications. Sarnoff has achieved measured quantum efficiencies > 55% at 193 nm, rising to 65% at 300 nm, and remaining almost constant out to 750 nm. Optimization of the sensitivity is being pursued to tailor the quantum efficiency for particular wavelengths. Characteristics of these high frame rate CCDs and cameras will be described and results will be presented demonstrating high UV sensitivity down to 150 nm.

  6. A simple optical tweezers for trapping polystyrene particles

    NASA Astrophysics Data System (ADS)

    Shiddiq, Minarni; Nasir, Zulfa; Yogasari, Dwiyana

    2013-09-01

    Optical tweezers is an optical trap. For decades, it has become an optical tool that can trap and manipulate any particle from the very small size like DNA to the big one like bacteria. The trapping force comes from the radiation pressure of laser light which is focused to a group of particles. Optical tweezers has been used in many research areas such as atomic physics, medical physics, biophysics, and chemistry. Here, a simple optical tweezers has been constructed using a modified Leybold laboratory optical microscope. The ocular lens of the microscope has been removed for laser light and digital camera accesses. A laser light from a Coherent diode laser with wavelength λ = 830 nm and power 50 mW is sent through an immersion oil objective lens with magnification 100 × and NA 1.25 to a cell made from microscope slides containing polystyrene particles. Polystyrene particles with size 3 μm and 10 μm are used. A CMOS Thorlabs camera type DCC1545M with USB Interface and Thorlabs camera lens 35 mm are connected to a desktop and used to monitor the trapping and measure the stiffness of the trap. The camera is accompanied by camera software which makes able for the user to capture and save images. The images are analyzed using ImageJ and Scion macro. The polystyrene particles have been trapped successfully. The stiffness of the trap depends on the size of the particles and the power of the laser. The stiffness increases linearly with power and decreases as the particle size larger.

  7. MEGARA: the new multi-object and integral field spectrograph for GTC

    NASA Astrophysics Data System (ADS)

    Carrasco, E.; Páez, G.; Izazaga-Pére, R.; Gil de Paz, A.; Gallego, J.; Iglesias-Páramo, J.

    2017-07-01

    MEGARA is an optical integral-field unit and multi-object spectrograph for the 10.4m Gran Telescopio Canarias. Both observational modes will provide identical spectral resolutions Rfwhm ˜ 6,000, 12,000 and 18,700. The spectrograph is a collimator-camera system. The unique characteristics of MEGARA in terms of throughput and versatility make this instrument the most efficient tool to date to analyze astrophysical objects at intermediate spectral resolutions. The instrument is currently at the telescope for on-sky commissioning. Here we describe the as-built main characteristics the instrument.

  8. The UCSC genome browser and associated tools

    PubMed Central

    Haussler, David; Kent, W. James

    2013-01-01

    The UCSC Genome Browser (http://genome.ucsc.edu) is a graphical viewer for genomic data now in its 13th year. Since the early days of the Human Genome Project, it has presented an integrated view of genomic data of many kinds. Now home to assemblies for 58 organisms, the Browser presents visualization of annotations mapped to genomic coordinates. The ability to juxtapose annotations of many types facilitates inquiry-driven data mining. Gene predictions, mRNA alignments, epigenomic data from the ENCODE project, conservation scores from vertebrate whole-genome alignments and variation data may be viewed at any scale from a single base to an entire chromosome. The Browser also includes many other widely used tools, including BLAT, which is useful for alignments from high-throughput sequencing experiments. Private data uploaded as Custom Tracks and Data Hubs in many formats may be displayed alongside the rich compendium of precomputed data in the UCSC database. The Table Browser is a full-featured graphical interface, which allows querying, filtering and intersection of data tables. The Saved Session feature allows users to store and share customized views, enhancing the utility of the system for organizing multiple trains of thought. Binary Alignment/Map (BAM), Variant Call Format and the Personal Genome Single Nucleotide Polymorphisms (SNPs) data formats are useful for visualizing a large sequencing experiment (whole-genome or whole-exome), where the differences between the data set and the reference assembly may be displayed graphically. Support for high-throughput sequencing extends to compact, indexed data formats, such as BAM, bigBed and bigWig, allowing rapid visualization of large datasets from RNA-seq and ChIP-seq experiments via local hosting. PMID:22908213

  9. The UCSC genome browser and associated tools.

    PubMed

    Kuhn, Robert M; Haussler, David; Kent, W James

    2013-03-01

    The UCSC Genome Browser (http://genome.ucsc.edu) is a graphical viewer for genomic data now in its 13th year. Since the early days of the Human Genome Project, it has presented an integrated view of genomic data of many kinds. Now home to assemblies for 58 organisms, the Browser presents visualization of annotations mapped to genomic coordinates. The ability to juxtapose annotations of many types facilitates inquiry-driven data mining. Gene predictions, mRNA alignments, epigenomic data from the ENCODE project, conservation scores from vertebrate whole-genome alignments and variation data may be viewed at any scale from a single base to an entire chromosome. The Browser also includes many other widely used tools, including BLAT, which is useful for alignments from high-throughput sequencing experiments. Private data uploaded as Custom Tracks and Data Hubs in many formats may be displayed alongside the rich compendium of precomputed data in the UCSC database. The Table Browser is a full-featured graphical interface, which allows querying, filtering and intersection of data tables. The Saved Session feature allows users to store and share customized views, enhancing the utility of the system for organizing multiple trains of thought. Binary Alignment/Map (BAM), Variant Call Format and the Personal Genome Single Nucleotide Polymorphisms (SNPs) data formats are useful for visualizing a large sequencing experiment (whole-genome or whole-exome), where the differences between the data set and the reference assembly may be displayed graphically. Support for high-throughput sequencing extends to compact, indexed data formats, such as BAM, bigBed and bigWig, allowing rapid visualization of large datasets from RNA-seq and ChIP-seq experiments via local hosting.

  10. UKIRT observer's manual

    NASA Astrophysics Data System (ADS)

    Davies, J. K.

    1991-04-01

    The United Kingdom 3.8 m Infrared Telescope (UKIRT) located at the summit of Mauna Kea on the big island of Hawaii is described. Summit sky conditions are photometric more than half the time and spectroscopic more than three quarters of the time. Photometry through all atmospheric windows in the 1 to 30 micrometer range and spectroscopy in the 1 to 5 micrometer range are possible. The telescope is equipped with a 1 to 5 micrometer infrared camera housing a 58 by 62 element detector array. Other individual instruments and aspects of operation at the telescope are described.

  11. Dynamic modal characterization of musical instruments using digital holography

    NASA Astrophysics Data System (ADS)

    Demoli, Nazif; Demoli, Ivan

    2005-06-01

    This study shows that a dynamic modal characterization of musical instruments with membrane can be carried out using a low-cost device and that the obtained very informative results can be presented as a movie. The proposed device is based on a digital holography technique using the quasi-Fourier configuration and time-average principle. Its practical realization with a commercial digital camera and large plane mirrors allows relatively simple analyzing of big vibration surfaces. The experimental measurements given for a percussion instrument are supported by the mathematical formulation of the problem.

  12. Comparison between layers stacks of 67P/CG comet and spectrophotometric variability obtained from OSIRIS data

    NASA Astrophysics Data System (ADS)

    Ferrari, S.; Penasa, L.; La Forgia, F.; Massironi, M.; Naletto, G.; Lazzarin, M.; Fornasier, S.; Barucci, M. A.; Lucchetti, A.; Pajola, M.; Frattin, E.; Bertini, I.; Ferri, F.; Cremonese, G.

    2017-09-01

    The Rosetta/OSIRIS cameras unveiled the layered nature of comet 67P/Churyumov-Gerasimenko, suggesting that the comet bilobate shape results from the low-velocity merging of two independent onion-like objects. Several physiographical regions of the southern-hemisphere big lobe show stacks of layers forming high scarps, terraces and mesas. A spectrophotometric analysis of OSIRIS images based on multispectral data classifications was conducted in order to identify possible morphological, textural and/or compositional characters that allow to distinguish regional stacks of layers.

  13. Reduced dimensionality (3,2)D NMR experiments and their automated analysis: implications to high-throughput structural studies on proteins.

    PubMed

    Reddy, Jithender G; Kumar, Dinesh; Hosur, Ramakrishna V

    2015-02-01

    Protein NMR spectroscopy has expanded dramatically over the last decade into a powerful tool for the study of their structure, dynamics, and interactions. The primary requirement for all such investigations is sequence-specific resonance assignment. The demand now is to obtain this information as rapidly as possible and in all types of protein systems, stable/unstable, soluble/insoluble, small/big, structured/unstructured, and so on. In this context, we introduce here two reduced dimensionality experiments – (3,2)D-hNCOcanH and (3,2)D-hNcoCAnH – which enhance the previously described 2D NMR-based assignment methods quite significantly. Both the experiments can be recorded in just about 2-3 h each and hence would be of immense value for high-throughput structural proteomics and drug discovery research. The applicability of the method has been demonstrated using alpha-helical bovine apo calbindin-D9k P43M mutant (75 aa) protein. Automated assignment of this data using AUTOBA has been presented, which enhances the utility of these experiments. The backbone resonance assignments so derived are utilized to estimate secondary structures and the backbone fold using Web-based algorithms. Taken together, we believe that the method and the protocol proposed here can be used for routine high-throughput structural studies of proteins. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Recent advances in high-throughput QCL-based infrared microspectral imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Rowlette, Jeremy A.; Fotheringham, Edeline; Nichols, David; Weida, Miles J.; Kane, Justin; Priest, Allen; Arnone, David B.; Bird, Benjamin; Chapman, William B.; Caffey, David B.; Larson, Paul; Day, Timothy

    2017-02-01

    The field of infrared spectral imaging and microscopy is advancing rapidly due in large measure to the recent commercialization of the first high-throughput, high-spatial-definition quantum cascade laser (QCL) microscope. Having speed, resolution and noise performance advantages while also eliminating the need for cryogenic cooling, its introduction has established a clear path to translating the well-established diagnostic capability of infrared spectroscopy into clinical and pre-clinical histology, cytology and hematology workflows. Demand for even higher throughput while maintaining high-spectral fidelity and low-noise performance continues to drive innovation in QCL-based spectral imaging instrumentation. In this talk, we will present for the first time, recent technological advances in tunable QCL photonics which have led to an additional 10X enhancement in spectral image data collection speed while preserving the high spectral fidelity and SNR exhibited by the first generation of QCL microscopes. This new approach continues to leverage the benefits of uncooled microbolometer focal plane array cameras, which we find to be essential for ensuring both reproducibility of data across instruments and achieving the high-reliability needed in clinical applications. We will discuss the physics underlying these technological advancements as well as the new biomedical applications these advancements are enabling, including automated whole-slide infrared chemical imaging on clinically relevant timescales.

  15. Xi-cam: Flexible High Throughput Data Processing for GISAXS

    NASA Astrophysics Data System (ADS)

    Pandolfi, Ronald; Kumar, Dinesh; Venkatakrishnan, Singanallur; Sarje, Abinav; Krishnan, Hari; Pellouchoud, Lenson; Ren, Fang; Fournier, Amanda; Jiang, Zhang; Tassone, Christopher; Mehta, Apurva; Sethian, James; Hexemer, Alexander

    With increasing capabilities and data demand for GISAXS beamlines, supporting software is under development to handle larger data rates, volumes, and processing needs. We aim to provide a flexible and extensible approach to GISAXS data treatment as a solution to these rising needs. Xi-cam is the CAMERA platform for data management, analysis, and visualization. The core of Xi-cam is an extensible plugin-based GUI platform which provides users an interactive interface to processing algorithms. Plugins are available for SAXS/GISAXS data and data series visualization, as well as forward modeling and simulation through HipGISAXS. With Xi-cam's advanced mode, data processing steps are designed as a graph-based workflow, which can be executed locally or remotely. Remote execution utilizes HPC or de-localized resources, allowing for effective reduction of high-throughput data. Xi-cam is open-source and cross-platform. The processing algorithms in Xi-cam include parallel cpu and gpu processing optimizations, also taking advantage of external processing packages such as pyFAI. Xi-cam is available for download online.

  16. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods

    PubMed Central

    Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.

    2012-01-01

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394

  17. A space- and time-resolved single photon counting detector for fluorescence microscopy and spectroscopy

    PubMed Central

    Michalet, X.; Siegmund, O.H.W.; Vallerga, J.V.; Jelinsky, P.; Millaud, J.E.; Weiss, S.

    2017-01-01

    We have recently developed a wide-field photon-counting detector having high-temporal and high-spatial resolutions and capable of high-throughput (the H33D detector). Its design is based on a 25 mm diameter multi-alkali photocathode producing one photo electron per detected photon, which are then multiplied up to 107 times by a 3-microchannel plate stack. The resulting electron cloud is proximity focused on a cross delay line anode, which allows determining the incident photon position with high accuracy. The imaging and fluorescence lifetime measurement performances of the H33D detector installed on a standard epifluorescence microscope will be presented. We compare them to those of standard single-molecule detectors such as single-photon avalanche photodiode (SPAD) or electron-multiplying camera using model samples (fluorescent beads, quantum dots and live cells). Finally, we discuss the design and applications of future generation of H33D detectors for single-molecule imaging and high-throughput study of biomolecular interactions. PMID:29479130

  18. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods.

    PubMed

    Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P

    2012-06-05

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.

  19. X-ray intensity and source size characterizations for the 25 kV upgraded Manson source at Sandia National Laboratories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loisel, G., E-mail: gploise@sandia.gov; Lake, P.; Gard, P.

    2016-11-15

    At Sandia National Laboratories, the x-ray generator Manson source model 5 was upgraded from 10 to 25 kV. The purpose of the upgrade is to drive higher characteristics photon energies with higher throughput. In this work we present characterization studies for the source size and the x-ray intensity when varying the source voltage for a series of K-, L-, and M-shell lines emitted from Al, Y, and Au elements composing the anode. We used a 2-pinhole camera to measure the source size and an energy dispersive detector to monitor the spectral content and intensity of the x-ray source. As themore » voltage increases, the source size is significantly reduced and line intensity is increased for the three materials. We can take advantage of the smaller source size and higher source throughput to effectively calibrate the suite of Z Pulsed Power Facility crystal spectrometers.« less

  20. Controlling high-throughput manufacturing at the nano-scale

    NASA Astrophysics Data System (ADS)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  1. Big Data and Predictive Analytics: Applications in the Care of Children.

    PubMed

    Suresh, Srinivasan

    2016-04-01

    Emerging changes in the United States' healthcare delivery model have led to renewed interest in data-driven methods for managing quality of care. Analytics (Data plus Information) plays a key role in predictive risk assessment, clinical decision support, and various patient throughput measures. This article reviews the application of a pediatric risk score, which is integrated into our hospital's electronic medical record, and provides an early warning sign for clinical deterioration. Dashboards that are a part of disease management systems, are a vital tool in peer benchmarking, and can help in reducing unnecessary variations in care. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. High-Throughput Identification of Antimicrobial Peptides from Amphibious Mudskippers

    PubMed Central

    You, Xinxin; Bian, Chao; Chen, Shixi; Lv, Zhao; Qiu, Limei; Shi, Qiong

    2017-01-01

    Widespread existence of antimicrobial peptides (AMPs) has been reported in various animals with comprehensive biological activities, which is consistent with the important roles of AMPs as the first line of host defense system. However, no big-data-based analysis on AMPs from any fish species is available. In this study, we identified 507 AMP transcripts on the basis of our previously reported genomes and transcriptomes of two representative amphibious mudskippers, Boleophthalmus pectinirostris (BP) and Periophthalmus magnuspinnatus (PM). The former is predominantly aquatic with less time out of water, while the latter is primarily terrestrial with extended periods of time on land. Within these identified AMPs, 449 sequences are novel; 15 were reported in BP previously; 48 are identically overlapped between BP and PM; 94 were validated by mass spectrometry. Moreover, most AMPs presented differential tissue transcription patterns in the two mudskippers. Interestingly, we discovered two AMPs, hemoglobin β1 and amylin, with high inhibitions on Micrococcus luteus. In conclusion, our high-throughput screening strategy based on genomic and transcriptomic data opens an efficient pathway to discover new antimicrobial peptides for ongoing development of marine drugs. PMID:29165344

  3. High-Throughput Identification of Antimicrobial Peptides from Amphibious Mudskippers.

    PubMed

    Yi, Yunhai; You, Xinxin; Bian, Chao; Chen, Shixi; Lv, Zhao; Qiu, Limei; Shi, Qiong

    2017-11-22

    Widespread existence of antimicrobial peptides (AMPs) has been reported in various animals with comprehensive biological activities, which is consistent with the important roles of AMPs as the first line of host defense system. However, no big-data-based analysis on AMPs from any fish species is available. In this study, we identified 507 AMP transcripts on the basis of our previously reported genomes and transcriptomes of two representative amphibious mudskippers, Boleophthalmus pectinirostris (BP) and Periophthalmus magnuspinnatus (PM). The former is predominantly aquatic with less time out of water, while the latter is primarily terrestrial with extended periods of time on land. Within these identified AMPs, 449 sequences are novel; 15 were reported in BP previously; 48 are identically overlapped between BP and PM; 94 were validated by mass spectrometry. Moreover, most AMPs presented differential tissue transcription patterns in the two mudskippers. Interestingly, we discovered two AMPs, hemoglobin β1 and amylin, with high inhibitions on Micrococcus luteus . In conclusion, our high-throughput screening strategy based on genomic and transcriptomic data opens an efficient pathway to discover new antimicrobial peptides for ongoing development of marine drugs.

  4. Individual Biomarkers Using Molecular Personalized Medicine Approaches.

    PubMed

    Zenner, Hans P

    2017-01-01

    Molecular personalized medicine tries to generate individual predictive biomarkers to assist doctors in their decision making. These are thought to improve the efficacy and lower the toxicity of a treatment. The molecular basis of the desired high-precision prediction is modern "omex" technologies providing high-throughput bioanalytical methods. These include genomics and epigenomics, transcriptomics, proteomics, metabolomics, microbiomics, imaging, and functional analyses. In most cases, producing big data also requires a complex biomathematical analysis. Using molecular personalized medicine, the conventional physician's check of biomarker results may no longer be sufficient. By contrast, the physician may need to cooperate with the biomathematician to achieve the desired prediction on the basis of the analysis of individual big data typically produced by omex technologies. Identification of individual biomarkers using molecular personalized medicine approaches is thought to allow a decision-making for the precise use of a targeted therapy, selecting the successful therapeutic tool from a panel of preexisting drugs or medical products. This should avoid the treatment of nonresponders and responders that produces intolerable unwanted effects. © 2017 S. Karger AG, Basel.

  5. “Big Data” for breast cancer: where to look and what you will find

    PubMed Central

    Clare, Susan E; Shaw, Pamela L

    2016-01-01

    Accessing the massive amount of breast cancer data that are currently publicly available may seem daunting to the brand new graduate student embarking on his/her first project or even to the seasoned lab leader, who may wish to explore a new avenue of investigation. In this review, we provide an overview of data resources focusing on high-throughput data and on cancer-related data resources. Although not intended as an exhaustive list, the information included in this review will provide a jumping-off point with descriptions of and links to the various data resources of interest. The review is divided into six sections: (1) compendia of data resources; (2) biomolecular repository “Hubs”; (3) a list of cancer-related data resources, which provides information on contents of the resource and whether the resource enables upload and analysis of investigator provided data; (4) a list of seminal publications containing specific breast cancer data, e.g., publications from METBRIC, Sanger, TCGA; (5) a list of journals focused on data science that include cancer-related “Big Data”; and (6) miscellaneous resources. PMID:28164152

  6. Efficient data management tools for the heterogeneous big data warehouse

    NASA Astrophysics Data System (ADS)

    Alekseev, A. A.; Osipova, V. V.; Ivanov, M. A.; Klimentov, A.; Grigorieva, N. V.; Nalamwar, H. S.

    2016-09-01

    The traditional RDBMS has been consistent for the normalized data structures. RDBMS served well for decades, but the technology is not optimal for data processing and analysis in data intensive fields like social networks, oil-gas industry, experiments at the Large Hadron Collider, etc. Several challenges have been raised recently on the scalability of data warehouse like workload against the transactional schema, in particular for the analysis of archived data or the aggregation of data for summary and accounting purposes. The paper evaluates new database technologies like HBase, Cassandra, and MongoDB commonly referred as NoSQL databases for handling messy, varied and large amount of data. The evaluation depends upon the performance, throughput and scalability of the above technologies for several scientific and industrial use-cases. This paper outlines the technologies and architectures needed for processing Big Data, as well as the description of the back-end application that implements data migration from RDBMS to NoSQL data warehouse, NoSQL database organization and how it could be useful for further data analytics.

  7. A Near-Optimal Distributed QoS Constrained Routing Algorithm for Multichannel Wireless Sensor Networks

    PubMed Central

    Lin, Frank Yeong-Sung; Hsiao, Chiu-Han; Yen, Hong-Hsu; Hsieh, Yu-Jen

    2013-01-01

    One of the important applications in Wireless Sensor Networks (WSNs) is video surveillance that includes the tasks of video data processing and transmission. Processing and transmission of image and video data in WSNs has attracted a lot of attention in recent years. This is known as Wireless Visual Sensor Networks (WVSNs). WVSNs are distributed intelligent systems for collecting image or video data with unique performance, complexity, and quality of service challenges. WVSNs consist of a large number of battery-powered and resource constrained camera nodes. End-to-end delay is a very important Quality of Service (QoS) metric for video surveillance application in WVSNs. How to meet the stringent delay QoS in resource constrained WVSNs is a challenging issue that requires novel distributed and collaborative routing strategies. This paper proposes a Near-Optimal Distributed QoS Constrained (NODQC) routing algorithm to achieve an end-to-end route with lower delay and higher throughput. A Lagrangian Relaxation (LR)-based routing metric that considers the “system perspective” and “user perspective” is proposed to determine the near-optimal routing paths that satisfy end-to-end delay constraints with high system throughput. The empirical results show that the NODQC routing algorithm outperforms others in terms of higher system throughput with lower average end-to-end delay and delay jitter. In this paper, for the first time, the algorithm shows how to meet the delay QoS and at the same time how to achieve higher system throughput in stringently resource constrained WVSNs.

  8. Assessing Morphological and Physiological Properties of Forest Species Using High Throughput Plant Phenotyping and Imaging Techniques

    NASA Astrophysics Data System (ADS)

    Mazis, A.; Hiller, J.; Morgan, P.; Awada, T.; Stoerger, V.

    2017-12-01

    High throughput plant phenotyping is increasingly being used to assess morphological and biophysical traits of economically important crops in agriculture. In this study, the potential application of this technique in natural resources management, through the characterization of woody plants regeneration, establishment, growth, and responses to water and nutrient manipulations was assessed. Two woody species were selected for this study, Quercus prinoides and Quercus bicolor. Seeds were collected from trees growing at the edge of their natural distribution in Nebraska and Missouri, USA. Seeds were germinated in the greenhouse and transferred to the Nebraska Innovation Campus Lemnatec3D High Throughput facility at the University of Nebraska-Lincoln. Seedlings subjected to water and N manipulations, were imaged twice or three times a week using four cameras (Visible, Fluorescence, Infrared and Hyperspectral), throughout the growing season. Traditional leaf to plant levels ecophysiological measurements were concurrently acquired to assess the relationship between these two techniques. These include gas exchange (LI 6400 and LI 6800, LICOR Inc., Lincoln NE), chlorophyll content, optical characteristics (Ocean Optics USB200), water and osmotic potentials, leaf area and weight and carbon isotope ratio. In the presentation, we highlight results on the potential use of high throughput plant phenotyping techniques to assess the morphology and physiology of woody species including responses to water availability and nutrient manipulation, and its broader application under field conditions and natural resources management. Also, we explore the different capabilities imaging provides us for modeling the plant physiological and morphological growth and how it can complement the current techniques

  9. Integrating Microscopic Analysis into Existing Quality Assurance Processes

    NASA Astrophysics Data System (ADS)

    Frühberger, Peter; Stephan, Thomas; Beyerer, Jürgen

    When technical goods, like mainboards and other electronic components, are produced, quality assurance (QA) is very important. To achieve this goal, different optical microscopes can be used to analyze a variety of specimen to gain comprehensive information by combining the acquired sensor data. In many industrial processes, cameras are used to examine these technical goods. Those cameras can analyze complete boards at once and offer a high level of accuracy when used for completeness checks. When small defects, e.g. soldered points, need to be examined in detail, those wide area cameras are limited. Microscopes with large magnification need to be used to analyze those critical areas. But microscopes alone cannot fulfill this task within a limited time schedule, because microscopic analysis of complete motherboards of a certain size is time demanding. Microscopes are limited concerning their depth of field and depth of focus, which is why additional components like XY moving tables need to be used to examine the complete surface. Yet today's industrial production quality standards require a 100 % control of the soldered components within a given time schedule. This level of quality, while keeping inspection time low, can only be achieved when combining multiple inspection devices in an optimized manner. This paper presents results and methods of combining industrial cameras with microscopy instrumenting a classificatory based approach intending to keep already deployed QA processes in place but extending them with the purpose of increasing the quality level of the produced technical goods while maintaining high throughput.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fahim, Farah; Deptuch, Grzegorz; Shenai, Alpana

    The Vertically Integrated Photon Imaging Chip - Large, (VIPIC-L), is a large area, small pixel (65μm), 3D integrated, photon counting ASIC with zero-suppressed or full frame dead-time-less data readout. It features data throughput of 14.4 Gbps per chip with a full frame readout speed of 56kframes/s in the imaging mode. VIPIC-L contain 192 x 192 pixel array and the total size of the chip is 1.248cm x 1.248cm with only a 5μm periphery. It contains about 120M transistors. A 1.3M pixel camera module will be developed by arranging a 6 x 6 array of 3D VIPIC-L’s bonded to a largemore » area silicon sensor on the analog side and to a readout board on the digital side. The readout board hosts a bank of FPGA’s, one per VIPIC-L to allow processing of up to 0.7 Tbps of raw data produced by the camera.« less

  11. Astronomical Polarimetry with the RIT Polarization Imaging Camera

    NASA Astrophysics Data System (ADS)

    Vorobiev, Dmitry V.; Ninkov, Zoran; Brock, Neal

    2018-06-01

    In the last decade, imaging polarimeters based on micropolarizer arrays have been developed for use in terrestrial remote sensing and metrology applications. Micropolarizer-based sensors are dramatically smaller and more mechanically robust than other polarimeters with similar spectral response and snapshot capability. To determine the suitability of these new polarimeters for astronomical applications, we developed the RIT Polarization Imaging Camera to investigate the performance of these devices, with a special attention to the low signal-to-noise regime. We characterized the device performance in the lab, by determining the relative throughput, efficiency, and orientation of every pixel, as a function of wavelength. Using the resulting pixel response model, we developed demodulation procedures for aperture photometry and imaging polarimetry observing modes. We found that, using the current calibration, RITPIC is capable of detecting polarization signals as small as ∼0.3%. The relative ease of data collection, calibration, and analysis provided by these sensors suggest than they may become an important tool for a number of astronomical targets.

  12. WFC3: In-Flight Performance Highlights

    NASA Astrophysics Data System (ADS)

    Kimble, Randy A.; MacKenty, J. W.; O'Connell, R. W.; Townsend, J. A.; WFC3 Team

    2010-01-01

    Wide Field Camera 3 (WFC3), a powerful new imager for the Hubble Space Telescope (HST), was successfully installed in the telescope in May 2009 during the first dramatic spacewalk of space shuttle flight STS-125, also known as HST Servicing Mission 4. This new camera offers unique observing capabilities in two channels spanning a broad wavelength range from the near ultraviolet to the near infrared (200-1000nm in the UV/Visible [UVIS] channel; 850-1700nm in the IR channel). After an initial outgassing period, WFC3 was cooled to its observing configuration in June. In the following months, a highly successful Servicing Mission Observatory Verification (SMOV4) program was executed, which has confirmed the exciting scientific potential of the instrument. Detailed performance results from the SMOV4 program are presented in a number of papers in this session. In this paper, we highlight some top-level performance assessments (throughput, limiting magnitudes, survey speeds) for WFC3, which is now actively engaged in the execution of forefront astronomical observing programs.

  13. Optical design of the PEPSI high-resolution spectrograph at LBT

    NASA Astrophysics Data System (ADS)

    Andersen, Michael I.; Spano, Paolo; Woche, Manfred; Strassmeier, Klaus G.; Beckert, Erik

    2004-09-01

    PEPSI is a high-resolution, fiber fed echelle spectrograph with polarimetric capabilities for the LBT. In order to reach a maximum resolution R=120.000 in polarimetric mode and 300.000 in integral light mode with high efficiency in the spectral range 390-1050~nm, we designed a white-pupil configuration with Maksutov collimators. Light is dispersed by an R4 31.6 lines/mm monolithic echelle grating mosaic and split into two arms through dichroics. The two arms, optimized for the spectral range 390-550~nm and 550-1050~nm, respectively, consist of Maksutov transfer collimators, VPH-grism cross dispersers, optimized dioptric cameras and 7.5K x 7.5K 8~μ CCDs. Fibers of different core sizes coupled to different image-slicers allow a high throughput, comparable to that of direct feed instruments. The optical configuration with only spherical and cylindrical surfaces, except for one aspherical surface in each camera, reduces costs and guarantees high optical quality. PEPSI is under construction at AIP with first light expected in 2006.

  14. The ``Missing Compounds'' affair in functionality-driven material discovery

    NASA Astrophysics Data System (ADS)

    Zunger, Alex

    2014-03-01

    In the paradigm of ``data-driven discovery,'' underlying one of the leading streams of the Material Genome Initiative (MGI), one attempts to compute high-throughput style as many of the properties of as many of the N (about 10**5- 10**6) compounds listed in databases of previously known compounds. One then inspects the ensuing Big Data, searching for useful trends. The alternative and complimentary paradigm of ``functionality-directed search and optimization'' used here, searches instead for the n much smaller than N configurations and compositions that have the desired value of the target functionality. Examples include the use of genetic and other search methods that optimize the structure or identity of atoms on lattice sites, using atomistic electronic structure (such as first-principles) approaches in search of a given electronic property. This addresses a few of the bottlenecks that have faced the alternative, data-driven/high throughput/Big Data philosophy: (i) When the configuration space is theoretically of infinite size, building a complete data base as in data-driven discovery is impossible, yet searching for the optimum functionality, is still a well-posed problem. (ii) The configuration space that we explore might include artificially grown, kinetically stabilized systems (such as 2D layer stacks; superlattices; colloidal nanostructures; Fullerenes) that are not listed in compound databases (used by data-driven approaches), (iii) a large fraction of chemically plausible compounds have not been experimentally synthesized, so in the data-driven approach these are often skipped. In our approach we search explicitly for such ``Missing Compounds''. It is likely that many interesting material properties will be found in cases (i)-(iii) that elude high throughput searches based on databases encapsulating existing knowledge. I will illustrate (a) Functionality-driven discovery of topological insulators and valley-split quantum-computer semiconductors, as well as (b) Use of ``first principles thermodynamics'' to discern which of the previously ``missing compounds'' should, in fact exist and in which structure. Synthesis efforts by Poeppelmeier group at NU realized 20 never-before-made half-Heusler compounds out of the 20 predicted ones, in our predicted space groups. This type of theory-led experimental search of designed materials with target functionalities may shorten the current process of discovery of interesting functional materials. Supported by DOE ,Office of Science, Energy Frontier Research Center for Inverse Design

  15. Super Resolution Algorithm for CCTVs

    NASA Astrophysics Data System (ADS)

    Gohshi, Seiichi

    2015-03-01

    Recently, security cameras and CCTV systems have become an important part of our daily lives. The rising demand for such systems has created business opportunities in this field, especially in big cities. Analogue CCTV systems are being replaced by digital systems, and HDTV CCTV has become quite common. HDTV CCTV can achieve images with high contrast and decent quality if they are clicked in daylight. However, the quality of an image clicked at night does not always have sufficient contrast and resolution because of poor lighting conditions. CCTV systems depend on infrared light at night to compensate for insufficient lighting conditions, thereby producing monochrome images and videos. However, these images and videos do not have high contrast and are blurred. We propose a nonlinear signal processing technique that significantly improves visual and image qualities (contrast and resolution) of low-contrast infrared images. The proposed method enables the use of infrared cameras for various purposes such as night shot and poor lighting environments under poor lighting conditions.

  16. Hubble illuminates the universe

    NASA Technical Reports Server (NTRS)

    Maran, Stephen P.

    1992-01-01

    Latest observations by the Hubble Space Telescope (HST) are described, including the first 'parallel' observations (on January 6, 1992) by the two of the Hubble's instruments of two different targets at the same time. On this date, the faint-object camera made images of the quasar 3C 273 in Virgo, while the wide-field and planetary camera recorded an adjacent field. The new HST images include those of the nucleus and the jet of M85, the giant elliptical galaxy at the heart of the Virgo cluster, and what appears to be a black hole of mass 2.6 billion solar masses in M87, and an image of N66, a planetary nebula in the LMC. Other images yield evidence of 'blue stragglers' in the core of 47 Tucanae, a globular cluster about 16,000 light-years from earth. The Goddard spectrograph recorded the spectrum of the star Capella at very high wavelength resolution, which made it possible to measure deuterium from the Big Bang.

  17. An experiment in big data: storage, querying and visualisation of data taken from the Liverpool Telescope's wide field cameras

    NASA Astrophysics Data System (ADS)

    Barnsley, R. M.; Steele, Iain A.; Smith, R. J.; Mawson, Neil R.

    2014-07-01

    The Small Telescopes Installed at the Liverpool Telescope (STILT) project has been in operation since March 2009, collecting data with three wide field unfiltered cameras: SkycamA, SkycamT and SkycamZ. To process the data, a pipeline was developed to automate source extraction, catalogue cross-matching, photometric calibration and database storage. In this paper, modifications and further developments to this pipeline will be discussed, including a complete refactor of the pipeline's codebase into Python, migration of the back-end database technology from MySQL to PostgreSQL, and changing the catalogue used for source cross-matching from USNO-B1 to APASS. In addition to this, details will be given relating to the development of a preliminary front-end to the source extracted database which will allow a user to perform common queries such as cone searches and light curve comparisons of catalogue and non-catalogue matched objects. Some next steps and future ideas for the project will also be presented.

  18. Digital impression-taking: Fundamentals and benefits in orthodontics.

    PubMed

    Lecocq, Guillaume

    2016-06-01

    The digital era has burst into our offices in a big way. 3D camera technology has improved, enabling us to record our impressions and the occlusion in a digital format file. This file can then be used to make set-ups and manufacture orthodontic devices. Like any new technology, it needs to be studied and understood in order to grasp it fully and master the information and digital flow which can be generated between one's office and any external party involved in treatment, such as laboratories or other colleagues. Copyright © 2016 CEO. Published by Elsevier Masson SAS. All rights reserved.

  19. Bubble driven quasioscillatory translational motion of catalytic micromotors.

    PubMed

    Manjare, Manoj; Yang, Bo; Zhao, Y-P

    2012-09-21

    A new quasioscillatory translational motion has been observed for big Janus catalytic micromotors with a fast CCD camera. Such motional behavior is found to coincide with both the bubble growth and burst processes resulting from the catalytic reaction, and the competition of the two processes generates a net forward motion. Detailed physical models have been proposed to describe the above processes. It is suggested that the bubble growth process imposes a growth force moving the micromotor forward, while the burst process induces an instantaneous local pressure depression pulling the micromotor backward. The theoretic predictions are consistent with the experimental data.

  20. Bubble Driven Quasioscillatory Translational Motion of Catalytic Micromotors

    NASA Astrophysics Data System (ADS)

    Manjare, Manoj; Yang, Bo; Zhao, Y.-P.

    2012-09-01

    A new quasioscillatory translational motion has been observed for big Janus catalytic micromotors with a fast CCD camera. Such motional behavior is found to coincide with both the bubble growth and burst processes resulting from the catalytic reaction, and the competition of the two processes generates a net forward motion. Detailed physical models have been proposed to describe the above processes. It is suggested that the bubble growth process imposes a growth force moving the micromotor forward, while the burst process induces an instantaneous local pressure depression pulling the micromotor backward. The theoretic predictions are consistent with the experimental data.

  1. VizieR Online Data Catalog: 1992-1997 binary star speckle measurements (Balega+, 1999)

    NASA Astrophysics Data System (ADS)

    Balega, I. I.; Balega, Y. Y.; Maksimov, A. F.; Pluzhnik, E. A.; Shkhagosheva, Z. U.; Vasyuk, V. A.

    2000-11-01

    We present the results of speckle interferometric measurements of binary stars made with the television photon-counting camera at the 6-m Big Azimuthal Telescope (BTA) and 1-m telescope of the Special Astrophysical Observatory (SAO) between August 1992 and May 1997. The data contain 89 observations of 62 star systems on the large telescope and 21 on the smaller one. For the 6-m aperture 18 systems remained unresolved. The measured angular separation ranged from 39 mas, two times above the BTA diffraction limit, to 1593 mas. (3 data files).

  2. Binary star speckle measurements during 1992-1997 from the SAO 6-m and 1-m telescopes in Zelenchuk

    NASA Astrophysics Data System (ADS)

    Balega, I. I.; Balega, Y. Y.; Maksimov, A. F.; Pluzhnik, E. A.; Shkhagosheva, Z. U.; Vasyuk, V. A.

    1999-12-01

    We present the results of speckle interferometric measurements of binary stars made with the television photon-counting camera at the 6-m Big Azimuthal Telescope (BTA) and 1-m telescope of the Special Astrophysical Observatory (SAO) between August 1992 and May 1997. The data contain 89 observations of 62 star systems on the large telescope and 21 on the smaller one. For the 6-m aperture 18 systems remained unresolved. The measured angular separation ranged from 39 mas, two times above the BTA diffraction limit, to 1593 mas.

  3. Southern Florida's River of Grass

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Florida's Everglades is a region of broad, slow-moving sheets of water flowing southward over low-lying areas from Lake Okeechobeeto the Gulf of Mexico. In places this remarkable 'river of grass' is 80 kilometers wide. These images from the Multi-angle Imaging SpectroRadiometer show the Everglades region on January 16, 2002. Each image covers an area measuring 191 kilometers x 205 kilometers. The data were captured during Terra orbit 11072.

    On the left is a natural color view acquired by MISR's nadir camera. A portion of Lake Okeechobee is visible at the top, to the right of image center. South of the lake, whose name derives from the Seminole word for 'big water,' an extensive region of farmland known as the Everglades Agricultural Area is recognizable by its many clustered squares. Over half of the sugar produced in United States is grown here. Urban areas along the east coast and in the northern part of the image extend to the boundaries of Big Cypress Swamp, situated north of Everglades National Park.

    The image on the right combines red-band data from the 46-degree backward, nadir and 46-degree forward-viewing camera angles to create a red, green, blue false-color composite. One of the interesting uses of the composite image is for detecting surface water. Wet surfaces appear blue in this rendition because sun glitter produces a greater signal at the forward camera's view angle. Wetlands visible in these images include a series of shallow impoundments called Water Conservation Areas which were built to speed water flow through the Everglades in times of drought. In parts of the Everglades, these levees and extensive systems such as the Miami and Tamiami Canals have altered the natural cycles of water flow. For example, the water volume of the Shark River Slough, a natural wetland which feeds Everglades National Park, is influenced by the Tamiami Canal. The unique and intrinsic value of the Everglades is now widely recognized, and efforts to restore the natural water cycles are underway.

  4. Chlorophyll fluorescence is a rigorous, high throughput tool to analyze the impacts of genotype, species, and stress on plant and ecosystem productivity

    NASA Astrophysics Data System (ADS)

    Ewers, B. E.; Pleban, J. R.; Aston, T.; Beverly, D.; Speckman, H. N.; Hosseini, A.; Bretfeld, M.; Edwards, C.; Yarkhunova, Y.; Weinig, C.; Mackay, D. S.

    2017-12-01

    Abiotic and biotic stresses reduce plant productivity, yet high-throughput characterization of plant responses across genotypes, species and stress conditions are limited by both instrumentation and data analysis techniques. Recent developments in chlorophyll a fluorescence measurement at leaf to landscape scales could improve our predictive understanding of plants response to stressors. We analyzed the interaction of species and stress across two crop types, five gymnosperm and two angiosperm tree species from boreal and montane forests, grasses, forbs and shrubs from sagebrush steppe, and 30 tree species from seasonally wet tropical forest. We also analyzed chlorophyll fluorescence and gas exchange data from twelve Brassica rapa crop accessions and 120 recombinant inbred lines to investigate phenotypic responses to drought. These data represent more than 10,000 measurements of fluorescence and allow us to answer two questions 1) are the measurements from high-throughput, hand held and drone-mounted instruments quantitatively similar to lower throughput camera and gas exchange mounted instruments and 2) do the measurements find differences in genotypic, species and environmental stress on plants? We found through regression that the high and low throughput instruments agreed across both individual chlorophyll fluorescence components and calculated ratios and were not different from a 1:1 relationship with correlation greater than 0.9. We used hierarchical Bayesian modeling to test the second question. We found a linear relationship between the fluorescence-derived quantum yield of PSII and the quantum yield of CO2 assimilation from gas-exchange, with a slope of ca. 0.1 indicating that the efficiency of the entire photosynthetic process was about 10% of PSII across genotypes, species and drought stress. Posterior estimates of quantum yield revealed that drought-treatment, genotype and species differences were preserved when accounting for measurement uncertainty. High throughput handheld or drone-based measurements of chlorophyll fluorescence provide high quality, quantitative data that can be used to not only connect genotype to phenotype but also quantify how vastly different plant species and genotypes respond to stress and change ecosystem productivity.

  5. Representing high throughput expression profiles via perturbation barcodes reveals compound targets.

    PubMed

    Filzen, Tracey M; Kutchukian, Peter S; Hermes, Jeffrey D; Li, Jing; Tudor, Matthew

    2017-02-01

    High throughput mRNA expression profiling can be used to characterize the response of cell culture models to perturbations such as pharmacologic modulators and genetic perturbations. As profiling campaigns expand in scope, it is important to homogenize, summarize, and analyze the resulting data in a manner that captures significant biological signals in spite of various noise sources such as batch effects and stochastic variation. We used the L1000 platform for large-scale profiling of 978 representative genes across thousands of compound treatments. Here, a method is described that uses deep learning techniques to convert the expression changes of the landmark genes into a perturbation barcode that reveals important features of the underlying data, performing better than the raw data in revealing important biological insights. The barcode captures compound structure and target information, and predicts a compound's high throughput screening promiscuity, to a higher degree than the original data measurements, indicating that the approach uncovers underlying factors of the expression data that are otherwise entangled or masked by noise. Furthermore, we demonstrate that visualizations derived from the perturbation barcode can be used to more sensitively assign functions to unknown compounds through a guilt-by-association approach, which we use to predict and experimentally validate the activity of compounds on the MAPK pathway. The demonstrated application of deep metric learning to large-scale chemical genetics projects highlights the utility of this and related approaches to the extraction of insights and testable hypotheses from big, sometimes noisy data.

  6. Representing high throughput expression profiles via perturbation barcodes reveals compound targets

    PubMed Central

    Kutchukian, Peter S.; Li, Jing; Tudor, Matthew

    2017-01-01

    High throughput mRNA expression profiling can be used to characterize the response of cell culture models to perturbations such as pharmacologic modulators and genetic perturbations. As profiling campaigns expand in scope, it is important to homogenize, summarize, and analyze the resulting data in a manner that captures significant biological signals in spite of various noise sources such as batch effects and stochastic variation. We used the L1000 platform for large-scale profiling of 978 representative genes across thousands of compound treatments. Here, a method is described that uses deep learning techniques to convert the expression changes of the landmark genes into a perturbation barcode that reveals important features of the underlying data, performing better than the raw data in revealing important biological insights. The barcode captures compound structure and target information, and predicts a compound’s high throughput screening promiscuity, to a higher degree than the original data measurements, indicating that the approach uncovers underlying factors of the expression data that are otherwise entangled or masked by noise. Furthermore, we demonstrate that visualizations derived from the perturbation barcode can be used to more sensitively assign functions to unknown compounds through a guilt-by-association approach, which we use to predict and experimentally validate the activity of compounds on the MAPK pathway. The demonstrated application of deep metric learning to large-scale chemical genetics projects highlights the utility of this and related approaches to the extraction of insights and testable hypotheses from big, sometimes noisy data. PMID:28182661

  7. Programming and Runtime Support to Blaze FPGA Accelerator Deployment at Datacenter Scale

    PubMed Central

    Huang, Muhuan; Wu, Di; Yu, Cody Hao; Fang, Zhenman; Interlandi, Matteo; Condie, Tyson; Cong, Jason

    2017-01-01

    With the end of CPU core scaling due to dark silicon limitations, customized accelerators on FPGAs have gained increased attention in modern datacenters due to their lower power, high performance and energy efficiency. Evidenced by Microsoft’s FPGA deployment in its Bing search engine and Intel’s 16.7 billion acquisition of Altera, integrating FPGAs into datacenters is considered one of the most promising approaches to sustain future datacenter growth. However, it is quite challenging for existing big data computing systems—like Apache Spark and Hadoop—to access the performance and energy benefits of FPGA accelerators. In this paper we design and implement Blaze to provide programming and runtime support for enabling easy and efficient deployments of FPGA accelerators in datacenters. In particular, Blaze abstracts FPGA accelerators as a service (FaaS) and provides a set of clean programming APIs for big data processing applications to easily utilize those accelerators. Our Blaze runtime implements an FaaS framework to efficiently share FPGA accelerators among multiple heterogeneous threads on a single node, and extends Hadoop YARN with accelerator-centric scheduling to efficiently share them among multiple computing tasks in the cluster. Experimental results using four representative big data applications demonstrate that Blaze greatly reduces the programming efforts to access FPGA accelerators in systems like Apache Spark and YARN, and improves the system throughput by 1.7 × to 3× (and energy efficiency by 1.5× to 2.7×) compared to a conventional CPU-only cluster. PMID:28317049

  8. Programming and Runtime Support to Blaze FPGA Accelerator Deployment at Datacenter Scale.

    PubMed

    Huang, Muhuan; Wu, Di; Yu, Cody Hao; Fang, Zhenman; Interlandi, Matteo; Condie, Tyson; Cong, Jason

    2016-10-01

    With the end of CPU core scaling due to dark silicon limitations, customized accelerators on FPGAs have gained increased attention in modern datacenters due to their lower power, high performance and energy efficiency. Evidenced by Microsoft's FPGA deployment in its Bing search engine and Intel's 16.7 billion acquisition of Altera, integrating FPGAs into datacenters is considered one of the most promising approaches to sustain future datacenter growth. However, it is quite challenging for existing big data computing systems-like Apache Spark and Hadoop-to access the performance and energy benefits of FPGA accelerators. In this paper we design and implement Blaze to provide programming and runtime support for enabling easy and efficient deployments of FPGA accelerators in datacenters. In particular, Blaze abstracts FPGA accelerators as a service (FaaS) and provides a set of clean programming APIs for big data processing applications to easily utilize those accelerators. Our Blaze runtime implements an FaaS framework to efficiently share FPGA accelerators among multiple heterogeneous threads on a single node, and extends Hadoop YARN with accelerator-centric scheduling to efficiently share them among multiple computing tasks in the cluster. Experimental results using four representative big data applications demonstrate that Blaze greatly reduces the programming efforts to access FPGA accelerators in systems like Apache Spark and YARN, and improves the system throughput by 1.7 × to 3× (and energy efficiency by 1.5× to 2.7×) compared to a conventional CPU-only cluster.

  9. Real-time Full-spectral Imaging and Affinity Measurements from 50 Microfluidic Channels using Nanohole Surface Plasmon Resonance†

    PubMed Central

    Lee, Si Hoon; Lindquist, Nathan C.; Wittenberg, Nathan J.; Jordan, Luke R.; Oh, Sang-Hyun

    2012-01-01

    With recent advances in high-throughput proteomics and systems biology, there is a growing demand for new instruments that can precisely quantify a wide range of receptor-ligand binding kinetics in a high-throughput fashion. Here we demonstrate a surface plasmon resonance (SPR) imaging spectroscopy instrument capable of extracting binding kinetics and affinities from 50 parallel microfluidic channels simultaneously. The instrument utilizes large-area (~cm2) metallic nanohole arrays as SPR sensing substrates and combines a broadband light source, a high-resolution imaging spectrometer and a low-noise CCD camera to extract spectral information from every channel in real time with a refractive index resolution of 7.7 × 10−6. To demonstrate the utility of our instrument for quantifying a wide range of biomolecular interactions, each parallel microfluidic channel is coated with a biomimetic supported lipid membrane containing ganglioside (GM1) receptors. The binding kinetics of cholera toxin b (CTX-b) to GM1 are then measured in a single experiment from 50 channels. By combining the highly parallel microfluidic device with large-area periodic nanohole array chips, our SPR imaging spectrometer system enables high-throughput, label-free, real-time SPR biosensing, and its full-spectral imaging capability combined with nanohole arrays could enable integration of SPR imaging with concurrent surface-enhanced Raman spectroscopy. PMID:22895607

  10. Challenges of Identifying Clinically Actionable Genetic Variants for Precision Medicine

    PubMed Central

    2016-01-01

    Advances in genomic medicine have the potential to change the way we treat human disease, but translating these advances into reality for improving healthcare outcomes depends essentially on our ability to discover disease- and/or drug-associated clinically actionable genetic mutations. Integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a big data infrastructure can provide an efficient and effective way to identify clinically actionable genetic variants for personalized treatments and reduce healthcare costs. We review bioinformatics processing of next-generation sequencing (NGS) data, bioinformatics infrastructures for implementing precision medicine, and bioinformatics approaches for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs. PMID:27195526

  11. Deploying a Proximal Sensing Cart to Identify Drought-Adaptive Traits in Upland Cotton for High-Throughput Phenotyping

    PubMed Central

    Thompson, Alison L.; Thorp, Kelly R.; Conley, Matthew; Andrade-Sanchez, Pedro; Heun, John T.; Dyer, John M.; White, Jeffery W.

    2018-01-01

    Field-based high-throughput phenotyping is an emerging approach to quantify difficult, time-sensitive plant traits in relevant growing conditions. Proximal sensing carts represent an alternative platform to more costly high-clearance tractors for phenotyping dynamic traits in the field. A proximal sensing cart and specifically a deployment protocol, were developed to phenotype traits related to drought tolerance in the field. The cart-sensor package included an infrared thermometer, ultrasonic transducer, multi-spectral reflectance sensor, weather station, and RGB cameras. The cart deployment protocol was evaluated on 35 upland cotton (Gossypium hirsutum L.) entries grown in 2017 at Maricopa, AZ, United States. Experimental plots were grown under well-watered and water-limited conditions using a (0,1) alpha lattice design and evaluated in June and July. Total collection time of the 0.87 hectare field averaged 2 h and 27 min and produced 50.7 MB and 45.7 GB of data from the sensors and RGB cameras, respectively. Canopy temperature, crop water stress index (CWSI), canopy height, normalized difference vegetative index (NDVI), and leaf area index (LAI) differed among entries and showed an interaction with the water regime (p < 0.05). Broad-sense heritability (H2) estimates ranged from 0.097 to 0.574 across all phenotypes and collections. Canopy cover estimated from RGB images increased with counts of established plants (r = 0.747, p = 0.033). Based on the cart-derived phenotypes, three entries were found to have improved drought-adaptive traits compared to a local adapted cultivar. These results indicate that the deployment protocol developed for the cart and sensor package can measure multiple traits rapidly and accurately to characterize complex plant traits under drought conditions. PMID:29868041

  12. Estimating rice yield related traits and quantitative trait loci analysis under different nitrogen treatments using a simple tower-based field phenotyping system with modified single-lens reflex cameras

    NASA Astrophysics Data System (ADS)

    Naito, Hiroki; Ogawa, Satoshi; Valencia, Milton Orlando; Mohri, Hiroki; Urano, Yutaka; Hosoi, Fumiki; Shimizu, Yo; Chavez, Alba Lucia; Ishitani, Manabu; Selvaraj, Michael Gomez; Omasa, Kenji

    2017-03-01

    Application of field based high-throughput phenotyping (FB-HTP) methods for monitoring plant performance in real field conditions has a high potential to accelerate the breeding process. In this paper, we discuss the use of a simple tower based remote sensing platform using modified single-lens reflex cameras for phenotyping yield traits in rice under different nitrogen (N) treatments over three years. This tower based phenotyping platform has the advantages of simplicity, ease and stability in terms of introduction, maintenance and continual operation under field conditions. Out of six phenological stages of rice analyzed, the flowering stage was the most useful in the estimation of yield performance under field conditions. We found a high correlation between several vegetation indices (simple ratio (SR), normalized difference vegetation index (NDVI), transformed vegetation index (TVI), corrected transformed vegetation index (CTVI), soil-adjusted vegetation index (SAVI) and modified soil-adjusted vegetation index (MSAVI)) and multiple yield traits (panicle number, grain weight and shoot biomass) across a three trials. Among all of the indices studied, SR exhibited the best performance in regards to the estimation of grain weight (R2 = 0.80). Under our tower-based field phenotyping system (TBFPS), we identified quantitative trait loci (QTL) for yield related traits using a mapping population of chromosome segment substitution lines (CSSLs) and a single nucleotide polymorphism data set. Our findings suggest the TBFPS can be useful for the estimation of yield performance during early crop development. This can be a major opportunity for rice breeders whom desire high throughput phenotypic selection for yield performance traits.

  13. A simple 96 well microfluidic chip combined with visual and densitometry detection for resource-poor point of care testing

    PubMed Central

    Yang, Minghui; Sun, Steven; Kostov, Yordan

    2010-01-01

    There is a well-recognized need for low cost biodetection technologies for resource-poor settings with minimal medical infrastructure. Lab-on-a-chip (LOC) technology has the ability to perform biological assays in such settings. The aim of this work is to develop a low cost, high-throughput detection system for the analysis of 96 samples simultaneously outside the laboratory setting. To achieve this aim, several biosensing elements were combined: a syringe operated ELISA lab-on-a-chip (ELISA-LOC) which integrates fluid delivery system into a miniature 96-well plate; a simplified non-enzymatic reporter and detection approach using a gold nanoparticle-antibody conjugate as a secondary antibody and silver enhancement of the visual signal; and Carbon nanotubes (CNT) to increase primary antibody immobilization and improve assay sensitivity. Combined, these elements obviate the need for an ELISA washer, electrical power for operation and a sophisticated detector. We demonstrate the use of the device for detection of Staphylococcal enterotoxin B, a major foodborne toxin using three modes of detection, visual detection, CCD camera and document scanner. With visual detection or using a document scanner to measure the signal, the limit of detection (LOD) was 0.5ng/ml. In addition to visual detection, for precise quantitation of signal using densitometry and a CCD camera, the LOD was 0.1ng/ml for the CCD analysis and 0.5 ng/ml for the document scanner. The observed sensitivity is in the same range as laboratory-based ELISA testing. The point of care device can analyze 96 samples simultaneously, permitting high throughput diagnostics in the field and in resource poor areas without ready access to laboratory facilities or electricity. PMID:21503269

  14. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Hui

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, the author introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties ofmore » suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, they demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm 2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection. In the second part of this dissertation, the author used laser-induced native fluorescence coupled with capillary electrophoresis (LINF-CE) and microscope imaging to study the single cell degranulation. On the basis of good temporal correlation with events observed through an optical microscope, they have identified individual peaks in the fluorescence electropherograms as serotonin released from the granular core on contact with the surrounding fluid.« less

  15. Preliminary optical design of PANIC, a wide-field infrared camera for CAHA

    NASA Astrophysics Data System (ADS)

    Cárdenas, M. C.; Rodríguez Gómez, J.; Lenzen, R.; Sánchez-Blanco, E.

    2008-07-01

    In this paper, we present the preliminary optical design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Calar Alto 2.2 m telescope. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. A mosaic of four Hawaii 2RG of 2k x 2k made by Teledyne is used as detector and will give a field of view of 31.9 arcmin x 31.9 arcmin. This cryogenic instrument has been optimized for the Y, J, H and K bands. Special care has been taken in the selection of the standard IR materials used for the optics in order to maximize the instrument throughput and to include the z band. The main challenges of this design are: to produce a well defined internal pupil which allows reducing the thermal background by a cryogenic pupil stop; the correction of off-axis aberrations due to the large field available; the correction of chromatic aberration because of the wide spectral coverage; and the capability of introduction of narrow band filters (~1%) in the system minimizing the degradation in the filter passband without a collimated stage in the camera. We show the optomechanical error budget and compensation strategy that allows our as built design to met the performances from an optical point of view. Finally, we demonstrate the flexibility of the design showing the performances of PANIC at the CAHA 3.5m telescope.

  16. Assessing the Role of Livestock in Big Cat Prey Choice Using Spatiotemporal Availability Patterns

    PubMed Central

    Ghoddousi, Arash; Soofi, Mahmood; Kh. Hamidi, Amirhossein; Lumetsberger, Tanja; Egli, Lukas; Khorozyan, Igor; Kiabi, Bahram H.; Waltert, Matthias

    2016-01-01

    Livestock is represented in big cat diets throughout the world. Husbandry approaches aim to reduce depredation, which may influence patterns of prey choice, but whether felids have a preference for livestock or not often remains unclear as most studies ignore livestock availability. We assessed prey choice of the endangered Persian leopard (Panthera pardus saxicolor) in Golestan National Park, Iran, where conflict over livestock depredation occurs. We analyzed leopard diet (77 scats) and assessed wild and domestic prey abundance by line transect sampling (186 km), camera-trapping (2777 camera days), double-observer point-counts (64 scans) and questionnaire surveys (136 respondents). Based on interviews with 18 shepherds, we estimated monthly grazing time outside six villages with 96 conflict cases to obtain a small livestock (domestic sheep and goat) availability coefficient. Using this coefficient, which ranged between 0.40 and 0.63 for different villages, we estimated the numbers of sheep and goats available to leopard depredation. Leopard diet consisted mainly of wild boar (Sus scrofa) (50.2% biomass consumed), but bezoar goat (Capra aegagrus) was the most preferred prey species (Ij = 0.73), whereas sheep and goats were avoided (Ij = -0.54). When absolute sheep and goat numbers (~11250) were used instead of the corrected ones (~6392), avoidance of small livestock appeared to be even stronger (Ij = -0.71). We suggest that future assessments of livestock choice by felids should incorporate such case-specific corrections for spatiotemporal patterns of availability, which may vary with husbandry methods. Such an approach increases our understanding of human-felid conflict dynamics and the role of livestock in felid diets. PMID:27064680

  17. Free-ranging domestic cats (Felis catus) on public lands: estimating density, activity, and diet in the Florida Keys

    USGS Publications Warehouse

    Cove, Michael V.; Gardner, Beth; Simons, Theodore R.; Kays, Roland; O'Connell, Allan F.

    2017-01-01

    Feral and free-ranging domestic cats (Felis catus) can have strong negative effects on small mammals and birds, particularly in island ecosystems. We deployed camera traps to study free-ranging cats in national wildlife refuges and state parks on Big Pine Key and Key Largo in the Florida Keys, USA, and used spatial capture–recapture models to estimate cat abundance, movement, and activities. We also used stable isotope analyses to examine the diet of cats captured on public lands. Top population models separated cats based on differences in movement and detection with three and two latent groups on Big Pine Key and Key Largo, respectively. We hypothesize that these latent groups represent feral, semi-feral, and indoor/outdoor house cats based on the estimated movement parameters of each group. Estimated cat densities and activity varied between the two islands, with relatively high densities (~4 cats/km2) exhibiting crepuscular diel patterns on Big Pine Key and lower densities (~1 cat/km2) exhibiting nocturnal diel patterns on Key Largo. These differences are most likely related to the higher proportion of house cats on Big Pine relative to Key Largo. Carbon and nitrogen isotope ratios from hair samples of free-ranging cats (n = 43) provided estimates of the proportion of wild and anthropogenic foods in cat diets. At the population level, cats on both islands consumed mostly anthropogenic foods (>80% of the diet), but eight individuals were effective predators of wildlife (>50% of the diet). We provide evidence that cat groups within a population move different distances, exhibit different activity patterns, and that individuals consume wildlife at different rates, which all have implications for managing this invasive predator.

  18. Slylab (SL)-3 View - North Central Wyoming (WY) - Southern Montana (MT)

    NASA Image and Video Library

    1973-08-15

    S73-35081 (July-September 1973) --- A view of approximately 3,600 square miles of north central Wyoming and southern Montana is seen in this Skylab 3 Earth Resources Experiments Package S190-B (five-inch Earth terrain camera) photograph taken from the Skylab space station in Earth orbit. The Big Horn River following northward crosses between the northwest trending Big Horn Mountains and the Pryor Mountains. Yellowtail Reservoir, named after a former chief of the Crow Indian tribe in the center of the picture is impounded by a dam across the small rectangular crop area along the Big Horn River (upper right) and the strip farming (yellow) practiced on the rolling hill along the Big Horn River and its tributaries (upper left corner and right edge). The low sun angle enhances the structural features of the mountains as well as the drainage patterns in the adjacent basins. Rock formation appears in this color photograph as they would to the eye from this altitude. The distinctive redbeds can be traced along the front of the Pryor Mountains and indicate the folding that occurred during mountain building. EREP investigators, Dr. Houston of the University of Wyoming and Dr. Hoppin of the University of Iowa, will analyze the photograph and use the results in geological mapping and mineral resource studies. Lowell, Wyoming (lower left corner) and Hardin, Montana (upper right corner) can be recognized. Federal agencies participating with NASA on the EREP project are the Departments of Agriculture, Commerce, Interior, the Environmental Protection Agency and the Corps of Engineers. All EREP photography is available to the public through the Department of Interior?s Earth Resources Observations Systems Data Center, Sioux Falls, South Dakota, 57198. (Alternate number SL3-86-337) Photo credit: NASA

  19. SPRAT: Spectrograph for the Rapid Acquisition of Transients

    NASA Astrophysics Data System (ADS)

    Piascik, A. S.; Steele, Iain A.; Bates, Stuart D.; Mottram, Christopher J.; Smith, R. J.; Barnsley, R. M.; Bolton, B.

    2014-07-01

    We describe the development of a low cost, low resolution (R ~ 350), high throughput, long slit spectrograph covering visible (4000-8000) wavelengths. The spectrograph has been developed for fully robotic operation with the Liverpool Telescope (La Palma). The primary aim is to provide rapid spectral classification of faint (V ˜ 20) transient objects detected by projects such as Gaia, iPTF (intermediate Palomar Transient Factory), LOFAR, and a variety of high energy satellites. The design employs a volume phase holographic (VPH) transmission grating as the dispersive element combined with a prism pair (grism) in a linear optical path. One of two peak spectral sensitivities are selectable by rotating the grism. The VPH and prism combination and entrance slit are deployable, and when removed from the beam allow the collimator/camera pair to re-image the target field onto the detector. This mode of operation provides automatic acquisition of the target onto the slit prior to spectrographic observation through World Coordinate System fitting. The selection and characterisation of optical components to maximise photon throughput is described together with performance predictions.

  20. Advances in single-cell experimental design made possible by automated imaging platforms with feedback through segmentation.

    PubMed

    Crick, Alex J; Cammarota, Eugenia; Moulang, Katie; Kotar, Jurij; Cicuta, Pietro

    2015-01-01

    Live optical microscopy has become an essential tool for studying the dynamical behaviors and variability of single cells, and cell-cell interactions. However, experiments and data analysis in this area are often extremely labor intensive, and it has often not been achievable or practical to perform properly standardized experiments on a statistically viable scale. We have addressed this challenge by developing automated live imaging platforms, to help standardize experiments, increasing throughput, and unlocking previously impossible ones. Our real-time cell tracking programs communicate in feedback with microscope and camera control software, and they are highly customizable, flexible, and efficient. As examples of our current research which utilize these automated platforms, we describe two quite different applications: egress-invasion interactions of malaria parasites and red blood cells, and imaging of immune cells which possess high motility and internal dynamics. The automated imaging platforms are able to track a large number of motile cells simultaneously, over hours or even days at a time, greatly increasing data throughput and opening up new experimental possibilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. High-throughput behavioral screening method for detecting auditory response defects in zebrafish.

    PubMed

    Bang, Pascal I; Yelick, Pamela C; Malicki, Jarema J; Sewell, William F

    2002-08-30

    We have developed an automated, high-throughput behavioral screening method for detecting hearing defects in zebrafish. Our assay monitors a rapid escape reflex in response to a loud sound. With this approach, 36 adult zebrafish, restrained in visually isolated compartments, can be simultaneously assessed for responsiveness to near-field 400 Hz sinusoidal tone bursts. Automated, objective determinations of responses are achieved with a computer program that obtains images at precise times relative to the acoustic stimulus. Images taken with a CCD video camera before and after stimulus presentation are subtracted to reveal a response to the sound. Up to 108 fish can be screened per hour. Over 6500 fish were tested to validate the reliability of the assay. We found that 1% of these animals displayed hearing deficits. The phenotypes of non-responders were further assessed with radiological analysis for defects in the gross morphology of the auditory system. Nearly all of those showed abnormalities in conductive elements of the auditory system: the swim bladder or Weberian ossicles. Copyright 2002 Elsevier Science B.V.

  2. High definition infrared chemical imaging of colorectal tissue using a Spero QCL microscope.

    PubMed

    Bird, B; Rowlette, J

    2017-04-10

    Mid-infrared microscopy has become a key technique in the field of biomedical science and spectroscopy. This label-free, non-destructive technique permits the visualisation of a wide range of intrinsic biochemical markers in tissues, cells and biofluids by detection of the vibrational modes of the constituent molecules. Together, infrared microscopy and chemometrics is a widely accepted method that can distinguish healthy and diseased states with high accuracy. However, despite the exponential growth of the field and its research world-wide, several barriers currently exist for its full translation into the clinical sphere, namely sample throughput and data management. The advent and incorporation of quantum cascade lasers (QCLs) into infrared microscopes could help propel the field over these remaining hurdles. Such systems offer several advantages over their FT-IR counterparts, a simpler instrument architecture, improved photon flux, use of room temperature camera systems, and the flexibility of a tunable illumination source. In this current study we explore the use of a QCL infrared microscope to produce high definition, high throughput chemical images useful for the screening of biopsied colorectal tissue.

  3. Advancing biomarker research: utilizing 'Big Data' approaches for the characterization and prevention of bipolar disorder.

    PubMed

    McIntyre, Roger S; Cha, Danielle S; Jerrell, Jeanette M; Swardfager, Walter; Kim, Rachael D; Costa, Leonardo G; Baskaran, Anusha; Soczynska, Joanna K; Woldeyohannes, Hanna O; Mansur, Rodrigo B; Brietzke, Elisa; Powell, Alissa M; Gallaugher, Ashley; Kudlow, Paul; Kaidanovich-Beilin, Oksana; Alsuwaidan, Mohammad

    2014-08-01

    To provide a strategic framework for the prevention of bipolar disorder (BD) that incorporates a 'Big Data' approach to risk assessment for BD. Computerized databases (e.g., Pubmed, PsychInfo, and MedlinePlus) were used to access English-language articles published between 1966 and 2012 with the search terms bipolar disorder, prodrome, 'Big Data', and biomarkers cross-referenced with genomics/genetics, transcriptomics, proteomics, metabolomics, inflammation, oxidative stress, neurotrophic factors, cytokines, cognition, neurocognition, and neuroimaging. Papers were selected from the initial search if the primary outcome(s) of interest was (were) categorized in any of the following domains: (i) 'omics' (e.g., genomics), (ii) molecular, (iii) neuroimaging, and (iv) neurocognitive. The current strategic approach to identifying individuals at risk for BD, with an emphasis on phenotypic information and family history, has insufficient predictive validity and is clinically inadequate. The heterogeneous clinical presentation of BD, as well as its pathoetiological complexity, suggests that it is unlikely that a single biomarker (or an exclusive biomarker approach) will sufficiently augment currently inadequate phenotypic-centric prediction models. We propose a 'Big Data'- bioinformatics approach that integrates vast and complex phenotypic, anamnestic, behavioral, family, and personal 'omics' profiling. Bioinformatic processing approaches, utilizing cloud- and grid-enabled computing, are now capable of analyzing data on the order of tera-, peta-, and exabytes, providing hitherto unheard of opportunities to fundamentally revolutionize how psychiatric disorders are predicted, prevented, and treated. High-throughput networks dedicated to research on, and the treatment of, BD, integrating both adult and younger populations, will be essential to sufficiently enroll adequate samples of individuals across the neurodevelopmental trajectory in studies to enable the characterization and prevention of this heterogeneous disorder. Advances in bioinformatics using a 'Big Data' approach provide an opportunity for novel insights regarding the pathoetiology of BD. The coordinated integration of research centers, inclusive of mixed-age populations, is a promising strategic direction for advancing this line of neuropsychiatric research. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  4. International Barcode of Life: Focus on big biodiversity in South Africa.

    PubMed

    Adamowicz, Sarah J; Hollingsworth, Peter M; Ratnasingham, Sujeevan; van der Bank, Michelle

    2017-11-01

    Participants in the 7th International Barcode of Life Conference (Kruger National Park, South Africa, 20-24 November 2017) share the latest findings in DNA barcoding research and its increasingly diversified applications. Here, we review prevailing trends synthesized from among 429 invited and contributed abstracts, which are collated in this open-access special issue of Genome. Hosted for the first time on the African continent, the 7th Conference places special emphasis on the evolutionary origins, biogeography, and conservation of African flora and fauna. Within Africa and elsewhere, DNA barcoding and related techniques are being increasingly used for wildlife forensics and for the validation of commercial products, such as medicinal plants and seafood species. A striking trend of the conference is the dramatic rise of studies on environmental DNA (eDNA) and on diverse uses of high-throughput sequencing techniques. Emerging techniques in these areas are opening new avenues for environmental biomonitoring, managing species-at-risk and invasive species, and revealing species interaction networks in unprecedented detail. Contributors call for the development of validated community standards for high-throughput sequence data generation and analysis, to enable the full potential of these methods to be realized for understanding and managing biodiversity on a global scale.

  5. Breast cancer: The translation of big genomic data to cancer precision medicine.

    PubMed

    Low, Siew-Kee; Zembutsu, Hitoshi; Nakamura, Yusuke

    2018-03-01

    Cancer is a complex genetic disease that develops from the accumulation of genomic alterations in which germline variations predispose individuals to cancer and somatic alterations initiate and trigger the progression of cancer. For the past 2 decades, genomic research has advanced remarkably, evolving from single-gene to whole-genome screening by using genome-wide association study and next-generation sequencing that contributes to big genomic data. International collaborative efforts have contributed to curating these data to identify clinically significant alterations that could be used in clinical settings. Focusing on breast cancer, the present review summarizes the identification of genomic alterations with high-throughput screening as well as the use of genomic information in clinical trials that match cancer patients to therapies, which further leads to cancer precision medicine. Furthermore, cancer screening and monitoring were enhanced greatly by the use of liquid biopsies. With the growing data complexity and size, there is much anticipation in exploiting deep machine learning and artificial intelligence to curate integrative "-omics" data to refine the current medical practice to be applied in the near future. © 2017 The Authors. Cancer Science published by John Wiley & Sons Australia, Ltd on behalf of Japanese Cancer Association.

  6. Multidisciplinary insight into clonal expansion of HTLV-1-infected cells in adult T-cell leukemia via modeling by deterministic finite automata coupled with high-throughput sequencing.

    PubMed

    Farmanbar, Amir; Firouzi, Sanaz; Park, Sung-Joon; Nakai, Kenta; Uchimaru, Kaoru; Watanabe, Toshiki

    2017-01-31

    Clonal expansion of leukemic cells leads to onset of adult T-cell leukemia (ATL), an aggressive lymphoid malignancy with a very poor prognosis. Infection with human T-cell leukemia virus type-1 (HTLV-1) is the direct cause of ATL onset, and integration of HTLV-1 into the human genome is essential for clonal expansion of leukemic cells. Therefore, monitoring clonal expansion of HTLV-1-infected cells via isolation of integration sites assists in analyzing infected individuals from early infection to the final stage of ATL development. However, because of the complex nature of clonal expansion, the underlying mechanisms have yet to be clarified. Combining computational/mathematical modeling with experimental and clinical data of integration site-based clonality analysis derived from next generation sequencing technologies provides an appropriate strategy to achieve a better understanding of ATL development. As a comprehensively interdisciplinary project, this study combined three main aspects: wet laboratory experiments, in silico analysis and empirical modeling. We analyzed clinical samples from HTLV-1-infected individuals with a broad range of proviral loads using a high-throughput methodology that enables isolation of HTLV-1 integration sites and accurate measurement of the size of infected clones. We categorized clones into four size groups, "very small", "small", "big", and "very big", based on the patterns of clonal growth and observed clone sizes. We propose an empirical formal model based on deterministic finite state automata (DFA) analysis of real clinical samples to illustrate patterns of clonal expansion. Through the developed model, we have translated biological data of clonal expansion into the formal language of mathematics and represented the observed clonality data with DFA. Our data suggest that combining experimental data (absolute size of clones) with DFA can describe the clonality status of patients. This kind of modeling provides a basic understanding as well as a unique perspective for clarifying the mechanisms of clonal expansion in ATL.

  7. WFPC2 Filters after 16 Years on Orbit

    NASA Astrophysics Data System (ADS)

    Lian Lim, Pey; Quijada, M.; Baggett, S.; Biretta, J.; MacKenty, J.; Boucarut, R.; Rice, S.; del Hoyo, J.

    2011-01-01

    Wide Field Planetary Camera 2 (WFPC2) was installed on Hubble Space Telescope (HST) in December 1993 during Servicing Mission 1 by the crew of Shuttle Mission STS-61. WFPC2 replaced Wide Field Planetary Camera 1 (WFPC1), providing improved UV performance, more advanced detectors, better contamination control, and its own corrective optics. After 16 years of exceptional service, WFPC2 was retired in May 2009 during Servicing Mission 4, when it was removed from HST in order to allow for the installation of Wide Field Camera 3 (WFC3). WFPC2 was carried back to Earth in the shuttle bay by the crew of Shuttle Mission STS-125. In a joint investigation by Goddard Space Flight Center (GSFC) and Space Telescope Science Institute (STScI), the Selectable Optical Filter Assembly (SOFA) of WFPC2 was extracted and the filter wheels removed and examined for any on-orbit changes. The filters were inspected, photographed and scanned with a spectrophotometer at GSFC. The data have been analyzed at STScI with a view towards understanding how prolonged exposure to the HST space environment affected the filters and what the resultant impacts are to WFPC2 calibrations. We will summarize our results from these post-SM4 laboratory studies, including a comparison of pre- to post-mission filter throughput measurements, evaluations of the UV filter red leaks, and assessment of the condition of the filter coatings.

  8. Using a 3D profiler and infrared camera to monitor oven loading in fully cooked meat operations

    NASA Astrophysics Data System (ADS)

    Stewart, John; Giorges, Aklilu

    2009-05-01

    Ensuring meat is fully cooked is an important food safety issue for operations that produce "ready to eat" products. In order to kill harmful pathogens like Salmonella, all of the product must reach a minimum threshold temperature. Producers typically overcook the majority of the product to ensure meat in the most difficult scenario reaches the desired temperature. A difficult scenario can be caused by an especially thick piece of meat or by a surge of product into the process. Overcooking wastes energy, degrades product quality, lowers the maximum throughput rate of the production line and decreases product yield. At typical production rates of 6000lbs/hour, these losses from overcooking can have a significant cost impact on producers. A wide area 3D camera coupled with a thermal camera was used to measure the thermal mass variability of chicken breasts in a cooking process. Several types of variability are considered including time varying thermal mass (mass x temperature / time), variation in individual product geometry and variation in product temperature. The automatic identification of product arrangement issues that affect cooking such as overlapping product and folded products is also addressed. A thermal model is used along with individual product geometry and oven cook profiles to predict the percentage of product that will be overcooked and to identify products that may not fully cook in a given process.

  9. Stress myocardial perfusion imaging in the emergency department--new techniques for speed and diagnostic accuracy.

    PubMed

    Harrison, Sheri D; Harrison, Mark A; Duvall, W Lane

    2012-05-01

    Emergency room evaluations of patients presenting with chest pain continue to rise, and these evaluations which often include cardiac imaging, are an increasing area of resource utilization in the current health system. Myocardial perfusion imaging from the emergency department remains a vital component of the diagnosis or exclusion of coronary artery disease as the etiology of chest pain. Recent advances in camera technology, and changes to the imaging protocols have allowed MPI to become a more efficient way of providing this diagnostic information. Compared with conventional SPECT, new high-efficiency CZT cameras provide a 3-5 fold increase in photon sensitivity, 1.65-fold improvement in energy resolution and a 1.7-2.5-fold increase in spatial resolution. With stress-only imaging, rest images are eliminated if stress images are normal, as they provide no additional prognostic or diagnostic value and cancelling the rest images would shorten the length of the test which is of particular importance to the ED population. The rapid but accurate triage of patients in an ED CPU is essential to their care, and stress-only imaging and new CZT cameras allow for shorter test time, lower radiation doses and lower costs while demonstrating good clinical outcomes. These changes to nuclear stress testing can allow for faster throughput of patients through the emergency department while providing a safe and efficient evaluation of chest pain.

  10. Packet based serial link realized in FPGA dedicated for high resolution infrared image transmission

    NASA Astrophysics Data System (ADS)

    Bieszczad, Grzegorz

    2015-05-01

    In article the external digital interface specially designed for thermographic camera built in Military University of Technology is described. The aim of article is to illustrate challenges encountered during design process of thermal vision camera especially related to infrared data processing and transmission. Article explains main requirements for interface to transfer Infra-Red or Video digital data and describes the solution which we elaborated based on Low Voltage Differential Signaling (LVDS) physical layer and signaling scheme. Elaborated link for image transmission is built using FPGA integrated circuit with built-in high speed serial transceivers achieving up to 2500Gbps throughput. Image transmission is realized using proprietary packet protocol. Transmission protocol engine was described in VHDL language and tested in FPGA hardware. The link is able to transmit 1280x1024@60Hz 24bit video data using one signal pair. Link was tested to transmit thermal-vision camera picture to remote monitor. Construction of dedicated video link allows to reduce power consumption compared to solutions with ASIC based encoders and decoders realizing video links like DVI or packed based Display Port, with simultaneous reduction of wires needed to establish link to one pair. Article describes functions of modules integrated in FPGA design realizing several functions like: synchronization to video source, video stream packeting, interfacing transceiver module and dynamic clock generation for video standard conversion.

  11. Ultrafast Microfluidic Cellular Imaging by Optical Time-Stretch.

    PubMed

    Lau, Andy K S; Wong, Terence T W; Shum, Ho Cheung; Wong, Kenneth K Y; Tsia, Kevin K

    2016-01-01

    There is an unmet need in biomedicine for measuring a multitude of parameters of individual cells (i.e., high content) in a large population efficiently (i.e., high throughput). This is particularly driven by the emerging interest in bringing Big-Data analysis into this arena, encompassing pathology, drug discovery, rare cancer cell detection, emulsion microdroplet assays, to name a few. This momentum is particularly evident in recent advancements in flow cytometry. They include scaling of the number of measurable colors from the labeled cells and incorporation of imaging capability to access the morphological information of the cells. However, an unspoken predicament appears in the current technologies: higher content comes at the expense of lower throughput, and vice versa. For example, accessing additional spatial information of individual cells, imaging flow cytometers only achieve an imaging throughput ~1000 cells/s, orders of magnitude slower than the non-imaging flow cytometers. In this chapter, we introduce an entirely new imaging platform, namely optical time-stretch microscopy, for ultrahigh speed and high contrast label-free single-cell (in a ultrafast microfluidic flow up to 10 m/s) imaging and analysis with an ultra-fast imaging line-scan rate as high as tens of MHz. Based on this technique, not only morphological information of the individual cells can be obtained in an ultrafast manner, quantitative evaluation of cellular information (e.g., cell volume, mass, refractive index, stiffness, membrane tension) at nanometer scale based on the optical phase is also possible. The technology can also be integrated with conventional fluorescence measurements widely adopted in the non-imaging flow cytometers. Therefore, these two combinatorial and complementary measurement capabilities in long run is an attractive platform for addressing the pressing need for expanding the "parameter space" in high-throughput single-cell analysis. This chapter provides the general guidelines of constructing the optical system for time stretch imaging, fabrication and design of the microfluidic chip for ultrafast fluidic flow, as well as the image acquisition and processing.

  12. Bayesian inference with historical data-based informative priors improves detection of differentially expressed genes

    PubMed Central

    Li, Ben; Sun, Zhaonan; He, Qing; Zhu, Yu; Qin, Zhaohui S.

    2016-01-01

    Motivation: Modern high-throughput biotechnologies such as microarray are capable of producing a massive amount of information for each sample. However, in a typical high-throughput experiment, only limited number of samples were assayed, thus the classical ‘large p, small n’ problem. On the other hand, rapid propagation of these high-throughput technologies has resulted in a substantial collection of data, often carried out on the same platform and using the same protocol. It is highly desirable to utilize the existing data when performing analysis and inference on a new dataset. Results: Utilizing existing data can be carried out in a straightforward fashion under the Bayesian framework in which the repository of historical data can be exploited to build informative priors and used in new data analysis. In this work, using microarray data, we investigate the feasibility and effectiveness of deriving informative priors from historical data and using them in the problem of detecting differentially expressed genes. Through simulation and real data analysis, we show that the proposed strategy significantly outperforms existing methods including the popular and state-of-the-art Bayesian hierarchical model-based approaches. Our work illustrates the feasibility and benefits of exploiting the increasingly available genomics big data in statistical inference and presents a promising practical strategy for dealing with the ‘large p, small n’ problem. Availability and implementation: Our method is implemented in R package IPBT, which is freely available from https://github.com/benliemory/IPBT. Contact: yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26519502

  13. Building connected data standards to promote interdisciplinary research in the paleogeosciences- PalEON, Neotoma, THROUGHPUT

    NASA Astrophysics Data System (ADS)

    Goring, S. J.; Richard, S. M.; Williams, J. W.; Dawson, A.

    2017-12-01

    A broad array of data resources, across disciplines, are needed to study Earth system processes operating at multiple spatial or temporal scales. Data friction frequently delays this integrative and interdisciplinary research, while sustainable solutions may be hampered as a result of academic incentives that penalize technical "tool building" at the expense of research publication. The paleogeosciences, in particular, often integrate data drawn from multiple sub-disciplines and from a range of long-tail and big data sources. Data friction can be lowered and the pace of scientific discovery accelerated through the development and adoption of data standards, both within the paleogeosciences and with allied disciplines. Using the PalEON Project (https://sites.nd.edu/paleonproject/) and the Neotoma Paleoecological Database (https://neotomadb.org) as focal case studies, we first illustrate the advances possible through data standardization. We then focus on new efforts in data standardization and building linkages among paleodata resources underway through the EarthCube-funded Throughput project. A first step underway is to analyze existing standards across paleo-data repositories and identify ways in which the adoption of common standards can promote connectivity, reducing barriers to interdisciplinary research, especially for early career researchers. Experience indicates that standards tend to emerge by necessity and from a mixture of bottom-up and top-down processes. A common pathway is when conventions developed to solve specific problems within a community are extended to address challenges that are more general. The Throughput project will identify, document, and promote such solutions to foster wider adoption of standards for data interchange and reduce data friction in the paleogeosciences.

  14. Mechanism Profiling of Hepatotoxicity Caused by Oxidative Stress Using Antioxidant Response Element Reporter Gene Assay Models and Big Data.

    PubMed

    Kim, Marlene Thai; Huang, Ruili; Sedykh, Alexander; Wang, Wenyi; Xia, Menghang; Zhu, Hao

    2016-05-01

    Hepatotoxicity accounts for a substantial number of drugs being withdrawn from the market. Using traditional animal models to detect hepatotoxicity is expensive and time-consuming. Alternative in vitro methods, in particular cell-based high-throughput screening (HTS) studies, have provided the research community with a large amount of data from toxicity assays. Among the various assays used to screen potential toxicants is the antioxidant response element beta lactamase reporter gene assay (ARE-bla), which identifies chemicals that have the potential to induce oxidative stress and was used to test > 10,000 compounds from the Tox21 program. The ARE-bla computational model and HTS data from a big data source (PubChem) were used to profile environmental and pharmaceutical compounds with hepatotoxicity data. Quantitative structure-activity relationship (QSAR) models were developed based on ARE-bla data. The models predicted the potential oxidative stress response for known liver toxicants when no ARE-bla data were available. Liver toxicants were used as probe compounds to search PubChem Bioassay and generate a response profile, which contained thousands of bioassays (> 10 million data points). By ranking the in vitro-in vivo correlations (IVIVCs), the most relevant bioassay(s) related to hepatotoxicity were identified. The liver toxicants profile contained the ARE-bla and relevant PubChem assays. Potential toxicophores for well-known toxicants were created by identifying chemical features that existed only in compounds with high IVIVCs. Profiling chemical IVIVCs created an opportunity to fully explore the source-to-outcome continuum of modern experimental toxicology using cheminformatics approaches and big data sources. Kim MT, Huang R, Sedykh A, Wang W, Xia M, Zhu H. 2016. Mechanism profiling of hepatotoxicity caused by oxidative stress using antioxidant response element reporter gene assay models and big data. Environ Health Perspect 124:634-641; http://dx.doi.org/10.1289/ehp.1509763.

  15. Mechanism Profiling of Hepatotoxicity Caused by Oxidative Stress Using Antioxidant Response Element Reporter Gene Assay Models and Big Data

    PubMed Central

    Kim, Marlene Thai; Huang, Ruili; Sedykh, Alexander; Wang, Wenyi; Xia, Menghang; Zhu, Hao

    2015-01-01

    Background: Hepatotoxicity accounts for a substantial number of drugs being withdrawn from the market. Using traditional animal models to detect hepatotoxicity is expensive and time-consuming. Alternative in vitro methods, in particular cell-based high-throughput screening (HTS) studies, have provided the research community with a large amount of data from toxicity assays. Among the various assays used to screen potential toxicants is the antioxidant response element beta lactamase reporter gene assay (ARE-bla), which identifies chemicals that have the potential to induce oxidative stress and was used to test > 10,000 compounds from the Tox21 program. Objective: The ARE-bla computational model and HTS data from a big data source (PubChem) were used to profile environmental and pharmaceutical compounds with hepatotoxicity data. Methods: Quantitative structure–activity relationship (QSAR) models were developed based on ARE-bla data. The models predicted the potential oxidative stress response for known liver toxicants when no ARE-bla data were available. Liver toxicants were used as probe compounds to search PubChem Bioassay and generate a response profile, which contained thousands of bioassays (> 10 million data points). By ranking the in vitro–in vivo correlations (IVIVCs), the most relevant bioassay(s) related to hepatotoxicity were identified. Results: The liver toxicants profile contained the ARE-bla and relevant PubChem assays. Potential toxicophores for well-known toxicants were created by identifying chemical features that existed only in compounds with high IVIVCs. Conclusion: Profiling chemical IVIVCs created an opportunity to fully explore the source-to-outcome continuum of modern experimental toxicology using cheminformatics approaches and big data sources. Citation: Kim MT, Huang R, Sedykh A, Wang W, Xia M, Zhu H. 2016. Mechanism profiling of hepatotoxicity caused by oxidative stress using antioxidant response element reporter gene assay models and big data. Environ Health Perspect 124:634–641; http://dx.doi.org/10.1289/ehp.1509763 PMID:26383846

  16. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells

    PubMed Central

    Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X.; Sartipy, Peter; Synnergren, Jane

    2017-01-01

    The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure data analysis in future research, both in stem cell differentiation, and more generally, in biomedical big data analytics. PMID:28654683

  17. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells.

    PubMed

    Ulfenborg, Benjamin; Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X; Sartipy, Peter; Synnergren, Jane

    2017-01-01

    The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure data analysis in future research, both in stem cell differentiation, and more generally, in biomedical big data analytics.

  18. Big Data over a 100G network at Fermilab

    DOE PAGES

    Garzoglio, Gabriele; Mhashilkar, Parag; Kim, Hyunwoo; ...

    2014-06-11

    As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out ofmore » the laboratory of about 30 Gbit/s and on the Local are network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research & Development facility connected to the ESnet 100G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. Furthermore, this work presents the new R&D facility and the continuation of the evaluation program.« less

  19. Big Data over a 100G network at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele; Mhashilkar, Parag; Kim, Hyunwoo

    As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out ofmore » the laboratory of about 30 Gbit/s and on the Local are network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research & Development facility connected to the ESnet 100G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. Furthermore, this work presents the new R&D facility and the continuation of the evaluation program.« less

  20. Computational solutions to large-scale data management and analysis

    PubMed Central

    Schadt, Eric E.; Linderman, Michael D.; Sorenson, Jon; Lee, Lawrence; Nolan, Garry P.

    2011-01-01

    Today we can generate hundreds of gigabases of DNA and RNA sequencing data in a week for less than US$5,000. The astonishing rate of data generation by these low-cost, high-throughput technologies in genomics is being matched by that of other technologies, such as real-time imaging and mass spectrometry-based flow cytometry. Success in the life sciences will depend on our ability to properly interpret the large-scale, high-dimensional data sets that are generated by these technologies, which in turn requires us to adopt advances in informatics. Here we discuss how we can master the different types of computational environments that exist — such as cloud and heterogeneous computing — to successfully tackle our big data problems. PMID:20717155

  1. [Determination of radioactivity by smartphones].

    PubMed

    Hartmann, H; Freudenberg, R; Andreeff, M; Kotzerke, J

    2013-01-01

    The interest in the detection of radioactive materials has strongly increased after the accident in the nuclear power plant Fukushima and has led to a bottleneck of suitable measuring instruments. Smartphones equipped with a commercially available software tool could be used for dose rate measurements following a calibration according to the specific camera module. We examined whether such measurements provide reliable data for typical activities and radionuclides in nuclear medicine. For the nuclides 99mTc (10 - 1000 MBq), 131I (3.7 - 1800 MBq, therapy capsule) and 68Ga (50 - 600 MBq) radioactivity with defined geometry in different distances was measured. The smartphones Milestone Droid 1 (Motorola) and HTC Desire (HTC Corporation) were compared with the standard instruments AD6 (automess) and DoseGUARD (AEA Technology). Measurements with the smartphones and the other devices show a good agreement: linear signal increase with rising activity and dose rate. The long time measurement (131I, 729 MBq, 0.5 m, 60 min) demonstrates a considerably higher variation (by 20%) of the measured smartphone data values compared with the AD6. For low dose rates (< 1 µGy/h), the sensitivity decreases so that measurements of e. g. the natural radiation exposure do not lead to valid results. The calibration of the camera responsivity for the smartphone has a big influence on the results caused by the small detector surface of the camera semiconductor. With commercial software the camera module of a smartphone can be used for the measurement of radioactivity. Dose rates resulting from typical nuclear medicine procedures can be measured reliably (e. g., dismissal dose after radioiodine therapy). The signal shows a high correlation to measured values of conventional dose measurement devices.

  2. Machine Learning for Zwicky Transient Facility

    NASA Astrophysics Data System (ADS)

    Mahabal, Ashish; Zwicky Transient Facility, Catalina Real-Time Transient Survey

    2018-01-01

    The Zwicky Transient Facility (ZTF) will operate from 2018 to 2020 covering the accessible sky with its large 47 square degree camera. The transient detection rate is expected to be about a million per night. ZTF is thus a perfect LSST prototype. The big difference is that all of the ZTF transients can be followed up by 4- to 8-m class telescopes. Given the large numbers, using human scanners for separating the genuine transients from artifacts is out of question. For that first step as well as for classifying the transients with minimal follow-up requires machine learning. We describe the tools and plans to take on this task using follow-up facilities, and knowledge gained from archival datasets.

  3. Development Tests of a Cryogenic Filter Wheel Assembly for the NIRCam Instrument

    NASA Technical Reports Server (NTRS)

    McCully, Sean; Clark, Charles; Schermerhorn, Michael; Trojanek, Filip; O'Hara, Mark; Williams, Jeff; Thatcher, John

    2006-01-01

    The James Webb Space Telescope is an infrared-optimized space telescope scheduled for launch in 201 3. Its 6.5-m diameter primary mirror will collect light from some of the first galaxies formed after the big bang. The Near Infrared camera (NIRCam) will detect the first light from these galaxies, provide the necessary tools for studying the formation of stars, aid in discovering planets around other stars, and adjust the wave front error on the primary mirror (Fig. 1). The instrument and its complement of mechanisms and optics will operate at a cryogenic temperature of 35 K. This paper describes tests and test results of the NIRCam Filter Wheel assembly prototype.

  4. Breeding to adapt agriculture to climate change: affordable phenotyping solutions.

    PubMed

    Araus, José L; Kefauver, Shawn C

    2018-05-28

    Breeding is one of the central pillars of adaptation of crops to climate change. However, phenotyping is a key bottleneck that is limiting breeding efficiency. The awareness of phenotyping as a breeding limitation is not only sustained by the lack of adequate approaches, but also by the perception that phenotyping is an expensive activity. Phenotyping is not just dependent on the choice of appropriate traits and tools (e.g. sensors) but relies on how these tools are deployed on their carrying platforms, the speed and volume of data extraction and analysis (throughput), the handling of spatial variability and characterization of environmental conditions, and finally how all the information is integrated and processed. Affordable high throughput phenotyping aims to achieve reasonably priced solutions for all the components comprising the phenotyping pipeline. This mini-review will cover current and imminent solutions for all these components, from the increasing use of conventional digital RGB cameras, within the category of sensors, to open-access cloud-structured data processing and the use of smartphones. Emphasis will be placed on field phenotyping, which is really the main application for day-to-day phenotyping. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Isotonic Regression Based-Method in Quantitative High-Throughput Screenings for Genotoxicity

    PubMed Central

    Fujii, Yosuke; Narita, Takeo; Tice, Raymond Richard; Takeda, Shunich

    2015-01-01

    Quantitative high-throughput screenings (qHTSs) for genotoxicity are conducted as part of comprehensive toxicology screening projects. The most widely used method is to compare the dose-response data of a wild-type and DNA repair gene knockout mutants, using model-fitting to the Hill equation (HE). However, this method performs poorly when the observed viability does not fit the equation well, as frequently happens in qHTS. More capable methods must be developed for qHTS where large data variations are unavoidable. In this study, we applied an isotonic regression (IR) method and compared its performance with HE under multiple data conditions. When dose-response data were suitable to draw HE curves with upper and lower asymptotes and experimental random errors were small, HE was better than IR, but when random errors were big, there was no difference between HE and IR. However, when the drawn curves did not have two asymptotes, IR showed better performance (p < 0.05, exact paired Wilcoxon test) with higher specificity (65% in HE vs. 96% in IR). In summary, IR performed similarly to HE when dose-response data were optimal, whereas IR clearly performed better in suboptimal conditions. These findings indicate that IR would be useful in qHTS for comparing dose-response data. PMID:26673567

  6. [Research on Spectral Polarization Imaging System Based on Static Modulation].

    PubMed

    Zhao, Hai-bo; Li, Huan; Lin, Xu-ling; Wang, Zheng

    2015-04-01

    The main disadvantages of traditional spectral polarization imaging system are: complex structure, with moving parts, low throughput. A novel method of spectral polarization imaging system is discussed, which is based on static polarization intensity modulation combined with Savart polariscope interference imaging. The imaging system can obtain real-time information of spectral and four Stokes polarization messages. Compared with the conventional methods, the advantages of the imaging system are compactness, low mass and no moving parts, no electrical control, no slit and big throughput. The system structure and the basic theory are introduced. The experimental system is established in the laboratory. The experimental system consists of reimaging optics, polarization intensity module, interference imaging module, and CCD data collecting and processing module. The spectral range is visible and near-infrared (480-950 nm). The white board and the plane toy are imaged by using the experimental system. The ability of obtaining spectral polarization imaging information is verified. The calibration system of static polarization modulation is set up. The statistical error of polarization degree detection is less than 5%. The validity and feasibility of the basic principle is proved by the experimental result. The spectral polarization data captured by the system can be applied to object identification, object classification and remote sensing detection.

  7. Vision-based Nano Robotic System for High-throughput Non-embedded Cell Cutting

    NASA Astrophysics Data System (ADS)

    Shang, Wanfeng; Lu, Haojian; Wan, Wenfeng; Fukuda, Toshio; Shen, Yajing

    2016-03-01

    Cell cutting is a significant task in biology study, but the highly productive non-embedded cell cutting is still a big challenge for current techniques. This paper proposes a vision-based nano robotic system and then realizes automatic non-embedded cell cutting with this system. First, the nano robotic system is developed and integrated with a nanoknife inside an environmental scanning electron microscopy (ESEM). Then, the positions of the nanoknife and the single cell are recognized, and the distance between them is calculated dynamically based on image processing. To guarantee the positioning accuracy and the working efficiency, we propose a distance-regulated speed adapting strategy, in which the moving speed is adjusted intelligently based on the distance between the nanoknife and the target cell. The results indicate that the automatic non-embedded cutting is able to be achieved within 1-2 mins with low invasion benefiting from the high precise nanorobot system and the sharp edge of nanoknife. This research paves a way for the high-throughput cell cutting at cell’s natural condition, which is expected to make significant impact on the biology studies, especially for the in-situ analysis at cellular and subcellular scale, such as cell interaction investigation, neural signal transduction and low invasive cell surgery.

  8. Applying Evolutionary Genetics to Developmental Toxicology and Risk Assessment

    PubMed Central

    Leung, Maxwell C. K.; Procter, Andrew C.; Goldstone, Jared V.; Foox, Jonathan; DeSalle, Robert; Mattingly, Carolyn J.; Siddall, Mark E.; Timme-Laragy, Alicia R.

    2018-01-01

    Evolutionary thinking continues to challenge our views on health and disease. Yet, there is a communication gap between evolutionary biologists and toxicologists in recognizing the connections among developmental pathways, high-throughput screening, and birth defects in humans. To increase our capability in identifying potential developmental toxicants in humans, we propose to apply evolutionary genetics to improve the experimental design and data interpretation with various in vitro and whole-organism models. We review five molecular systems of stress response and update 18 consensual cell-cell signaling pathways that are the hallmark for early development, organogenesis, and differentiation; and revisit the principles of teratology in light of recent advances in high-throughput screening, big data techniques, and systems toxicology. Multiscale systems modeling plays an integral role in the evolutionary approach to cross-species extrapolation. Phylogenetic analysis and comparative bioinformatics are both valuable tools in identifying and validating the molecular initiating events that account for adverse developmental outcomes in humans. The discordance of susceptibility between test species and humans (ontogeny) reflects their differences in evolutionary history (phylogeny). This synthesis not only can lead to novel applications in developmental toxicity and risk assessment, but also can pave the way for applying an evo-devo perspective to the study of developmental origins of health and disease. PMID:28267574

  9. Characterization and Performance of the Cananea Near-infrared Camera (CANICA)

    NASA Astrophysics Data System (ADS)

    Devaraj, R.; Mayya, Y. D.; Carrasco, L.; Luna, A.

    2018-05-01

    We present details of characterization and imaging performance of the Cananea Near-infrared Camera (CANICA) at the 2.1 m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located in Cananea, Sonora, México. CANICA has a HAWAII array with a HgCdTe detector of 1024 × 1024 pixels covering a field of view of 5.5 × 5.5 arcmin2 with a plate scale of 0.32 arcsec/pixel. The camera characterization involved measuring key detector parameters: conversion gain, dark current, readout noise, and linearity. The pixels in the detector have a full-well-depth of 100,000 e‑ with the conversion gain measured to be 5.8 e‑/ADU. The time-dependent dark current was estimated to be 1.2 e‑/sec. Readout noise for correlated double sampled (CDS) technique was measured to be 30 e‑/pixel. The detector shows 10% non-linearity close to the full-well-depth. The non-linearity was corrected within 1% levels for the CDS images. Full-field imaging performance was evaluated by measuring the point spread function, zeropoints, throughput, and limiting magnitude. The average zeropoint value in each filter are J = 20.52, H = 20.63, and K = 20.23. The saturation limit of the detector is about sixth magnitude in all the primary broadbands. CANICA on the 2.1 m OAGH telescope reaches background-limited magnitudes of J = 18.5, H = 17.6, and K = 16.0 for a signal-to-noise ratio of 10 with an integration time of 900 s.

  10. Provenance-aware optimization of workload for distributed data production

    NASA Astrophysics Data System (ADS)

    Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal

    2017-10-01

    Distributed data processing in High Energy and Nuclear Physics (HENP) is a prominent example of big data analysis. Having petabytes of data being processed at tens of computational sites with thousands of CPUs, standard job scheduling approaches either do not address well the problem complexity or are dedicated to one specific aspect of the problem only (CPU, network or storage). Previously we have developed a new job scheduling approach dedicated to distributed data production - an essential part of data processing in HENP (preprocessing in big data terminology). In this contribution, we discuss the load balancing with multiple data sources and data replication, present recent improvements made to our planner and provide results of simulations which demonstrate the advantage against standard scheduling policies for the new use case. Multi-source or provenance is common in computing models of many applications whereas the data may be copied to several destinations. The initial input data set would hence be already partially replicated to multiple locations and the task of the scheduler is to maximize overall computational throughput considering possible data movements and CPU allocation. The studies have shown that our approach can provide a significant gain in overall computational performance in a wide scope of simulations considering realistic size of computational Grid and various input data distribution.

  11. Asymmetric author-topic model for knowledge discovering of big data in toxicogenomics.

    PubMed

    Chung, Ming-Hua; Wang, Yuping; Tang, Hailin; Zou, Wen; Basinger, John; Xu, Xiaowei; Tong, Weida

    2015-01-01

    The advancement of high-throughput screening technologies facilitates the generation of massive amount of biological data, a big data phenomena in biomedical science. Yet, researchers still heavily rely on keyword search and/or literature review to navigate the databases and analyses are often done in rather small-scale. As a result, the rich information of a database has not been fully utilized, particularly for the information embedded in the interactive nature between data points that are largely ignored and buried. For the past 10 years, probabilistic topic modeling has been recognized as an effective machine learning algorithm to annotate the hidden thematic structure of massive collection of documents. The analogy between text corpus and large-scale genomic data enables the application of text mining tools, like probabilistic topic models, to explore hidden patterns of genomic data and to the extension of altered biological functions. In this paper, we developed a generalized probabilistic topic model to analyze a toxicogenomics dataset that consists of a large number of gene expression data from the rat livers treated with drugs in multiple dose and time-points. We discovered the hidden patterns in gene expression associated with the effect of doses and time-points of treatment. Finally, we illustrated the ability of our model to identify the evidence of potential reduction of animal use.

  12. clubber: removing the bioinformatics bottleneck in big data analyses.

    PubMed

    Miller, Maximilian; Zhu, Chengsheng; Bromberg, Yana

    2017-06-13

    With the advent of modern day high-throughput technologies, the bottleneck in biological discovery has shifted from the cost of doing experiments to that of analyzing results. clubber is our automated cluster-load balancing system developed for optimizing these "big data" analyses. Its plug-and-play framework encourages re-use of existing solutions for bioinformatics problems. clubber's goals are to reduce computation times and to facilitate use of cluster computing. The first goal is achieved by automating the balance of parallel submissions across available high performance computing (HPC) resources. Notably, the latter can be added on demand, including cloud-based resources, and/or featuring heterogeneous environments. The second goal of making HPCs user-friendly is facilitated by an interactive web interface and a RESTful API, allowing for job monitoring and result retrieval. We used clubber to speed up our pipeline for annotating molecular functionality of metagenomes. Here, we analyzed the Deepwater Horizon oil-spill study data to quantitatively show that the beach sands have not yet entirely recovered. Further, our analysis of the CAMI-challenge data revealed that microbiome taxonomic shifts do not necessarily correlate with functional shifts. These examples (21 metagenomes processed in 172 min) clearly illustrate the importance of clubber in the everyday computational biology environment.

  13. clubber: removing the bioinformatics bottleneck in big data analyses

    PubMed Central

    Miller, Maximilian; Zhu, Chengsheng; Bromberg, Yana

    2018-01-01

    With the advent of modern day high-throughput technologies, the bottleneck in biological discovery has shifted from the cost of doing experiments to that of analyzing results. clubber is our automated cluster-load balancing system developed for optimizing these “big data” analyses. Its plug-and-play framework encourages re-use of existing solutions for bioinformatics problems. clubber’s goals are to reduce computation times and to facilitate use of cluster computing. The first goal is achieved by automating the balance of parallel submissions across available high performance computing (HPC) resources. Notably, the latter can be added on demand, including cloud-based resources, and/or featuring heterogeneous environments. The second goal of making HPCs user-friendly is facilitated by an interactive web interface and a RESTful API, allowing for job monitoring and result retrieval. We used clubber to speed up our pipeline for annotating molecular functionality of metagenomes. Here, we analyzed the Deepwater Horizon oil-spill study data to quantitatively show that the beach sands have not yet entirely recovered. Further, our analysis of the CAMI-challenge data revealed that microbiome taxonomic shifts do not necessarily correlate with functional shifts. These examples (21 metagenomes processed in 172 min) clearly illustrate the importance of clubber in the everyday computational biology environment. PMID:28609295

  14. Big Spherules near 'Victoria'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This frame from the microscopic imager on NASA's Mars Exploration Rover Opportunity shows spherules up to about 5 millimeters (one-fifth of an inch) in diameter. The camera took this image during the 924th Martian day, or sol, of Opportunity's Mars-surface mission (Aug. 30, 2006), when the rover was about 200 meters (650 feet) north of 'Victoria Crater.'

    Opportunity discovered spherules like these, nicknamed 'blueberries,' at its landing site in 'Eagle Crater,' and investigations determined them to be iron-rich concretions that formed inside deposits soaked with groundwater. However, such concretions were much smaller or absent at the ground surface along much of the rover's trek of more than 5 kilometers (3 miles) southward to Victoria. The big ones showed up again when Opportunity got to the ring, or annulus, of material excavated and thrown outward by the impact that created Victoria Crater. Researchers hypothesize that some layer beneath the surface in Victoria's vicinity was once soaked with water long enough to form the concretions, that the crater-forming impact dispersed some material from that layer, and that Opportunity might encounter that layer in place if the rover drives down into the crater.

  15. Using Auditory Cues to Perceptually Extract Visual Data in Collaborative, Immersive Big-Data Display Systems

    NASA Astrophysics Data System (ADS)

    Lee, Wendy

    The advent of multisensory display systems, such as virtual and augmented reality, has fostered a new relationship between humans and space. Not only can these systems mimic real-world environments, they have the ability to create a new space typology made solely of data. In these spaces, two-dimensional information is displayed in three dimensions, requiring human senses to be used to understand virtual, attention-based elements. Studies in the field of big data have predominately focused on visual representations and extractions of information with little focus on sounds. The goal of this research is to evaluate the most efficient methods of perceptually extracting visual data using auditory stimuli in immersive environments. Using Rensselaer's CRAIVE-Lab, a virtual reality space with 360-degree panorama visuals and an array of 128 loudspeakers, participants were asked questions based on complex visual displays using a variety of auditory cues ranging from sine tones to camera shutter sounds. Analysis of the speed and accuracy of participant responses revealed that auditory cues that were more favorable for localization and were positively perceived were best for data extraction and could help create more user-friendly systems in the future.

  16. The big lobe of 67P/Churyumov-Gerasimenko comet: morphological and spectrophotometric evidences of layering as from OSIRIS data

    NASA Astrophysics Data System (ADS)

    Ferrari, Sabrina; Penasa, L.; La Forgia, F.; Massironi, M.; Naletto, G.; Lazzarin, M.; Fornasier, S.; Hasselmann, P. H.; Lucchetti, A.; Pajola, M.; Ferri, F.; Cambianica, P.; Oklay, N.; Tubiana, C.; Sierks, H.; Lamy, P. L.; Rodrigo, R.; Koschny, D.; Davidsson, B.; Barucci, M. A.; Bertaux, J.-L.; Bertini, I.; Bodewits, D.; Cremonese, G.; Da Deppo, V.; Debei, S.; De Cecco, M.; Deller, J.; Franceschi, M.; Frattin, E.; Fulle, M.; Groussin, O.; Gutiérrez, P. J.; Güttler, C.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Keller, H. U.; Knollenberg, J.; Kührt, E.; Küppers, M.; Lara, L. M.; López-Moreno, J. J.; Marzari, F.; Shi, X.; Simioni, E.; Thomas, N.; Vincent, J.-B.

    2018-06-01

    Between 2014 and 2016, ESA's Rosetta OSIRIS cameras acquired multiple-filters images of the layered nucleus of comet 67P/Churyumov-Gerasimenko, ranging from ultraviolet to near-infrared wavelengths. No correlation between layers disposition and surface spectral variegation has been observed so far. This paper investigates possible spectral differences among decametre-thickness outcropping layers of the biggest lobe of the comet by means of OSIRIS image dataset. A two-classes Maximum Likelihood classification was applied on consolidated outcrops and relative deposits identified on post-perihelion multispectral images of the big lobe. We distinguished multispectral data on the basis of the structural elevation of the onion-shell Ellipsoidal Model of 67P. The spatial distribution of the two classes displays a clear dependence on the structural elevation, with the innermost class resulting over 50% brighter that the outermost one. Consolidated cometary materials located at different structural levels are characterized by different brightness and revealed due to the selective removal of large volumes. This variegation can be attributed to a different texture of the outcrop surface and/or to a different content of refractory materials.

  17. Uav Borne Low Altitude Photogrammetry System

    NASA Astrophysics Data System (ADS)

    Lin, Z.; Su, G.; Xie, F.

    2012-07-01

    In this paper,the aforementioned three major aspects related to the Unmanned Aerial Vehicles (UAV) system for low altitude aerial photogrammetry, i.e., flying platform, imaging sensor system and data processing software, are discussed. First of all, according to the technical requirements about the least cruising speed, the shortest taxiing distance, the level of the flight control and the performance of turbulence flying, the performance and suitability of the available UAV platforms (e.g., fixed wing UAVs, the unmanned helicopters and the unmanned airships) are compared and analyzed. Secondly, considering the restrictions on the load weight of a platform and the resolution pertaining to a sensor, together with the exposure equation and the theory of optical information, the principles of designing self-calibration and self-stabilizing combined wide-angle digital cameras (e.g., double-combined camera and four-combined camera) are placed more emphasis on. Finally, a software named MAP-AT, considering the specialty of UAV platforms and sensors, is developed and introduced. Apart from the common functions of aerial image processing, MAP-AT puts more effort on automatic extraction, automatic checking and artificial aided adding of the tie points for images with big tilt angles. Based on the recommended process for low altitude photogrammetry with UAVs in this paper, more than ten aerial photogrammetry missions have been accomplished, the accuracies of Aerial Triangulation, Digital orthophotos(DOM)and Digital Line Graphs(DLG) of which meet the standard requirement of 1:2000, 1:1000 and 1:500 mapping.

  18. Innovative diagnostics for ITER physics addressed in JET

    NASA Astrophysics Data System (ADS)

    Murari, A.; Edlington, T.; Alfier, A.; Alonso, A.; Andrew, Y.; Arnoux, G.; Beurskens, M.; Coad, P.; Crombe, C.; Gauthier, E.; Giroud, C.; Hidalgo, C.; Hong, S.; Kempenaars, M.; Kiptily, V.; Loarer, T.; Meigs, A.; Pasqualotto, R.; Tala, T.; Contributors, JET-EFDA

    2008-12-01

    In recent years, JET diagnostic capability has been significantly improved to widen the range of physical phenomena that can be studied and thus contribute to the understanding of some ITER relevant issues. The most significant results reported in this paper refer to the plasma wall interactions, the interplay between core and edge physics and fast particles. A synergy between new infrared cameras, visible cameras and spectroscopy diagnostics has allowed investigating a series of new aspects of the plasma wall interactions. The power loads on the plasma facing components of JET main chambers have been assessed at steady state and during transient events like ELMs and disruptions. Evidence of filaments in the edge region of the plasma has been collected with a new fast visible camera and high resolution Thomson scattering. The physics of detached plasmas and some new aspects of dust formation have also been devoted particular attention. The influence of the edge plasma on the core has been investigated with upgraded active spectroscopy, providing new information on momentum transport and the effects of impurity injection on ELMs and ITBs and their interdependence. Given the fact that JET is the only machine with a plasma volume big enough to confine the alphas, a coherent programme of diagnostic developments for the energetic particles has been undertaken. With upgraded γ-ray spectroscopy and a new scintillator probe, it is now possible to study both the redistribution and the losses of the fast particles in various plasma conditions.

  19. Microfluidic Imaging Flow Cytometry by Asymmetric-detection Time-stretch Optical Microscopy (ATOM).

    PubMed

    Tang, Anson H L; Lai, Queenie T K; Chung, Bob M F; Lee, Kelvin C M; Mok, Aaron T Y; Yip, G K; Shum, Anderson H C; Wong, Kenneth K Y; Tsia, Kevin K

    2017-06-28

    Scaling the number of measurable parameters, which allows for multidimensional data analysis and thus higher-confidence statistical results, has been the main trend in the advanced development of flow cytometry. Notably, adding high-resolution imaging capabilities allows for the complex morphological analysis of cellular/sub-cellular structures. This is not possible with standard flow cytometers. However, it is valuable for advancing our knowledge of cellular functions and can benefit life science research, clinical diagnostics, and environmental monitoring. Incorporating imaging capabilities into flow cytometry compromises the assay throughput, primarily due to the limitations on speed and sensitivity in the camera technologies. To overcome this speed or throughput challenge facing imaging flow cytometry while preserving the image quality, asymmetric-detection time-stretch optical microscopy (ATOM) has been demonstrated to enable high-contrast, single-cell imaging with sub-cellular resolution, at an imaging throughput as high as 100,000 cells/s. Based on the imaging concept of conventional time-stretch imaging, which relies on all-optical image encoding and retrieval through the use of ultrafast broadband laser pulses, ATOM further advances imaging performance by enhancing the image contrast of unlabeled/unstained cells. This is achieved by accessing the phase-gradient information of the cells, which is spectrally encoded into single-shot broadband pulses. Hence, ATOM is particularly advantageous in high-throughput measurements of single-cell morphology and texture - information indicative of cell types, states, and even functions. Ultimately, this could become a powerful imaging flow cytometry platform for the biophysical phenotyping of cells, complementing the current state-of-the-art biochemical-marker-based cellular assay. This work describes a protocol to establish the key modules of an ATOM system (from optical frontend to data processing and visualization backend), as well as the workflow of imaging flow cytometry based on ATOM, using human cells and micro-algae as the examples.

  20. Bayesian inference with historical data-based informative priors improves detection of differentially expressed genes.

    PubMed

    Li, Ben; Sun, Zhaonan; He, Qing; Zhu, Yu; Qin, Zhaohui S

    2016-03-01

    Modern high-throughput biotechnologies such as microarray are capable of producing a massive amount of information for each sample. However, in a typical high-throughput experiment, only limited number of samples were assayed, thus the classical 'large p, small n' problem. On the other hand, rapid propagation of these high-throughput technologies has resulted in a substantial collection of data, often carried out on the same platform and using the same protocol. It is highly desirable to utilize the existing data when performing analysis and inference on a new dataset. Utilizing existing data can be carried out in a straightforward fashion under the Bayesian framework in which the repository of historical data can be exploited to build informative priors and used in new data analysis. In this work, using microarray data, we investigate the feasibility and effectiveness of deriving informative priors from historical data and using them in the problem of detecting differentially expressed genes. Through simulation and real data analysis, we show that the proposed strategy significantly outperforms existing methods including the popular and state-of-the-art Bayesian hierarchical model-based approaches. Our work illustrates the feasibility and benefits of exploiting the increasingly available genomics big data in statistical inference and presents a promising practical strategy for dealing with the 'large p, small n' problem. Our method is implemented in R package IPBT, which is freely available from https://github.com/benliemory/IPBT CONTACT: yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Investigating core genetic-and-epigenetic cell cycle networks for stemness and carcinogenic mechanisms, and cancer drug design using big database mining and genome-wide next-generation sequencing data.

    PubMed

    Li, Cheng-Wei; Chen, Bor-Sen

    2016-10-01

    Recent studies have demonstrated that cell cycle plays a central role in development and carcinogenesis. Thus, the use of big databases and genome-wide high-throughput data to unravel the genetic and epigenetic mechanisms underlying cell cycle progression in stem cells and cancer cells is a matter of considerable interest. Real genetic-and-epigenetic cell cycle networks (GECNs) of embryonic stem cells (ESCs) and HeLa cancer cells were constructed by applying system modeling, system identification, and big database mining to genome-wide next-generation sequencing data. Real GECNs were then reduced to core GECNs of HeLa cells and ESCs by applying principal genome-wide network projection. In this study, we investigated potential carcinogenic and stemness mechanisms for systems cancer drug design by identifying common core and specific GECNs between HeLa cells and ESCs. Integrating drug database information with the specific GECNs of HeLa cells could lead to identification of multiple drugs for cervical cancer treatment with minimal side-effects on the genes in the common core. We found that dysregulation of miR-29C, miR-34A, miR-98, and miR-215; and methylation of ANKRD1, ARID5B, CDCA2, PIF1, STAMBPL1, TROAP, ZNF165, and HIST1H2AJ in HeLa cells could result in cell proliferation and anti-apoptosis through NFκB, TGF-β, and PI3K pathways. We also identified 3 drugs, methotrexate, quercetin, and mimosine, which repressed the activated cell cycle genes, ARID5B, STK17B, and CCL2, in HeLa cells with minimal side-effects.

  2. Taming Big Data: An Information Extraction Strategy for Large Clinical Text Corpora.

    PubMed

    Gundlapalli, Adi V; Divita, Guy; Carter, Marjorie E; Redd, Andrew; Samore, Matthew H; Gupta, Kalpana; Trautner, Barbara

    2015-01-01

    Concepts of interest for clinical and research purposes are not uniformly distributed in clinical text available in electronic medical records. The purpose of our study was to identify filtering techniques to select 'high yield' documents for increased efficacy and throughput. Using two large corpora of clinical text, we demonstrate the identification of 'high yield' document sets in two unrelated domains: homelessness and indwelling urinary catheters. For homelessness, the high yield set includes homeless program and social work notes. For urinary catheters, concepts were more prevalent in notes from hospitalized patients; nursing notes accounted for a majority of the high yield set. This filtering will enable customization and refining of information extraction pipelines to facilitate extraction of relevant concepts for clinical decision support and other uses.

  3. Biosignature Discovery for Substance Use Disorders Using Statistical Learning.

    PubMed

    Baurley, James W; McMahan, Christopher S; Ervin, Carolyn M; Pardamean, Bens; Bergen, Andrew W

    2018-02-01

    There are limited biomarkers for substance use disorders (SUDs). Traditional statistical approaches are identifying simple biomarkers in large samples, but clinical use cases are still being established. High-throughput clinical, imaging, and 'omic' technologies are generating data from SUD studies and may lead to more sophisticated and clinically useful models. However, analytic strategies suited for high-dimensional data are not regularly used. We review strategies for identifying biomarkers and biosignatures from high-dimensional data types. Focusing on penalized regression and Bayesian approaches, we address how to leverage evidence from existing studies and knowledge bases, using nicotine metabolism as an example. We posit that big data and machine learning approaches will considerably advance SUD biomarker discovery. However, translation to clinical practice, will require integrated scientific efforts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. The Eukaryotic Pathogen Databases: a functional genomic resource integrating data from human and veterinary parasites.

    PubMed

    Harb, Omar S; Roos, David S

    2015-01-01

    Over the past 20 years, advances in high-throughput biological techniques and the availability of computational resources including fast Internet access have resulted in an explosion of large genome-scale data sets "big data." While such data are readily available for download and personal use and analysis from a variety of repositories, often such analysis requires access to seldom-available computational skills. As a result a number of databases have emerged to provide scientists with online tools enabling the interrogation of data without the need for sophisticated computational skills beyond basic knowledge of Internet browser utility. This chapter focuses on the Eukaryotic Pathogen Databases (EuPathDB: http://eupathdb.org) Bioinformatic Resource Center (BRC) and illustrates some of the available tools and methods.

  5. A novel high-throughput imaging system for automated analyses of avoidance behavior in zebrafish larvae

    PubMed Central

    Pelkowski, Sean D.; Kapoor, Mrinal; Richendrfer, Holly A.; Wang, Xingyue; Colwill, Ruth M.; Creton, Robbert

    2011-01-01

    Early brain development can be influenced by numerous genetic and environmental factors, with long-lasting effects on brain function and behavior. The identification of these factors is facilitated by recent innovations in high-throughput screening. However, large-scale screening in whole organisms remains challenging, in particular when studying changes in brain function or behavior in vertebrate model systems. In this study, we present a novel imaging system for high-throughput analyses of behavior in zebrafish larvae. The three-camera system can image twelve multiwell plates simultaneously and is unique in its ability to provide local visual stimuli in the wells of a multiwell plate. The acquired images are converted into a series of coordinates, which characterize the location and orientation of the larvae. The developed imaging techniques were tested by measuring avoidance behaviors in seven-day-old zebrafish larvae. The system effectively quantified larval avoidance and revealed an increased edge preference in response to a blue or red ‘bouncing ball’ stimulus. Larvae also avoid a bouncing ball stimulus when it is counter-balanced with a stationary ball, but do not avoid blinking balls counter-balanced with a stationary ball. These results indicate that the seven-day-old larvae respond specifically to movement, rather than color, size, or local changes in light intensity. The imaging system and assays for measuring avoidance behavior may be used to screen for genetic and environmental factors that cause developmental brain disorders and for novel drugs that could prevent or treat these disorders. PMID:21549762

  6. A novel high-throughput imaging system for automated analyses of avoidance behavior in zebrafish larvae.

    PubMed

    Pelkowski, Sean D; Kapoor, Mrinal; Richendrfer, Holly A; Wang, Xingyue; Colwill, Ruth M; Creton, Robbert

    2011-09-30

    Early brain development can be influenced by numerous genetic and environmental factors, with long-lasting effects on brain function and behavior. The identification of these factors is facilitated by recent innovations in high-throughput screening. However, large-scale screening in whole organisms remains challenging, in particular when studying changes in brain function or behavior in vertebrate model systems. In this study, we present a novel imaging system for high-throughput analyses of behavior in zebrafish larvae. The three-camera system can image 12 multiwell plates simultaneously and is unique in its ability to provide local visual stimuli in the wells of a multiwell plate. The acquired images are converted into a series of coordinates, which characterize the location and orientation of the larvae. The developed imaging techniques were tested by measuring avoidance behaviors in seven-day-old zebrafish larvae. The system effectively quantified larval avoidance and revealed an increased edge preference in response to a blue or red 'bouncing ball' stimulus. Larvae also avoid a bouncing ball stimulus when it is counter-balanced with a stationary ball, but do not avoid blinking balls counter-balanced with a stationary ball. These results indicate that the seven-day-old larvae respond specifically to movement, rather than color, size, or local changes in light intensity. The imaging system and assays for measuring avoidance behavior may be used to screen for genetic and environmental factors that cause developmental brain disorders and for novel drugs that could prevent or treat these disorders. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. GALAXIES IN THE YOUNG UNIVERSE [left

    NASA Technical Reports Server (NTRS)

    2002-01-01

    This image of a small region of the constellation Sculptor, taken with a ground-based photographic sky survey camera, illustrates the extremely small angular size of a distant galaxy cluster in the night sky. Though this picture encompasses a piece of the sky about the width of the bowl of the Big Dipper, the cluster is so far away it fills a sky area only 1/10th the diameter of the Full Moon. The cluster members are not visible because they are so much fainter than foreground stars. [center] A NASA Hubble Space Telescope (HST) image of the farthest cluster of galaxies in the universe, located at a distance of 12 billion light-years. Because the light from these remote galaxies has taken 12 billion years to reach us, this image is a remarkable glimpse of the primeval universe, at it looked about two billion years after the Big Bang. The cluster contains 14 galaxies, the other objects are largely foreground galaxies. The galaxy cluster lies in front of quasar Q0000-263 in the constellation Sculptor. Presumably the brilliant core of an active galaxy, the quasar provides a beacon for searching for primordial galaxy clusters. The image is the full field view of the Wide Field and Planetary Camera-2, taken on September 6, 1994. The 4.7-hour exposure reveals objects down to 28.5 magnitude. [right] This enlargement shows one of the farthest normal galaxies yet detected, (blob at center right) at a distance of 12 billion light-years (redshift of z=3.330). The galaxy lies 300 million light-years in front of the quasar Q0000-263 (z=4.11, large white blob and spike on left side of frame) and was detected because it absorbs some light from the quasar. The galaxy's spectrum reveals that vigorous star formation is taking place. Credit: Duccio Macchetto (ESA/STScI), Mauro Giavalisco (STScI), and NASA

  8. Corn and sorghum phenotyping using a fixed-wing UAV-based remote sensing system

    NASA Astrophysics Data System (ADS)

    Shi, Yeyin; Murray, Seth C.; Rooney, William L.; Valasek, John; Olsenholler, Jeff; Pugh, N. Ace; Henrickson, James; Bowden, Ezekiel; Zhang, Dongyan; Thomasson, J. Alex

    2016-05-01

    Recent development of unmanned aerial systems has created opportunities in automation of field-based high-throughput phenotyping by lowering flight operational cost and complexity and allowing flexible re-visit time and higher image resolution than satellite or manned airborne remote sensing. In this study, flights were conducted over corn and sorghum breeding trials in College Station, Texas, with a fixed-wing unmanned aerial vehicle (UAV) carrying two multispectral cameras and a high-resolution digital camera. The objectives were to establish the workflow and investigate the ability of UAV-based remote sensing for automating data collection of plant traits to develop genetic and physiological models. Most important among these traits were plant height and number of plants which are currently manually collected with high labor costs. Vegetation indices were calculated for each breeding cultivar from mosaicked and radiometrically calibrated multi-band imagery in order to be correlated with ground-measured plant heights, populations and yield across high genetic-diversity breeding cultivars. Growth curves were profiled with the aerial measured time-series height and vegetation index data. The next step of this study will be to investigate the correlations between aerial measurements and ground truth measured manually in field and from lab tests.

  9. Quantitative phase measurement for wafer-level optics

    NASA Astrophysics Data System (ADS)

    Qu, Weijuan; Wen, Yongfu; Wang, Zhaomin; Yang, Fang; Huang, Lei; Zuo, Chao

    2015-07-01

    Wafer-level-optics now is widely used in smart phone camera, mobile video conferencing or in medical equipment that require tiny cameras. Extracting quantitative phase information has received increased interest in order to quantify the quality of manufactured wafer-level-optics, detect defective devices before packaging, and provide feedback for manufacturing process control, all at the wafer-level for high-throughput microfabrication. We demonstrate two phase imaging methods, digital holographic microscopy (DHM) and Transport-of-Intensity Equation (TIE) to measure the phase of the wafer-level lenses. DHM is a laser-based interferometric method based on interference of two wavefronts. It can perform a phase measurement in a single shot. While a minimum of two measurements of the spatial intensity of the optical wave in closely spaced planes perpendicular to the direction of propagation are needed to do the direct phase retrieval by solving a second-order differential equation, i.e., with a non-iterative deterministic algorithm from intensity measurements using the Transport-of-Intensity Equation (TIE). But TIE is a non-interferometric method, thus can be applied to partial-coherence light. We demonstrated the capability and disability for the two phase measurement methods for wafer-level optics inspection.

  10. Cloud Computing with Context Cameras

    NASA Astrophysics Data System (ADS)

    Pickles, A. J.; Rosing, W. E.

    2016-05-01

    We summarize methods and plans to monitor and calibrate photometric observations with our autonomous, robotic network of 2m, 1m and 40cm telescopes. These are sited globally to optimize our ability to observe time-variable sources. Wide field "context" cameras are aligned with our network telescopes and cycle every ˜2 minutes through BVr'i'z' filters, spanning our optical range. We measure instantaneous zero-point offsets and transparency (throughput) against calibrators in the 5-12m range from the all-sky Tycho2 catalog, and periodically against primary standards. Similar measurements are made for all our science images, with typical fields of view of ˜0.5 degrees. These are matched against Landolt, Stetson and Sloan standards, and against calibrators in the 10-17m range from the all-sky APASS catalog. Such measurements provide pretty good instantaneous flux calibration, often to better than 5%, even in cloudy conditions. Zero-point and transparency measurements can be used to characterize, monitor and inter-compare sites and equipment. When accurate calibrations of Target against Standard fields are required, monitoring measurements can be used to select truly photometric periods when accurate calibrations can be automatically scheduled and performed.

  11. The ASTRO-H X-ray Observatory

    NASA Astrophysics Data System (ADS)

    Takahashi, Tadayuki; Mitsuda, Kazuhisa; Kelley, Richard; Aarts, Henri; Aharonian, Felix; Akamatsu, Hiroki; Akimoto, Fumie; Allen, Steve; Anabuki, Naohisa; Angelini, Lorella; Arnaud, Keith; Asai, Makoto; Audard, Marc; Awaki, Hisamitsu; Azzarello, Philipp; Baluta, Chris; Bamba, Aya; Bando, Nobutaka; Bautz, Mark; Blandford, Roger; Boyce, Kevin; Brown, Greg; Cackett, Ed; Chernyakova, Mara; Coppi, Paolo; Costantini, Elisa; de Plaa, Jelle; den Herder, Jan-Willem; DiPirro, Michael; Done, Chris; Dotani, Tadayasu; Doty, John; Ebisawa, Ken; Eckart, Megan; Enoto, Teruaki; Ezoe, Yuichiro; Fabian, Andrew; Ferrigno, Carlo; Foster, Adam; Fujimoto, Ryuichi; Fukazawa, Yasushi; Funk, Stefan; Furuzawa, Akihiro; Galeazzi, Massimiliano; Gallo, Luigi; Gandhi, Poshak; Gendreau, Keith; Gilmore, Kirk; Haas, Daniel; Haba, Yoshito; Hamaguchi, Kenji; Hatsukade, Isamu; Hayashi, Takayuki; Hayashida, Kiyoshi; Hiraga, Junko; Hirose, Kazuyuki; Hornschemeier, Ann; Hoshino, Akio; Hughes, John; Hwang, Una; Iizuka, Ryo; Inoue, Yoshiyuki; Ishibashi, Kazunori; Ishida, Manabu; Ishimura, Kosei; Ishisaki, Yoshitaka; Ito, Masayuki; Iwata, Naoko; Iyomoto, Naoko; Kaastra, Jelle; Kallman, Timothy; Kamae, Tuneyoshi; Kataoka, Jun; Katsuda, Satoru; Kawahara, Hajime; Kawaharada, Madoka; Kawai, Nobuyuki; Kawasaki, Shigeo; Khangaluyan, Dmitry; Kilbourne, Caroline; Kimura, Masashi; Kinugasa, Kenzo; Kitamoto, Shunji; Kitayama, Tetsu; Kohmura, Takayoshi; Kokubun, Motohide; Kosaka, Tatsuro; Koujelev, Alex; Koyama, Katsuji; Krimm, Hans; Kubota, Aya; Kunieda, Hideyo; LaMassa, Stephanie; Laurent, Philippe; Lebrun, Francois; Leutenegger, Maurice; Limousin, Olivier; Loewenstein, Michael; Long, Knox; Lumb, David; Madejski, Grzegorz; Maeda, Yoshitomo; Makishima, Kazuo; Marchand, Genevieve; Markevitch, Maxim; Matsumoto, Hironori; Matsushita, Kyoko; McCammon, Dan; McNamara, Brian; Miller, Jon; Miller, Eric; Mineshige, Shin; Minesugi, Kenji; Mitsuishi, Ikuyuki; Miyazawa, Takuya; Mizuno, Tsunefumi; Mori, Hideyuki; Mori, Koji; Mukai, Koji; Murakami, Toshio; Murakami, Hiroshi; Mushotzky, Richard; Nagano, Hosei; Nagino, Ryo; Nakagawa, Takao; Nakajima, Hiroshi; Nakamori, Takeshi; Nakazawa, Kazuhiro; Namba, Yoshiharu; Natsukari, Chikara; Nishioka, Yusuke; Nobukawa, Masayoshi; Nomachi, Masaharu; O'Dell, Steve; Odaka, Hirokazu; Ogawa, Hiroyuki; Ogawa, Mina; Ogi, Keiji; Ohashi, Takaya; Ohno, Masanori; Ohta, Masayuki; Okajima, Takashi; Okamoto, Atsushi; Okazaki, Tsuyoshi; Ota, Naomi; Ozaki, Masanobu; Paerels, Fritzs; Paltani, Stéphane; Parmar, Arvind; Petre, Robert; Pohl, Martin; Porter, F. Scott; Ramsey, Brian; Reis, Rubens; Reynolds, Christopher; Russell, Helen; Safi-Harb, Samar; Sakai, Shin-ichiro; Sameshima, Hiroaki; Sanders, Jeremy; Sato, Goro; Sato, Rie; Sato, Yohichi; Sato, Kosuke; Sawada, Makoto; Serlemitsos, Peter; Seta, Hiromi; Shibano, Yasuko; Shida, Maki; Shimada, Takanobu; Shinozaki, Keisuke; Shirron, Peter; Simionescu, Aurora; Simmons, Cynthia; Smith, Randall; Sneiderman, Gary; Soong, Yang; Stawarz, Lukasz; Sugawara, Yasuharu; Sugita, Hiroyuki; Sugita, Satoshi; Szymkowiak, Andrew; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takeda, Shin-ichiro; Takei, Yoh; Tamagawa, Toru; Tamura, Takayuki; Tamura, Keisuke; Tanaka, Takaaki; Tanaka, Yasuo; Tashiro, Makoto; Tawara, Yuzuru; Terada, Yukikatsu; Terashima, Yuichi; Tombesi, Francesco; Tomida, Hiroshi; Tsuboi, Yohko; Tsujimoto, Masahiro; Tsunemi, Hiroshi; Tsuru, Takeshi; Uchida, Hiroyuki; Uchiyama, Yasunobu; Uchiyama, Hideki; Ueda, Yoshihiro; Ueno, Shiro; Uno, Shinichiro; Urry, Meg; Ursino, Eugenio; de Vries, Cor; Wada, Atsushi; Watanabe, Shin; Werner, Norbert; White, Nicholas; Yamada, Takahiro; Yamada, Shinya; Yamaguchi, Hiroya; Yamasaki, Noriko; Yamauchi, Shigeo; Yamauchi, Makoto; Yatsu, Yoichi; Yonetoku, Daisuke; Yoshida, Atsumasa; Yuasa, Takayuki

    2012-09-01

    The joint JAXA/NASA ASTRO-H mission is the sixth in a series of highly successful X-ray missions initiated by the Institute of Space and Astronautical Science (ISAS). ASTRO-H will investigate the physics of the highenergy universe via a suite of four instruments, covering a very wide energy range, from 0.3 keV to 600 keV. These instruments include a high-resolution, high-throughput spectrometer sensitive over 0.3-12 keV with high spectral resolution of ΔE ≦ 7 eV, enabled by a micro-calorimeter array located in the focal plane of thin-foil X-ray optics; hard X-ray imaging spectrometers covering 5-80 keV, located in the focal plane of multilayer-coated, focusing hard X-ray mirrors; a wide-field imaging spectrometer sensitive over 0.4-12 keV, with an X-ray CCD camera in the focal plane of a soft X-ray telescope; and a non-focusing Compton-camera type soft gamma-ray detector, sensitive in the 40-600 keV band. The simultaneous broad bandpass, coupled with high spectral resolution, will enable the pursuit of a wide variety of important science themes.

  12. Thermographic Patterns of the Upper and Lower Limbs: Baseline Data

    PubMed Central

    Cassar, Kevin; Camilleri, Kenneth P.; De Raffaele, Clifford; Mizzi, Stephen; Cristina, Stefania

    2015-01-01

    Objectives. To collect normative baseline data and identify any significant differences between hand and foot thermographic distribution patterns in a healthy adult population. Design. A single-centre, randomized, prospective study. Methods. Thermographic data was acquired using a FLIR camera for the data acquisition of both plantar and dorsal aspects of the feet, volar aspects of the hands, and anterior aspects of the lower limbs under controlled climate conditions. Results. There is general symmetry in skin temperature between the same regions in contralateral limbs, in terms of both magnitude and pattern. There was also minimal intersubject temperature variation with a consistent temperature pattern in toes and fingers. The thumb is the warmest digit with the temperature falling gradually between the 2nd and the 5th fingers. The big toe and the 5th toe are the warmest digits with the 2nd to the 4th toes being cooler. Conclusion. Measurement of skin temperature of the limbs using a thermal camera is feasible and reproducible. Temperature patterns in fingers and toes are consistent with similar temperatures in contralateral limbs in healthy subjects. This study provides the basis for further research to assess the clinical usefulness of thermography in the diagnosis of vascular insufficiency. PMID:25648145

  13. The role of three-dimensional high-definition laparoscopic surgery for gynaecology.

    PubMed

    Usta, Taner A; Gundogdu, Elif C

    2015-08-01

    This article reviews the potential benefits and disadvantages of new three-dimensional (3D) high-definition laparoscopic surgery for gynaecology. With the new-generation 3D high-definition laparoscopic vision systems (LVSs), operation time and learning period are reduced and procedural error margin is decreased. New-generation 3D high-definition LVSs enable to reduce operation time both for novice and experienced surgeons. Headache, eye fatigue or nausea reported with first-generation systems are not different than two-dimensional (2D) LVSs. The system's being more expensive, having the obligation to wear glasses, big and heavy camera probe in some of the devices are accounted for negative aspects of the system that need to be improved. Depth loss in tissues in 2D LVSs and associated adverse events can be eliminated with 3D high-definition LVSs. By virtue of faster learning curve, shorter operation time, reduced error margin and lack of side-effects reported by surgeons with first-generation systems, 3D LVSs seem to be a strong competition to classical laparoscopic imaging systems. Thanks to technological advancements, using lighter and smaller cameras and monitors without glasses is in the near future.

  14. Boosting Big National Lab Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleese van Dam, Kerstin

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example themore » validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a tissue sample and the gradual effect is observed as more of the substance is injected, providing better insights into the natural processes that are occurring, as well as result driven sampling adjustment to capture particularly interesting features --- as they emerge. The Department of Energy’s Pacific Northwest National Laboratory (PNNL) is recognized for it’s expertise in the development of new measurement techniques and their application to challenges of national importance. So it was obvious to us to address the need for in-situ analysis of large scale experimental data. We have a wide range of experimental instruments on site, in facilities such as DOE’s national scientific user facility, the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL). Commonly, scientists would create an individual analysis pipeline for each of those instruments; but even the same type of instrument would not necessarily share the same analysis tools. With the rapid increase of data volumes and rates we were facing two key challenges: how to bring a wider set of capabilities to bear to achieve in-situ analysis, and how to do so across a wide range of heterogeneous instruments at affordable costs and in a reasonable timeframe. We decided to take an unconventional approach to the problem, rather than developing customized, one-off solutions for specific instruments we wanted to explore if a more common solution could be found that would go beyond shared, basic infrastructures such as data movement and workflow engines.« less

  15. DFT, Its Impact on Condensed Matter and on ``Materials-Genome'' Research

    NASA Astrophysics Data System (ADS)

    Scheffler, Matthias

    About 40 years ago, two seminal works demonstrated the power of density-functional theory (DFT) for real materials. These studies by Moruzzi, Janak, and Williams on metals and Yin and Cohen on semiconductors visualized the spatial distribution of electrons, predicted the equation of state of solids, crystal stability, pressure-induced phase transitions, and more. They also stressed the importance of identifying trends by looking at many systems (e.g. the whole transition-metal series). Since then, the field has seen numerous applications of DFT to solids, liquids, defects, surfaces, and interfaces providing important descriptions and explanations as well as predictions of experimentally not yet identified systems. - ∖ ∖ About 10 years ago, G. Ceder and his group [Ref. 3 and references therein] started with high-throughput screening calculations in the spirit of what in 2011 became the ``Materials Genome Initiative''. The idea of high-throughput screening is old (a key example is the ammonia catalyst found by A. Mittasch at BASF more than 100 years ago), but it is now increasingly becoming clear that big data of materials does not only provide direct information but that the data is structured. This enables interpolation, (modest) extrapolation, and new routes towards understanding [Ref. 5 and references therein]. - ∖ ∖ The amount of data created by ``computational materials science'' is significant. For instance, the NoMaD Repository (which includes also data from other repositories, e.g. AFLOWLIB and OQMD) now holds more than 18 million total-energy calculations. In fact, the amount of data of computational materials science is steadily increasing, and about hundred million CPU core hours are nowadays used every day, worldwide, for DFT calculations for materials. - ∖ ∖ The talk will summarize this enormous impact of DFT on materials science, and it will address the next steps, e.g. the issue how to exploit big data of materials for doing forefront research, how to find (hidden) structure in the data in order to advance materials science, identify new scientific phenomena, and to provide support towards industrial applications. The NOMAD Laboratory Center of Excellence, European Union's Horizon 2020 research and innovation program, Grant agreement no. 676580.

  16. Improving the Calibration of the SN Ia Anchor Datasets with a Bayesian Hierarchal Model

    NASA Astrophysics Data System (ADS)

    Currie, Miles; Rubin, David

    2018-01-01

    Inter-survey calibration remains one of the largest systematic uncertainties in SN Ia cosmology today. Ideally, each survey would measure their system throughputs and observe well characterized spectrophotometric standard stars, but many important surveys have not done so. For these surveys, we calibrate using tertiary survey stars tied to SDSS and Pan-STARRS. We improve on previous efforts by taking the spatially variable response of each telescope/camera into account, and using improved color transformations in the surveys’ natural instrumental photometric system. We use a global hierarchical model of the data, automatically providing a covariance matrix of magnitude offsets and bandpass shifts which reduces the systematic uncertainty in inter-survey calibration, thereby providing better cosmological constraints.

  17. A Near-Infrared Spectrometer to Measure Zodiacal Light Absorption Spectrum

    NASA Technical Reports Server (NTRS)

    Kutyrev, A. S.; Arendt, R.; Dwek, E.; Kimble, R.; Moseley, S. H.; Rapchun, D.; Silverberg, R. F.

    2010-01-01

    We have developed a high throughput infrared spectrometer for zodiacal light fraunhofer lines measurements. The instrument is based on a cryogenic dual silicon Fabry-Perot etalon which is designed to achieve high signal to noise Fraunhofer line profile measurements. Very large aperture silicon Fabry-Perot etalons and fast camera optics make these measurements possible. The results of the absorption line profile measurements will provide a model free measure of the zodiacal Light intensity in the near infrared. The knowledge of the zodiacal light brightness is crucial for accurate subtraction of zodiacal light foreground for accurate measure of the extragalactic background light after the subtraction of zodiacal light foreground. We present the final design of the instrument and the first results of its performance.

  18. Adaptive optics system application for solar telescope

    NASA Astrophysics Data System (ADS)

    Lukin, V. P.; Grigor'ev, V. M.; Antoshkin, L. V.; Botugina, N. N.; Emaleev, O. N.; Konyaev, P. A.; Kovadlo, P. G.; Krivolutskiy, N. P.; Lavrionova, L. N.; Skomorovski, V. I.

    2008-07-01

    The possibility of applying adaptive correction to ground-based solar astronomy is considered. Several experimental systems for image stabilization are described along with the results of their tests. Using our work along several years and world experience in solar adaptive optics (AO) we are assuming to obtain first light to the end of 2008 for the first Russian low order ANGARA solar AO system on the Big Solar Vacuum Telescope (BSVT) with 37 subapertures Shack-Hartmann wavefront sensor based of our modified correlation tracker algorithm, DALSTAR video camera, 37 elements deformable bimorph mirror, home made fast tip-tip mirror with separate correlation tracker. Too strong daytime turbulence is on the BSVT site and we are planning to obtain a partial correction for part of Sun surface image.

  19. The European perspective for LSST

    NASA Astrophysics Data System (ADS)

    Gangler, Emmanuel

    2017-06-01

    LSST is a next generation telescope that will produce an unprecedented data flow. The project goal is to deliver data products such as images and catalogs thus enabling scientific analysis for a wide community of users. As a large scale survey, LSST data will be complementary with other facilities in a wide range of scientific domains, including data from ESA or ESO. European countries have invested in LSST since 2007, in the construction of the camera as well as in the computing effort. This latter will be instrumental in designing the next step: how to distribute LSST data to Europe. Astroinformatics challenges for LSST indeed includes not only the analysis of LSST big data, but also the practical efficiency of the data access.

  20. Cardiac action potential imaging

    NASA Astrophysics Data System (ADS)

    Tian, Qinghai; Lipp, Peter; Kaestner, Lars

    2013-06-01

    Action potentials in cardiac myocytes have durations in the order of magnitude of 100 milliseconds. In biomedical investigations the documentation of the occurrence of action potentials is often not sufficient, but a recording of the shape of an action potential allows a functional estimation of several molecular players. Therefore a temporal resolution of around 500 images per second is compulsory. In the past such measurements have been performed with photometric approaches limiting the measurement to one cell at a time. In contrast, imaging allows reading out several cells at a time with additional spatial information. Recent developments in camera technologies allow the acquisition with the required speed and sensitivity. We performed action potential imaging on isolated adult cardiomyocytes of guinea pigs utilizing the fluorescent membrane potential sensor di-8-ANEPPS and latest electron-multiplication CCD as well as scientific CMOS cameras of several manufacturers. Furthermore, we characterized the signal to noise ratio of action potential signals of varying sets of cameras, dye concentrations and objective lenses. We ensured that di-8-ANEPPS itself did not alter action potentials by avoiding concentrations above 5 μM. Based on these results we can conclude that imaging is a reliable method to read out action potentials. Compared to conventional current-clamp experiments, this optical approach allows a much higher throughput and due to its contact free concept leaving the cell to a much higher degree undisturbed. Action potential imaging based on isolated adult cardiomyocytes can be utilized in pharmacological cardiac safety screens bearing numerous advantages over approaches based on heterologous expression of hERG channels in cell lines.

  1. Research highlights: microfluidics meets big data.

    PubMed

    Tseng, Peter; Weaver, Westbrook M; Masaeli, Mahdokht; Owsley, Keegan; Di Carlo, Dino

    2014-03-07

    In this issue we highlight a collection of recent work in which microfluidic parallelization and automation have been employed to address the increasing need for large amounts of quantitative data concerning cellular function--from correlating microRNA levels to protein expression, increasing the throughput and reducing the noise when studying protein dynamics in single-cells, and understanding how signal dynamics encodes information. The painstaking dissection of cellular pathways one protein at a time appears to be coming to an end, leading to more rapid discoveries which will inevitably translate to better cellular control--in producing useful gene products and treating disease at the individual cell level. From these studies it is also clear that development of large scale mutant or fusion libraries, automation of microscopy, image analysis, and data extraction will be key components as microfluidics contributes its strengths to aid systems biology moving forward.

  2. MDTM: Optimizing Data Transfer using Multicore-Aware I/O Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Liang; Demar, Phil; Wu, Wenji

    2017-05-09

    Bulk data transfer is facing significant challenges in the coming era of big data. There are multiple performance bottlenecks along the end-to-end path from the source to destination storage system. The limitations of current generation data transfer tools themselves can have a significant impact on end-to-end data transfer rates. In this paper, we identify the issues that lead to underperformance of these tools, and present a new data transfer tool with an innovative I/O scheduler called MDTM. The MDTM scheduler exploits underlying multicore layouts to optimize throughput by reducing delay and contention for I/O reading and writing operations. With ourmore » evaluations, we show how MDTM successfully avoids NUMA-based congestion and significantly improves end-to-end data transfer rates across high-speed wide area networks.« less

  3. MDTM: Optimizing Data Transfer using Multicore-Aware I/O Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Liang; Demar, Phil; Wu, Wenji

    2017-01-01

    Bulk data transfer is facing significant challenges in the coming era of big data. There are multiple performance bottlenecks along the end-to-end path from the source to destination storage system. The limitations of current generation data transfer tools themselves can have a significant impact on end-to-end data transfer rates. In this paper, we identify the issues that lead to underperformance of these tools, and present a new data transfer tool with an innovative I/O scheduler called MDTM. The MDTM scheduler exploits underlying multicore layouts to optimize throughput by reducing delay and contention for I/O reading and writing operations. With ourmore » evaluations, we show how MDTM successfully avoids NUMA-based congestion and significantly improves end-to-end data transfer rates across high-speed wide area networks.« less

  4. Line-Focused Optical Excitation of Parallel Acoustic Focused Sample Streams for High Volumetric and Analytical Rate Flow Cytometry.

    PubMed

    Kalb, Daniel M; Fencl, Frank A; Woods, Travis A; Swanson, August; Maestas, Gian C; Juárez, Jaime J; Edwards, Bruce S; Shreve, Andrew P; Graves, Steven W

    2017-09-19

    Flow cytometry provides highly sensitive multiparameter analysis of cells and particles but has been largely limited to the use of a single focused sample stream. This limits the analytical rate to ∼50K particles/s and the volumetric rate to ∼250 μL/min. Despite the analytical prowess of flow cytometry, there are applications where these rates are insufficient, such as rare cell analysis in high cellular backgrounds (e.g., circulating tumor cells and fetal cells in maternal blood), detection of cells/particles in large dilute samples (e.g., water quality, urine analysis), or high-throughput screening applications. Here we report a highly parallel acoustic flow cytometer that uses an acoustic standing wave to focus particles into 16 parallel analysis points across a 2.3 mm wide optical flow cell. A line-focused laser and wide-field collection optics are used to excite and collect the fluorescence emission of these parallel streams onto a high-speed camera for analysis. With this instrument format and fluorescent microsphere standards, we obtain analysis rates of 100K/s and flow rates of 10 mL/min, while maintaining optical performance comparable to that of a commercial flow cytometer. The results with our initial prototype instrument demonstrate that the integration of key parallelizable components, including the line-focused laser, particle focusing using multinode acoustic standing waves, and a spatially arrayed detector, can increase analytical and volumetric throughputs by orders of magnitude in a compact, simple, and cost-effective platform. Such instruments will be of great value to applications in need of high-throughput yet sensitive flow cytometry analysis.

  5. Wavelength Scanning with a Tilting Interference Filter for Glow-Discharge Elemental Imaging.

    PubMed

    Storey, Andrew P; Ray, Steven J; Hoffmann, Volker; Voronov, Maxim; Engelhard, Carsten; Buscher, Wolfgang; Hieftje, Gary M

    2017-06-01

    Glow discharges have long been used for depth profiling and bulk analysis of solid samples. In addition, over the past decade, several methods of obtaining lateral surface elemental distributions have been introduced, each with its own strengths and weaknesses. Challenges for each of these techniques are acceptable optical throughput and added instrumental complexity. Here, these problems are addressed with a tilting-filter instrument. A pulsed glow discharge is coupled to an optical system comprising an adjustable-angle tilting filter, collimating and imaging lenses, and a gated, intensified charge-coupled device (CCD) camera, which together provide surface elemental mapping of solid samples. The tilting-filter spectrometer is instrumentally simpler, produces less image distortion, and achieves higher optical throughput than a monochromator-based instrument, but has a much more limited tunable spectral range and poorer spectral resolution. As a result, the tilting-filter spectrometer is limited to single-element or two-element determinations, and only when the target spectral lines fall within an appropriate spectral range and can be spectrally discerned. Spectral interferences that result from heterogeneous impurities can be flagged and overcome by observing the spatially resolved signal response across the available tunable spectral range. The instrument has been characterized and evaluated for the spatially resolved analysis of glow-discharge emission from selected but representative samples.

  6. Development of new photon-counting detectors for single-molecule fluorescence microscopy.

    PubMed

    Michalet, X; Colyer, R A; Scalia, G; Ingargiola, A; Lin, R; Millaud, J E; Weiss, S; Siegmund, Oswald H W; Tremsin, Anton S; Vallerga, John V; Cheng, A; Levi, M; Aharoni, D; Arisaka, K; Villa, F; Guerrieri, F; Panzeri, F; Rech, I; Gulinatti, A; Zappa, F; Ghioni, M; Cova, S

    2013-02-05

    Two optical configurations are commonly used in single-molecule fluorescence microscopy: point-like excitation and detection to study freely diffusing molecules, and wide field illumination and detection to study surface immobilized or slowly diffusing molecules. Both approaches have common features, but also differ in significant aspects. In particular, they use different detectors, which share some requirements but also have major technical differences. Currently, two types of detectors best fulfil the needs of each approach: single-photon-counting avalanche diodes (SPADs) for point-like detection, and electron-multiplying charge-coupled devices (EMCCDs) for wide field detection. However, there is room for improvements in both cases. The first configuration suffers from low throughput owing to the analysis of data from a single location. The second, on the other hand, is limited to relatively low frame rates and loses the benefit of single-photon-counting approaches. During the past few years, new developments in point-like and wide field detectors have started addressing some of these issues. Here, we describe our recent progresses towards increasing the throughput of single-molecule fluorescence spectroscopy in solution using parallel arrays of SPADs. We also discuss our development of large area photon-counting cameras achieving subnanosecond resolution for fluorescence lifetime imaging applications at the single-molecule level.

  7. Development of new photon-counting detectors for single-molecule fluorescence microscopy

    PubMed Central

    Michalet, X.; Colyer, R. A.; Scalia, G.; Ingargiola, A.; Lin, R.; Millaud, J. E.; Weiss, S.; Siegmund, Oswald H. W.; Tremsin, Anton S.; Vallerga, John V.; Cheng, A.; Levi, M.; Aharoni, D.; Arisaka, K.; Villa, F.; Guerrieri, F.; Panzeri, F.; Rech, I.; Gulinatti, A.; Zappa, F.; Ghioni, M.; Cova, S.

    2013-01-01

    Two optical configurations are commonly used in single-molecule fluorescence microscopy: point-like excitation and detection to study freely diffusing molecules, and wide field illumination and detection to study surface immobilized or slowly diffusing molecules. Both approaches have common features, but also differ in significant aspects. In particular, they use different detectors, which share some requirements but also have major technical differences. Currently, two types of detectors best fulfil the needs of each approach: single-photon-counting avalanche diodes (SPADs) for point-like detection, and electron-multiplying charge-coupled devices (EMCCDs) for wide field detection. However, there is room for improvements in both cases. The first configuration suffers from low throughput owing to the analysis of data from a single location. The second, on the other hand, is limited to relatively low frame rates and loses the benefit of single-photon-counting approaches. During the past few years, new developments in point-like and wide field detectors have started addressing some of these issues. Here, we describe our recent progresses towards increasing the throughput of single-molecule fluorescence spectroscopy in solution using parallel arrays of SPADs. We also discuss our development of large area photon-counting cameras achieving subnanosecond resolution for fluorescence lifetime imaging applications at the single-molecule level. PMID:23267185

  8. MultiSense: A Multimodal Sensor Tool Enabling the High-Throughput Analysis of Respiration.

    PubMed

    Keil, Peter; Liebsch, Gregor; Borisjuk, Ljudmilla; Rolletschek, Hardy

    2017-01-01

    The high-throughput analysis of respiratory activity has become an important component of many biological investigations. Here, a technological platform, denoted the "MultiSense tool," is described. The tool enables the parallel monitoring of respiration in 100 samples over an extended time period, by dynamically tracking the concentrations of oxygen (O 2 ) and/or carbon dioxide (CO 2 ) and/or pH within an airtight vial. Its flexible design supports the quantification of respiration based on either oxygen consumption or carbon dioxide release, thereby allowing for the determination of the physiologically significant respiratory quotient (the ratio between the quantities of CO 2 released and the O 2 consumed). It requires an LED light source to be mounted above the sample, together with a CCD camera system, adjusted to enable the capture of analyte-specific wavelengths, and fluorescent sensor spots inserted into the sample vial. Here, a demonstration is given of the use of the MultiSense tool to quantify respiration in imbibing plant seeds, for which an appropriate step-by-step protocol is provided. The technology can be easily adapted for a wide range of applications, including the monitoring of gas exchange in any kind of liquid culture system (algae, embryo and tissue culture, cell suspensions, microbial cultures).

  9. Automating fruit fly Drosophila embryo injection for high throughput transgenic studies

    NASA Astrophysics Data System (ADS)

    Cornell, E.; Fisher, W. W.; Nordmeyer, R.; Yegian, D.; Dong, M.; Biggin, M. D.; Celniker, S. E.; Jin, J.

    2008-01-01

    To decipher and manipulate the 14 000 identified Drosophila genes, there is a need to inject a large number of embryos with transgenes. We have developed an automated instrument for high throughput injection of Drosophila embryos. It was built on an inverted microscope, equipped with a motorized xy stage, autofocus, a charge coupled device camera, and an injection needle mounted on a high speed vertical stage. A novel, micromachined embryo alignment device was developed to facilitate the arrangement of a large number of eggs. The control system included intelligent and dynamic imaging and analysis software and an embryo injection algorithm imitating a human operator. Once the injection needle and embryo slide are loaded, the software automatically images and characterizes each embryo and subsequently injects DNA into all suitable embryos. The ability to program needle flushing and monitor needle status after each injection ensures reliable delivery of biomaterials. Using this instrument, we performed a set of transformation injection experiments. The robot achieved injection speeds and transformation efficiencies comparable to those of a skilled human injector. Because it can be programed to allow injection at various locations in the embryo, such as the anterior pole or along the dorsal or ventral axes, this system is also suitable for injection of general biochemicals, including drugs and RNAi.

  10. AutoClickChem: click chemistry in silico.

    PubMed

    Durrant, Jacob D; McCammon, J Andrew

    2012-01-01

    Academic researchers and many in industry often lack the financial resources available to scientists working in "big pharma." High costs include those associated with high-throughput screening and chemical synthesis. In order to address these challenges, many researchers have in part turned to alternate methodologies. Virtual screening, for example, often substitutes for high-throughput screening, and click chemistry ensures that chemical synthesis is fast, cheap, and comparatively easy. Though both in silico screening and click chemistry seek to make drug discovery more feasible, it is not yet routine to couple these two methodologies. We here present a novel computer algorithm, called AutoClickChem, capable of performing many click-chemistry reactions in silico. AutoClickChem can be used to produce large combinatorial libraries of compound models for use in virtual screens. As the compounds of these libraries are constructed according to the reactions of click chemistry, they can be easily synthesized for subsequent testing in biochemical assays. Additionally, in silico modeling of click-chemistry products may prove useful in rational drug design and drug optimization. AutoClickChem is based on the pymolecule toolbox, a framework that may facilitate the development of future python-based programs that require the manipulation of molecular models. Both the pymolecule toolbox and AutoClickChem are released under the GNU General Public License version 3 and are available for download from http://autoclickchem.ucsd.edu.

  11. Extraction of drainage networks from large terrain datasets using high throughput computing

    NASA Astrophysics Data System (ADS)

    Gong, Jianya; Xie, Jibo

    2009-02-01

    Advanced digital photogrammetry and remote sensing technology produces large terrain datasets (LTD). How to process and use these LTD has become a big challenge for GIS users. Extracting drainage networks, which are basic for hydrological applications, from LTD is one of the typical applications of digital terrain analysis (DTA) in geographical information applications. Existing serial drainage algorithms cannot deal with large data volumes in a timely fashion, and few GIS platforms can process LTD beyond the GB size. High throughput computing (HTC), a distributed parallel computing mode, is proposed to improve the efficiency of drainage networks extraction from LTD. Drainage network extraction using HTC involves two key issues: (1) how to decompose the large DEM datasets into independent computing units and (2) how to merge the separate outputs into a final result. A new decomposition method is presented in which the large datasets are partitioned into independent computing units using natural watershed boundaries instead of using regular 1-dimensional (strip-wise) and 2-dimensional (block-wise) decomposition. Because the distribution of drainage networks is strongly related to watershed boundaries, the new decomposition method is more effective and natural. The method to extract natural watershed boundaries was improved by using multi-scale DEMs instead of single-scale DEMs. A HTC environment is employed to test the proposed methods with real datasets.

  12. AutoClickChem: Click Chemistry in Silico

    PubMed Central

    Durrant, Jacob D.; McCammon, J. Andrew

    2012-01-01

    Academic researchers and many in industry often lack the financial resources available to scientists working in “big pharma.” High costs include those associated with high-throughput screening and chemical synthesis. In order to address these challenges, many researchers have in part turned to alternate methodologies. Virtual screening, for example, often substitutes for high-throughput screening, and click chemistry ensures that chemical synthesis is fast, cheap, and comparatively easy. Though both in silico screening and click chemistry seek to make drug discovery more feasible, it is not yet routine to couple these two methodologies. We here present a novel computer algorithm, called AutoClickChem, capable of performing many click-chemistry reactions in silico. AutoClickChem can be used to produce large combinatorial libraries of compound models for use in virtual screens. As the compounds of these libraries are constructed according to the reactions of click chemistry, they can be easily synthesized for subsequent testing in biochemical assays. Additionally, in silico modeling of click-chemistry products may prove useful in rational drug design and drug optimization. AutoClickChem is based on the pymolecule toolbox, a framework that may facilitate the development of future python-based programs that require the manipulation of molecular models. Both the pymolecule toolbox and AutoClickChem are released under the GNU General Public License version 3 and are available for download from http://autoclickchem.ucsd.edu. PMID:22438795

  13. A High-Throughput Assay for Rho Guanine Nucleotide Exchange Factors Based on the Transcreener GDP Assay.

    PubMed

    Reichman, Melvin; Schabdach, Amanda; Kumar, Meera; Zielinski, Tom; Donover, Preston S; Laury-Kleintop, Lisa D; Lowery, Robert G

    2015-12-01

    Ras homologous (Rho) family GTPases act as molecular switches controlling cell growth, movement, and gene expression by cycling between inactive guanosine diphosphate (GDP)- and active guanosine triphosphate (GTP)-bound conformations. Guanine nucleotide exchange factors (GEFs) positively regulate Rho GTPases by accelerating GDP dissociation to allow formation of the active, GTP-bound complex. Rho proteins are directly involved in cancer pathways, especially cell migration and invasion, and inhibiting GEFs holds potential as a therapeutic strategy to diminish Rho-dependent oncogenesis. Methods for measuring GEF activity suitable for high-throughput screening (HTS) are limited. We developed a simple, generic biochemical assay method for measuring GEF activity based on the fact that GDP dissociation is generally the rate-limiting step in the Rho GTPase catalytic cycle, and thus addition of a GEF causes an increase in steady-state GTPase activity. We used the Transcreener GDP Assay, which relies on selective immunodetection of GDP, to measure the GEF-dependent stimulation of steady-state GTP hydrolysis by small GTPases using Dbs (Dbl's big sister) as a GEF for Cdc42, RhoA, and RhoB. The assay is well suited for HTS, with a homogenous format and far red fluorescence polarization (FP) readout, and it should be broadly applicable to diverse Rho GEF/GTPase pairs. © 2015 Society for Laboratory Automation and Screening.

  14. Miniaturizing 3D assay for high-throughput drug and genetic screens for small patient-derived tumor samples (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Rotem, Asaf; Garraway, Levi; Su, Mei-Ju; Basu, Anindita; Regev, Aviv; Struhl, Kevin

    2017-02-01

    Three-dimensional growth conditions reflect the natural environment of cancer cells and are crucial to be performed at drug screens. We developed a 3D assay for cellular transformation that involves growth in low attachment (GILA) conditions and is strongly correlated with the 50-year old benchmark assay-soft agar. Using GILA, we performed high-throughput screens for drugs and genes that selectively inhibit or increase transformation, but not proliferation. This phenotypic approach is complementary to our genetic approach that utilizes single-cell RNA-sequencing of a patient sample to identify putative oncogenes that confer sensitivity to drugs designed to specifically inhibit the identified oncoprotein. Currently, we are dealing with a big challenge in our field- the limited number of cells that might be extracted from a biopsy. Small patient-derived samples are hard to test in the traditional multiwell plate and it will be helpful to minimize the culture area and the experimental system. We managed to design a suitable microfluidic device for limited number of cells and perform the assay using image analysis. We aim to test drugs on tumor cells, outside of the patient body- and recommend on the ideal treatment that is tailored to the individual. This device will help to minimize biopsy-sampling volumes and minimize interventions in the patient's tumor.

  15. NASA Photographer Prepares to Film a Mercury Capsule

    NASA Image and Video Library

    1959-06-21

    National Aeronautics and Space Administration (NASA) photographer Arthur Laufman sets up a camera to film a Mercury capsule that was constructed by the Lewis Research Center staff. Lewis engineers and mechanics built two of the capsules for the upcoming Big Joe launches in September 1959. Big Joe was an attempt early in Project Mercury to use a full-scale Atlas booster to simulate the reentry of a mock-up Mercury capsule without actually placing it in orbit. The Photographic Branch, referred to as the Photo Lab, was part of the center’s Technical Reports Division. Originally the group performed normal and high-speed still image and motion picture photography. The photographers documented construction, performed publicity work, created images for reports, photographed data on manometer boards, and recorded test footage. Laufman joined the Photo Lab staff in 1948 and began producing full-length technical films as a tool to educate those outside of the agency on the research being conducted at Lewis. He worked with engineers to determine proper subjects for these films and develop a script. Laufman not only filmed tests, but also supporting footage of facilities, models, and staff members. He then edited the footage and added audio, visuals, and narration. The film masters were assigned standard identification numbers and add to the Photo Lab’s catalogue.

  16. Orthogonal strip HPGe planar SmartPET detectors in Compton configuration

    NASA Astrophysics Data System (ADS)

    Boston, H. C.; Gillam, J.; Boston, A. J.; Cooper, R. J.; Cresswell, J.; Grint, A. N.; Mather, A. R.; Nolan, P. J.; Scraggs, D. P.; Turk, G.; Hall, C. J.; Lazarus, I.; Berry, A.; Beveridge, T.; Lewis, R.

    2007-10-01

    The evolution of Germanium detector technology over the last decade has lead to the possibility that they can be employed in medical and security imaging. The potential of excellent energy resolution coupled with good position information that Germanium affords removes the necessity for mechanical collimators that would be required in a conventional gamma camera system. By removing this constraint, the overall dose to the patient can be reduced or the throughput of the system can be increased. An additional benefit of excellent energy resolution is that tight gates can be placed on energies from either a multi-lined gamma source or from multi-nuclide sources increasing the number of sources that can be used in medical imaging. In terms of security imaging, segmented Germanium gives directionality and excellent spectroscopic information.

  17. Evaluating Dense 3d Reconstruction Software Packages for Oblique Monitoring of Crop Canopy Surface

    NASA Astrophysics Data System (ADS)

    Brocks, S.; Bareth, G.

    2016-06-01

    Crop Surface Models (CSMs) are 2.5D raster surfaces representing absolute plant canopy height. Using multiple CMSs generated from data acquired at multiple time steps, a crop surface monitoring is enabled. This makes it possible to monitor crop growth over time and can be used for monitoring in-field crop growth variability which is useful in the context of high-throughput phenotyping. This study aims to evaluate several software packages for dense 3D reconstruction from multiple overlapping RGB images on field and plot-scale. A summer barley field experiment located at the Campus Klein-Altendorf of University of Bonn was observed by acquiring stereo images from an oblique angle using consumer-grade smart cameras. Two such cameras were mounted at an elevation of 10 m and acquired images for a period of two months during the growing period of 2014. The field experiment consisted of nine barley cultivars that were cultivated in multiple repetitions and nitrogen treatments. Manual plant height measurements were carried out at four dates during the observation period. The software packages Agisoft PhotoScan, VisualSfM with CMVS/PMVS2 and SURE are investigated. The point clouds are georeferenced through a set of ground control points. Where adequate results are reached, a statistical analysis is performed.

  18. Earth observations taken during the STS-103 mission

    NASA Image and Video Library

    1999-12-23

    STS103-730-032 (19-27 December 1999) --- One of the astronauts aboard the Earth-orbiting Space Shuttle Discovery used a handheld 70mm camera to capture the southern to middle Rocky Mountains in low sunlight. The middle Rockies include the Big Horn range of Wyoming (snow capped range almost center of horizon) and the Unita Mountains of northeastern Utah (snow capped range left side of horizon). The southern Rockies includes the Front Range, Sangre de Cristo Mountains, Sawatch Ranges, and the San Juan Mountains. The eastern (Front Range, Sangre de Cristo) and western ranges (Sawatch, San Juan's) are separated by intermontane basins. The southernmost basin (near center of the image) is the San Luis Valley of Colorado. On the eastern edge of the San Luis Valley are the Sangre de Cristo Mountains.

  19. Northwest corner of Wyoming

    NASA Image and Video Library

    1974-02-01

    SL4-138-3846 (February 1974) --- A near vertical view of the snow-covered northwest corner of Wyoming as seen from the Skylab space station in Earth orbit. A Skylab 4 crewman used a hand-held 70mm Hasselblad camera to take this picture. A small portion of Montana and Idaho is seen in this photograph also. The dark area is Yellowstone National Park. The largest body of water is Yellowstone Lake. The Absaroka Range is immediately east and northeast of Yellowstone Lake. The elongated range in the eastern part of the picture is the Big Horn Mountain range. The Wind River Range is at bottom center. The Grand Teton National Park area is almost straight south of Yellowstone Lake. Approximately 30 per cent of the state of Wyoming can be seen in this photograph. Photo credit: NASA

  20. Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Mather, John; Stockman, H. S.; Fisher, Richard R. (Technical Monitor)

    2000-01-01

    The Next Generation Space Telescope (NGST), planned for launch in 2009, will be an 8-m class radiatively cooled infrared telescope at the Lagrange point L2. It will cover the wavelength range from 0.6 to 28 microns with cameras and spectrometers, to observe the first luminous objects after the Big Bang, and the formation, growth, clustering, and evolution of galaxies, stars, and protoplanetary clouds, leading to better understanding of our own Origins. It will seek evidence of the cosmic dark matter through its gravitational effects. With an aperture three times greater than the Hubble Space Telescope, it will provide extraordinary advances in capabilities and enable the discovery of many new phenomena. It is a joint project of the NASA, ESA, and CSA, and scientific operations will be provided by the Space Telescope Science Institute.

  1. Implementation of remote monitoring and managing switches

    NASA Astrophysics Data System (ADS)

    Leng, Junmin; Fu, Guo

    2010-12-01

    In order to strengthen the safety performance of the network and provide the big convenience and efficiency for the operator and the manager, the system of remote monitoring and managing switches has been designed and achieved using the advanced network technology and present network resources. The fast speed Internet Protocol Cameras (FS IP Camera) is selected, which has 32-bit RSIC embedded processor and can support a number of protocols. An Optimal image compress algorithm Motion-JPEG is adopted so that high resolution images can be transmitted by narrow network bandwidth. The architecture of the whole monitoring and managing system is designed and implemented according to the current infrastructure of the network and switches. The control and administrative software is projected. The dynamical webpage Java Server Pages (JSP) development platform is utilized in the system. SQL (Structured Query Language) Server database is applied to save and access images information, network messages and users' data. The reliability and security of the system is further strengthened by the access control. The software in the system is made to be cross-platform so that multiple operating systems (UNIX, Linux and Windows operating systems) are supported. The application of the system can greatly reduce manpower cost, and can quickly find and solve problems.

  2. Gigavision - A weatherproof, multibillion pixel resolution time-lapse camera system for recording and tracking phenology in every plant in a landscape

    NASA Astrophysics Data System (ADS)

    Brown, T.; Borevitz, J. O.; Zimmermann, C.

    2010-12-01

    We have a developed a camera system that can record hourly, gigapixel (multi-billion pixel) scale images of an ecosystem in a 360x90 degree panorama. The “Gigavision” camera system is solar-powered and can wirelessly stream data to a server. Quantitative data collection from multiyear timelapse gigapixel images is facilitated through an innovative web-based toolkit for recording time-series data on developmental stages (phenology) from any plant in the camera’s field of view. Gigapixel images enable time-series recording of entire landscapes with a resolution sufficient to record phenology from a majority of individuals in entire populations of plants. When coupled with next generation sequencing, quantitative population genomics can be performed in a landscape context linking ecology and evolution in situ and in real time. The Gigavision camera system achieves gigapixel image resolution by recording rows and columns of overlapping megapixel images. These images are stitched together into a single gigapixel resolution image using commercially available panorama software. Hardware consists of a 5-18 megapixel resolution DSLR or Network IP camera mounted on a pair of heavy-duty servo motors that provide pan-tilt capabilities. The servos and camera are controlled with a low-power Windows PC. Servo movement, power switching, and system status monitoring are enabled with Phidgets-brand sensor boards. System temperature, humidity, power usage, and battery voltage are all monitored at 5 minute intervals. All sensor data is uploaded via cellular or 802.11 wireless to an interactive online interface for easy remote monitoring of system status. Systems with direct internet connections upload the full sized images directly to our automated stitching server where they are stitched and available online for viewing within an hour of capture. Systems with cellular wireless upload an 80 megapixel “thumbnail” of each larger panorama and full-sized images are manually retrieved at bi-weekly intervals. Our longer-term goal is to make gigapixel time-lapse datasets available online in an interactive interface that layers plant-level phenology data with gigapixel resolution images, genomic sequence data from individual plants with weather and other abitotic sensor data. Co-visualization of all of these data types provides researchers with a powerful new tool for examining complex ecological interactions across scales from the individual to the ecosystem. We will present detailed phenostage data from more than 100 plants of multiple species from our Gigavision timelapse camera at our “Big Blowout East” field site in the Indiana Dunes State Park, IN. This camera has been recording three to four 700 million pixel images a day since February 28, 2010. The camera field of view covers an area of about 7 hectares resulting in an average image resolution of about 1 pixel per centimeter over the entire site. We will also discuss some of the many technological challenges with developing and maintaining these types of hardware systems, collecting quantitative data from gigapixel resolution time-lapse data and effectively managing terabyte-sized datasets of millions of images.

  3. Cell classification using big data analytics plus time stretch imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Jalali, Bahram; Chen, Claire L.; Mahjoubfar, Ata

    2016-09-01

    We show that blood cells can be classified with high accuracy and high throughput by combining machine learning with time stretch quantitative phase imaging. Our diagnostic system captures quantitative phase images in a flow microscope at millions of frames per second and extracts multiple biophysical features from individual cells including morphological characteristics, light absorption and scattering parameters, and protein concentration. These parameters form a hyperdimensional feature space in which supervised learning and cell classification is performed. We show binary classification of T-cells against colon cancer cells, as well classification of algae cell strains with high and low lipid content. The label-free screening averts the negative impact of staining reagents on cellular viability or cell signaling. The combination of time stretch machine vision and learning offers unprecedented cell analysis capabilities for cancer diagnostics, drug development and liquid biopsy for personalized genomics.

  4. Cheminformatics in Drug Discovery, an Industrial Perspective.

    PubMed

    Chen, Hongming; Kogej, Thierry; Engkvist, Ola

    2018-05-18

    Cheminformatics has established itself as a core discipline within large scale drug discovery operations. It would be impossible to handle the amount of data generated today in a small molecule drug discovery project without persons skilled in cheminformatics. In addition, due to increased emphasis on "Big Data", machine learning and artificial intelligence, not only in the society in general, but also in drug discovery, it is expected that the cheminformatics field will be even more important in the future. Traditional areas like virtual screening, library design and high-throughput screening analysis are highlighted in this review. Applying machine learning in drug discovery is an area that has become very important. Applications of machine learning in early drug discovery has been extended from predicting ADME properties and target activity to tasks like de novo molecular design and prediction of chemical reactions. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Intelligent Access to Sequence and Structure Databases (IASSD) - an interface for accessing information from major web databases.

    PubMed

    Ganguli, Sayak; Gupta, Manoj Kumar; Basu, Protip; Banik, Rahul; Singh, Pankaj Kumar; Vishal, Vineet; Bera, Abhisek Ranjan; Chakraborty, Hirak Jyoti; Das, Sasti Gopal

    2014-01-01

    With the advent of age of big data and advances in high throughput technology accessing data has become one of the most important step in the entire knowledge discovery process. Most users are not able to decipher the query result that is obtained when non specific keywords or a combination of keywords are used. Intelligent access to sequence and structure databases (IASSD) is a desktop application for windows operating system. It is written in Java and utilizes the web service description language (wsdl) files and Jar files of E-utilities of various databases such as National Centre for Biotechnology Information (NCBI) and Protein Data Bank (PDB). Apart from that IASSD allows the user to view protein structure using a JMOL application which supports conditional editing. The Jar file is freely available through e-mail from the corresponding author.

  6. EUV local CDU healing performance and modeling capability towards 5nm node

    NASA Astrophysics Data System (ADS)

    Jee, Tae Kwon; Timoshkov, Vadim; Choi, Peter; Rio, David; Tsai, Yu-Cheng; Yaegashi, Hidetami; Koike, Kyohei; Fonseca, Carlos; Schoofs, Stijn

    2017-10-01

    Both local variability and optical proximity correction (OPC) errors are big contributors to the edge placement error (EPE) budget which is closely related to the device yield. The post-litho contact hole healing will be demonstrated to meet after-etch local variability specifications using a low dose, 30mJ/cm2 dose-to-size, positive tone developed (PTD) resist with relevant throughput in high volume manufacturing (HVM). The total local variability of the node 5nm (N5) contact holes will be characterized in terms of local CD uniformity (LCDU), local placement error (LPE), and contact edge roughness (CER) using a statistical methodology. The CD healing process has complex etch proximity effects, so the OPC prediction accuracy is challenging to meet EPE requirements for the N5. Thus, the prediction accuracy of an after-etch model will be investigated and discussed using ASML Tachyon OPC model.

  7. XRootD popularity on hadoop clusters

    NASA Astrophysics Data System (ADS)

    Meoni, Marco; Boccali, Tommaso; Magini, Nicolò; Menichetti, Luca; Giordano, Domenico; CMS Collaboration

    2017-10-01

    Performance data and metadata of the computing operations at the CMS experiment are collected through a distributed monitoring infrastructure, currently relying on a traditional Oracle database system. This paper shows how to harness Big Data architectures in order to improve the throughput and the efficiency of such monitoring. A large set of operational data - user activities, job submissions, resources, file transfers, site efficiencies, software releases, network traffic, machine logs - is being injected into a readily available Hadoop cluster, via several data streamers. The collected metadata is further organized running fast arbitrary queries; this offers the ability to test several Map&Reduce-based frameworks and measure the system speed-up when compared to the original database infrastructure. By leveraging a quality Hadoop data store and enabling an analytics framework on top, it is possible to design a mining platform to predict dataset popularity and discover patterns and correlations.

  8. Sequential data access with Oracle and Hadoop: a performance comparison

    NASA Astrophysics Data System (ADS)

    Baranowski, Zbigniew; Canali, Luca; Grancher, Eric

    2014-06-01

    The Hadoop framework has proven to be an effective and popular approach for dealing with "Big Data" and, thanks to its scaling ability and optimised storage access, Hadoop Distributed File System-based projects such as MapReduce or HBase are seen as candidates to replace traditional relational database management systems whenever scalable speed of data processing is a priority. But do these projects deliver in practice? Does migrating to Hadoop's "shared nothing" architecture really improve data access throughput? And, if so, at what cost? Authors answer these questions-addressing cost/performance as well as raw performance- based on a performance comparison between an Oracle-based relational database and Hadoop's distributed solutions like MapReduce or HBase for sequential data access. A key feature of our approach is the use of an unbiased data model as certain data models can significantly favour one of the technologies tested.

  9. Toward Personalized Control of Human Gut Bacterial Communities.

    PubMed

    David, Lawrence A

    2018-01-01

    A key challenge in microbiology will be developing tools for manipulating human gut bacterial communities. Our ability to predict and control the dynamics of these communities is now in its infancy. To manage human gut microbiota, I am developing methods in three research domains. First, I am refining in vitro tools to experimentally study gut microbes at high throughput and in controlled settings. Second, I am adapting "big data" techniques to overcome statistical challenges confronting microbiota modeling. Third, I am testing study designs that can streamline human testing of microbiota manipulations. Assembling these methods creates new challenges, including training scientists who can work across disciplines such as engineering, ecology, and medicine. Nevertheless, I envision that overcoming these obstacles will enable my group to construct platforms that can personalize microbiota treatments, particularly ones based on diet. More broadly, I anticipate that such platforms will have applications across fields such as agriculture, biotechnology, and environmental management.

  10. Feature-Based Approach for the Registration of Pushbroom Imagery with Existing Orthophotos

    NASA Astrophysics Data System (ADS)

    Xiong, Weifeng

    Low-cost Unmanned Airborne Vehicles (UAVs) are rapidly becoming suitable platforms for acquiring remote sensing data for a wide range of applications. For example, a UAV-based mobile mapping system (MMS) is emerging as a novel phenotyping tool that delivers several advantages to alleviate the drawbacks of conventional manual plant trait measurements. Moreover, UAVs equipped with direct geo-referenced frame cameras and pushbroom scanners can acquire geospatial data for comprehensive high-throughput phenotyping. UAVs for mobile mapping platforms are low-cost and easy to use, can fly closer to the objects, and are filling an important gap between ground wheel-based and traditional manned-airborne platforms. However, consumer-grade UAVs are capable of carrying only equipment with a relatively light payload and their flying time is determined by a limited battery life. These restrictions of UAVs unfortunately force potential users to adopt lower-quality direct geo-referencing and imaging systems that may negatively impact the quality of the deliverables. Recent advances in sensor calibration and automated triangulation have made it feasible to obtain accurate mapping using low-cost camera systems equipped with consumer-grade GNSS/INS units. However, ortho-rectification of the data from a linear-array scanner is challenging for low-cost UAV systems, because the derived geo-location information from pushbroom sensors is quite sensitive to the performance of the implemented direct geo-referencing unit. This thesis presents a novel approach for improving the ortho-rectification of hyperspectral pushbroom scanner imagery with the aid of orthophotos generated from frame cameras through the identification of conjugate features while modeling the impact of residual artifacts in the direct geo-referencing information. The experimental results qualitatively and quantitatively proved the feasibility of the proposed methodology in improving the geo-referencing accuracy of real datasets collected over an agricultural field.

  11. Harnessing NGS and Big Data Optimally: Comparison of miRNA Prediction from Assembled versus Non-assembled Sequencing Data--The Case of the Grass Aegilops tauschii Complex Genome.

    PubMed

    Budak, Hikmet; Kantar, Melda

    2015-07-01

    MicroRNAs (miRNAs) are small, endogenous, non-coding RNA molecules that regulate gene expression at the post-transcriptional level. As high-throughput next generation sequencing (NGS) and Big Data rapidly accumulate for various species, efforts for in silico identification of miRNAs intensify. Surprisingly, the effect of the input genomics sequence on the robustness of miRNA prediction was not evaluated in detail to date. In the present study, we performed a homology-based miRNA and isomiRNA prediction of the 5D chromosome of bread wheat progenitor, Aegilops tauschii, using two distinct sequence data sets as input: (1) raw sequence reads obtained from 454-GS FLX Titanium sequencing platform and (2) an assembly constructed from these reads. We also compared this method with a number of available plant sequence datasets. We report here the identification of 62 and 22 miRNAs from raw reads and the assembly, respectively, of which 16 were predicted with high confidence from both datasets. While raw reads promoted sensitivity with the high number of miRNAs predicted, 55% (12 out of 22) of the assembly-based predictions were supported by previous observations, bringing specificity forward compared to the read-based predictions, of which only 37% were supported. Importantly, raw reads could identify several repeat-related miRNAs that could not be detected with the assembly. However, raw reads could not capture 6 miRNAs, for which the stem-loops could only be covered by the relatively longer sequences from the assembly. In summary, the comparison of miRNA datasets obtained by these two strategies revealed that utilization of raw reads, as well as assemblies for in silico prediction, have distinct advantages and disadvantages. Consideration of these important nuances can benefit future miRNA identification efforts in the current age of NGS and Big Data driven life sciences innovation.

  12. Hierarchical video surveillance architecture: a chassis for video big data analytics and exploration

    NASA Astrophysics Data System (ADS)

    Ajiboye, Sola O.; Birch, Philip; Chatwin, Christopher; Young, Rupert

    2015-03-01

    There is increasing reliance on video surveillance systems for systematic derivation, analysis and interpretation of the data needed for predicting, planning, evaluating and implementing public safety. This is evident from the massive number of surveillance cameras deployed across public locations. For example, in July 2013, the British Security Industry Association (BSIA) reported that over 4 million CCTV cameras had been installed in Britain alone. The BSIA also reveal that only 1.5% of these are state owned. In this paper, we propose a framework that allows access to data from privately owned cameras, with the aim of increasing the efficiency and accuracy of public safety planning, security activities, and decision support systems that are based on video integrated surveillance systems. The accuracy of results obtained from government-owned public safety infrastructure would improve greatly if privately owned surveillance systems `expose' relevant video-generated metadata events, such as triggered alerts and also permit query of a metadata repository. Subsequently, a police officer, for example, with an appropriate level of system permission can query unified video systems across a large geographical area such as a city or a country to predict the location of an interesting entity, such as a pedestrian or a vehicle. This becomes possible with our proposed novel hierarchical architecture, the Fused Video Surveillance Architecture (FVSA). At the high level, FVSA comprises of a hardware framework that is supported by a multi-layer abstraction software interface. It presents video surveillance systems as an adapted computational grid of intelligent services, which is integration-enabled to communicate with other compatible systems in the Internet of Things (IoT).

  13. Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning.

    PubMed

    Norouzzadeh, Mohammad Sadegh; Nguyen, Anh; Kosmala, Margaret; Swanson, Alexandra; Palmer, Meredith S; Packer, Craig; Clune, Jeff

    2018-06-19

    Having accurate, detailed, and up-to-date information about the location and behavior of animals in the wild would improve our ability to study and conserve ecosystems. We investigate the ability to automatically, accurately, and inexpensively collect such data, which could help catalyze the transformation of many fields of ecology, wildlife biology, zoology, conservation biology, and animal behavior into "big data" sciences. Motion-sensor "camera traps" enable collecting wildlife pictures inexpensively, unobtrusively, and frequently. However, extracting information from these pictures remains an expensive, time-consuming, manual task. We demonstrate that such information can be automatically extracted by deep learning, a cutting-edge type of artificial intelligence. We train deep convolutional neural networks to identify, count, and describe the behaviors of 48 species in the 3.2 million-image Snapshot Serengeti dataset. Our deep neural networks automatically identify animals with >93.8% accuracy, and we expect that number to improve rapidly in years to come. More importantly, if our system classifies only images it is confident about, our system can automate animal identification for 99.3% of the data while still performing at the same 96.6% accuracy as that of crowdsourced teams of human volunteers, saving >8.4 y (i.e., >17,000 h at 40 h/wk) of human labeling effort on this 3.2 million-image dataset. Those efficiency gains highlight the importance of using deep neural networks to automate data extraction from camera-trap images, reducing a roadblock for this widely used technology. Our results suggest that deep learning could enable the inexpensive, unobtrusive, high-volume, and even real-time collection of a wealth of information about vast numbers of animals in the wild. Copyright © 2018 the Author(s). Published by PNAS.

  14. Beesmart - a Crowdsourcing Project with Smartphones

    NASA Astrophysics Data System (ADS)

    Gülch, E.; Uddin, S.; Willi, B.

    2016-06-01

    The project Beesmart aims at the derivation of a geolocation yield catalogue for honey bees by using a crowd-sourcing approach with the help of smartphones. A central issue are thus the design of an application (App2bee) for smartphones and the design of a software for flower recognition, which uses sensor information of the smart phone and information about blooming times to recognize and localise flowers. The implemented flower recognition is based on the approach "Minimal-bag-of-visual-Words". A classification accuracy of about 60-70% can be reached, which is of course affected by the big variety of flowers, by the way on how images are taken and how the image quality and resolution actually are. The classification results are further improved by applying apriori a simple manual segmentation on the touch screen to put the focus in the image on the flower in question. The design and the functionality of the App2Bee are presented followed by details on the communication, database and Web-portal components. In a second part of the project the classification of larger areas of flowers important for honey bees are investigate using a fixed-wing UAV system with two different types of cameras, a RGB digital camera and a NIR digital camera. It is certainly not possible to recognize single flowers, but it could be shown, that larger fields of the same flower, like e.g. Red Clover, can be classified with this approach. With the data available it was also possible to classify bare-ground, roads, low pasture, high pasture as well as mixed pasture. For the high pasture it was possible to automatically identify clusters of flowers, like Yarrow.

  15. Curiosity Self-Portrait at Big Sky Drilling Site

    NASA Image and Video Library

    2015-10-13

    This self-portrait of NASA's Curiosity Mars rover shows the vehicle at the "Big Sky" site, where its drill collected the mission's fifth taste of Mount Sharp. The scene combines dozens of images taken during the 1,126th Martian day, or sol, of Curiosity's work during Mars (Oct. 6, 2015, PDT), by the Mars Hand Lens Imager (MAHLI) camera at the end of the rover's robotic arm. The rock drilled at this site is sandstone in the Stimson geological unit inside Gale Crater. The location is on cross-bedded sandstone in which the cross bedding is more evident in views from when the rover was approaching the area, such as PIA19818. The view is centered toward the west-northwest. It does not include the rover's robotic arm, though the shadow of the arm is visible on the ground. Wrist motions and turret rotations on the arm allowed MAHLI to acquire the mosaic's component images. The arm was positioned out of the shot in the images, or portions of images, that were used in this mosaic. This process was used previously in acquiring and assembling Curiosity self-portraits taken at sample-collection sites "Rocknest" (PIA16468), "John Klein" (PIA16937) and "Windjana" (PIA18390). This portrait of the rover was designed to show the Chemistry and Camera (ChemCam) instrument atop the rover appearing level. This causes the horizon to appear to tilt toward the left, but in reality it is fairly flat. For scale, the rover's wheels are 20 inches (50 centimeters) in diameter and about 16 inches (40 centimeters) wide. The drilled hole in the rock, appearing grey near the lower left corner of the image, is 0.63 inch (1.6 centimeters) in diameter. MAHLI was built by Malin Space Science Systems, San Diego. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Science Laboratory Project for the NASA Science Mission Directorate, Washington. JPL designed and built the project's Curiosity rover. http://photojournal.jpl.nasa.gov/catalog/PIA19920

  16. Clinical and laboratory applications of slide-based cytometry with the LSC, SFM, and the iCYTE imaging cytometer instruments

    NASA Astrophysics Data System (ADS)

    Bocsi, Jozsef; Luther, Ed; Mittag, Anja; Jensen, Ingo; Sack, Ulrich; Lenz, Dominik; Trezl, Lajos; Varga, Viktor S.; Molnar, Beea; Tarnok, Attila

    2004-06-01

    Background: Slide based cytometry (SBC) is a technology for the rapid stoichiometric analysis of cells fixed to surfaces. Its applications are highly versatile and ranges from the clinics to high throughput drug discovery. SBC is realized in different instruments such as the Laser Scanning Cytometer (LSC) and Scanning Fluorescent Microscope (SFM) and the novel inverted microscope based iCyte image cytometer (Compucyte Corp.). Methods: Fluorochrome labeled specimens were immobilized on microscopic slides. They were placed on a conventional fluorescence microscope and analyzed by photomultiplayers or digital camera. Data comparable to flow cytometry were generated. In addition, each individual event could be visualized. Applications: The major advantage of instruments is the combination of two features: a) the minimal sample volume needed, and b) the connection of fluorescence data and morphological information. Rare cells were detected, frequency of apoptosis by myricetin formaldehyde and H2O2 mixtures was determined;. Conclusion: LSC, SFM and the novel iCyte have a wide spectrum of applicability in SBC and can be introduced as a standard technology for multiple settings. In addition, the iCyte and SFM instrument is suited for high throughput screening by automation and may be in future adapted to telepathology due to their high quality images. (This study was supported by the IZKF-Leipzig, Germany and T 034245 OTKA, Hungary)

  17. Scaling up high throughput field phenotyping of corn and soy research plots using ground rovers

    NASA Astrophysics Data System (ADS)

    Peshlov, Boyan; Nakarmi, Akash; Baldwin, Steven; Essner, Scott; French, Jasenka

    2017-05-01

    Crop improvement programs require large and meticulous selection processes that effectively and accurately collect and analyze data to generate quality plant products as efficiently as possible, develop superior cropping and/or crop improvement methods. Typically, data collection for such testing is performed by field teams using hand-held instruments or manually-controlled devices. Although steps are taken to reduce error, the data collected in such manner can be unreliable due to human error and fatigue, which reduces the ability to make accurate selection decisions. Monsanto engineering teams have developed a high-clearance mobile platform (Rover) as a step towards high throughput and high accuracy phenotyping at an industrial scale. The rovers are equipped with GPS navigation, multiple cameras and sensors and on-board computers to acquire data and compute plant vigor metrics per plot. The supporting IT systems enable automatic path planning, plot identification, image and point cloud data QA/QC and near real-time analysis where results are streamed to enterprise databases for additional statistical analysis and product advancement decisions. Since the rover program was launched in North America in 2013, the number of research plots we can analyze in a growing season has expanded dramatically. This work describes some of the successes and challenges in scaling up of the rover platform for automated phenotyping to enable science at scale.

  18. WF/PC internal molecular contamination during system thermal-vacuum test

    NASA Technical Reports Server (NTRS)

    Taylor, Daniel M.; Barengoltz, J.; Jenkins, T.; Leschly, K.; Triolo, J.

    1988-01-01

    During the recent system thermal vacuum test of the Wide-Field/Planetary Camera (WF/PC), instrumentation was added to the WF/PC to characterize the internal molecular contamination and verify the instrument throughput down to 1470 angstroms. Analysis of data elements revealed two contaminants affecting the far-ultraviolet (FUV) performance of the WF/PC. The one contaminant (heavy volatile) is correlated with the electronic and housing temperature, and the contamination is significantly reduced when the electronics are operated below plus 8 degrees to plus 10 degrees C. The other contaminant (light volatile) is controlled by the heat pipe temperature, and the contamination is significantly reduced when the Thermal Electric Cooler (TEC) hot-junction temperature is below minus 40 degrees to minus 50 degrees C. The utility of contamination sensors located behind instruments during system tests was demonstrated.

  19. A novel method for real-time edge-enhancement and its application to pattern recognition

    NASA Astrophysics Data System (ADS)

    Ge, Huayong; Bai, Enjian; Fan, Hong

    2010-11-01

    The coupling gain coefficient g is redefined and deduced based on coupling theory, the variant of coupling gain coefficient g for different ΓL and r is analyzed. A new optical system is proposed for image edge-enhancement. It recycles the back signal to amplify the edge signal, which has the advantages of high throughput efficiency and brightness. The optical system is designed and built, and the edge-enhanced image of hand bone is captured electronically by CCD camera. The principle of optical correlation is demonstrated, 3-D correlation distribution of letter H with and without edge-enhancement is simulated, the discrimination capability Iac and the full-width at half maximum intensity (FWHM) are compared for two kinds of correlators. The analysis shows that edge-enhancement preprocessing can improve the performance of correlator effectively.

  20. Effects of competitive prey capture on flight behavior and sonar beam pattern in paired big brown bats, Eptesicus fuscus.

    PubMed

    Chiu, Chen; Reddy, Puduru Viswanadha; Xian, Wei; Krishnaprasad, Perinkulam S; Moss, Cynthia F

    2010-10-01

    Foraging and flight behavior of echolocating bats were quantitatively analyzed in this study. Paired big brown bats, Eptesicus fuscus, competed for a single food item in a large laboratory flight room. Their sonar beam patterns and flight paths were recorded by a microphone array and two high-speed cameras, respectively. Bats often remained in nearly classical pursuit (CP) states when one bat is following another bat. A follower can detect and anticipate the movement of the leader, while the leader has the advantage of gaining access to the prey first. Bats in the trailing position throughout the trial were more successful in accessing the prey. In this study, bats also used their sonar beam to monitor the conspecific's movement and to track the prey. Each bat tended to use its sonar beam to track the prey when it was closer to the worm than to another bat. The trailing bat often directed its sonar beam toward the leading bat in following flight. When two bats flew towards each other, they tended to direct their sonar beam axes away from each other, presumably to avoid signal jamming. This study provides a new perspective on how echolocating bats use their biosonar system to coordinate their flight with conspecifics in a group and how they compete for the same food source with conspecifics.

  1. Benoit Mandelbrot, films, and me: a tribute

    NASA Astrophysics Data System (ADS)

    Lesmoir-Gordon, Nigel

    2015-03-01

    Back in the early 1990s I was researching topics for possible science TV documentaries and reading Ian Stewart's book Does God Play Dice? [1] I came across the chapter he called The Gingerbread Man. It was about the Mandelbrot set. Reading that chapter I could see that here was the perfect subject for a film. The Mandelbrot set is a fractal (Fig. 13.1) and fractal geometry is the geometry of nature (Figs. 13.2 to 13.4). So here I had the stunning visual beauty of the set itself and the natural world to point my camera at. It was a science story I could tell in pictures. It wasn't the Big Bang or gravity or quantum mechanics, it was something you could actually see and which could be filmed directly without recourse to endless and expensive CGI...

  2. Evaluation of the LLNL Spectrometer for Possible use with the NSTec Optical Streak Camera as a Light Gas Gun Diagnostic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Connor, J., Cradick, J.

    In fiscal year 2012, it was desired to combine a visible spectrometer with a streak camera to form a diagnostic system for recording time-resolved spectra generated in light gas gun experiments. Acquiring a new spectrometer was an option, but it was possible to borrow an existing unit for a period of months, which would be sufficient to evaluate both “off-line” and in-gas gun shots. If it proved adequate for this application, it could be duplicated (with possible modifications); if not, such testing would help determine needed specifications for another model. This report describes the evaluation of the spectrometer (separately andmore » combined with the NSTec LO streak camera) for this purpose. Spectral and temporal resolutions were of primary interest. The first was measured with a monochromatic laser input. The second was ascertained by the combination of the spectrometer’s spatial resolution in the time-dispersive direction and the streak camera’s intrinsic temporal resolution. System responsivity was also important, and this was investigated by measuring the response of the spectrometer/camera system to black body input—the gas gun experiments are expected to be similar to a 3000K black body—as well as measuring the throughput of the spectrometer separately over a range of visible light provided by a monochromator. The flat field (in wavelength) was also measured and the final part of the evaluation was actual fielding on two gas gun shots. No firm specifications for spectral or temporal resolution were defined precisely, but these were desired to be in the 1–2 nm and 1–2 ns ranges, respectively, if possible. As seen below, these values were met or nearly met, depending on wavelength. Other performance parameters were also not given (threshold requirements) but the evaluations performed with laser, black body, and successful gas gun shots taken in aggregate indicate that the spectrometer is adequate for this purpose. Even still, some (relatively minor) opportunities for improvement were noticed and these were documented for incorporation into any near-duplicate spectrometer that might be fabricated in the future.« less

  3. Metrology camera system of prime focus spectrograph for Suburu telescope

    NASA Astrophysics Data System (ADS)

    Wang, Shiang-Yu; Chou, Richard C. Y.; Huang, Pin-Jie; Ling, Hung-Hsu; Karr, Jennifer; Chang, Yin-Chang; Hu, Yen-Sang; Hsu, Shu-Fu; Chen, Hsin-Yo; Gunn, James E.; Reiley, Dan J.; Tamura, Naoyuki; Takato, Naruhisa; Shimono, Atsushi

    2016-08-01

    The Prime Focus Spectrograph (PFS) is a new optical/near-infrared multi-fiber spectrograph designed for the prime focus of the 8.2m Subaru telescope. PFS will cover a 1.3 degree diameter field with 2394 fibers to complement the imaging capabilities of Hyper SuprimeCam. To retain high throughput, the final positioning accuracy between the fibers and observing targets of PFS is required to be less than 10 microns. The metrology camera system (MCS) serves as the optical encoder of the fiber motors for the configuring of fibers. MCS provides the fiber positions within a 5 microns error over the 45 cm focal plane. The information from MCS will be fed into the fiber positioner control system for the closed loop control. MCS will be located at the Cassegrain focus of Subaru telescope in order to cover the whole focal plane with one 50M pixel Canon CMOS camera. It is a 380mm Schmidt type telescope which generates a uniform spot size with a 10 micron FWHM across the field for reasonable sampling of the point spread function. Carbon fiber tubes are used to provide a stable structure over the operating conditions without focus adjustments. The CMOS sensor can be read in 0.8s to reduce the overhead for the fiber configuration. The positions of all fibers can be obtained within 0.5s after the readout of the frame. This enables the overall fiber configuration to be less than 2 minutes. MCS will be installed inside a standard Subaru Cassgrain Box. All components that generate heat are located inside a glycol cooled cabinet to reduce the possible image motion due to heat. The optics and camera for MCS have been delivered and tested. The mechanical parts and supporting structure are ready as of spring 2016. The integration of MCS will start in the summer of 2016. In this report, the performance of the MCS components, the alignment and testing procedure as well as the status of the PFS MCS will be presented.

  4. A generalised random encounter model for estimating animal density with remote sensor data.

    PubMed

    Lucas, Tim C D; Moorcroft, Elizabeth A; Freeman, Robin; Rowcliffe, J Marcus; Jones, Kate E

    2015-05-01

    Wildlife monitoring technology is advancing rapidly and the use of remote sensors such as camera traps and acoustic detectors is becoming common in both the terrestrial and marine environments. Current methods to estimate abundance or density require individual recognition of animals or knowing the distance of the animal from the sensor, which is often difficult. A method without these requirements, the random encounter model (REM), has been successfully applied to estimate animal densities from count data generated from camera traps. However, count data from acoustic detectors do not fit the assumptions of the REM due to the directionality of animal signals.We developed a generalised REM (gREM), to estimate absolute animal density from count data from both camera traps and acoustic detectors. We derived the gREM for different combinations of sensor detection widths and animal signal widths (a measure of directionality). We tested the accuracy and precision of this model using simulations of different combinations of sensor detection widths and animal signal widths, number of captures and models of animal movement.We find that the gREM produces accurate estimates of absolute animal density for all combinations of sensor detection widths and animal signal widths. However, larger sensor detection and animal signal widths were found to be more precise. While the model is accurate for all capture efforts tested, the precision of the estimate increases with the number of captures. We found no effect of different animal movement models on the accuracy and precision of the gREM.We conclude that the gREM provides an effective method to estimate absolute animal densities from remote sensor count data over a range of sensor and animal signal widths. The gREM is applicable for count data obtained in both marine and terrestrial environments, visually or acoustically (e.g. big cats, sharks, birds, echolocating bats and cetaceans). As sensors such as camera traps and acoustic detectors become more ubiquitous, the gREM will be increasingly useful for monitoring unmarked animal populations across broad spatial, temporal and taxonomic scales.

  5. Crowd-Funded Micro-Grants for Genomics and “Big Data”: An Actionable Idea Connecting Small (Artisan) Science, Infrastructure Science, and Citizen Philanthropy

    PubMed Central

    Badr, Kamal F.; Dove, Edward S.; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J.; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N.; Sabra, Ramzi; Sarkissian, Christineh N.; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K.; Kickbusch, Ilona

    2013-01-01

    Abstract Biomedical science in the 21st century is embedded in, and draws from, a digital commons and “Big Data” created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., “the lone genius” or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21st century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists—only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the “bottom one billion”—the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while sharing similar disease burdens, therapeutics, and diagnostic needs. We report the creation of ten Type 2 micro-grants for citizen science and artisan labs to be administered by the nonprofit Data-Enabled Life Sciences Alliance International (DELSA Global, Seattle). Our hope is that these micro-grants will spur novel forms of disruptive innovation and genomics translation by artisan scientists and citizen scholars alike. We conclude with a neglected voice from the global health frontlines, the American University of Iraq in Sulaimani, and suggest that many similar global regions are now poised for micro-grant enabled collective innovation to harness the 21st century digital commons. PMID:23574338

  6. Crowd-funded micro-grants for genomics and "big data": an actionable idea connecting small (artisan) science, infrastructure science, and citizen philanthropy.

    PubMed

    Özdemir, Vural; Badr, Kamal F; Dove, Edward S; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N; Sabra, Ramzi; Sarkissian, Christineh N; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K; Kickbusch, Ilona

    2013-04-01

    Biomedical science in the 21(st) century is embedded in, and draws from, a digital commons and "Big Data" created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., "the lone genius" or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21(st) century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists-only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the "bottom one billion"-the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while sharing similar disease burdens, therapeutics, and diagnostic needs. We report the creation of ten Type 2 micro-grants for citizen science and artisan labs to be administered by the nonprofit Data-Enabled Life Sciences Alliance International (DELSA Global, Seattle). Our hope is that these micro-grants will spur novel forms of disruptive innovation and genomics translation by artisan scientists and citizen scholars alike. We conclude with a neglected voice from the global health frontlines, the American University of Iraq in Sulaimani, and suggest that many similar global regions are now poised for micro-grant enabled collective innovation to harness the 21(st) century digital commons.

  7. Advanced EVA Suit Camera System Development Project

    NASA Technical Reports Server (NTRS)

    Mock, Kyla

    2016-01-01

    The National Aeronautics and Space Administration (NASA) at the Johnson Space Center (JSC) is developing a new extra-vehicular activity (EVA) suit known as the Advanced EVA Z2 Suit. All of the improvements to the EVA Suit provide the opportunity to update the technology of the video imagery. My summer internship project involved improving the video streaming capabilities of the cameras that will be used on the Z2 Suit for data acquisition. To accomplish this, I familiarized myself with the architecture of the camera that is currently being tested to be able to make improvements on the design. Because there is a lot of benefit to saving space, power, and weight on the EVA suit, my job was to use Altium Design to start designing a much smaller and simplified interface board for the camera's microprocessor and external components. This involved checking datasheets of various components and checking signal connections to ensure that this architecture could be used for both the Z2 suit and potentially other future projects. The Orion spacecraft is a specific project that may benefit from this condensed camera interface design. The camera's physical placement on the suit also needed to be determined and tested so that image resolution can be maximized. Many of the options of the camera placement may be tested along with other future suit testing. There are multiple teams that work on different parts of the suit, so the camera's placement could directly affect their research or design. For this reason, a big part of my project was initiating contact with other branches and setting up multiple meetings to learn more about the pros and cons of the potential camera placements we are analyzing. Collaboration with the multiple teams working on the Advanced EVA Z2 Suit is absolutely necessary and these comparisons will be used as further progress is made for the overall suit design. This prototype will not be finished in time for the scheduled Z2 Suit testing, so my time was also spent creating a case for the original interface board that is already being used. This design is being done by use of Creo 2. Due to time constraints, I may not be able to complete the 3-D printing portion of this design, but I was able to use my knowledge of the interface board and Altium Design to help in the task. As a side project, I assisted another intern in selecting and programming a microprocessor to control linear actuators. These linear actuators will be used to move various increments of polyethylene for controlled radiation testing. For this, we began the software portion of the project using the Arduino's coding environment to control an Arduino Due and H-Bridge components. Along with the obvious learning of computer programs such as Altium Design and Creo 2, I also acquired more skills with networking and collaborating with others, being able to multi-task because of responsibilities to work on various projects, and how to set realistic goals in the work place. Like many internship projects, this project will be continued and improved, so I also had the chance to improve my organization and communication skills as I documented all of my meetings and research. As a result of my internship at JSC, I desire to continue a career with NASA, whether that be through another internship or possibly a co-op. I am excited to return to my university and continue my education in electrical engineering because of all of my experiences at JSC.

  8. Mosaic3: a red-sensitive upgrade for the prime focus camera at the Mayall 4m telescope

    NASA Astrophysics Data System (ADS)

    Dey, Arjun; Rabinowitz, David; Karcher, Armin; Bebek, Chris; Baltay, Charles; Sprayberry, David; Valdes, Frank; Stupak, Bob; Donaldson, John; Emmet, Will; Hurteau, Tom; Abareshi, Behzad; Marshall, Bob; Lang, Dustin; Fitzpatrick, Mike; Daly, Phil; Joyce, Dick; Schlegel, David; Schweiker, Heidi; Allen, Lori; Blum, Bob; Levi, Michael

    2016-08-01

    The Dark Energy Spectroscopic Instrument (DESI) is under construction and will be used to measure the expansion history of the Universe using the Baryon Acoustic Oscillation (BAO) technique and the growth of structure using redshift-space distortions (RSD). The spectra of 30 million galaxies over 14000 sq deg will be measured over the course of the experiment. In order to provide spectroscopic targets for the DESI survey, we are carrying out a three-band (g,r,z ) imaging survey of the sky using the NOAO 4-m telescopes at Kitt Peak National Observatory (KPNO) and the Cerro Tololo Interamerican Observatory (CTIO). At KPNO, we will use an upgraded version of the Mayall 4m telescope prime focus camera, Mosaic3, to carry out a z-band survey of the Northern Galactic Cap at declinations δ>=+30 degrees. By equipping an existing Dewar with four 4kx4k fully depleted CCDs manufactured by the Lawrence Berkeley National Laboratory (LBNL), we increased the z-band throughput of the system by a factor of 1.6. These devices have the thickest active area fielded at a telescope. The Mosaic3 z-band survey will be complemented by g-band and r-band observations using the Bok telescope and 90 Prime imager on Kitt Peak. We describe the upgrade and performance of the Mosaic3 instrument and the scope of the northern survey.

  9. Science Goals for an All-sky Viewing Observatory in X-rays

    NASA Astrophysics Data System (ADS)

    Remillard, R. A.; Levine, A. M.; Morgan, E. H.; Bradt, H. V.

    2003-03-01

    We describe a concept for a NASA SMEX Mission that will provide a comprehensive investigation of cosmic explosions. These range from the short flashes at cosmological distances in Gamma-ray bursts, to the moments of relativistic mass ejections in Galactic microquasars, to the panorama of outbursts used to identify the stellar-scale black holes in our Galaxy. With an equatorial launch, an array of 31 cameras can cover 97% of the sky with an average exposure efficiency of 65%. Coded mask cameras with Xe detectors (1.5-12 keV) are chosen for their ability to distinguish thermal and non-thermal processes, while providing high throughput and msec time resolution to capture the detailed evolution of bright events. This mission, with 1' position accuracy, would provide a long-term solution to the critical needs for monitoring services for Chandra and GLAST, with possible overlap into the time frame for Constellation-X. The sky coverage would create additional science opportunities beyond the X-ray missions: "eyes" for LIGO and partnerships for time-variability with LOFAR and dedicated programs at optical observatories. Compared to the RXTE ASM, AVOX offers improvements by a factor of 40 in instantaneous sky coverage and a factor of 10 in sensitivity to faint X-ray sources (i.e. to 0.8 mCrab at 3 sigma in 1 day).

  10. Colorimetric analyzer based on mobile phone camera for determination of available phosphorus in soil.

    PubMed

    Moonrungsee, Nuntaporn; Pencharee, Somkid; Jakmunee, Jaroon

    2015-05-01

    A field deployable colorimetric analyzer based on an "Android mobile phone" was developed for the determination of available phosphorus content in soil. An inexpensive mobile phone embedded with digital camera was used for taking photograph of the chemical solution under test. The method involved a reaction of the phosphorus (orthophosphate form), ammonium molybdate and potassium antimonyl tartrate to form phosphomolybdic acid which was reduced by ascorbic acid to produce the intense colored molybdenum blue. The software program was developed to use with the phone for recording and analyzing RGB color of the picture. A light tight box with LED light to control illumination was fabricated to improve precision and accuracy of the measurement. Under the optimum conditions, the calibration graph was created by measuring blue color intensity of a series of standard phosphorus solution (0.0-1.0mgPL(-1)), then, the calibration equation obtained was retained by the program for the analysis of sample solution. The results obtained from the proposed method agreed well with the spectrophotometric method, with a detection limit of 0.01mgPL(-1) and a sample throughput about 40h(-1) was achieved. The developed system provided good accuracy (RE<5%) and precision (RSD<2%, intra- and inter-day), fast and cheap analysis, and especially convenient to use in crop field for soil analysis of phosphorus nutrient. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Dual light-emitting diode-based multichannel microscopy for whole-slide multiplane, multispectral and phase imaging.

    PubMed

    Liao, Jun; Wang, Zhe; Zhang, Zibang; Bian, Zichao; Guo, Kaikai; Nambiar, Aparna; Jiang, Yutong; Jiang, Shaowei; Zhong, Jingang; Choma, Michael; Zheng, Guoan

    2018-02-01

    We report the development of a multichannel microscopy for whole-slide multiplane, multispectral and phase imaging. We use trinocular heads to split the beam path into 6 independent channels and employ a camera array for parallel data acquisition, achieving a maximum data throughput of approximately 1 gigapixel per second. To perform single-frame rapid autofocusing, we place 2 near-infrared light-emitting diodes (LEDs) at the back focal plane of the condenser lens to illuminate the sample from 2 different incident angles. A hot mirror is used to direct the near-infrared light to an autofocusing camera. For multiplane whole-slide imaging (WSI), we acquire 6 different focal planes of a thick specimen simultaneously. For multispectral WSI, we relay the 6 independent image planes to the same focal position and simultaneously acquire information at 6 spectral bands. For whole-slide phase imaging, we acquire images at 3 focal positions simultaneously and use the transport-of-intensity equation to recover the phase information. We also provide an open-source design to further increase the number of channels from 6 to 15. The reported platform provides a simple solution for multiplexed fluorescence imaging and multimodal WSI. Acquiring an instant focal stack without z-scanning may also enable fast 3-dimensional dynamic tracking of various biological samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. A cooled CCD camera-based protocol provides an effective solution for in vitro monitoring of luciferase.

    PubMed

    Afshari, Amirali; Uhde-Stone, Claudia; Lu, Biao

    2015-03-13

    Luciferase assay has become an increasingly important technique to monitor a wide range of biological processes. However, the mainstay protocols require a luminometer to acquire and process the data, therefore limiting its application to specialized research labs. To overcome this limitation, we have developed an alternative protocol that utilizes a commonly available cooled charge-coupled device (CCCD), instead of a luminometer for data acquiring and processing. By measuring activities of different luciferases, we characterized their substrate specificity, assay linearity, signal-to-noise levels, and fold-changes via CCCD. Next, we defined the assay parameters that are critical for appropriate use of CCCD for different luciferases. To demonstrate the usefulness in cultured mammalian cells, we conducted a case study to examine NFκB gene activation in response to inflammatory signals in human embryonic kidney cells (HEK293 cells). We found that data collected by CCCD camera was equivalent to those acquired by luminometer, thus validating the assay protocol. In comparison, The CCCD-based protocol is readily amenable to live-cell and high-throughput applications, offering fast simultaneous data acquisition and visual and quantitative data presentation. In conclusion, the CCCD-based protocol provides a useful alternative for monitoring luciferase reporters. The wide availability of CCCD will enable more researchers to use luciferases to monitor and quantify biological processes. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Modern Processing Capabilities of Analog Data from Documentation of the Great Omayyad Mosque in Aleppo, Syria, Damaged in Civil War

    NASA Astrophysics Data System (ADS)

    Pavelka, K.; Šedina, J.; Raeva, P.; Hůlková, M.

    2017-08-01

    In 1999, a big project for the documentation of the Great Omayyad mosque in Aleppo / Syria under UNESCO was conducted. By end of the last century, still analogue cameras were still being used, like the UMK Zeiss, RolleiMetric System. Digital cameras and digital automatic data processing were just starting to be on the rise and laser scanning was not relevant. In this situation, photogrammetrical measurement used stereo technology for complicated situations, and object and single-image technology for creating photoplans. Hundreds of photogrammetric images were taken. However, data processing was carried out on digital stereo plotters or workstations; it was necessary that all analogue photos were converted to digital form using a photogrammetric scanner. The outputs were adequate to the end of the last century. Nowadays, after 19 years, the photogrammetric materials still exist, but the technology and processing is completely different. Our original measurement is historical and nowadays quite obsolete. So we was it decided to explore the possibilities of the new processing of historical materials. Why? The reason is that in the last few years there has been civil war in Syria and the above mentioned monument was severely damaged. The existing historical materials therefore provide a unique opportunity for possible future reconstruction. This paper refers to the completion of existing materials, their evaluation and possibilities of new processing with today's technologies.

  14. Ex vivo brain tumor analysis using spectroscopic optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Lenz, Marcel; Krug, Robin; Welp, Hubert; Schmieder, Kirsten; Hofmann, Martin R.

    2016-03-01

    A big challenge during neurosurgeries is to distinguish between healthy tissue and cancerous tissue, but currently a suitable non-invasive real time imaging modality is not available. Optical Coherence Tomography (OCT) is a potential technique for such a modality. OCT has a penetration depth of 1-2 mm and a resolution of 1-15 μm which is sufficient to illustrate structural differences between healthy tissue and brain tumor. Therefore, we investigated gray and white matter of healthy central nervous system and meningioma samples with a Spectral Domain OCT System (Thorlabs Callisto). Additional OCT images were generated after paraffin embedding and after the samples were cut into 10 μm thin slices for histological investigation with a bright field microscope. All samples were stained with Hematoxylin and Eosin. In all cases B-scans and 3D images were made. Furthermore, a camera image of the investigated area was made by the built-in video camera of our OCT system. For orientation, the backsides of all samples were marked with blue ink. The structural differences between healthy tissue and meningioma samples were most pronounced directly after removal. After paraffin embedding these differences diminished. A correlation between OCT en face images and microscopy images can be seen. In order to increase contrast, post processing algorithms were applied. Hence we employed Spectroscopic OCT, pattern recognition algorithms and machine learning algorithms such as k-means Clustering and Principal Component Analysis.

  15. International Ultraviolet Explorer Final Archive

    NASA Technical Reports Server (NTRS)

    1997-01-01

    CSC processed IUE images through the Final Archive Data Processing System. Raw images were obtained from both NDADS and the IUEGTC optical disk platters for processing on the Alpha cluster, and from the IUEGTC optical disk platters for DECstation processing. Input parameters were obtained from the IUE database. Backup tapes of data to send to VILSPA were routinely made on the Alpha cluster. IPC handled more than 263 requests for priority NEWSIPS processing during the contract. Staff members also answered various questions and requests for information and sent copies of IUE documents to requesters. CSC implemented new processing capabilities into the NEWSIPS processing systems as they became available. In addition, steps were taken to improve efficiency and throughput whenever possible. The node TORTE was reconfigured as the I/O server for Alpha processing in May. The number of Alpha nodes used for the NEWSIPS processing queue was increased to a maximum of six in measured fashion in order to understand the dependence of throughput on the number of nodes and to be able to recognize when a point of diminishing returns was reached. With Project approval, generation of the VD FITS files was dropped in July. This action not only saved processing time but, even more significantly, also reduced the archive storage media requirements, and the time required to perform the archiving, drastically. The throughput of images verified through CDIVS and processed through NEWSIPS for the contract period is summarized below. The number of images of a given dispersion type and camera that were processed in any given month reflects several factors, including the availability of the required NEWSIPS software system, the availability of the corresponding required calibrations (e.g., the LWR high-dispersion ripple correction and absolute calibration), and the occurrence of reprocessing efforts such as that conducted to incorporate the updated SWP sensitivity-degradation correction in May.

  16. Development of a phenotyping platform for high throughput screening of nodal root angle in sorghum.

    PubMed

    Joshi, Dinesh C; Singh, Vijaya; Hunt, Colleen; Mace, Emma; van Oosterom, Erik; Sulman, Richard; Jordan, David; Hammer, Graeme

    2017-01-01

    In sorghum, the growth angle of nodal roots is a major component of root system architecture. It strongly influences the spatial distribution of roots of mature plants in the soil profile, which can impact drought adaptation. However, selection for nodal root angle in sorghum breeding programs has been restricted by the absence of a suitable high throughput phenotyping platform. The aim of this study was to develop a phenotyping platform for the rapid, non-destructive and digital measurement of nodal root angle of sorghum at the seedling stage. The phenotyping platform comprises of 500 soil filled root chambers (50 × 45 × 0.3 cm in size), made of transparent perspex sheets that were placed in metal tubs and covered with polycarbonate sheets. Around 3 weeks after sowing, once the first flush of nodal roots was visible, roots were imaged in situ using an imaging box that included two digital cameras that were remotely controlled by two android tablets. Free software ( openGelPhoto.tcl ) allowed precise measurement of nodal root angle from the digital images. The reliability and efficiency of the platform was evaluated by screening a large nested association mapping population of sorghum and a set of hybrids in six independent experimental runs that included up to 500 plants each. The platform revealed extensive genetic variation and high heritability (repeatability) for nodal root angle. High genetic correlations and consistent ranking of genotypes across experimental runs confirmed the reproducibility of the platform. This low cost, high throughput root phenotyping platform requires no sophisticated equipment, is adaptable to most glasshouse environments and is well suited to dissect the genetic control of nodal root angle of sorghum. The platform is suitable for use in sorghum breeding programs aiming to improve drought adaptation through root system architecture manipulation.

  17. Innovativ Airborne Sensors for Disaster Management

    NASA Astrophysics Data System (ADS)

    Altan, M. O.; Kemper, G.

    2016-06-01

    Modern Disaster Management Systems are based on 3 columns, crisis preparedness, early warning and the final crisis management. In all parts, special data are needed in order to analyze existing structures, assist in the early warning system and in the updating after a disaster happens to assist the crises management organizations. How can new and innovative sensors assist in these tasks? Aerial images have been frequently used in the past for generating spatial data, however in urban structures not all information can be extracted easily. Modern Oblique camera systems already assist in the evaluation of building structures to define rescue paths, analyze building structures and give also information of the stability of the urban fabric. For this application there is no need of a high geometric accurate sensor, also SLC Camera based Oblique Camera system as the OI X5, which uses Nikon Cameras, do a proper job. Such a camera also delivers worth full information after a Disaster happens to validate the degree of deformation in order to estimate stability and usability for the population. Thermal data in combination with RGB give further information of the building structure, damages and potential water intrusion. Under development is an oblique thermal sensor with 9 heads which enables nadir and oblique thermal data acquisition. Beside the application for searching people, thermal anomalies can be created out of humidity in constructions (transpiration effects), damaged power lines, burning gas tubes and many other dangerous facts. A big task is in the data analysis which should be made automatically and fast. This requires a good initial orientation and a proper relative adjustment of the single sensors. Like that, many modern software tools enable a rapid data extraction. Automated analysis of the data before and after a disaster can highlight areas of significant changes. Detecting anomalies are the way to get the focus on the prior area. Also Lidar supports Disaster management by analyzing changes in the DSM before and after the "event". Advantage of Lidar is that beside rain and clouds, no other weather conditions limit their use. As an active sensor, missions in the nighttime are possible. The new mid-format cameras that make use CMOS sensors (e.g. Phase One IXU1000) can capture data also under poor and difficult light conditions and might will be the first choice for remotely sensed data acquisition in aircrafts and UAVs. UAVs will surely be more and more part of the disaster management on the detailed level. Today equipped with video live cams using RGB and Thermal IR, they assist in looking inside buildings and behind. Thus, they can continue with the aerial survey where airborne anomalies have been detected.

  18. DSD Characteristics of a Mid-Winter Tornadic Storm Using C-Band Polarimetric Radar and Two 2D-Video Disdrometers

    NASA Technical Reports Server (NTRS)

    Thurai, M.; Petersen, W. A.; Carey, L. A.

    2010-01-01

    Drop size distributions in an evolving tornadic storm are examined using C-band polarimetric radar observations and two 2D-video disdrometers. The E-F2 storm occurred in mid-winter (21 January 2010) in northern Alabama, USA, and caused widespread damage. The evolution of the storm occurred within the C-band radar coverage and moreover, several minutes prior to touch down, the storm passed over a site where several disdrometers including two 2D video disdrometers (2DVD) had been installed. One of the 2DVDs is a low profile unit and the other is a new next generation compact unit currently undergoing performance evaluation. Analyses of the radar data indicate that the main region of precipitation should be treated as a "big-drop" regime case. Even the measured differential reflectivity values (i.e. without attenuation correction) were as high as 6-7 dB within regions of high reflectivity. Standard attenuation-correction methods using differential propagation phase have been "fine tuned" to be applicable to the "big drop" regime. The corrected reflectivity and differential reflectivity data are combined with the co-polar correlation coefficient and specific differential phase to determine the mass-weighted mean diameter, Dm, and the width of the mass spectrum, (sigma)M, as well as the intercept parameter , Nw. Significant areas of high Dm (3-4 mm) were retrieved within the main precipitation areas of the tornadic storm. The "big drop" regime assumption is substantiated by the two sets of 2DVD measurements. The Dm values calculated from 1-minute drop size distributions reached nearly 4 mm, whilst the maximum drop diameters were over 6 mm. The fall velocity measurements from the 2DVD indicate almost all hydrometeors to be fully melted at ground level. Drop shapes for this event are also being investigated from the 2DVD camera data.

  19. Universal explosive detection system for homeland security applications

    NASA Astrophysics Data System (ADS)

    Lee, Vincent Y.; Bromberg, Edward E. A.

    2010-04-01

    L-3 Communications CyTerra Corporation has developed a high throughput universal explosive detection system (PassPort) to automatically screen the passengers in airports without requiring them to remove their shoes. The technical approach is based on the patented energetic material detection (EMD) technology. By analyzing the results of sample heating with an infrared camera, one can distinguish the deflagration or decomposition of an energetic material from other clutters such as flammables and general background substances. This becomes the basis of a universal explosive detection system that does not require a library and is capable of detecting trace levels of explosives with a low false alarm rate. The PassPort is a simple turnstile type device and integrates a non-intrusive aerodynamic sampling scheme that has been shown capable of detecting trace levels of explosives on shoes. A detailed description of the detection theory and the automated sampling techniques, as well as the field test results, will be presented.

  20. Transmitter diversity verification on ARTEMIS geostationary satellite

    NASA Astrophysics Data System (ADS)

    Mata Calvo, Ramon; Becker, Peter; Giggenbach, Dirk; Moll, Florian; Schwarzer, Malte; Hinz, Martin; Sodnik, Zoran

    2014-03-01

    Optical feeder links will become the extension of the terrestrial fiber communications towards space, increasing data throughput in satellite communications by overcoming the spectrum limitations of classical RF-links. The geostationary telecommunication satellite Alphasat and the satellites forming the EDRS-system will become the next generation for high-speed data-relay services. The ESA satellite ARTEMIS, precursor for geostationary orbit (GEO) optical terminals, is still a privileged experiment platform to characterize the turbulent channel and investigate the challenges of free-space optical communication to GEO. In this framework, two measurement campaigns were conducted with the scope of verifying the benefits of transmitter diversity in the uplink. To evaluate this mitigation technique, intensity measurements were carried out at both ends of the link. The scintillation parameter is calculated and compared to theory and, additionally, the Fried Parameter is estimated by using a focus camera to monitor the turbulence strength.

  1. Chemiluminescence imaging ELISA using an imprinted polymer as the recognition element instead of an antibody.

    PubMed

    Surugiu, I; Danielsson, B; Ye, L; Mosbach, K; Haupt, K

    2001-02-01

    An imaging assay analogous to competitive enzyme immunoassays has been developed using a molecularly imprinted polymer instead of an antibody. The antigen 2,4-dichlorophenoxyacetic acid (2,4-D) was labeled with tobacco peroxidase, and the chemiluminescence reaction of luminol was used for detection. Microtiter plates (96 or 384 wells) were coated with polymer microspheres imprinted with 2,4-D, which were fixed in place by using poly(vinyl alcohol) as glue. In a competitive mode, the analyte-peroxidase conjugate was incubated with the free analyte in the microtiter plate, after which the bound fraction of the conjugate was quantified. After addition of the chemiluminescent substrates, light emission was measured in a high-throughput imaging format with a CCD camera. Calibration curves corresponding to analyte concentrations ranging from 0.01 to 100 microg/mL were obtained.

  2. Autonomous Docking Based on Infrared System for Electric Vehicle Charging in Urban Areas

    PubMed Central

    Pérez, Joshué; Nashashibi, Fawzi; Lefaudeux, Benjamin; Resende, Paulo; Pollard, Evangeline

    2013-01-01

    Electric vehicles are progressively introduced in urban areas, because of their ability to reduce air pollution, fuel consumption and noise nuisance. Nowadays, some big cities are launching the first electric car-sharing projects to clear traffic jams and enhance urban mobility, as an alternative to the classic public transportation systems. However, there are still some problems to be solved related to energy storage, electric charging and autonomy. In this paper, we present an autonomous docking system for electric vehicles recharging based on an embarked infrared camera performing infrared beacons detection installed in the infrastructure. A visual servoing system coupled with an automatic controller allows the vehicle to dock accurately to the recharging booth in a street parking area. The results show good behavior of the implemented system, which is currently deployed as a real prototype system in the city of Paris. PMID:23429581

  3. Autonomous docking based on infrared system for electric vehicle charging in urban areas.

    PubMed

    Pérez, Joshué; Nashashibi, Fawzi; Lefaudeux, Benjamin; Resende, Paulo; Pollard, Evangeline

    2013-02-21

    Electric vehicles are progressively introduced in urban areas, because of their ability to reduce air pollution, fuel consumption and noise nuisance. Nowadays, some big cities are launching the first electric car-sharing projects to clear traffic jams and enhance urban mobility, as an alternative to the classic public transportation systems. However, there are still some problems to be solved related to energy storage, electric charging and autonomy. In this paper, we present an autonomous docking system for electric vehicles recharging based on an embarked infrared camera performing infrared beacons detection installed in the infrastructure. A visual servoing system coupled with an automatic controller allows the vehicle to dock accurately to the recharging booth in a street parking area. The results show good behavior of the implemented system, which is currently deployed as a real prototype system in the city of Paris.

  4. A low-cost and portable realization on fringe projection three-dimensional measurement

    NASA Astrophysics Data System (ADS)

    Xiao, Suzhi; Tao, Wei; Zhao, Hui

    2015-12-01

    Fringe projection three-dimensional measurement is widely applied in a wide range of industrial application. The traditional fringe projection system has the disadvantages of high expense, big size, and complicated calibration requirements. In this paper we introduce a low-cost and portable realization on three-dimensional measurement with Pico projector. It has the advantages of low cost, compact physical size, and flexible configuration. For the proposed fringe projection system, there is no restriction to camera and projector's relative alignment on parallelism and perpendicularity for installation. Moreover, plane-based calibration method is adopted in this paper that avoids critical requirements on calibration system such as additional gauge block or precise linear z stage. What is more, error sources existing in the proposed system are introduced in this paper. The experimental results demonstrate the feasibility of the proposed low cost and portable fringe projection system.

  5. Traffic analysis and control using image processing

    NASA Astrophysics Data System (ADS)

    Senthilkumar, K.; Ellappan, Vijayan; Arun, A. R.

    2017-11-01

    This paper shows the work on traffic analysis and control till date. It shows an approach to regulate traffic the use of image processing and MATLAB systems. This concept uses computational images that are to be compared with original images of the street taken in order to determine the traffic level percentage and set the timing for the traffic signal accordingly which are used to reduce the traffic stoppage on traffic lights. They concept proposes to solve real life scenarios in the streets, thus enriching the traffic lights by adding image receivers like HD cameras and image processors. The input is then imported into MATLAB to be used. as a method for calculating the traffic on roads. Their results would be computed in order to adjust the traffic light timings on a particular street, and also with respect to other similar proposals but with the added value of solving a real, big instance.

  6. The MoEDAL Experiment at the LHC

    NASA Astrophysics Data System (ADS)

    Pinfold, James L.

    2014-04-01

    In 2010 the CERN (European Centre for Particle Physics Research) Research Board unanimously approved MoEDAL, the 7th international experiment at the Large Hadron Collider (LHC), which is designed to search for avatars of new physics signified by highly ionizing particles. The MoEDAL detector is like a giant camera ready to reveal "photographic" evidence for new physics and also to actually trap long-lived new particles for further study. The MoEDAL experiment will significantly expand the horizon for discovery at the LHC, in a complementary way. A MoEDAL discovery would have revolutionary implications for our understanding of the microcosm, providing insights into such fundamental questions as: do magnetic monopoles exist, are there extra dimensions or new symmetries of nature; what is the mechanism for the generation of mass; what is the nature of dark matter and how did the big-bang unfurl at the earliest times.

  7. CCD Photometry of bright stars using objective wire mesh

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamiński, Krzysztof; Zgórz, Marika; Schwarzenberg-Czerny, Aleksander, E-mail: chrisk@amu.edu.pl

    2014-06-01

    Obtaining accurate photometry of bright stars from the ground remains problematic due to the danger of overexposing the target and/or the lack of suitable nearby comparison stars. The century-old method of using objective wire mesh to produce multiple stellar images seems promising for the precise CCD photometry of such stars. Furthermore, our tests on β Cep and its comparison star, differing by 5 mag, are very encouraging. Using a CCD camera and a 20 cm telescope with the objective covered by a plastic wire mesh, in poor weather conditions, we obtained differential photometry with a precision of 4.5 mmag permore » two minute exposure. Our technique is flexible and may be tuned to cover a range as big as 6-8 mag. We discuss the possibility of installing a wire mesh directly in the filter wheel.« less

  8. High Throughput Screening of Esterases, Lipases and Phospholipases in Mutant and Metagenomic Libraries: A Review.

    PubMed

    Peña-García, Carlina; Martínez-Martínez, Mónica; Reyes-Duarte, Dolores; Ferrer, Manuel

    2016-01-01

    Nowadays, enzymes can be efficiently identified and screened from metagenomic resources or mutant libraries. A set of a few hundred new enzymes can be found using a simple substrate within few months. Hence, the establishment of collections of enzymes is no longer a big hurdle. However, a key problem is the relatively low rate of positive hits and that a timeline of several years from the identification of a gene to the development of a process is the reality rather than the exception. Major problems are related to the time-consuming and cost-intensive screening process that only very few enzymes finally pass. Accessing to the highest possible enzyme and mutant diversity by different, but complementary approaches is increasingly important. The aim of this review is to deliver state-of-art status of traditional and novel screening protocols for targeting lipases, esterases and phospholipases of industrial relevance, and that can be applied at high throughput scale (HTS) for at least 200 distinct substrates, at a speed of more than 105 - 108 clones/day. We also review fine-tuning sequence analysis pipelines and in silico tools, which can further improve enzyme selection by an unprecedent speed (up to 1030 enzymes). If the hit rate in an enzyme collection could be increased by HTS approaches, it can be expected that also the very further expensive and time-consuming enzyme optimization phase could be significantly shortened, as the processes of enzyme-candidate selection by such methods can be adapted to conditions most likely similar to the ones needed at industrial scale.

  9. Real Time Target Tracking Using Dedicated Vision Hardware

    NASA Astrophysics Data System (ADS)

    Kambies, Keith; Walsh, Peter

    1988-03-01

    This paper describes a real-time vision target tracking system developed by Adaptive Automation, Inc. and delivered to NASA's Launch Equipment Test Facility, Kennedy Space Center, Florida. The target tracking system is part of the Robotic Application Development Laboratory (RADL) which was designed to provide NASA with a general purpose robotic research and development test bed for the integration of robot and sensor systems. One of the first RADL system applications is the closing of a position control loop around a six-axis articulated arm industrial robot using a camera and dedicated vision processor as the input sensor so that the robot can locate and track a moving target. The vision system is inside of the loop closure of the robot tracking system, therefore, tight throughput and latency constraints are imposed on the vision system that can only be met with specialized hardware and a concurrent approach to the processing algorithms. State of the art VME based vision boards capable of processing the image at frame rates were used with a real-time, multi-tasking operating system to achieve the performance required. This paper describes the high speed vision based tracking task, the system throughput requirements, the use of dedicated vision hardware architecture, and the implementation design details. Important to the overall philosophy of the complete system was the hierarchical and modular approach applied to all aspects of the system, hardware and software alike, so there is special emphasis placed on this topic in the paper.

  10. 1-Million droplet array with wide-field fluorescence imaging for digital PCR.

    PubMed

    Hatch, Andrew C; Fisher, Jeffrey S; Tovar, Armando R; Hsieh, Albert T; Lin, Robert; Pentoney, Stephen L; Yang, David L; Lee, Abraham P

    2011-11-21

    Digital droplet reactors are useful as chemical and biological containers to discretize reagents into picolitre or nanolitre volumes for analysis of single cells, organisms, or molecules. However, most DNA based assays require processing of samples on the order of tens of microlitres and contain as few as one to as many as millions of fragments to be detected. Presented in this work is a droplet microfluidic platform and fluorescence imaging setup designed to better meet the needs of the high-throughput and high-dynamic-range by integrating multiple high-throughput droplet processing schemes on the chip. The design is capable of generating over 1-million, monodisperse, 50 picolitre droplets in 2-7 minutes that then self-assemble into high density 3-dimensional sphere packing configurations in a large viewing chamber for visualization and analysis. This device then undergoes on-chip polymerase chain reaction (PCR) amplification and fluorescence detection to digitally quantify the sample's nucleic acid contents. Wide-field fluorescence images are captured using a low cost 21-megapixel digital camera and macro-lens with an 8-12 cm(2) field-of-view at 1× to 0.85× magnification, respectively. We demonstrate both end-point and real-time imaging ability to perform on-chip quantitative digital PCR analysis of the entire droplet array. Compared to previous work, this highly integrated design yields a 100-fold increase in the number of on-chip digitized reactors with simultaneous fluorescence imaging for digital PCR based assays.

  11. Matching brain-machine interface performance to space applications.

    PubMed

    Citi, Luca; Tonet, Oliver; Marinelli, Martina

    2009-01-01

    A brain-machine interface (BMI) is a particular class of human-machine interface (HMI). BMIs have so far been studied mostly as a communication means for people who have little or no voluntary control of muscle activity. For able-bodied users, such as astronauts, a BMI would only be practical if conceived as an augmenting interface. A method is presented for pointing out effective combinations of HMIs and applications of robotics and automation to space. Latency and throughput are selected as performance measures for a hybrid bionic system (HBS), that is, the combination of a user, a device, and a HMI. We classify and briefly describe HMIs and space applications and then compare the performance of classes of interfaces with the requirements of classes of applications, both in terms of latency and throughput. Regions of overlap correspond to effective combinations. Devices requiring simpler control, such as a rover, a robotic camera, or environmental controls are suitable to be driven by means of BMI technology. Free flyers and other devices with six degrees of freedom can be controlled, but only at low-interactivity levels. More demanding applications require conventional interfaces, although they could be controlled by BMIs once the same levels of performance as currently recorded in animal experiments are attained. Robotic arms and manipulators could be the next frontier for noninvasive BMIs. Integrating smart controllers in HBSs could improve interactivity and boost the use of BMI technology in space applications.

  12. Molecular pathological epidemiology: new developing frontiers of big data science to study etiologies and pathogenesis.

    PubMed

    Hamada, Tsuyoshi; Keum, NaNa; Nishihara, Reiko; Ogino, Shuji

    2017-03-01

    Molecular pathological epidemiology (MPE) is an integrative field that utilizes molecular pathology to incorporate interpersonal heterogeneity of a disease process into epidemiology. In each individual, the development and progression of a disease are determined by a unique combination of exogenous and endogenous factors, resulting in different molecular and pathological subtypes of the disease. Based on "the unique disease principle," the primary aim of MPE is to uncover an interactive relationship between a specific environmental exposure and disease subtypes in determining disease incidence and mortality. This MPE approach can provide etiologic and pathogenic insights, potentially contributing to precision medicine for personalized prevention and treatment. Although breast, prostate, lung, and colorectal cancers have been among the most commonly studied diseases, the MPE approach can be used to study any disease. In addition to molecular features, host immune status and microbiome profile likely affect a disease process, and thus serve as informative biomarkers. As such, further integration of several disciplines into MPE has been achieved (e.g., pharmaco-MPE, immuno-MPE, and microbial MPE), to provide novel insights into underlying etiologic mechanisms. With the advent of high-throughput sequencing technologies, available genomic and epigenomic data have expanded dramatically. The MPE approach can also provide a specific risk estimate for each disease subgroup, thereby enhancing the impact of genome-wide association studies on public health. In this article, we present recent progress of MPE, and discuss the importance of accounting for the disease heterogeneity in the era of big-data health science and precision medicine.

  13. BigQ: a NoSQL based framework to handle genomic variants in i2b2.

    PubMed

    Gabetta, Matteo; Limongelli, Ivan; Rizzo, Ettore; Riva, Alberto; Segagni, Daniele; Bellazzi, Riccardo

    2015-12-29

    Precision medicine requires the tight integration of clinical and molecular data. To this end, it is mandatory to define proper technological solutions able to manage the overwhelming amount of high throughput genomic data needed to test associations between genomic signatures and human phenotypes. The i2b2 Center (Informatics for Integrating Biology and the Bedside) has developed a widely internationally adopted framework to use existing clinical data for discovery research that can help the definition of precision medicine interventions when coupled with genetic data. i2b2 can be significantly advanced by designing efficient management solutions of Next Generation Sequencing data. We developed BigQ, an extension of the i2b2 framework, which integrates patient clinical phenotypes with genomic variant profiles generated by Next Generation Sequencing. A visual programming i2b2 plugin allows retrieving variants belonging to the patients in a cohort by applying filters on genomic variant annotations. We report an evaluation of the query performance of our system on more than 11 million variants, showing that the implemented solution scales linearly in terms of query time and disk space with the number of variants. In this paper we describe a new i2b2 web service composed of an efficient and scalable document-based database that manages annotations of genomic variants and of a visual programming plug-in designed to dynamically perform queries on clinical and genetic data. The system therefore allows managing the fast growing volume of genomic variants and can be used to integrate heterogeneous genomic annotations.

  14. Screening the Molecular Framework Underlying Local Dendritic mRNA Translation

    PubMed Central

    Namjoshi, Sanjeev V.; Raab-Graham, Kimberly F.

    2017-01-01

    In the last decade, bioinformatic analyses of high-throughput proteomics and transcriptomics data have enabled researchers to gain insight into the molecular networks that may underlie lasting changes in synaptic efficacy. Development and utilization of these techniques have advanced the field of learning and memory significantly. It is now possible to move from the study of activity-dependent changes of a single protein to modeling entire network changes that require local protein synthesis. This data revolution has necessitated the development of alternative computational and statistical techniques to analyze and understand the patterns contained within. Thus, the focus of this review is to provide a synopsis of the journey and evolution toward big data techniques to address still unanswered questions regarding how synapses are modified to strengthen neuronal circuits. We first review the seminal studies that demonstrated the pivotal role played by local mRNA translation as the mechanism underlying the enhancement of enduring synaptic activity. In the interest of those who are new to the field, we provide a brief overview of molecular biology and biochemical techniques utilized for sample preparation to identify locally translated proteins using RNA sequencing and proteomics, as well as the computational approaches used to analyze these data. While many mRNAs have been identified, few have been shown to be locally synthesized. To this end, we review techniques currently being utilized to visualize new protein synthesis, a task that has proven to be the most difficult aspect of the field. Finally, we provide examples of future applications to test the physiological relevance of locally synthesized proteins identified by big data approaches. PMID:28286470

  15. Molecular pathological epidemiology: new developing frontiers of big data science to study etiologies and pathogenesis

    PubMed Central

    Hamada, Tsuyoshi; Keum, NaNa; Nishihara, Reiko; Ogino, Shuji

    2016-01-01

    Molecular pathological epidemiology (MPE) is an integrative field that utilizes molecular pathology to incorporate interpersonal heterogeneity of a disease process into epidemiology. In each individual, the development and progression of a disease are determined by a unique combination of exogenous and endogenous factors, resulting in different molecular and pathological subtypes of the disease. Based on “the unique disease principle,” the primary aim of MPE is to uncover an interactive relationship between a specific environmental exposure and disease subtypes in determining disease incidence and mortality. This MPE approach can provide etiologic and pathogenic insights, potentially contributing to precision medicine for personalized prevention and treatment. Although breast, prostate, lung, and colorectal cancers have been among the most commonly studied diseases, the MPE approach can be used to study any disease. In addition to molecular features, host immune status and microbiome profile likely affect a disease process, and thus serve as informative biomarkers. As such, further integration of several disciplines into MPE has been achieved (e.g., pharmaco-MPE, immuno-MPE, and microbial MPE), to provide novel insights into underlying etiologic mechanisms. With the advent of high-throughput sequencing technologies, available genomic and epigenomic data have expanded dramatically. The MPE approach can also provide a specific risk estimate for each disease subgroup, thereby enhancing the impact of genome-wide association studies on public health. In this article, we present recent progress of MPE, and discuss the importance of accounting for the disease heterogeneity in the era of big-data health science and precision medicine. PMID:27738762

  16. Radiomics in radiooncology - Challenging the medical physicist.

    PubMed

    Peeken, Jan C; Bernhofer, Michael; Wiestler, Benedikt; Goldberg, Tatyana; Cremers, Daniel; Rost, Burkhard; Wilkens, Jan J; Combs, Stephanie E; Nüsslin, Fridtjof

    2018-04-01

    Noticing the fast growing translation of artificial intelligence (AI) technologies to medical image analysis this paper emphasizes the future role of the medical physicist in this evolving field. Specific challenges are addressed when implementing big data concepts with high-throughput image data processing like radiomics and machine learning in a radiooncology environment to support clinical decisions. Based on the experience of our interdisciplinary radiomics working group, techniques for processing minable data, extracting radiomics features and associating this information with clinical, physical and biological data for the development of prediction models are described. A special emphasis was placed on the potential clinical significance of such an approach. Clinical studies demonstrate the role of radiomics analysis as an additional independent source of information with the potential to influence the radiooncology practice, i.e. to predict patient prognosis, treatment response and underlying genetic changes. Extending the radiomics approach to integrate imaging, clinical, genetic and dosimetric data ('panomics') challenges the medical physicist as member of the radiooncology team. The new field of big data processing in radiooncology offers opportunities to support clinical decisions, to improve predicting treatment outcome and to stimulate fundamental research on radiation response both of tumor and normal tissue. The integration of physical data (e.g. treatment planning, dosimetric, image guidance data) demands an involvement of the medical physicist in the radiomics approach of radiooncology. To cope with this challenge national and international organizations for medical physics should organize more training opportunities in artificial intelligence technologies in radiooncology. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. The Design and Application of Data Storage System in Miyun Satellite Ground Station

    NASA Astrophysics Data System (ADS)

    Xue, Xiping; Su, Yan; Zhang, Hongbo; Liu, Bin; Yao, Meijuan; Zhao, Shu

    2015-04-01

    China has launched Chang'E-3 satellite in 2013, firstly achieved soft landing on moon for China's lunar probe. Miyun satellite ground station firstly used SAN storage network system based-on Stornext sharing software in Chang'E-3 mission. System performance fully meets the application requirements of Miyun ground station data storage.The Stornext file system is a sharing file system with high performance, supports multiple servers to access the file system using different operating system at the same time, and supports access to data on a variety of topologies, such as SAN and LAN. Stornext focused on data protection and big data management. It is announced that Quantum province has sold more than 70,000 licenses of Stornext file system worldwide, and its customer base is growing, which marks its leading position in the big data management.The responsibilities of Miyun satellite ground station are the reception of Chang'E-3 satellite downlink data and management of local data storage. The station mainly completes exploration mission management, receiving and management of observation data, and provides a comprehensive, centralized monitoring and control functions on data receiving equipment. The ground station applied SAN storage network system based on Stornext shared software for receiving and managing data reliable.The computer system in Miyun ground station is composed by business running servers, application workstations and other storage equipments. So storage systems need a shared file system which supports heterogeneous multi-operating system. In practical applications, 10 nodes simultaneously write data to the file system through 16 channels, and the maximum data transfer rate of each channel is up to 15MB/s. Thus the network throughput of file system is not less than 240MB/s. At the same time, the maximum capacity of each data file is up to 810GB. The storage system planned requires that 10 nodes simultaneously write data to the file system through 16 channels with 240MB/s network throughput.When it is integrated,sharing system can provide 1020MB/s write speed simultaneously.When the master storage server fails, the backup storage server takes over the normal service.The literacy of client will not be affected,in which switching time is less than 5s.The design and integrated storage system meet users requirements. Anyway, all-fiber way is too expensive in SAN; SCSI hard disk transfer rate may still be the bottleneck in the development of the entire storage system. Stornext can provide users with efficient sharing, management, automatic archiving of large numbers of files and hardware solutions. It occupies a leading position in big data management. Storage is the most popular sharing shareware, and there are drawbacks in Stornext: Firstly, Stornext software is expensive, in which charge by the sites. When the network scale is large, the purchase cost will be very high. Secondly, the parameters of Stornext software are more demands on the skills of technical staff. If there is a problem, it is difficult to exclude.

  18. Applied Epidemiology and Public Health: Are We Training the Future Generations Appropriately?

    PubMed Central

    Brownson, Ross C.; Samet, Jonathan M.; Bensyl, Diana M.

    2017-01-01

    To extend the reach and relevance of epidemiology for public health practice, the science needs be broadened beyond etiologic research, to link more strongly with emerging technologies and to acknowledge key societal transformations. This new focus for epidemiology and its implications for epidemiologic training can be considered in the context of macro trends affecting society, including a greater focus on upstream causes of disease, shifting demographics, the Affordable Care Act and health care system reform, globalization, changing health communication environment, growing centrality of team and transdisciplinary science, emergence of translational sciences, greater focus on accountability, big data, informatics, high-throughput technologies (“omics”), privacy changes, and the evolving funding environment. This commentary describes existing approaches to and competencies for training in epidemiology, maps macro trends with competencies, highlights an example of competency-based education in the Epidemic Intelligence Service of Centers for Disease Control and Prevention, and suggests expanded and more dynamic training approaches. A re-examination of current approaches to epidemiologic training is needed. PMID:28038933

  19. Genomics pipelines and data integration: challenges and opportunities in the research setting

    PubMed Central

    Davis-Turak, Jeremy; Courtney, Sean M.; Hazard, E. Starr; Glen, W. Bailey; da Silveira, Willian; Wesselman, Timothy; Harbin, Larry P.; Wolf, Bethany J.; Chung, Dongjun; Hardiman, Gary

    2017-01-01

    Introduction The emergence and mass utilization of high-throughput (HT) technologies, including sequencing technologies (genomics) and mass spectrometry (proteomics, metabolomics, lipids), has allowed geneticists, biologists, and biostatisticians to bridge the gap between genotype and phenotype on a massive scale. These new technologies have brought rapid advances in our understanding of cell biology, evolutionary history, microbial environments, and are increasingly providing new insights and applications towards clinical care and personalized medicine. Areas covered The very success of this industry also translates into daunting big data challenges for researchers and institutions that extend beyond the traditional academic focus of algorithms and tools. The main obstacles revolve around analysis provenance, data management of massive datasets, ease of use of software, interpretability and reproducibility of results. Expert Commentary The authors review the challenges associated with implementing bioinformatics best practices in a large-scale setting, and highlight the opportunity for establishing bioinformatics pipelines that incorporate data tracking and auditing, enabling greater consistency and reproducibility for basic research, translational or clinical settings. PMID:28092471

  20. Identification of Nanoparticle Prototypes and Archetypes.

    PubMed

    Fernandez, Michael; Barnard, Amanda S

    2015-12-22

    High-throughput (HT) computational characterization of nanomaterials is poised to accelerate novel material breakthroughs. The number of possible nanomaterials is increasing exponentially along with their complexity, and so statistical and information technology will play a fundamental role in rationalizing nanomaterials HT data. We demonstrate that multivariate statistical analysis of heterogeneous ensembles can identify the truly significant nanoparticles and their most relevant properties. Virtual samples of diamond nanoparticles and graphene nanoflakes are characterized using clustering and archetypal analysis, where we find that saturated particles are defined by their geometry, while nonsaturated nanoparticles are defined by their carbon chemistry. At the complex hull of the nanostructure spaces, a combination of complex archetypes can efficiency describe a large number of members of the ensembles, whereas the regular shapes that are typically assumed to be representative can only describe a small set of the most regular morphologies. This approach provides a route toward the characterization of computationally intractable virtual nanomaterial spaces, which can aid nanomaterials discovery in the foreseen big data scenario.

  1. Radiomics: a new application from established techniques

    PubMed Central

    Parekh, Vishwa; Jacobs, Michael A.

    2016-01-01

    The increasing use of biomarkers in cancer have led to the concept of personalized medicine for patients. Personalized medicine provides better diagnosis and treatment options available to clinicians. Radiological imaging techniques provide an opportunity to deliver unique data on different types of tissue. However, obtaining useful information from all radiological data is challenging in the era of “big data”. Recent advances in computational power and the use of genomics have generated a new area of research termed Radiomics. Radiomics is defined as the high throughput extraction of quantitative imaging features or texture (radiomics) from imaging to decode tissue pathology and creating a high dimensional data set for feature extraction. Radiomic features provide information about the gray-scale patterns, inter-pixel relationships. In addition, shape and spectral properties can be extracted within the same regions of interest on radiological images. Moreover, these features can be further used to develop computational models using advanced machine learning algorithms that may serve as a tool for personalized diagnosis and treatment guidance. PMID:28042608

  2. Application of hydrophobic interaction displacement chromatography for an industrial protein purification.

    PubMed

    Sunasara, Khurram M; Xia, Fang; Gronke, Robert S; Cramer, Steven M

    2003-05-05

    Recently it has been established that low molecular weight displacers can be successfully employed for the purification of proteins in hydrophobic interaction chromatography (HIC) systems. This work investigates the utility of this technique for the purification of an industrial protein mixture. The study involved the separation of a mixture of three protein forms, that differed in the C-terminus, from their aggregate impurities while maintaining the same relative ratio of the three protein forms as in the feed. A batch high-throughput screening (HTS) technique was employed in concert with fluorescence spectroscopy for displacer screening in these HIC systems. This methodology was demonstrated to be an effective tool for identifying lead displacer candidates for a particular protein/stationary-phase system. In addition, these results indicate that surfactants can be employed at concentrations above their CMCs as effective displacers. Displacement of the recombinant proteins with PEG-3400 and the surfactant Big Chap was shown to increase the productivity as compared to the existing step-gradient elution process. Copyright 2003 Wiley Periodicals, Inc.

  3. Genomics pipelines and data integration: challenges and opportunities in the research setting.

    PubMed

    Davis-Turak, Jeremy; Courtney, Sean M; Hazard, E Starr; Glen, W Bailey; da Silveira, Willian A; Wesselman, Timothy; Harbin, Larry P; Wolf, Bethany J; Chung, Dongjun; Hardiman, Gary

    2017-03-01

    The emergence and mass utilization of high-throughput (HT) technologies, including sequencing technologies (genomics) and mass spectrometry (proteomics, metabolomics, lipids), has allowed geneticists, biologists, and biostatisticians to bridge the gap between genotype and phenotype on a massive scale. These new technologies have brought rapid advances in our understanding of cell biology, evolutionary history, microbial environments, and are increasingly providing new insights and applications towards clinical care and personalized medicine. Areas covered: The very success of this industry also translates into daunting big data challenges for researchers and institutions that extend beyond the traditional academic focus of algorithms and tools. The main obstacles revolve around analysis provenance, data management of massive datasets, ease of use of software, interpretability and reproducibility of results. Expert commentary: The authors review the challenges associated with implementing bioinformatics best practices in a large-scale setting, and highlight the opportunity for establishing bioinformatics pipelines that incorporate data tracking and auditing, enabling greater consistency and reproducibility for basic research, translational or clinical settings.

  4. Harnessing the Big Data Paradigm for ICME: Shifting from Materials Selection to Materials Enabled Design

    NASA Astrophysics Data System (ADS)

    Broderick, Scott R.; Santhanam, Ganesh Ram; Rajan, Krishna

    2016-08-01

    As the size of databases has significantly increased, whether through high throughput computation or through informatics-based modeling, the challenge of selecting the optimal material for specific design requirements has also arisen. Given the multiple, and often conflicting, design requirements, this selection process is not as trivial as sorting the database for a given property value. We suggest that the materials selection process should minimize selector bias, as well as take data uncertainty into account. For this reason, we discuss and apply decision theory for identifying chemical additions to Ni-base alloys. We demonstrate and compare results for both a computational array of chemistries and standard commercial superalloys. We demonstrate how we can use decision theory to select the best chemical additions for enhancing both property and processing, which would not otherwise be easily identifiable. This work is one of the first examples of introducing the mathematical framework of set theory and decision analysis into the domain of the materials selection process.

  5. The ALICE analysis train system

    NASA Astrophysics Data System (ADS)

    Zimmermann, Markus; ALICE Collaboration

    2015-05-01

    In the ALICE experiment hundreds of users are analyzing big datasets on a Grid system. High throughput and short turn-around times are achieved by a centralized system called the LEGO trains. This system combines analysis from different users in so-called analysis trains which are then executed within the same Grid jobs thereby reducing the number of times the data needs to be read from the storage systems. The centralized trains improve the performance, the usability for users and the bookkeeping in comparison to single user analysis. The train system builds upon the already existing ALICE tools, i.e. the analysis framework as well as the Grid submission and monitoring infrastructure. The entry point to the train system is a web interface which is used to configure the analysis and the desired datasets as well as to test and submit the train. Several measures have been implemented to reduce the time a train needs to finish and to increase the CPU efficiency.

  6. Applied epidemiology and public health: are we training the future generations appropriately?

    PubMed

    Brownson, Ross C; Samet, Jonathan M; Bensyl, Diana M

    2017-02-01

    To extend the reach and relevance of epidemiology for public health practice, the science needs be broadened beyond etiologic research, to link more strongly with emerging technologies and to acknowledge key societal transformations. This new focus for epidemiology and its implications for epidemiologic training can be considered in the context of macro trends affecting society, including a greater focus on upstream causes of disease, shifting demographics, the Affordable Care Act and health care system reform, globalization, changing health communication environment, growing centrality of team and transdisciplinary science, emergence of translational sciences, greater focus on accountability, big data, informatics, high-throughput technologies ("omics"), privacy changes, and the evolving funding environment. This commentary describes existing approaches to and competencies for training in epidemiology, maps macro trends with competencies, highlights an example of competency-based education in the Epidemic Intelligence Service of Centers for Disease Control and Prevention, and suggests expanded and more dynamic training approaches. A reexamination of current approaches to epidemiologic training is needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. GOBLET: The Global Organisation for Bioinformatics Learning, Education and Training

    PubMed Central

    Atwood, Teresa K.; Bongcam-Rudloff, Erik; Brazas, Michelle E.; Corpas, Manuel; Gaudet, Pascale; Lewitter, Fran; Mulder, Nicola; Palagi, Patricia M.; Schneider, Maria Victoria; van Gelder, Celia W. G.

    2015-01-01

    In recent years, high-throughput technologies have brought big data to the life sciences. The march of progress has been rapid, leaving in its wake a demand for courses in data analysis, data stewardship, computing fundamentals, etc., a need that universities have not yet been able to satisfy—paradoxically, many are actually closing “niche” bioinformatics courses at a time of critical need. The impact of this is being felt across continents, as many students and early-stage researchers are being left without appropriate skills to manage, analyse, and interpret their data with confidence. This situation has galvanised a group of scientists to address the problems on an international scale. For the first time, bioinformatics educators and trainers across the globe have come together to address common needs, rising above institutional and international boundaries to cooperate in sharing bioinformatics training expertise, experience, and resources, aiming to put ad hoc training practices on a more professional footing for the benefit of all. PMID:25856076

  8. Experimental study on pool boiling of distilled water and HFE7500 fluid under microgravity

    NASA Astrophysics Data System (ADS)

    Yang, Yan-jie; Chen, Xiao-qian; Huang, Yi-yong; Li, Guang-yu

    2018-02-01

    The experimental study on bubble behavior and heat transfer of pool boiling for distilled water and HFE7500 fluid under microgravity has been conducted by using drop tower in the National Microgravity Laboratory of China (NMLC). Two MCH ceramic plates of 20 mm(L) × 10 mm(W) × 1.2 mm(H) were used as the heaters. The nucleate boiling evolution under microgravity was observed during the experiment. It has been found that at the same heat flux, the bubbles of HFE7500 (which has smaller contact angle) grew faster and bigger, moved quickly on the heater surface, and were easier to merge into a central big bubble with other bubbles than that of distilled water. The whole process of bubbles coalescence from seven to one was recorded by using video camera. For distilled water (with bigger contact angle), the bubbles tended to keep at the nucleate location on heater surface, and the central big bubble evolved at its nucleate cite by absorbing smaller bubbles nearby. Compared with the bubbles under normal gravity, bubble radius of distilled water under microgravity was about 1.4 times bigger and of HFE7500 was about more than 6 times bigger till the end of experiment. At the beginning, pool boiling heat transfer of distilled water was advanced and then impeded under microgravity. As to HFE7500, the pool boiling impedes the heat transfer from heater to liquid under microgravity throughout the experiment.

  9. Efficient processing of multiple nested event pattern queries over multi-dimensional event streams based on a triaxial hierarchical model.

    PubMed

    Xiao, Fuyuan; Aritsugi, Masayoshi; Wang, Qing; Zhang, Rong

    2016-09-01

    For efficient and sophisticated analysis of complex event patterns that appear in streams of big data from health care information systems and support for decision-making, a triaxial hierarchical model is proposed in this paper. Our triaxial hierarchical model is developed by focusing on hierarchies among nested event pattern queries with an event concept hierarchy, thereby allowing us to identify the relationships among the expressions and sub-expressions of the queries extensively. We devise a cost-based heuristic by means of the triaxial hierarchical model to find an optimised query execution plan in terms of the costs of both the operators and the communications between them. According to the triaxial hierarchical model, we can also calculate how to reuse the results of the common sub-expressions in multiple queries. By integrating the optimised query execution plan with the reuse schemes, a multi-query optimisation strategy is developed to accomplish efficient processing of multiple nested event pattern queries. We present empirical studies in which the performance of multi-query optimisation strategy was examined under various stream input rates and workloads. Specifically, the workloads of pattern queries can be used for supporting monitoring patients' conditions. On the other hand, experiments with varying input rates of streams can correspond to changes of the numbers of patients that a system should manage, whereas burst input rates can correspond to changes of rushes of patients to be taken care of. The experimental results have shown that, in Workload 1, our proposal can improve about 4 and 2 times throughput comparing with the relative works, respectively; in Workload 2, our proposal can improve about 3 and 2 times throughput comparing with the relative works, respectively; in Workload 3, our proposal can improve about 6 times throughput comparing with the relative work. The experimental results demonstrated that our proposal was able to process complex queries efficiently which can support health information systems and further decision-making. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. White Dwarf Stars

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Peering deep inside a cluster of several hundred thousand stars, NASA's Hubble Space Telescope has uncovered the oldest burned-out stars in our Milky Way Galaxy, giving astronomers a fresh reading on the age of the universe.

    Located in the globular cluster M4, these small, burned-out stars -- called white dwarfs -- are about 12 to 13 billion years old. By adding the one billion years it took the cluster to form after the Big Bang, astronomers found that the age of the white dwarfs agrees with previous estimates that the universe is 13 to 14 billion years old.

    The images, including some taken by Hubble's Wide Field and Planetary Camera 2, are available online at

    http://oposite.stsci.edu/pubinfo/pr/2002/10/ or

    http://www.jpl.nasa.gov/images/wfpc .

    The camera was designed and built by NASA's Jet Propulsion Laboratory, Pasadena, Calif.

    In the top panel, a ground-based observatory snapped a panoramic view of the entire cluster, which contains several hundred thousand stars within a volume of 10 to 30 light-years across. The Kitt Peak National Observatory's .9-meter telescope took this picture in March 1995. The box at left indicates the region observed by the Hubble telescope.

    The Hubble telescope studied a small region of the cluster. A section of that region is seen in the picture at bottom left. A sampling of an even smaller region is shown at bottom right. This region is only about one light-year across. In this smaller region, Hubble pinpointed a number of faint white dwarfs. The blue circles indicate the dwarfs. It took nearly eight days of exposure time over a 67-day period to find these extremely faint stars.

    Globular clusters are among the oldest clusters of stars in the universe. The faintest and coolest white dwarfs within globular clusters can yield a globular cluster's age. Earlier Hubble observations showed that the first stars formed less than 1 billion years after the universe's birth in the big bang. So, finding the oldest stars puts astronomers within arm's reach of the universe's age.

    Hubble's Wide Field and Planetary Camera 2 made the observations from January through April 2001. These optical observations were combined to create the above images. Spectral data were also taken. M4 is 7,000 light-years away in the constellation Scorpius.

    The full press release on the latest findings is online at

    http://oposite.stsci.edu/pubinfo/pr/2002/10/pr.html .

    The Space Telescope Science Institute is operated by the Association of Universities for Research in Astronomy, Inc., for NASA under contract with the Goddard Space Flight Center, Greenbelt, Md. The Hubble Space Telescope is a project of international cooperation between the European Space Agency and NASA. The California Institute of Technology in Pasadena manages JPL for NASA.

  11. HCI∧2 framework: a software framework for multimodal human-computer interaction systems.

    PubMed

    Shen, Jie; Pantic, Maja

    2013-12-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a shared-memory-based data transport protocol for message delivery and a TCP-based system management protocol. The latter ensures that the integrity of system structure is maintained at runtime. With the inclusion of bridging modules, the HCI∧2 Framework is interoperable with other software frameworks including Psyclone and ActiveMQ. In addition to the core communication middleware, we also present the integrated development environment (IDE) of the HCI∧2 Framework. It provides a complete graphical environment to support every step in a typical MHCI system development process, including module development, debugging, packaging, and management, as well as the whole system management and testing. The quantitative evaluation indicates that our framework outperforms other similar tools in terms of average message latency and maximum data throughput under a typical single PC scenario. To demonstrate HCI∧2 Framework's capabilities in integrating heterogeneous modules, we present several example modules working with a variety of hardware and software. We also present an example of a full system developed using the proposed HCI∧2 Framework, which is called the CamGame system and represents a computer game based on hand-held marker(s) and low-cost camera(s).

  12. Calibration strategies for the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Gaug, Markus; Berge, David; Daniel, Michael; Doro, Michele; Förster, Andreas; Hofmann, Werner; Maccarone, Maria C.; Parsons, Dan; de los Reyes Lopez, Raquel; van Eldik, Christopher

    2014-08-01

    The Central Calibration Facilities workpackage of the Cherenkov Telescope Array (CTA) observatory for very high energy gamma ray astronomy defines the overall calibration strategy of the array, develops dedicated hardware and software for the overall array calibration and coordinates the calibration efforts of the different telescopes. The latter include LED-based light pulsers, and various methods and instruments to achieve a calibration of the overall optical throughput. On the array level, methods for the inter-telescope calibration and the absolute calibration of the entire observatory are being developed. Additionally, the atmosphere above the telescopes, used as a calorimeter, will be monitored constantly with state-of-the-art instruments to obtain a full molecular and aerosol profile up to the stratosphere. The aim is to provide a maximal uncertainty of 10% on the reconstructed energy-scale, obtained through various independent methods. Different types of LIDAR in combination with all-sky-cameras will provide the observatory with an online, intelligent scheduling system, which, if the sky is partially covered by clouds, gives preference to sources observable under good atmospheric conditions. Wide-field optical telescopes and Raman Lidars will provide online information about the height-resolved atmospheric extinction, throughout the field-of-view of the cameras, allowing for the correction of the reconstructed energy of each gamma-ray event. The aim is to maximize the duty cycle of the observatory, in terms of usable data, while reducing the dead time introduced by calibration activities to an absolute minimum.

  13. The HURRA filter: An easy method to eliminate collimator artifacts in high-energy gamma camera images.

    PubMed

    Perez-Garcia, H; Barquero, R

    The correct determination and delineation of tumor/organ size is crucial in 2-D imaging in 131 I therapy. These images are usually obtained using a system composed of a Gamma camera and high-energy collimator, although the system can produce artifacts in the image. This article analyses these artifacts and describes a correction filter that can eliminate those collimator artifacts. Using free software, ImageJ, a central profile in the image is obtained and analyzed. Two components can be seen in the fluctuation of the profile: one associated with the stochastic nature of the radiation, plus electronic noise and the other periodically across the position in space due to the collimator. These frequencies are analytically obtained and compared with the frequencies in the Fourier transform of the profile. A specially developed filter removes the artifacts in the 2D Fourier transform of the DICOM image. This filter is tested using a 15-cm-diameter Petri dish with 131 I radioactive water (big object size) image, a 131 I clinical pill (small object size) image, and an image of the remainder of the lesion of two patients treated with 3.7GBq (100mCi), and 4.44GBq (120mCi) of 131 I, respectively, after thyroidectomy. The artifact is due to the hexagonal periodic structure of the collimator. The use of the filter on large-sized images reduces the fluctuation by 5.8-3.5%. In small-sized images, the FWHM can be determined in the filtered image, while this is impossible in the unfiltered image. The definition of tumor boundary and the visualization of the activity distribution inside patient lesions improve drastically when the filter is applied to the corresponding images obtained with HE gamma camera. The HURRA filter removes the artifact of high-energy collimator artifacts in planar images obtained with a Gamma camera without reducing the image resolution. It can be applied in any study of patient quantification because the number of counts remains invariant. The filter makes possible the definition and delimitation of small uptakes, such as those presented in treatments with 131 I. Copyright © 2016 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  14. Identifying candidate drivers of drug response in heterogeneous cancer by mining high throughput genomics data.

    PubMed

    Nabavi, Sheida

    2016-08-15

    With advances in technologies, huge amounts of multiple types of high-throughput genomics data are available. These data have tremendous potential to identify new and clinically valuable biomarkers to guide the diagnosis, assessment of prognosis, and treatment of complex diseases, such as cancer. Integrating, analyzing, and interpreting big and noisy genomics data to obtain biologically meaningful results, however, remains highly challenging. Mining genomics datasets by utilizing advanced computational methods can help to address these issues. To facilitate the identification of a short list of biologically meaningful genes as candidate drivers of anti-cancer drug resistance from an enormous amount of heterogeneous data, we employed statistical machine-learning techniques and integrated genomics datasets. We developed a computational method that integrates gene expression, somatic mutation, and copy number aberration data of sensitive and resistant tumors. In this method, an integrative method based on module network analysis is applied to identify potential driver genes. This is followed by cross-validation and a comparison of the results of sensitive and resistance groups to obtain the final list of candidate biomarkers. We applied this method to the ovarian cancer data from the cancer genome atlas. The final result contains biologically relevant genes, such as COL11A1, which has been reported as a cis-platinum resistant biomarker for epithelial ovarian carcinoma in several recent studies. The described method yields a short list of aberrant genes that also control the expression of their co-regulated genes. The results suggest that the unbiased data driven computational method can identify biologically relevant candidate biomarkers. It can be utilized in a wide range of applications that compare two conditions with highly heterogeneous datasets.

  15. RabbitQR: fast and flexible big data processing at LSST data rates using existing, shared-use hardware

    NASA Astrophysics Data System (ADS)

    Kotulla, Ralf; Gopu, Arvind; Hayashi, Soichi

    2016-08-01

    Processing astronomical data to science readiness was and remains a challenge, in particular in the case of multi detector instruments such as wide-field imagers. One such instrument, the WIYN One Degree Imager, is available to the astronomical community at large, and, in order to be scientifically useful to its varied user community on a short timescale, provides its users fully calibrated data in addition to the underlying raw data. However, time-efficient re-processing of the often large datasets with improved calibration data and/or software requires more than just a large number of CPU-cores and disk space. This is particularly relevant if all computing resources are general purpose and shared with a large number of users in a typical university setup. Our approach to address this challenge is a flexible framework, combining the best of both high performance (large number of nodes, internal communication) and high throughput (flexible/variable number of nodes, no dedicated hardware) computing. Based on the Advanced Message Queuing Protocol, we a developed a Server-Manager- Worker framework. In addition to the server directing the work flow and the worker executing the actual work, the manager maintains a list of available worker, adds and/or removes individual workers from the worker pool, and re-assigns worker to different tasks. This provides the flexibility of optimizing the worker pool to the current task and workload, improves load balancing, and makes the most efficient use of the available resources. We present performance benchmarks and scaling tests, showing that, today and using existing, commodity shared- use hardware we can process data with data throughputs (including data reduction and calibration) approaching that expected in the early 2020s for future observatories such as the Large Synoptic Survey Telescope.

  16. Screening of 485 Pesticide Residues in Fruits and Vegetables by Liquid Chromatography-Quadrupole-Time-of-Flight Mass Spectrometry Based on TOF Accurate Mass Database and QTOF Spectrum Library.

    PubMed

    Pang, Guo-Fang; Fan, Chun-Lin; Chang, Qiao-Ying; Li, Jian-Xun; Kang, Jian; Lu, Mei-Ling

    2018-03-22

    This paper uses the LC-quadrupole-time-of-flight MS technique to evaluate the behavioral characteristics of MSof 485 pesticides under different conditions and has developed an accurate mass database and spectra library. A high-throughput screening and confirmation method has been developed for the 485 pesticides in fruits and vegetables. Through the optimization of parameters such as accurate mass number, time of retention window, ionization forms, etc., the method has improved the accuracy of pesticide screening, thus avoiding the occurrence of false-positive and false-negative results. The method features a full scan of fragments, with 80% of pesticide qualitative points over 10, which helps increase pesticide qualitative accuracy. The abundant differences of fragment categories help realize the effective separation and qualitative identification of isomer pesticides. Four different fruits and vegetables-apples, grapes, celery, and tomatoes-were chosen to evaluate the efficiency of the method at three fortification levels of 5, 10, and 20 μg/kg, and satisfactory results were obtained. With this method, a national survey of pesticide residues was conducted between 2012 and 2015 for 12 551 samples of 146 different fruits and vegetables collected from 638 sampling points in 284 counties across 31 provincial capitals/cities directly under the central government, which provided scientific data backup for ensuring pesticide residue safety of the fruits and vegetables consumed daily by the public. Meanwhile, the big data statistical analysis of the new technique also further proves it to be of high speed, high throughput, high accuracy, high reliability, and high informatization.

  17. Evaluation of the NanoCHIP® Gastrointestinal Panel (GIP) Test for Simultaneous Detection of Parasitic and Bacterial Enteric Pathogens in Fecal Specimens

    PubMed Central

    Ken Dror, Shifra; Pavlotzky, Elsa; Barak, Mira

    2016-01-01

    Infectious gastroenteritis is a global health problem associated with high morbidity and mortality rates. Rapid and accurate diagnosis is crucial to allow appropriate and timely treatment. Current laboratory stool testing has a long turnaround time (TAT) and demands highly qualified personnel and multiple techniques. The need for high throughput and the number of possible enteric pathogens compels the implementation of a molecular approach which uses multiplex technology, without compromising performance requirements. In this work we evaluated the feasibility of the NanoCHIP® Gastrointestinal Panel (GIP) (Savyon Diagnostics, Ashdod, IL), a molecular microarray-based screening test, to be used in the routine workflow of our laboratory, a big outpatient microbiology laboratory. The NanoCHIP® GIP test provides simultaneous detection of nine major enteric bacteria and parasites: Campylobacter spp., Salmonella spp., Shigella spp., Giardia sp., Cryptosporidium spp., Entamoeba histolytica, Entamoeba dispar, Dientamoeba fragilis, and Blastocystis spp. The required high-throughput was obtained by the NanoCHIP® detection system together with the MagNA Pure 96 DNA purification system (Roche Diagnostics Ltd., Switzerland). This combined system has demonstrated a higher sensitivity and detection yield compared to the conventional methods in both, retrospective and prospective samples. The identification of multiple parasites and bacteria in a single test also enabled increased efficiency of detecting mixed infections, as well as reduced hands-on time and work load. In conclusion, the combination of these two automated systems is a proper response to the laboratory needs in terms of improving laboratory workflow, turn-around-time, minimizing human errors and can be efficiently integrated in the routine work of the laboratory. PMID:27447173

  18. New Algorithm and Software (BNOmics) for Inferring and Visualizing Bayesian Networks from Heterogeneous Big Biological and Genetic Data

    PubMed Central

    Gogoshin, Grigoriy; Boerwinkle, Eric

    2017-01-01

    Abstract Bayesian network (BN) reconstruction is a prototypical systems biology data analysis approach that has been successfully used to reverse engineer and model networks reflecting different layers of biological organization (ranging from genetic to epigenetic to cellular pathway to metabolomic). It is especially relevant in the context of modern (ongoing and prospective) studies that generate heterogeneous high-throughput omics datasets. However, there are both theoretical and practical obstacles to the seamless application of BN modeling to such big data, including computational inefficiency of optimal BN structure search algorithms, ambiguity in data discretization, mixing data types, imputation and validation, and, in general, limited scalability in both reconstruction and visualization of BNs. To overcome these and other obstacles, we present BNOmics, an improved algorithm and software toolkit for inferring and analyzing BNs from omics datasets. BNOmics aims at comprehensive systems biology—type data exploration, including both generating new biological hypothesis and testing and validating the existing ones. Novel aspects of the algorithm center around increasing scalability and applicability to varying data types (with different explicit and implicit distributional assumptions) within the same analysis framework. An output and visualization interface to widely available graph-rendering software is also included. Three diverse applications are detailed. BNOmics was originally developed in the context of genetic epidemiology data and is being continuously optimized to keep pace with the ever-increasing inflow of available large-scale omics datasets. As such, the software scalability and usability on the less than exotic computer hardware are a priority, as well as the applicability of the algorithm and software to the heterogeneous datasets containing many data types—single-nucleotide polymorphisms and other genetic/epigenetic/transcriptome variables, metabolite levels, epidemiological variables, endpoints, and phenotypes, etc. PMID:27681505

  19. Data mining with iPlant: a meeting report from the 2013 GARNet workshop, Data mining with iPlant.

    PubMed

    Martin, Lisa; Cook, Charis; Matasci, Naim; Williams, Jason; Bastow, Ruth

    2015-01-01

    High-throughput sequencing technologies have rapidly moved from large international sequencing centres to individual laboratory benchtops. These changes have driven the 'data deluge' of modern biology. Submissions of nucleotide sequences to GenBank, for example, have doubled in size every year since 1982, and individual data sets now frequently reach terabytes in size. While 'big data' present exciting opportunities for scientific discovery, data analysis skills are not part of the typical wet bench biologist's experience. Knowing what to do with data, how to visualize and analyse them, make predictions, and test hypotheses are important barriers to success. Many researchers also lack adequate capacity to store and share these data, creating further bottlenecks to effective collaboration between groups and institutes. The US National Science Foundation-funded iPlant Collaborative was established in 2008 to form part of the data collection and analysis pipeline and help alleviate the bottlenecks associated with the big data challenge in plant science. Leveraging the power of high-performance computing facilities, iPlant provides free-to-use cyberinfrastructure to enable terabytes of data storage, improve analysis, and facilitate collaborations. To help train UK plant science researchers to use the iPlant platform and understand how it can be exploited to further research, GARNet organized a four-day Data mining with iPlant workshop at Warwick University in September 2013. This report provides an overview of the workshop, and highlights the power of the iPlant environment for lowering barriers to using complex bioinformatics resources, furthering discoveries in plant science research and providing a platform for education and outreach programmes. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  20. Lost in Translation (LiT)

    PubMed Central

    Dollery, Colin T

    2014-01-01

    Translational medicine is a roller coaster with occasional brilliant successes and a large majority of failures. Lost in Translation 1 (‘LiT1’), beginning in the 1950s, was a golden era built upon earlier advances in experimental physiology, biochemistry and pharmacology, with a dash of serendipity, that led to the discovery of many new drugs for serious illnesses. LiT2 saw the large-scale industrialization of drug discovery using high-throughput screens and assays based on affinity for the target molecule. The links between drug development and university sciences and medicine weakened, but there were still some brilliant successes. In LiT3, the coverage of translational medicine expanded from molecular biology to drug budgets, with much greater emphasis on safety and official regulation. Compared with R&D expenditure, the number of breakthrough discoveries in LiT3 was disappointing, but monoclonal antibodies for immunity and inflammation brought in a new golden era and kinase inhibitors such as imatinib were breakthroughs in cancer. The pharmaceutical industry is trying to revive the LiT1 approach by using phenotypic assays and closer links with academia. LiT4 faces a data explosion generated by the genome project, GWAS, ENCODE and the ‘omics’ that is in danger of leaving LiT4 in a computerized cloud. Industrial laboratories are filled with masses of automated machinery while the scientists sit in a separate room viewing the results on their computers. Big Data will need Big Thinking in LiT4 but with so many unmet medical needs and so many new opportunities being revealed there are high hopes that the roller coaster will ride high again. PMID:24428732

  1. New Algorithm and Software (BNOmics) for Inferring and Visualizing Bayesian Networks from Heterogeneous Big Biological and Genetic Data.

    PubMed

    Gogoshin, Grigoriy; Boerwinkle, Eric; Rodin, Andrei S

    2017-04-01

    Bayesian network (BN) reconstruction is a prototypical systems biology data analysis approach that has been successfully used to reverse engineer and model networks reflecting different layers of biological organization (ranging from genetic to epigenetic to cellular pathway to metabolomic). It is especially relevant in the context of modern (ongoing and prospective) studies that generate heterogeneous high-throughput omics datasets. However, there are both theoretical and practical obstacles to the seamless application of BN modeling to such big data, including computational inefficiency of optimal BN structure search algorithms, ambiguity in data discretization, mixing data types, imputation and validation, and, in general, limited scalability in both reconstruction and visualization of BNs. To overcome these and other obstacles, we present BNOmics, an improved algorithm and software toolkit for inferring and analyzing BNs from omics datasets. BNOmics aims at comprehensive systems biology-type data exploration, including both generating new biological hypothesis and testing and validating the existing ones. Novel aspects of the algorithm center around increasing scalability and applicability to varying data types (with different explicit and implicit distributional assumptions) within the same analysis framework. An output and visualization interface to widely available graph-rendering software is also included. Three diverse applications are detailed. BNOmics was originally developed in the context of genetic epidemiology data and is being continuously optimized to keep pace with the ever-increasing inflow of available large-scale omics datasets. As such, the software scalability and usability on the less than exotic computer hardware are a priority, as well as the applicability of the algorithm and software to the heterogeneous datasets containing many data types-single-nucleotide polymorphisms and other genetic/epigenetic/transcriptome variables, metabolite levels, epidemiological variables, endpoints, and phenotypes, etc.

  2. Automated Functional Analysis of Astrocytes from Chronic Time-Lapse Calcium Imaging Data

    PubMed Central

    Wang, Yinxue; Shi, Guilai; Miller, David J.; Wang, Yizhi; Wang, Congchao; Broussard, Gerard; Wang, Yue; Tian, Lin; Yu, Guoqiang

    2017-01-01

    Recent discoveries that astrocytes exert proactive regulatory effects on neural information processing and that they are deeply involved in normal brain development and disease pathology have stimulated broad interest in understanding astrocyte functional roles in brain circuit. Measuring astrocyte functional status is now technically feasible, due to recent advances in modern microscopy and ultrasensitive cell-type specific genetically encoded Ca2+ indicators for chronic imaging. However, there is a big gap between the capability of generating large dataset via calcium imaging and the availability of sophisticated analytical tools for decoding the astrocyte function. Current practice is essentially manual, which not only limits analysis throughput but also risks introducing bias and missing important information latent in complex, dynamic big data. Here, we report a suite of computational tools, called Functional AStrocyte Phenotyping (FASP), for automatically quantifying the functional status of astrocytes. Considering the complex nature of Ca2+ signaling in astrocytes and low signal to noise ratio, FASP is designed with data-driven and probabilistic principles, to flexibly account for various patterns and to perform robustly with noisy data. In particular, FASP explicitly models signal propagation, which rules out the applicability of tools designed for other types of data. We demonstrate the effectiveness of FASP using extensive synthetic and real data sets. The findings by FASP were verified by manual inspection. FASP also detected signals that were missed by purely manual analysis but could be confirmed by more careful manual examination under the guidance of automatic analysis. All algorithms and the analysis pipeline are packaged into a plugin for Fiji (ImageJ), with the source code freely available online at https://github.com/VTcbil/FASP. PMID:28769780

  3. Lost in Translation (LiT): IUPHAR Review 6.

    PubMed

    Dollery, Colin T

    2014-05-01

    Translational medicine is a roller coaster with occasional brilliant successes and a large majority of failures. Lost in Translation 1 ('LiT1'), beginning in the 1950s, was a golden era built upon earlier advances in experimental physiology, biochemistry and pharmacology, with a dash of serendipity, that led to the discovery of many new drugs for serious illnesses. LiT2 saw the large-scale industrialization of drug discovery using high-throughput screens and assays based on affinity for the target molecule. The links between drug development and university sciences and medicine weakened, but there were still some brilliant successes. In LiT3, the coverage of translational medicine expanded from molecular biology to drug budgets, with much greater emphasis on safety and official regulation. Compared with R&D expenditure, the number of breakthrough discoveries in LiT3 was disappointing, but monoclonal antibodies for immunity and inflammation brought in a new golden era and kinase inhibitors such as imatinib were breakthroughs in cancer. The pharmaceutical industry is trying to revive the LiT1 approach by using phenotypic assays and closer links with academia. LiT4 faces a data explosion generated by the genome project, GWAS, ENCODE and the 'omics' that is in danger of leaving LiT4 in a computerized cloud. Industrial laboratories are filled with masses of automated machinery while the scientists sit in a separate room viewing the results on their computers. Big Data will need Big Thinking in LiT4 but with so many unmet medical needs and so many new opportunities being revealed there are high hopes that the roller coaster will ride high again. © 2014 The British Pharmacological Society.

  4. Enabling phenotypic big data with PheNorm.

    PubMed

    Yu, Sheng; Ma, Yumeng; Gronsbell, Jessica; Cai, Tianrun; Ananthakrishnan, Ashwin N; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Liao, Katherine P; Cai, Tianxi

    2018-01-01

    Electronic health record (EHR)-based phenotyping infers whether a patient has a disease based on the information in his or her EHR. A human-annotated training set with gold-standard disease status labels is usually required to build an algorithm for phenotyping based on a set of predictive features. The time intensiveness of annotation and feature curation severely limits the ability to achieve high-throughput phenotyping. While previous studies have successfully automated feature curation, annotation remains a major bottleneck. In this paper, we present PheNorm, a phenotyping algorithm that does not require expert-labeled samples for training. The most predictive features, such as the number of International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes or mentions of the target phenotype, are normalized to resemble a normal mixture distribution with high area under the receiver operating curve (AUC) for prediction. The transformed features are then denoised and combined into a score for accurate disease classification. We validated the accuracy of PheNorm with 4 phenotypes: coronary artery disease, rheumatoid arthritis, Crohn's disease, and ulcerative colitis. The AUCs of the PheNorm score reached 0.90, 0.94, 0.95, and 0.94 for the 4 phenotypes, respectively, which were comparable to the accuracy of supervised algorithms trained with sample sizes of 100-300, with no statistically significant difference. The accuracy of the PheNorm algorithms is on par with algorithms trained with annotated samples. PheNorm fully automates the generation of accurate phenotyping algorithms and demonstrates the capacity for EHR-driven annotations to scale to the next level - phenotypic big data. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  5. Automated Functional Analysis of Astrocytes from Chronic Time-Lapse Calcium Imaging Data.

    PubMed

    Wang, Yinxue; Shi, Guilai; Miller, David J; Wang, Yizhi; Wang, Congchao; Broussard, Gerard; Wang, Yue; Tian, Lin; Yu, Guoqiang

    2017-01-01

    Recent discoveries that astrocytes exert proactive regulatory effects on neural information processing and that they are deeply involved in normal brain development and disease pathology have stimulated broad interest in understanding astrocyte functional roles in brain circuit. Measuring astrocyte functional status is now technically feasible, due to recent advances in modern microscopy and ultrasensitive cell-type specific genetically encoded Ca 2+ indicators for chronic imaging. However, there is a big gap between the capability of generating large dataset via calcium imaging and the availability of sophisticated analytical tools for decoding the astrocyte function. Current practice is essentially manual, which not only limits analysis throughput but also risks introducing bias and missing important information latent in complex, dynamic big data. Here, we report a suite of computational tools, called Functional AStrocyte Phenotyping (FASP), for automatically quantifying the functional status of astrocytes. Considering the complex nature of Ca 2+ signaling in astrocytes and low signal to noise ratio, FASP is designed with data-driven and probabilistic principles, to flexibly account for various patterns and to perform robustly with noisy data. In particular, FASP explicitly models signal propagation, which rules out the applicability of tools designed for other types of data. We demonstrate the effectiveness of FASP using extensive synthetic and real data sets. The findings by FASP were verified by manual inspection. FASP also detected signals that were missed by purely manual analysis but could be confirmed by more careful manual examination under the guidance of automatic analysis. All algorithms and the analysis pipeline are packaged into a plugin for Fiji (ImageJ), with the source code freely available online at https://github.com/VTcbil/FASP.

  6. Spectroscopic imaging system for high-throughput viability assessment of ovarian spheroids or microdissected tumor tissues (MDTs) in a microfluidic chip

    NASA Astrophysics Data System (ADS)

    St-Georges-Robillard, A.; Masse, M.; Kendall-Dupont, J.; Strupler, M.; Patra, B.; Jermyn, M.; Mes-Masson, A.-M.; Leblond, F.; Gervais, T.

    2016-02-01

    There is a growing effort in the biomicrosystems community to develop a personalized treatment response assay for cancer patients using primary cells, patient-derived spheroids, or live tissues on-chip. Recently, our group has developed a technique to cut tumors in 350 μm diameter microtissues and keep them alive on-chip, enabling multiplexed in vitro drug assays on primary tumor tissue. Two-photon microscopy, confocal microscopy and flow cytometry are the current standard to assay tissue chemosensitivity on-chip. While these techniques provide microscopic and molecular information, they are not adapted for high-throughput analysis of microtissues. We present a spectroscopic imaging system that allows rapid quantitative measurements of multiple fluorescent viability markers simultaneously by using a liquid crystal tunable filter to record fluorescence and transmittance spectra. As a proof of concept, 24 spheroids composed of ovarian cancer cell line OV90 were formed in a microfluidic chip, stained with two live cell markers (CellTrackerTM Green and Orange), and imaged. Fluorescence images acquired were normalized to the acquisition time and gain of the camera, dark noise was removed, spectral calibration was applied, and spatial uniformity was corrected. Spectral un-mixing was applied to separate each fluorophore's contribution. We have demonstrated that rapid and simultaneous viability measurements on multiple spheroids can be achieved, which will have a significant impact on the prediction of a tumor's response to multiple treatment options. This technique may be applied as well in drug discovery to assess the potential of a drug candidate directly on human primary tissue.

  7. Digital micromirror devices in Raman trace detection of explosives

    NASA Astrophysics Data System (ADS)

    Glimtoft, Martin; Svanqvist, Mattias; Ågren, Matilda; Nordberg, Markus; Östmark, Henric

    2016-05-01

    Imaging Raman spectroscopy based on tunable filters is an established technique for detecting single explosives particles at stand-off distances. However, large light losses are inherent in the design due to sequential imaging at different wavelengths, leading to effective transmission often well below 1 %. The use of digital micromirror devices (DMD) and compressive sensing (CS) in imaging Raman explosives trace detection can improve light throughput and add significant flexibility compared to existing systems. DMDs are based on mature microelectronics technology, and are compact, scalable, and can be customized for specific tasks, including new functions not available with current technologies. This paper has been focusing on investigating how a DMD can be used when applying CS-based imaging Raman spectroscopy on stand-off explosives trace detection, and evaluating the performance in terms of light throughput, image reconstruction ability and potential detection limits. This type of setup also gives the possibility to combine imaging Raman with non-spatially resolved fluorescence suppression techniques, such as Kerr gating. The system used consists of a 2nd harmonics Nd:YAG laser for sample excitation, collection optics, DMD, CMOScamera and a spectrometer with ICCD camera for signal gating and detection. Initial results for compressive sensing imaging Raman shows a stable reconstruction procedure even at low signals and in presence of interfering background signal. It is also shown to give increased effective light transmission without sacrificing molecular specificity or area coverage compared to filter based imaging Raman. At the same time it adds flexibility so the setup can be customized for new functionality.

  8. Dwarf Galaxies Swimming in Tidal Tails

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This false-color infrared image from NASA's Spitzer Space Telescope shows little 'dwarf galaxies' forming in the 'tails' of two larger galaxies that are colliding together. The big galaxies are at the center of the picture, while the dwarfs can be seen as red dots in the red streamers, or tidal tails. The two blue dots above the big galaxies are stars in the foreground.

    Galaxy mergers are common occurrences in the universe; for example, our own Milky Way galaxy will eventually smash into the nearby Andromeda galaxy. When two galaxies meet, they tend to rip each other apart, leaving a trail, called a tidal tail, of gas and dust in their wake. It is out of this galactic debris that new dwarf galaxies are born.

    The new Spitzer picture demonstrates that these particular dwarfs are actively forming stars. The red color indicates the presence of dust produced in star-forming regions, including organic molecules called polycyclic aromatic hydrocarbons. These carbon-containing molecules are also found on Earth, in car exhaust and on burnt toast, among other places. Here, the molecules are being heated up by the young stars, and, as a result, shine in infrared light.

    This image was taken by the infrared array camera on Spitzer. It is a 4-color composite of infrared light, showing emissions from wavelengths of 3.6 microns (blue), 4.5 microns (green), 5.8 microns (orange), and 8.0 microns (red). Starlight has been subtracted from the orange and red channels in order to enhance the dust features.

  9. Dynamic Echo Information Guides Flight in the Big Brown Bat

    PubMed Central

    Warnecke, Michaela; Lee, Wu-Jung; Krishnan, Anand; Moss, Cynthia F.

    2016-01-01

    Animals rely on sensory feedback from their environment to guide locomotion. For instance, visually guided animals use patterns of optic flow to control their velocity and to estimate their distance to objects (e.g., Srinivasan et al., 1991, 1996). In this study, we investigated how acoustic information guides locomotion of animals that use hearing as a primary sensory modality to orient and navigate in the dark, where visual information is unavailable. We studied flight and echolocation behaviors of big brown bats as they flew under infrared illumination through a corridor with walls constructed from a series of individual vertical wooden poles. The spacing between poles on opposite walls of the corridor was experimentally manipulated to create dense/sparse and balanced/imbalanced spatial structure. The bats’ flight trajectories and echolocation signals were recorded with high-speed infrared motion-capture cameras and ultrasound microphones, respectively. As bats flew through the corridor, successive biosonar emissions returned cascades of echoes from the walls of the corridor. The bats flew through the center of the corridor when the pole spacing on opposite walls was balanced and closer to the side with wider pole spacing when opposite walls had an imbalanced density. Moreover, bats produced shorter duration echolocation calls when they flew through corridors with smaller spacing between poles, suggesting that clutter density influences features of the bat’s sonar signals. Flight speed and echolocation call rate did not, however, vary with dense and sparse spacing between the poles forming the corridor walls. Overall, these data demonstrate that bats adapt their flight and echolocation behavior dynamically when flying through acoustically complex environments. PMID:27199690

  10. SAAO's new robotic telescope and WiNCam (Wide-field Nasmyth Camera)

    NASA Astrophysics Data System (ADS)

    Worters, Hannah L.; O'Connor, James E.; Carter, David B.; Loubser, Egan; Fourie, Pieter A.; Sickafoose, Amanda; Swanevelder, Pieter

    2016-08-01

    The South African Astronomical Observatory (SAAO) is designing and manufacturing a wide-field camera for use on two of its telescopes. The initial concept was of a Prime focus camera for the 74" telescope, an equatorial design made by Grubb Parsons, where it would employ a 61mmx61mm detector to cover a 23 arcmin diameter field of view. However, while in the design phase, SAAO embarked on the process of acquiring a bespoke 1-metre robotic alt-az telescope with a 43 arcmin field of view, which needs a homegrown instrument suite. The Prime focus camera design was thus adapted for use on either telescope, increasing the detector size to 92mmx92mm. Since the camera will be mounted on the Nasmyth port of the new telescope, it was dubbed WiNCam (Wide-field Nasmyth Camera). This paper describes both WiNCam and the new telescope. Producing an instrument that can be swapped between two very different telescopes poses some unique challenges. At the Nasmyth port of the alt-az telescope there is ample circumferential space, while on the 74 inch the available envelope is constrained by the optical footprint of the secondary, if further obscuration is to be avoided. This forces the design into a cylindrical volume of 600mm diameter x 250mm height. The back focal distance is tightly constrained on the new telescope, shoehorning the shutter, filter unit, guider mechanism, a 10mm thick window and a tip/tilt mechanism for the detector into 100mm depth. The iris shutter and filter wheel planned for prime focus could no longer be accommodated. Instead, a compact shutter with a thickness of less than 20mm has been designed in-house, using a sliding curtain mechanism to cover an aperture of 125mmx125mm, while the filter wheel has been replaced with 2 peripheral filter cartridges (6 filters each) and a gripper to move a filter into the beam. We intend using through-vacuum wall PCB technology across the cryostat vacuum interface, instead of traditional hermetic connector-based wiring. This has advantages in terms of space saving and improved performance. Measures are being taken to minimise the risk of damage during an instrument change. The detector is cooled by a Stirling cooler, which can be disconnected from the cooler unit without risking damage. Each telescope has a dedicated cooler unit into which the coolant hoses of WiNCam will plug. To overcome an inherent drawback of Stirling coolers, an active vibration damper is incorporated. During an instrument change, the autoguider remains on the telescope, and the filter magazines, shutter and detector package are removed as a single unit. The new alt-az telescope, manufactured by APM-Telescopes, is a 1-metre f/8 Ritchey-Chrétien with optics by LOMO. The field flattening optics were designed by Darragh O'Donoghue to have high UV throughput and uniform encircled energy over the 100mm diameter field. WiNCam will be mounted on one Nasmyth port, with the second port available for SHOC (Sutherland High-speed Optical Camera) and guest instrumentation. The telescope will be located in Sutherland, where an existing dome is being extensively renovated to accommodate it. Commissioning is planned for the second half of 2016.

  11. First Born amplitude for transitions from a circular state to a state of large (l, m)

    NASA Astrophysics Data System (ADS)

    Dewangan, D. P.

    2005-01-01

    The use of cylindrical polar coordinates instead of the conventional spherical polar coordinates enables us to derive compact expressions of the first Born amplitude for some selected sets of transitions from an arbitrary initial circular \\big|\\psi_{n_i,n_i-1,n_i-1}\\big\\rangle state to a final \\big|\\psi_{n_f,l_f,m_f}\\big\\rangle state of large (lf, mf). The formulae for \\big|\\psi_{n_i,n_i-1,n_i-1}\\big\\rangle \\longrightarrow \\big|\\psi_{n_f,n_f-1,n_f-2}\\big\\rangle and \\big|\\psi_{n_i,n_i-1,n_i-1}\\big\\rangle \\longrightarrow \\big|\\psi_{n_f,n_f-1,n_f-3}\\big\\rangle transitions are expressed in terms of the Jacobi polynomials which serve as suitable starting points for constructing complete solutions over the bound energy levels of hydrogen-like atoms. The formulae for \\big|\\psi_{n_i,n_i-1,n_i-1}\\big\\rangle \\longrightarrow \\big|\\psi_{n_f,n_f-1,-(n_f-2)}\\big\\rangle and \\big|\\psi_{n_i,n_i-1,n_i-1}\\big\\rangle \\longrightarrow \\big|\\psi_{n_f,n_f-1,-(n_f-3)}\\big\\rangle transitions are in simple algebraic forms and are directly applicable to all possible values of ni and nf. It emerges that the method can be extended to evaluate the first Born amplitude for many other transitions involving states of large (l, m).

  12. SLAM-based dense surface reconstruction in monocular Minimally Invasive Surgery and its application to Augmented Reality.

    PubMed

    Chen, Long; Tang, Wen; John, Nigel W; Wan, Tao Ruan; Zhang, Jian Jun

    2018-05-01

    While Minimally Invasive Surgery (MIS) offers considerable benefits to patients, it also imposes big challenges on a surgeon's performance due to well-known issues and restrictions associated with the field of view (FOV), hand-eye misalignment and disorientation, as well as the lack of stereoscopic depth perception in monocular endoscopy. Augmented Reality (AR) technology can help to overcome these limitations by augmenting the real scene with annotations, labels, tumour measurements or even a 3D reconstruction of anatomy structures at the target surgical locations. However, previous research attempts of using AR technology in monocular MIS surgical scenes have been mainly focused on the information overlay without addressing correct spatial calibrations, which could lead to incorrect localization of annotations and labels, and inaccurate depth cues and tumour measurements. In this paper, we present a novel intra-operative dense surface reconstruction framework that is capable of providing geometry information from only monocular MIS videos for geometry-aware AR applications such as site measurements and depth cues. We address a number of compelling issues in augmenting a scene for a monocular MIS environment, such as drifting and inaccurate planar mapping. A state-of-the-art Simultaneous Localization And Mapping (SLAM) algorithm used in robotics has been extended to deal with monocular MIS surgical scenes for reliable endoscopic camera tracking and salient point mapping. A robust global 3D surface reconstruction framework has been developed for building a dense surface using only unorganized sparse point clouds extracted from the SLAM. The 3D surface reconstruction framework employs the Moving Least Squares (MLS) smoothing algorithm and the Poisson surface reconstruction framework for real time processing of the point clouds data set. Finally, the 3D geometric information of the surgical scene allows better understanding and accurate placement AR augmentations based on a robust 3D calibration. We demonstrate the clinical relevance of our proposed system through two examples: (a) measurement of the surface; (b) depth cues in monocular endoscopy. The performance and accuracy evaluations of the proposed framework consist of two steps. First, we have created a computer-generated endoscopy simulation video to quantify the accuracy of the camera tracking by comparing the results of the video camera tracking with the recorded ground-truth camera trajectories. The accuracy of the surface reconstruction is assessed by evaluating the Root Mean Square Distance (RMSD) of surface vertices of the reconstructed mesh with that of the ground truth 3D models. An error of 1.24 mm for the camera trajectories has been obtained and the RMSD for surface reconstruction is 2.54 mm, which compare favourably with previous approaches. Second, in vivo laparoscopic videos are used to examine the quality of accurate AR based annotation and measurement, and the creation of depth cues. These results show the potential promise of our geometry-aware AR technology to be used in MIS surgical scenes. The results show that the new framework is robust and accurate in dealing with challenging situations such as the rapid endoscopy camera movements in monocular MIS scenes. Both camera tracking and surface reconstruction based on a sparse point cloud are effective and operated in real-time. This demonstrates the potential of our algorithm for accurate AR localization and depth augmentation with geometric cues and correct surface measurements in MIS with monocular endoscopes. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Combined optical sizing and acoustical characterization of single freely-floating microbubbles

    NASA Astrophysics Data System (ADS)

    Luan, Ying; Renaud, Guillaume; Raymond, Jason L.; Segers, Tim; Lajoinie, Guillaume; Beurskens, Robert; Mastik, Frits; Kokhuis, Tom J. A.; van der Steen, Antonius F. W.; Versluis, Michel; de Jong, Nico

    2016-12-01

    In this study we present a combined optical sizing and acoustical characterization technique for the study of the dynamics of single freely-floating ultrasound contrast agent microbubbles exposed to long burst ultrasound excitations up to the milliseconds range. A co-axial flow device was used to position individual microbubbles on a streamline within the confocal region of three ultrasound transducers and a high-resolution microscope objective. Bright-field images of microbubbles passing through the confocal region were captured using a high-speed camera synchronized to the acoustical data acquisition to assess the microbubble response to a 1-MHz ultrasound burst. Nonlinear bubble vibrations were identified at a driving pressure as low as 50 kPa. The results demonstrate good agreement with numerical simulations based on the shell-buckling model proposed by Marmottant et al. [J. Acoust. Soc. Am. 118, 3499-3505 (2005)]. The system demonstrates the potential for a high-throughput in vitro characterization of individual microbubbles.

  14. Computer-aided target tracking in motion analysis studies

    NASA Astrophysics Data System (ADS)

    Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.

    1990-08-01

    Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.

  15. Application of polarization in high speed, high contrast inspection

    NASA Astrophysics Data System (ADS)

    Novak, Matthew J.

    2017-08-01

    Industrial optical inspection often requires high speed and high throughput of materials. Engineers use a variety of techniques to handle these inspection needs. Some examples include line scan cameras, high speed multi-spectral and laser-based systems. High-volume manufacturing presents different challenges for inspection engineers. For example, manufacturers produce some components in quantities of millions per month, per week or even per day. Quality control of so many parts requires creativity to achieve the measurement needs. At times, traditional vision systems lack the contrast to provide the data required. In this paper, we show how dynamic polarization imaging captures high contrast images. These images are useful for engineers to perform inspection tasks in some cases where optical contrast is low. We will cover basic theory of polarization. We show how to exploit polarization as a contrast enhancement technique. We also show results of modeling for a polarization inspection application. Specifically, we explore polarization techniques for inspection of adhesives on glass.

  16. Fast-ion Dα spectrum diagnostic in the EAST

    NASA Astrophysics Data System (ADS)

    Hou, Y. M.; Wu, C. R.; Huang, J.; Heidbrink, W. W.; von Hellermann, M. G.; Xu, Z.; Jin, Z.; Chang, J. F.; Zhu, Y. B.; Gao, W.; Chen, Y. J.; Lyu, B.; Hu, R. J.; Zhang, P. F.; Zhang, L.; Gao, W.; Wu, Z. W.; Yu, Y.; Ye, M. Y.

    2016-11-01

    In toroidal magnetic fusion devices, fast-ion D-alpha diagnostic (FIDA) is a powerful method to study the fast-ion feature. The fast-ion characteristics can be inferred from the Doppler shifted spectrum of Dα light according to charge exchange recombination process between fast ions and probe beam. Since conceptual design presented in the last HTPD conference, significant progress has been made to apply FIDA systems on the Experimental Advanced Superconducting Tokamak (EAST). Both co-current and counter-current neutral beam injectors are available, and each can deliver 2-4 MW beam power with 50-80 keV beam energy. Presently, two sets of high throughput spectrometer systems have been installed on EAST, allowing to capture passing and trapped fast-ion characteristics simultaneously, using Kaiser HoloSpec transmission grating spectrometer and Bunkoukeiki FLP-200 volume phase holographic spectrometer coupled with Princeton Instruments ProEM 1024B eXcelon and Andor DU-888 iXon3 1024 CCD camera, respectively. This paper will present the details of the hardware descriptions and experimental spectrum.

  17. Ultra-high-speed variable focus optics for novel applications in advanced imaging

    NASA Astrophysics Data System (ADS)

    Kang, S.; Dotsenko, E.; Amrhein, D.; Theriault, C.; Arnold, C. B.

    2018-02-01

    With the advancement of ultra-fast manufacturing technologies, high speed imaging with high 3D resolution has become increasingly important. Here we show the use of an ultra-high-speed variable focus optical element, the TAG Lens, to enable new ways to acquire 3D information from an object. The TAG Lens uses sound to adjust the index of refraction profile in a liquid and thereby can achieve focal scanning rates greater than 100 kHz. When combined with a high-speed pulsed LED and a high-speed camera, we can exploit this phenomenon to achieve high-resolution imaging through large depths. By combining the image acquisition with digital image processing, we can extract relevant parameters such as tilt and angle information from objects in the image. Due to the high speeds at which images can be collected and processed, we believe this technique can be used as an efficient method of industrial inspection and metrology for high throughput applications.

  18. Monitoring the Environmental Impact of TiO2 Nanoparticles Using a Plant-Based Sensor Network

    PubMed Central

    Lenaghan, Scott C.; Li, Yuanyuan; Zhang, Hao; Burris, Jason N.; Stewart, C. Neal; Parker, Lynne E.; Zhang, Mingjun

    2016-01-01

    The increased manufacturing of nanoparticles for use in cosmetics, foods, and clothing necessitates the need for an effective system to monitor and evaluate the potential environmental impact of these nanoparticles. The goal of this research was to develop a plant-based sensor network for characterizing, monitoring, and understanding the environmental impact of TiO2 nanoparticles. The network consisted of potted Arabidopsis thaliana with a surrounding water supply, which was monitored by cameras attached to a laptop computer running a machine learning algorithm. Using the proposed plant sensor network, we were able to examine the toxicity of TiO2 nanoparticles in two systems: algae and terrestrial plants. Increased terrestrial plant growth was observed upon introduction of the nanoparticles, whereas algal growth decreased significantly. The proposed system can be further automated for high-throughput screening of nanoparticle toxicity in the environment at multiple trophic levels. The proposed plant-based sensor network could be used for more accurate characterization of the environmental impact of nanomaterials. PMID:28458617

  19. Remote detection of buried land-mines and IEDs using LWIR polarimetric imaging.

    PubMed

    Gurton, Kristan P; Felton, Melvin

    2012-09-24

    We report results of an ongoing study designed to assess the ability for enhanced detection of recently buried land-mines and/or improvised explosive devices (IED) devices using passive long-wave infrared (LWIR) polarimetric imaging. Polarimetric results are presented for a series of field tests conducted at various locations and soil types. Well-calibrated Stokes images, S0, S1, S2, and the degree-of-linear-polarization (DoLP) are recorded for different line-of-sight (LOS) slant paths at varying distances. Results span a three-year time period in which three different LWIR polarimetric camera systems are used. All three polarimetric imaging platforms used a spinning-achromatic-retarder (SAR) design capable of achieving high polarimetric frame rates and good radiometric throughput without the loss of spatial resolution inherent in other optical designs. Receiver-operating-characteristic (ROC) analysis and a standardized contrast parameter are used to compare detectability between conventional LWIR thermal and polarimetric imagery. Results suggest improved detectability, regardless of geographic location or soil type.

  20. Novel Aspects of the DESI Data Acquisition System

    NASA Astrophysics Data System (ADS)

    Beaufore, Lucas; Honscheid, Klaus; Elliott, Ann; Dark Energy Spectroscopic Instrument Collaboration

    2015-04-01

    The Dark Energy Spectroscopic Instrument (DESI) will measure the effect of dark energy on the expansion of the universe. It will obtain optical spectra for tens of millions of galaxies and quasars, constructing a 3-dimensional map spanning the nearby universe to 10 billion light years. The survey will be conducted on the Mayall 4-meter telescope at Kitt Peak National Observatory starting in 2018. In order to achieve these scientific goals the DESI collaboration is building a high throughput spectrograph capable of observing thousands of spectra simultaneously. In this presentation we discuss the DESI instrument control and data acquisition system that is currently being developed to operate the 5,000 fiber positioners in the focal plane, the 10 spectrographs each with three CDD cameras and every other aspect of the instrument. Special emphasis will be given to novel aspects of the design including the use of inexpensive Linux-based microcontrollers such as the Raspberry PI to control a number of DESI hardware components.

  1. Clinical diagnostic of pleural effusions using a high-speed viscosity measurement method

    NASA Astrophysics Data System (ADS)

    Hurth, Cedric; Klein, Katherine; van Nimwegen, Lena; Korn, Ronald; Vijayaraghavan, Krishnaswami; Zenhausern, Frederic

    2011-08-01

    We present a novel bio-analytical method to discriminate between transudative and exudative pleural effusions based on a high-speed video analysis of a solid glass sphere impacting a liquid. Since the result depends on the solution viscosity, it can ultimately replace the battery of biochemical assays currently used. We present results obtained on a series of 7 pleural effusions obtained from consenting patients by analyzing both the splash observed after the glass impactor hits the liquid surface, and in a configuration reminiscent of the drop ball viscometer with added sensitivity and throughput provided by the high-speed camera. The results demonstrate distinction between the pleural effusions and good correlation with the fluid chemistry analysis to accurately differentiate exudates and transudates for clinical purpose. The exudative effusions display a viscosity around 1.39 ± 0.08 cP whereas the transudative effusion was measured at 0.89 ± 0.09 cP, in good agreement with previous reports.

  2. Infrared imaging of the polymer 3D-printing process

    NASA Astrophysics Data System (ADS)

    Dinwiddie, Ralph B.; Kunc, Vlastimil; Lindal, John M.; Post, Brian; Smith, Rachel J.; Love, Lonnie; Duty, Chad E.

    2014-05-01

    Both mid-wave and long-wave IR cameras are used to measure various temperature profiles in thermoplastic parts as they are printed. Two significantly different 3D-printers are used in this study. The first is a small scale commercially available Solidoodle 3 printer, which prints parts with layer thicknesses on the order of 125μm. The second printer used is a "Big Area Additive Manufacturing" (BAAM) 3D-printer developed at Oak Ridge National Laboratory. The BAAM prints parts with a layer thicknesses of 4.06 mm. Of particular interest is the temperature of the previously deposited layer as the new hot layer is about to be extruded onto it. The two layers are expected have a stronger bond if the temperature of the substrate layer is above the glass transition temperature. This paper describes the measurement technique and results for a study of temperature decay and substrate layer temperature for ABS thermoplastic with and without the addition of chopped carbon fibers.

  3. The Polarization-Sensitive Bolometers for SPICA and their Potential Use for Ground-Based Application

    NASA Astrophysics Data System (ADS)

    Reveret, Vincent

    2018-01-01

    CEA is leading the development of Safari-POL, an imaging-polarimeter aboard the SPICA space observatory (ESA M5). SPICA will be able to reach unprecedented sensitivities thanks to its cooled telescope and its ultra-sensitive detectors. The detector assembly of Safari-POL holds three arrays that are cooled down to 50 mK and correspond to three spectral bands : 100, 200 and 350 microns. The detectors (silicon bolometers), benefit from the Herschel/PACS legacy and are also a big step forward in term of sensitivity (improved by two orders of magnitude compared to PACS bolometers) and for polarimetry capabilities. Indeed, each pixel is intrinsically sensitive to two polarization components (Horizontal and Vertical). We will present the Safari-POL concept, the first results of measurements made on the detectors, and future plans for possible ground-based instruments using this technology. We will also present the example of the ArTéMiS camera, installed at APEX, that was developped as a ground-based conterpart of the PACS photometer.

  4. Contributions to the History of Astronomy, Vol. 8 (German Title: Beiträge zur Astronomiegeschichte, Band 8)

    NASA Astrophysics Data System (ADS)

    Dick, Wolfgang R.; Hamel, Jürgen

    The contributions span a time interval of more than 450 years. There are biographical investigations on Georg Joachim Rheticus, C.W.A. von Wahl and K.F. Heym, investigation on a reprint of a chapter of the principal work of Nicolaus Copernicus, on Christoph Scheiner and the "camera obscura", and, with respect to the history of timekeeping, on the "big Nuremberg clock". 19th century topics are: a contribution on the honorary doctorate of Joseph Fraunhofer, and on the construction of a lunar globe by Wilhelmine Witte, while the report on Friedrich Wilhelm Bessel and the cholera pandemia in Königsberg in the year 1831 gives a view into everyday life of scientists. 20th century topics are: the contributions on Bruno Thüring in Vienna and his relations with national socialism, as well as on Arthur Beer, Albert Einstein and the Warburg library. The book concludes by short communications, obituaries and book reviews.

  5. Mighty Little Dot

    NASA Image and Video Library

    2014-12-01

    Enceladus (visible in the lower-left corner of the image) is but a speck before enormous Saturn, but even a small moon can generate big waves of excitement throughout the scientific community. Enceladus, only 313 miles (504 kilometers) across, spurts vapor jets from its south pole. The presence of these jets from Enceladus has been the subject of intense study since they were discovered by Cassini. Their presence may point to a sub-surface water reservoir. This view looks toward the unilluminated side of the rings from about 2 degrees below the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on Oct. 20, 2014 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was obtained at a distance of approximately 589,000 miles (948,000 kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 26 degrees. Image scale is 35 miles (57 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18296

  6. Human grasping database for activities of daily living with depth, color and kinematic data streams.

    PubMed

    Saudabayev, Artur; Rysbek, Zhanibek; Khassenova, Raykhan; Varol, Huseyin Atakan

    2018-05-29

    This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth sensor on the dominant arm and upper body kinematic data acquired from an inertial motion capture suit. 3826 grasps were identified in the data collected during 9-hours of experiments. The grasps were grouped according to a hierarchical taxonomy into 35 different grasp types. The database contains information related to each grasp and associated sensor data acquired from the three sensor modalities. We also provide our data annotation software written in Matlab as an open-source tool. The size of the database is 172 GB. We believe this database can be used as a stepping stone to develop big data and machine learning techniques for grasping and manipulation with potential applications in rehabilitation robotics and intelligent automation.

  7. Design of an Optical System for Phase Retrieval based on a Spatial Light Modulator

    NASA Astrophysics Data System (ADS)

    Falldorf, Claas; Agour, Mostafa; von Kopylow, Christoph; Bergmann, Ralf B.

    2010-04-01

    We present an optical configuration for phase retrieval from a sequence of intensity measurements. The setup is based on a 4f-configuration with a phase modulating spatial light modulator (SLM) located in the Fourier domain. The SLM is used to modulate the incoming light with the transfer function of propagation, thus a sequence of propagated representations of the subjected wave field can be captured across a common sensor plane. The main advantage of this technique is the greatly reduced measurement time, since no mechanical adjustment of the camera sensor is required throughout the measurement process. The treatise is focused on the analysis of the wave field in the sensor domain. From the discussion a set of parameters is derived in order to minimize disturbing effects arising from the discrete nature of the SLM. Finally, the big potential of this approach is demonstrated by means of experimental investigations with regard to wave field sensing.

  8. The Large Synoptic Survey Telescope OCS and TCS models

    NASA Astrophysics Data System (ADS)

    Schumacher, German; Delgado, Francisco

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) is a project envisioned as a system of systems with demanding science, technical, and operational requirements, that must perform as a fully integrated unit. The design and implementation of such a system poses big engineering challenges when performing requirements analysis, detailed interface definitions, operational modes and control strategy studies. The OMG System Modeling Language (SysML) has been selected as the framework for the systems engineering analysis and documentation for the LSST. Models for the overall system architecture and different observatory subsystems have been built describing requirements, structure, interfaces and behavior. In this paper we show the models for the Observatory Control System (OCS) and the Telescope Control System (TCS), and how this methodology has helped in the clarification of the design and requirements. In one common language, the relationships of the OCS, TCS, Camera and Data management subsystems are captured with models of the structure, behavior, requirements and the traceability between them.

  9. IOT for Agriculture: Food Quality and Safety

    NASA Astrophysics Data System (ADS)

    Witjaksono, Gunawan; Abdelkreem Saeed Rabih, Almur; Yahya, Noorhana bt; Alva, Sagir

    2018-03-01

    Food is the main energy source for the living beings; as such food quality and safety have been in the highest demand throughout the human history. Internet of things (IOT) is a technology with a vision to connect anything at anytime and anywhere. Utilizing IOT in the food supply chain (FSC) is believed to enhance the quality of life by tracing and tracking the food conditions and live-sharing the obtained data with the consumers or the FSC supervisors. Currently, full application of IOT in the FSC is still in the developing stage and there is a big gap for improvements. The purpose of this paper is to explore the possibility of applying IOT for agriculture to trace and track food quality and safety. Mobile application for food freshness investigation was successfully developed and the results showed that consumer mobile camera could be used to test the freshness of food. By applying the IOT technology this information could be shared with all the consumers and also the supervisors.

  10. An Efficient and QoS Supported Multichannel MAC Protocol for Vehicular Ad Hoc Networks

    PubMed Central

    Tan, Guozhen; Yu, Chao

    2017-01-01

    Vehicular Ad Hoc Networks (VANETs) employ multichannel to provide a variety of safety and non-safety (transport efficiency and infotainment) applications, based on the IEEE 802.11p and IEEE 1609.4 protocols. Different types of applications require different levels Quality-of-Service (QoS) support. Recently, transport efficiency and infotainment applications (e.g., electronic map download and Internet access) have received more and more attention, and this kind of applications is expected to become a big market driver in a near future. In this paper, we propose an Efficient and QoS supported Multichannel Medium Access Control (EQM-MAC) protocol for VANETs in a highway environment. The EQM-MAC protocol utilizes the service channel resources for non-safety message transmissions during the whole synchronization interval, and it dynamically adjusts minimum contention window size for different non-safety services according to the traffic conditions. Theoretical model analysis and extensive simulation results show that the EQM-MAC protocol can support QoS services, while ensuring the high saturation throughput and low transmission delay for non-safety applications. PMID:28991217

  11. Bigger data, collaborative tools and the future of predictive drug discovery

    NASA Astrophysics Data System (ADS)

    Ekins, Sean; Clark, Alex M.; Swamidass, S. Joshua; Litterman, Nadia; Williams, Antony J.

    2014-10-01

    Over the past decade we have seen a growth in the provision of chemistry data and cheminformatics tools as either free websites or software as a service commercial offerings. These have transformed how we find molecule-related data and use such tools in our research. There have also been efforts to improve collaboration between researchers either openly or through secure transactions using commercial tools. A major challenge in the future will be how such databases and software approaches handle larger amounts of data as it accumulates from high throughput screening and enables the user to draw insights, enable predictions and move projects forward. We now discuss how information from some drug discovery datasets can be made more accessible and how privacy of data should not overwhelm the desire to share it at an appropriate time with collaborators. We also discuss additional software tools that could be made available and provide our thoughts on the future of predictive drug discovery in this age of big data. We use some examples from our own research on neglected diseases, collaborations, mobile apps and algorithm development to illustrate these ideas.

  12. Metagenomic analysis of bacterial community composition and antibiotic resistance genes in a wastewater treatment plant and its receiving surface water.

    PubMed

    Tang, Junying; Bu, Yuanqing; Zhang, Xu-Xiang; Huang, Kailong; He, Xiwei; Ye, Lin; Shan, Zhengjun; Ren, Hongqiang

    2016-10-01

    The presence of pathogenic bacteria and the dissemination of antibiotic resistance genes (ARGs) may pose big risks to the rivers that receive the effluent from municipal wastewater treatment plants (WWTPs). In this study, we investigated the changes of bacterial community and ARGs along treatment processes of one WWTP, and examined the effects of the effluent discharge on the bacterial community and ARGs in the receiving river. Pyrosequencing was applied to reveal bacterial community composition including potential bacterial pathogen, and Illumina high-throughput sequencing was used for profiling ARGs. The results showed that the WWTP had good removal efficiency on potential pathogenic bacteria (especially Arcobacter butzleri) and ARGs. Moreover, the bacterial communities of downstream and upstream of the river showed no significant difference. However, the increase in the abundance of potential pathogens and ARGs at effluent outfall was observed, indicating that WWTP effluent might contribute to the dissemination of potential pathogenic bacteria and ARGs in the receiving river. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Lessons learned in the analysis of high-dimensional data in vaccinomics

    PubMed Central

    Oberg, Ann L.; McKinney, Brett A.; Schaid, Daniel J.; Pankratz, V. Shane; Kennedy, Richard B.; Poland, Gregory A.

    2015-01-01

    The field of vaccinology is increasingly moving toward the generation, analysis, and modeling of extremely large and complex high-dimensional datasets. We have used data such as these in the development and advancement of the field of vaccinomics to enable prediction of vaccine responses and to develop new vaccine candidates. However, the application of systems biology to what has been termed “big data,” or “high-dimensional data,” is not without significant challenges—chief among them a paucity of gold standard analysis and modeling paradigms with which to interpret the data. In this article, we relate some of the lessons we have learned over the last decade of working with high-dimensional, high-throughput data as applied to the field of vaccinomics. The value of such efforts, however, is ultimately to better understand the immune mechanisms by which protective and non-protective responses to vaccines are generated, and to use this information to support a personalized vaccinology approach in creating better, and safer, vaccines for the public health. PMID:25957070

  14. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    PubMed

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  15. Old wines in new bottles: Repurposing opportunities for Parkinson's disease.

    PubMed

    Kakkar, Ashish Kumar; Singh, Harmanjit; Medhi, Bikash

    2018-07-05

    Parkinson's disease (PD) is a chronic progressive neurological disorder characterized by accumulation of Lewy bodies and profound loss of substantia nigra dopaminergic neurons. PD symptomatology is now recognized to include both cardinal motor as well as clinically significant non-motor symptoms. Despite intensive research, the current understanding of molecular mechanisms underlying neurodegeneration in PD is limited and has hampered the development of novel symptomatic and disease modifying therapies. The currently available treatment options are only partially or transiently effective and fail to restore the lost dopaminergic neurons or retard disease progression. Given the escalating drug development costs, lengthening timelines and declining R&D efficiency, industry and academia are increasingly focusing on ways to repurpose existing molecules as an accelerated route for drug discovery. The field of PD therapeutics is witnessing vigorous repurposing activity supported by big data analytics, computational models, and high-throughput drug screening systems. Here we review the mechanisms, efficacy, and safety of several emerging drugs currently aspiring to be repositioned for PD pharmacotherapy. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Lessons learned in the analysis of high-dimensional data in vaccinomics.

    PubMed

    Oberg, Ann L; McKinney, Brett A; Schaid, Daniel J; Pankratz, V Shane; Kennedy, Richard B; Poland, Gregory A

    2015-09-29

    The field of vaccinology is increasingly moving toward the generation, analysis, and modeling of extremely large and complex high-dimensional datasets. We have used data such as these in the development and advancement of the field of vaccinomics to enable prediction of vaccine responses and to develop new vaccine candidates. However, the application of systems biology to what has been termed "big data," or "high-dimensional data," is not without significant challenges-chief among them a paucity of gold standard analysis and modeling paradigms with which to interpret the data. In this article, we relate some of the lessons we have learned over the last decade of working with high-dimensional, high-throughput data as applied to the field of vaccinomics. The value of such efforts, however, is ultimately to better understand the immune mechanisms by which protective and non-protective responses to vaccines are generated, and to use this information to support a personalized vaccinology approach in creating better, and safer, vaccines for the public health. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Marine Metagenome as A Resource for Novel Enzymes.

    PubMed

    Alma'abadi, Amani D; Gojobori, Takashi; Mineta, Katsuhiko

    2015-10-01

    More than 99% of identified prokaryotes, including many from the marine environment, cannot be cultured in the laboratory. This lack of capability restricts our knowledge of microbial genetics and community ecology. Metagenomics, the culture-independent cloning of environmental DNAs that are isolated directly from an environmental sample, has already provided a wealth of information about the uncultured microbial world. It has also facilitated the discovery of novel biocatalysts by allowing researchers to probe directly into a huge diversity of enzymes within natural microbial communities. Recent advances in these studies have led to a great interest in recruiting microbial enzymes for the development of environmentally-friendly industry. Although the metagenomics approach has many limitations, it is expected to provide not only scientific insights but also economic benefits, especially in industry. This review highlights the importance of metagenomics in mining microbial lipases, as an example, by using high-throughput techniques. In addition, we discuss challenges in the metagenomics as an important part of bioinformatics analysis in big data. Copyright © 2015 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  18. Toward Genome-Based Metabolic Engineering in Bacteria.

    PubMed

    Oesterle, Sabine; Wuethrich, Irene; Panke, Sven

    2017-01-01

    Prokaryotes modified stably on the genome are of great importance for production of fine and commodity chemicals. Traditional methods for genome engineering have long suffered from imprecision and low efficiencies, making construction of suitable high-producer strains laborious. Here, we review the recent advances in discovery and refinement of molecular precision engineering tools for genome-based metabolic engineering in bacteria for chemical production, with focus on the λ-Red recombineering and the clustered regularly interspaced short palindromic repeats/Cas9 nuclease systems. In conjunction, they enable the integration of in vitro-synthesized DNA segments into specified locations on the chromosome and allow for enrichment of rare mutants by elimination of unmodified wild-type cells. Combination with concurrently developing improvements in important accessory technologies such as DNA synthesis, high-throughput screening methods, regulatory element design, and metabolic pathway optimization tools has resulted in novel efficient microbial producer strains and given access to new metabolic products. These new tools have made and will likely continue to make a big impact on the bioengineering strategies that transform the chemical industry. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. TTSA: An Effective Scheduling Approach for Delay Bounded Tasks in Hybrid Clouds.

    PubMed

    Yuan, Haitao; Bi, Jing; Tan, Wei; Zhou, MengChu; Li, Bo Hu; Li, Jianqiang

    2017-11-01

    The economy of scale provided by cloud attracts a growing number of organizations and industrial companies to deploy their applications in cloud data centers (CDCs) and to provide services to users around the world. The uncertainty of arriving tasks makes it a big challenge for private CDC to cost-effectively schedule delay bounded tasks without exceeding their delay bounds. Unlike previous studies, this paper takes into account the cost minimization problem for private CDC in hybrid clouds, where the energy price of private CDC and execution price of public clouds both show the temporal diversity. Then, this paper proposes a temporal task scheduling algorithm (TTSA) to effectively dispatch all arriving tasks to private CDC and public clouds. In each iteration of TTSA, the cost minimization problem is modeled as a mixed integer linear program and solved by a hybrid simulated-annealing particle-swarm-optimization. The experimental results demonstrate that compared with the existing methods, the optimal or suboptimal scheduling strategy produced by TTSA can efficiently increase the throughput and reduce the cost of private CDC while meeting the delay bounds of all the tasks.

  20. Combining clinical and genomics queries using i2b2 – Three methods

    PubMed Central

    Murphy, Shawn N.; Avillach, Paul; Bellazzi, Riccardo; Phillips, Lori; Gabetta, Matteo; Eran, Alal; McDuffie, Michael T.; Kohane, Isaac S.

    2017-01-01

    We are fortunate to be living in an era of twin biomedical data surges: a burgeoning representation of human phenotypes in the medical records of our healthcare systems, and high-throughput sequencing making rapid technological advances. The difficulty representing genomic data and its annotations has almost by itself led to the recognition of a biomedical “Big Data” challenge, and the complexity of healthcare data only compounds the problem to the point that coherent representation of both systems on the same platform seems insuperably difficult. We investigated the capability for complex, integrative genomic and clinical queries to be supported in the Informatics for Integrating Biology and the Bedside (i2b2) translational software package. Three different data integration approaches were developed: The first is based on Sequence Ontology, the second is based on the tranSMART engine, and the third on CouchDB. These novel methods for representing and querying complex genomic and clinical data on the i2b2 platform are available today for advancing precision medicine. PMID:28388645

  1. [Review of Second Generation Sequencing and Its Application in Forensic Genetics].

    PubMed

    Zhang, S H; Bian, Y N; Zhao, Q; Li, C T

    2016-08-01

    The rapid development of second generation sequencing (SGS) within the past few years has led to the increasement of data throughput and read length while at the same time brought down substantially the sequencing cost. This made new breakthrough in the area of biology and ushered the forensic genetics into a new era. Based on the history of sequencing application in forensic genetics, this paper reviews the importance of sequencing technologies for genetic marker detection. The application status and potential of SGS in forensic genetics are discussed based on the already explored SGS platforms of Roche, Illumina and Life Technologies. With these platforms, DNA markers (SNP, STR), RNA markers (mRNA, microRNA) and whole mtDNA can be sequenced. However, development and validation of application kits, maturation of analysis software, connection to the existing databases and the possible ethical issues occurred with big data will be the key factors that determine whether this technology can substitute or supplement PCR-CE, the mature technology, and be widely used for cases detection. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  2. Efficient Computation of Anharmonic Force Constants via q-space, with Application to Graphene

    NASA Astrophysics Data System (ADS)

    Kornbluth, Mordechai; Marianetti, Chris

    We present a new approach for extracting anharmonic force constants from a sparse sampling of the anharmonic dynamical tensor. We calculate the derivative of the energy with respect to q-space displacements (phonons) and strain, which guarantees the absence of supercell image errors. Central finite differences provide a well-converged quadratic error tail for each derivative, separating the contribution of each anharmonic order. These derivatives populate the anharmonic dynamical tensor in a sparse mesh that bounds the Brillouin Zone, which ensures comprehensive sampling of q-space while exploiting small-cell calculations for efficient, high-throughput computation. This produces a well-converged and precisely-defined dataset, suitable for big-data approaches. We transform this sparsely-sampled anharmonic dynamical tensor to real-space anharmonic force constants that obey full space-group symmetries by construction. Machine-learning techniques identify the range of real-space interactions. We show the entire process executed for graphene, up to and including the fifth-order anharmonic force constants. This method successfully calculates strain-based phonon renormalization in graphene, even under large strains, which solves a major shortcoming of previous potentials.

  3. Collaborative Biomedicine in the Age of Big Data: The Case of Cancer

    PubMed Central

    Butte, Atul J; Schully, Sheri D; Dalton, William S; Khoury, Muin J; Hesse, Bradford W

    2014-01-01

    Biomedicine is undergoing a revolution driven by high throughput and connective computing that is transforming medical research and practice. Using oncology as an example, the speed and capacity of genomic sequencing technologies is advancing the utility of individual genetic profiles for anticipating risk and targeting therapeutics. The goal is to enable an era of “P4” medicine that will become increasingly more predictive, personalized, preemptive, and participative over time. This vision hinges on leveraging potentially innovative and disruptive technologies in medicine to accelerate discovery and to reorient clinical practice for patient-centered care. Based on a panel discussion at the Medicine 2.0 conference in Boston with representatives from the National Cancer Institute, Moffitt Cancer Center, and Stanford University School of Medicine, this paper explores how emerging sociotechnical frameworks, informatics platforms, and health-related policy can be used to encourage data liquidity and innovation. This builds on the Institute of Medicine’s vision for a “rapid learning health care system” to enable an open source, population-based approach to cancer prevention and control. PMID:24711045

  4. Collaborative biomedicine in the age of big data: the case of cancer.

    PubMed

    Shaikh, Abdul R; Butte, Atul J; Schully, Sheri D; Dalton, William S; Khoury, Muin J; Hesse, Bradford W

    2014-04-07

    Biomedicine is undergoing a revolution driven by high throughput and connective computing that is transforming medical research and practice. Using oncology as an example, the speed and capacity of genomic sequencing technologies is advancing the utility of individual genetic profiles for anticipating risk and targeting therapeutics. The goal is to enable an era of "P4" medicine that will become increasingly more predictive, personalized, preemptive, and participative over time. This vision hinges on leveraging potentially innovative and disruptive technologies in medicine to accelerate discovery and to reorient clinical practice for patient-centered care. Based on a panel discussion at the Medicine 2.0 conference in Boston with representatives from the National Cancer Institute, Moffitt Cancer Center, and Stanford University School of Medicine, this paper explores how emerging sociotechnical frameworks, informatics platforms, and health-related policy can be used to encourage data liquidity and innovation. This builds on the Institute of Medicine's vision for a "rapid learning health care system" to enable an open source, population-based approach to cancer prevention and control.

  5. A perspective on 10-years HTS experience at the Walter and Eliza Hall Institute of Medical Research - eighteen million assays and counting.

    PubMed

    Lackovic, Kurt; Lessene, Guillaume; Falk, Hendrik; Leuchowius, Karl-Johan; Baell, Jonathan; Street, Ian

    2014-03-01

    The Walter and Eliza Hall Institute of Medical Research (WEHI) is Australia's longest serving medical research institute. WEHI's High Throughput Screening (HTS) Facility was established in 2003 with $5 million of infrastructure funds invested by WEHI, and the Victorian State Government's Strategic Technology Initiative through Bio21 Australia Ltd. The Facility was Australia's first truly academic HTS facility and was one of only a handful operating in publicly funded institutions worldwide at that time. The objectives were to provide access to enabling HTS technologies, such as assay design, liquid handling automation, compound libraries and expertise to promote translation of basic research in a national setting that has a relatively young biotech sector and does not have a big Pharma research presence. Ten years on and the WEHI HTS Facility has participated in over 92 collaborative projects, generated over 18 million data points, and most importantly, projects that began in the Facility have been commercialized successfully (due to strong ties with Business Development and emphasis on intellectual property management) and now have molecules progressing in clinical trials.

  6. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  7. Effects of Airport Tower Controller Decision Support Tool on Controllers Head-Up Time

    NASA Technical Reports Server (NTRS)

    Hayashi, Miwa; Cruz Lopez, Jose M.

    2013-01-01

    Despite that aircraft positions and movements can be easily monitored on the radar displays at major airports nowadays, it is still important for the air traffic control tower (ATCT) controllers to look outside the window as much as possible to assure safe operations of traffic management. The present paper investigates whether an introduction of the NASA's proposed Spot and Runway Departure Advisor (SARDA), a decision support tool for the ATCT controller, would increase or decrease the controllers' head-up time. SARDA provides the controller departure-release schedule advisories, i.e., when to release each departure aircraft in order to minimize individual aircraft's fuel consumption on taxiways and simultaneously maximize the overall runway throughput. The SARDA advisories were presented on electronic flight strips (EFS). To investigate effects on the head-up time, a human-in-the-loop simulation experiment with two retired ATCT controller participants was conducted in a high-fidelity ATCT cab simulator with 360-degree computer-generated out-the-window view. Each controller participant wore a wearable video camera on a side of their head with the camera facing forward. The video data were later used to calculate their line of sight at each moment and eventually identify their head-up times. Four sessions were run with the SARDA advisories, and four sessions were run without (baseline). Traffic-load levels were varied in each session. The same set of user interface - EFS and the radar displays - were used in both the advisory and baseline sessions to make them directly comparable. The paper reports the findings and discusses their implications.

  8. Automated registration of tail bleeding in rats.

    PubMed

    Johansen, Peter B; Henriksen, Lars; Andresen, Per R; Lauritzen, Brian; Jensen, Kåre L; Juhl, Trine N; Tranholm, Mikael

    2008-05-01

    An automated system for registration of tail bleeding in rats using a camera and a user-designed PC-based software program has been developed. The live and processed images are displayed on the screen and are exported together with a text file for later statistical processing of the data allowing calculation of e.g. number of bleeding episodes, bleeding times and bleeding areas. Proof-of-principle was achieved when the camera captured the blood stream after infusion of rat whole blood into saline. Suitability was assessed by recording of bleeding profiles in heparin-treated rats, demonstrating that the system was able to capture on/off bleedings and that the data transfer and analysis were conducted successfully. Then, bleeding profiles were visually recorded by two independent observers simultaneously with the automated recordings after tail transection in untreated rats. Linear relationships were found in the number of bleedings, demonstrating, however, a statistically significant difference in the recording of bleeding episodes between observers. Also, the bleeding time was longer for visual compared to automated recording. No correlation was found between blood loss and bleeding time in untreated rats, but in heparinized rats a correlation was suggested. Finally, the blood loss correlated with the automated recording of bleeding area. In conclusion, the automated system has proven suitable for replacing visual recordings of tail bleedings in rats. Inter-observer differences can be eliminated, monotonous repetitive work avoided, and a higher through-put of animals in less time achieved. The automated system will lead to an increased understanding of the nature of bleeding following tail transection in different rodent models.

  9. KiwiSpec - an advanced spectrograph for high resolution spectroscopy: prototype design and performance

    NASA Astrophysics Data System (ADS)

    Gibson, Steve; Barnes, Stuart I.; Hearnshaw, John; Nield, Kathryn; Cochrane, Dave; Grobler, Deon

    2012-09-01

    A new advanced high resolution spectrograph has been developed by Kiwistar Optics of Industrial Research Ltd., New Zealand. The instrument, KiwiSpec R4-100, is bench-mounted, bre-fed, compact (0.75m by 1.5m footprint), and is well-suited for small to medium-sized telescopes. The instrument makes use of several advanced concepts in high resolution spectrograph design. The basic design follows the classical white pupil concept in an asymmetric implementation and employs an R4 echelle grating illuminated by a 100mm diameter collimated beam for primary dispersion. A volume phase holographic grating (VPH) based grism is used for cross-dispersion. The design also allows for up to four camera and detector channels to allow for extended wavelength coverage at high eciency. A single channel prototype of the instrument has been built and successfully tested with a 1m telescope. Targets included various spectrophotometric standard stars and several radial velocity standard stars to measure the instrument's light throughput and radial velocity capabilities. The prototype uses a 725 lines/mm VPH grism, an off-the-shelf camera objective, and a 2k×2k CCD. As such, it covers the wavelength range from 420nm to 660nm and has a resolving power of R ≍ 40,000. Spectrophotometric and precision radial velocity results from the on-sky testing period will be reported, as well as results of laboratory-based measurements. The optical design of KiwiSpec, and the various multi-channel design options, will be presented elsewhere in these proceedings.

  10. CTER-rapid estimation of CTF parameters with error assessment.

    PubMed

    Penczek, Pawel A; Fang, Jia; Li, Xueming; Cheng, Yifan; Loerke, Justus; Spahn, Christian M T

    2014-05-01

    In structural electron microscopy, the accurate estimation of the Contrast Transfer Function (CTF) parameters, particularly defocus and astigmatism, is of utmost importance for both initial evaluation of micrograph quality and for subsequent structure determination. Due to increases in the rate of data collection on modern microscopes equipped with new generation cameras, it is also important that the CTF estimation can be done rapidly and with minimal user intervention. Finally, in order to minimize the necessity for manual screening of the micrographs by a user it is necessary to provide an assessment of the errors of fitted parameters values. In this work we introduce CTER, a CTF parameters estimation method distinguished by its computational efficiency. The efficiency of the method makes it suitable for high-throughput EM data collection, and enables the use of a statistical resampling technique, bootstrap, that yields standard deviations of estimated defocus and astigmatism amplitude and angle, thus facilitating the automation of the process of screening out inferior micrograph data. Furthermore, CTER also outputs the spatial frequency limit imposed by reciprocal space aliasing of the discrete form of the CTF and the finite window size. We demonstrate the efficiency and accuracy of CTER using a data set collected on a 300kV Tecnai Polara (FEI) using the K2 Summit DED camera in super-resolution counting mode. Using CTER we obtained a structure of the 80S ribosome whose large subunit had a resolution of 4.03Å without, and 3.85Å with, inclusion of astigmatism parameters. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Vacuum Nuller Testbed Performance, Characterization and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, R. G.; Clampin, M.; Petrone, P.; Mallik, U.; Madison, T.; Bolcar, M.; Noecker, C.; Kendrick, S.; Helmbrecht, M. A.

    2011-01-01

    The Visible Nulling Coronagraph (VNC) can detect and characterize exoplanets with filled, segmented and sparse aperture telescopes, thereby spanning the choice of future internal coronagraph exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has developed a Vacuum Nuller Testbed (VNT) to advance this approach, and assess and advance technologies needed to realize a VNC as a flight instrument. The VNT is an ultra-stable testbed operating at 15 Hz in vacuum. It consists of a MachZehnder nulling interferometer; modified with a "W" configuration to accommodate a hexpacked MEMS based deformable mirror (DM), coherent fiber bundle and achromatic phase shifters. The 2-output channels are imaged with a vacuum photon counting camera and conventional camera. Error-sensing and feedback to DM and delay line with control algorithms are implemented in a real-time architecture. The inherent advantage of the VNC is that it is its own interferometer and directly controls its errors by exploiting images from bright and dark channels simultaneously. Conservation of energy requires the sum total of the photon counts be conserved independent of the VNC state. Thus sensing and control bandwidth is limited by the target stars throughput, with the net effect that the higher bandwidth offloads stressing stability tolerances within the telescope. We report our recent progress with the VNT towards achieving an incremental sequence of contrast milestones of 10(exp 8) , 10(exp 9) and 10(exp 10) respectively at inner working angles approaching 2A/D. Discussed will be the optics, lab results, technologies, and null control. Shown will be evidence that the milestones have been achieved.

  12. Human Visual System-Based Fundus Image Quality Assessment of Portable Fundus Camera Photographs.

    PubMed

    Wang, Shaoze; Jin, Kai; Lu, Haitong; Cheng, Chuming; Ye, Juan; Qian, Dahong

    2016-04-01

    Telemedicine and the medical "big data" era in ophthalmology highlight the use of non-mydriatic ocular fundus photography, which has given rise to indispensable applications of portable fundus cameras. However, in the case of portable fundus photography, non-mydriatic image quality is more vulnerable to distortions, such as uneven illumination, color distortion, blur, and low contrast. Such distortions are called generic quality distortions. This paper proposes an algorithm capable of selecting images of fair generic quality that would be especially useful to assist inexperienced individuals in collecting meaningful and interpretable data with consistency. The algorithm is based on three characteristics of the human visual system--multi-channel sensation, just noticeable blur, and the contrast sensitivity function to detect illumination and color distortion, blur, and low contrast distortion, respectively. A total of 536 retinal images, 280 from proprietary databases and 256 from public databases, were graded independently by one senior and two junior ophthalmologists, such that three partial measures of quality and generic overall quality were classified into two categories. Binary classification was implemented by the support vector machine and the decision tree, and receiver operating characteristic (ROC) curves were obtained and plotted to analyze the performance of the proposed algorithm. The experimental results revealed that the generic overall quality classification achieved a sensitivity of 87.45% at a specificity of 91.66%, with an area under the ROC curve of 0.9452, indicating the value of applying the algorithm, which is based on the human vision system, to assess the image quality of non-mydriatic photography, especially for low-cost ophthalmological telemedicine applications.

  13. Efficient coding and detection of ultra-long IDs for visible light positioning systems.

    PubMed

    Zhang, Hualong; Yang, Chuanchuan

    2018-05-14

    Visible light positioning (VLP) is a promising technique to complement Global Navigation Satellite System (GNSS) such as Global positioning system (GPS) and BeiDou Navigation Satellite System (BDS) which features the advantage of low-cost and high accuracy. The situation becomes even more crucial for indoor environments, where satellite signals are weak or even unavailable. For large-scale application of VLP, there would be a considerable number of Light emitting diode (LED) IDs, which bring forward the demand of long LED ID detection. In particular, to provision indoor localization globally, a convenient way is to program a unique ID into each LED during manufacture. This poses a big challenge for image sensors, such as the CMOS camera in everybody's hands since the long ID covers the span of multiple frames. In this paper, we investigate the detection of ultra-long ID using rolling shutter cameras. By analyzing the pattern of data loss in each frame, we proposed a novel coding technique to improve the efficiency of LED ID detection. We studied the performance of Reed-Solomon (RS) code in this system and designed a new coding method which considered the trade-off between performance and decoding complexity. Coding technique decreases the number of frames needed in data processing, significantly reduces the detection time, and improves the accuracy of detection. Numerical and experimental results show that the detected LED ID can be much longer with the coding technique. Besides, our proposed coding method is proved to achieve a performance close to that of RS code while the decoding complexity is much lower.

  14. Space Science

    NASA Image and Video Library

    2002-01-01

    Pictured is the chosen artist's rendering of NASA's next generation space telescope, a successor to the Hubble Space Telescope, was named the James Webb Space Telescope (JWST) in honor of NASA's second administrator, James E. Webb. To further our understanding of the way our present universe formed following the the big bang, NASA is developing the JWST to observe the first stars and galaxies in the universe. This grand effort will help to answer the following fundamental questions: How galaxies form and evolve, how stars and planetary systems form and interact, how the universe builds up its present elemental/chemical composition, and what dark matter is. To see into the depths of space, the JWST is currently plarning to carry instruments that are sensitive to the infrared wavelengths of the electromagnetic spectrum. The new telescope will carry a near-infrared camera, a multi-object spectrometer, and a mid-infrared camera/spectrometer. The JWST is scheduled for launch in 2010 aboard an expendable launch vehicle. It will take about 3 months for the spacecraft to reach its destination, an orbit of 940,000 miles in space. Marshall Space Flight Center (MSFC) is supporting Goddard Space Flight Center (GSFC) in developing the JWST by creating an ultra-lightweight mirror for the telescope at MSFC's Space Optics Manufacturing Technology Center. GSFC, Greenbelt, Maryland, manages the JWST, and TRW will design and fabricate the observatory's primary mirror and spacecraft. The program has a number of industry, academic, and government partners, as well as the European Space Agency and the Canadian Space Agency. (Image: Courtesy of TRW)

  15. Study of Thermal Anomalies at Cotopaxi Volcano, 2002 to 2005

    NASA Astrophysics Data System (ADS)

    Rivero, D. R.; Beate, B.; Troncoso, L.; Ramón, P.

    2007-05-01

    The Instituto Geofisico of the Escuela Politecnica Nacional (IG-EPN) has maintained continuous monitoring since 1977, allowing a better understanding of the volcano's baseline activity. Preliminary signs observed since 2001 of a possible reactivation of this volcano after more than a century of repose, prompted a comprehensive seismological study and implementation of new methods of monitoring, based mainly upon a general increase in seismic activity (VT and LP); appearance of new types of seismic signals never observed before (hybrids, "tornillos", big LP, and tremor); an increase in the fumaroles' number and discharge, as well as a marked thermal anomaly in the summit region. Seismic activity reached its peak in late 2001 / early 2002 and was correlated with enhanced degassing from the crater, with vapor columns reaching some meters above the crater level with abundant SO2 perceived. In this abstract we show evidence of the existence of a magmatic intrusion (Troncoso, 2005), that has disturbed the hydrothermal system present in the cone and it is melting the glacier. This has generated local population and civil defense concern. Since this stage of activity, Cotopaxi has not yet returned to its baseline level, therefore the newly implemented technology includes periodic over flights with a FLIR camera, which permits localization and identification of thermal anomalies. Additionally, a telemetric video camera has been deployed in the northwest rim of the crater to identify degassing changes and its relationship with seismic events. Finally, the IG-EPN staff perform continuous visits to the crater to observe changes IN the ice-cap, measure temperatures and verify the presence of magmatic gases.

  16. NASA Telescopes Help Discover Surprisingly Young Galaxy

    NASA Image and Video Library

    2017-12-08

    NASA image release April 12, 2011 Astronomers have uncovered one of the youngest galaxies in the distant universe, with stars that formed 13.5 billion years ago, a mere 200 million years after the Big Bang. The finding addresses questions about when the first galaxies arose, and how the early universe evolved. NASA's Hubble Space Telescope was the first to spot the newfound galaxy. Detailed observations from the W.M. Keck Observatory on Mauna Kea in Hawaii revealed the observed light dates to when the universe was only 950 million years old; the universe formed about 13.7 billion years ago. Infrared data from both Hubble and NASA's Spitzer Space Telescope revealed the galaxy's stars are quite mature, having formed when the universe was just a toddler at 200 million years old. The galaxy's image is being magnified by the gravity of a massive cluster of galaxies (Abell 383) parked in front of it, making it appear 11 times brighter. This phenomenon is called gravitational lensing. Hubble imaged the lensing galaxy Abell 383 with the Wide Field Camera 3 and the Advanced Camera for Surveys in November 2010 through March 2011. Credit: NASA, ESA, J. Richard (Center for Astronomical Research/Observatory of Lyon, France), and J.-P. Kneib (Astrophysical Laboratory of Marseille, France) NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  17. Reducing the threat of wildlife-vehicle collisions during peak tourism periods using a Roadside Animal Detection System.

    PubMed

    Grace, Molly K; Smith, Daniel J; Noss, Reed F

    2017-12-01

    Roadside Animal Detection Systems (RADS) aim to reduce the frequency of wildlife-vehicle collisions. Unlike fencing and wildlife passages, RADS do not attempt to keep animals off the road; rather, they attempt to modify driver behavior by detecting animals near the road and warning drivers with flashing signs. A RADS was installed in Big Cypress National Park (Florida, USA) in 2012 in response to an increased number of Florida panther mortalities. To assess driver response, we measured the speed of individual cars on the road when the RADS was active (flashing) and inactive (not flashing) during the tourist season (November-March) and the off-season (April-October), which vary dramatically in traffic volume. We also used track beds and camera traps to assess whether roadside activity of large mammal species varied between seasons. In the tourist season, the activation of the RADS caused a significant reduction in vehicle speed. However, this effect was not observed in the off-season. Track and camera data showed that the tourist season coincided with peak periods of activity for several large mammals of conservation interest. Drivers in the tourist season generally drove faster than those in the off-season, so a reduction in speed in response to the RADS is more beneficial in the tourist season. Because traffic volume and roadside activity of several species of conservation interest both peak during the tourist season, our study indicates that the RADS has the potential to reduce the number of accidents during this period of heightened risk. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Big Opportunities and Big Concerns of Big Data in Education

    ERIC Educational Resources Information Center

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  19. Increased plasma levels of big-endothelin-2 and big-endothelin-3 in patients with end-stage renal disease.

    PubMed

    Miyauchi, Yumi; Sakai, Satoshi; Maeda, Seiji; Shimojo, Nobutake; Watanabe, Shigeyuki; Honma, Satoshi; Kuga, Keisuke; Aonuma, Kazutaka; Miyauchi, Takashi

    2012-10-15

    Big endothelins (pro-endothelin; inactive-precursor) are converted to biologically active endothelins (ETs). Mammals and humans produce three ET family members: ET-1, ET-2 and ET-3, from three different genes. Although ET-1 is produced by vascular endothelial cells, these cells do not produce ET-3, which is produced by neuronal cells and organs such as the thyroid, salivary gland and the kidney. In patients with end-stage renal disease, abnormal vascular endothelial cell function and elevated plasma ET-1 and big ET-1 levels have been reported. It is unknown whether big ET-2 and big ET-3 plasma levels are altered in these patients. The purpose of the present study was to determine whether endogenous ET-1, ET-2, and ET-3 systems including big ETs are altered in patients with end-stage renal disease. We measured plasma levels of ET-1, ET-3 and big ET-1, big ET-2, and big ET-3 in patients on chronic hemodialysis (n=23) and age-matched healthy subjects (n=17). In patients on hemodialysis, plasma levels (measured just before hemodialysis) of both ET-1 and ET-3 and big ET-1, big ET-2, and big ET-3 were markedly elevated, and the increase was higher for big ETs (Big ET-1, 4-fold; big ET-2, 6-fold; big ET-3: 5-fold) than for ETs (ET-1, 1.7-fold; ET-3, 2-fold). In hemodialysis patients, plasma levels of the inactive precursors big ET-1, big ET-2, and big ET-3 levels are markedly increased, yet there is only a moderate increase in plasma levels of the active products, ET-1 and ET-3. This suggests that the activity of endothelin converting enzyme contributing to circulating levels of ET-1 and ET-3 may be decreased in patients on chronic hemodialysis. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Big Data: Implications for Health System Pharmacy

    PubMed Central

    Stokes, Laura B.; Rogers, Joseph W.; Hertig, John B.; Weber, Robert J.

    2016-01-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services. PMID:27559194

  1. Big Data: Implications for Health System Pharmacy.

    PubMed

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  2. Vacuum-Compatible Wideband White Light and Laser Combiner Source System

    NASA Technical Reports Server (NTRS)

    Azizi, Alineza; Ryan, Daniel J.; Tang, Hong; Demers, Richard T.; Kadogawa, Hiroshi; An, Xin; Sun, George Y.

    2010-01-01

    For the Space Interferometry Mission (SIM) Spectrum Calibration Development Unit (SCDU) testbed, wideband white light is used to simulate starlight. The white light source mount requires extremely stable pointing accuracy (<3.2 microradians). To meet this and other needs, the laser light from a single-mode fiber was combined, through a beam splitter window with special coating from broadband wavelengths, with light from multimode fiber. Both lights were coupled to a photonic crystal fiber (PCF). In many optical systems, simulating a point star with broadband spectrum with stability of microradians for white light interferometry is a challenge. In this case, the cameras use the white light interference to balance two optical paths, and to maintain close tracking. In order to coarse align the optical paths, a laser light is sent into the system to allow tracking of fringes because a narrow band laser has a great range of interference. The design requirements forced the innovators to use a new type of optical fiber, and to take a large amount of care in aligning the input sources. The testbed required better than 1% throughput, or enough output power on the lowest spectrum to be detectable by the CCD camera (6 nW at camera). The system needed to be vacuum-compatible and to have the capability for combining a visible laser light at any time for calibration purposes. The red laser is a commercially produced 635-nm laser 5-mW diode, and the white light source is a commercially produced tungsten halogen lamp that gives a broad spectrum of about 525 to 800 nm full width at half maximum (FWHM), with about 1.4 mW of power at 630 nm. A custom-made beam splitter window with special coating for broadband wavelengths is used with the white light input via a 50-mm multi-mode fiber. The large mode area PCF is an LMA-8 made by Crystal Fibre (core diameter of 8.5 mm, mode field diameter of 6 mm, and numerical aperture at 625 nm of 0.083). Any science interferometer that needs a tracking laser fringe to assist in alignment can use this system.

  3. Foliar Temperature Gradients as Drivers of Budburst in Douglas-fir: New Applications of Thermal Infrared Imagery

    NASA Astrophysics Data System (ADS)

    Miller, R.; Lintz, H. E.; Thomas, C. K.; Salino-Hugg, M. J.; Niemeier, J. J.; Kruger, A.

    2014-12-01

    Budburst, the initiation of annual growth in plants, is sensitive to climate and is used to monitor physiological responses to climate change. Accurately forecasting budburst response to these changes demands an understanding of the drivers of budburst. Current research and predictive models focus on population or landscape-level drivers, yet fundamental questions regarding drivers of budburst diversity within an individual tree remain unanswered. We hypothesize that foliar temperature, an important physiological property, may be a dominant driver of differences in the timing of budburst within a single tree. Studying these differences facilitates development of high throughput phenotyping technology used to improve predictive budburst models. We present spatial and temporal variation in foliar temperature as a function of physical drivers culminating in a single-tree budburst model based on foliar temperature. We use a novel remote sensing approach, combined with on-site meteorological measurements, to demonstrate important intra-canopy differences between air and foliar temperature. We mounted a thermal infrared camera within an old-growth canopy at the H.J. Andrews LTER forest and imaged an 8m by 10.6m section of a Douglas-fir crown. Sampling one image per minute, approximately 30,000 thermal infrared images were collected over a one-month period to approximate foliar temperature before, during and after budburst. Using time-lapse photography in the visible spectrum, we documented budburst at fifteen-minute intervals with eight cameras stratified across the thermal infrared camera's field of view. Within the imaged tree's crown, we installed a pyranometer, 2D sonic anemometer and fan-aspirated thermohygrometer and collected 3,000 measurements of net shortwave radiation, wind speed, air temperature and relative humidity. We documented a difference of several days in the timing of budburst across both vertical and horizontal gradients. We also observed clear spatial and temporal foliar temperature gradients. In addition to exploring physical drivers of budburst, this remote sensing approach provides insight into intra-canopy structural complexity and opportunities to advance our understanding of vegetation-­atmospheric interactions.

  4. Galaxies Gather at Great Distances

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Distant Galaxy Cluster Infrared Survey Poster [figure removed for brevity, see original site] [figure removed for brevity, see original site] Bird's Eye View Mosaic Bird's Eye View Mosaic with Clusters [figure removed for brevity, see original site] [figure removed for brevity, see original site] [figure removed for brevity, see original site] 9.1 Billion Light-Years 8.7 Billion Light-Years 8.6 Billion Light-Years

    Astronomers have discovered nearly 300 galaxy clusters and groups, including almost 100 located 8 to 10 billion light-years away, using the space-based Spitzer Space Telescope and the ground-based Mayall 4-meter telescope at Kitt Peak National Observatory in Tucson, Ariz. The new sample represents a six-fold increase in the number of known galaxy clusters and groups at such extreme distances, and will allow astronomers to systematically study massive galaxies two-thirds of the way back to the Big Bang.

    A mosaic portraying a bird's eye view of the field in which the distant clusters were found is shown at upper left. It spans a region of sky 40 times larger than that covered by the full moon as seen from Earth. Thousands of individual images from Spitzer's infrared array camera instrument were stitched together to create this mosaic. The distant clusters are marked with orange dots.

    Close-up images of three of the distant galaxy clusters are shown in the adjoining panels. The clusters appear as a concentration of red dots near the center of each image. These images reveal the galaxies as they were over 8 billion years ago, since that's how long their light took to reach Earth and Spitzer's infrared eyes.

    These pictures are false-color composites, combining ground-based optical images captured by the Mosaic-I camera on the Mayall 4-meter telescope at Kitt Peak, with infrared pictures taken by Spitzer's infrared array camera. Blue and green represent visible light at wavelengths of 0.4 microns and 0.8 microns, respectively, while red indicates infrared light at 4.5 microns.

    Kitt Peak National Observatory is part of the National Optical Astronomy Observatory in Tuscon, Ariz.

  5. Opportunity's Path

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This Long Term Planning graphic was created from a mosaic of navigation camera images overlain by a polar coordinate grid with the center point as Opportunity's original landing site. The blue dots represent the rover position at various locations.

    The red dots represent the center points of the target areas for the instruments on the rover mast (the panoramic camera and miniature thermal emission spectrometer). Opportunity visited Stone Mountain on Feb. 5. Stone Mountain was named after the southernmost point of the Appalachian Mountains outside of Atlanta, Ga. On Earth, Stone Mountain is the last big mountain before the Piedmont flatlands, and on Mars, Stone Mountain is at one end of Opportunity Ledge. El Capitan is a target of interest on Mars named after the second highest peak in Texas in Guadaloupe National Park, which is one of the most visited outcrops in the United States by geologists. It has been a training ground for students and professional geologists to understand what the layering means in relation to the formation of Earth, and scientists will study this prominent point of Opportunity Ledge to understand what the layering means on Mars.

    The yellow lines show the midpoint where the panoramic camera has swept and will sweep a 120-degree area from the three waypoints on the tour of the outcrop. Imagine a fan-shaped wedge from left to right of the yellow line.

    The white contour lines are one meter apart, and each drive has been roughly about 2-3 meters in length over the last few sols. The large white blocks are dropouts in the navigation camera data.

    Opportunity is driving along and taking a photographic panorama of the entire outcrop. Scientists will stitch together these images and use the new mosaic as a 'base map' to decide on geology targets of interest for a more detailed study of the outcrop using the instruments on the robotic arm. Once scientists choose their targets of interest, they plan to study the outcrop for roughly five to fifteen sols. This will include El Capitan and probably one to two other areas.

    Blue Dot Dates Sol 7 / Jan 31 = Egress & first soil data collected by instruments on the arm Sol 9 / Feb 2 = Second Soil Target Sol 12 / Feb 5 = First Rock Target Sol 16 / Feb 9 = Alpha Waypoint Sol 17 / Feb 10 = Bravo Waypoint Sol 19 or 20 / Feb 12 or 13 = Charlie Waypoint

  6. High-Throughput Phenotyping of Plant Height: Comparing Unmanned Aerial Vehicles and Ground LiDAR Estimates.

    PubMed

    Madec, Simon; Baret, Fred; de Solan, Benoît; Thomas, Samuel; Dutartre, Dan; Jezequel, Stéphane; Hemmerlé, Matthieu; Colombeau, Gallian; Comar, Alexis

    2017-01-01

    The capacity of LiDAR and Unmanned Aerial Vehicles (UAVs) to provide plant height estimates as a high-throughput plant phenotyping trait was explored. An experiment over wheat genotypes conducted under well watered and water stress modalities was conducted. Frequent LiDAR measurements were performed along the growth cycle using a phénomobile unmanned ground vehicle. UAV equipped with a high resolution RGB camera was flying the experiment several times to retrieve the digital surface model from structure from motion techniques. Both techniques provide a 3D dense point cloud from which the plant height can be estimated. Plant height first defined as the z -value for which 99.5% of the points of the dense cloud are below. This provides good consistency with manual measurements of plant height (RMSE = 3.5 cm) while minimizing the variability along each microplot. Results show that LiDAR and structure from motion plant height values are always consistent. However, a slight under-estimation is observed for structure from motion techniques, in relation with the coarser spatial resolution of UAV imagery and the limited penetration capacity of structure from motion as compared to LiDAR. Very high heritability values ( H 2 > 0.90) were found for both techniques when lodging was not present. The dynamics of plant height shows that it carries pertinent information regarding the period and magnitude of the plant stress. Further, the date when the maximum plant height is reached was found to be very heritable ( H 2 > 0.88) and a good proxy of the flowering stage. Finally, the capacity of plant height as a proxy for total above ground biomass and yield is discussed.

  7. Real-time quantitative fluorescence measurement of microscale cell culture analog systems

    NASA Astrophysics Data System (ADS)

    Oh, Taek-il; Kim, Donghyun; Tatosian, Daniel; Sung, Jong Hwan; Shuler, Michael

    2007-02-01

    A microscale cell culture analog (μCCA) is a cell-based lab-on-a-chip assay that, as an animal surrogate, is applied to pharmacological studies for toxicology tests. A μCCA typically comprises multiple chambers and microfluidics that connect the chambers, which represent animal organs and blood flow to mimic animal metabolism more realistically. A μCCA is expected to provide a tool for high-throughput drug discovery. Previously, a portable fluorescence detection system was investigated for a single μCCA device in real-time. In this study, we present a fluorescence-based imaging system that provides quantitative real-time data of the metabolic interactions in μCCAs with an emphasis on measuring multiple μCCA samples simultaneously for high-throughput screening. The detection system is based on discrete optics components, with a high-power LED and a charge-coupled device (CCD) camera as a light source and a detector, for monitoring cellular status on the chambers of each μCCA sample. Multiple samples are characterized mechanically on a motorized linear stage, which is fully-automated. Each μCCA sample has four chambers, where cell lines MES-SA/DX- 5, and MES-SA (tumor cells of human uterus) have been cultured. All cell-lines have been transfected to express the fusion protein H2B-GFP, which is a human histone protein fused at the amino terminus to EGFP. As a model cytotoxic drug, 10 μM doxorubicin (DOX) was used. Real-time quantitative data of the intensity loss of enhanced green fluorescent protein (EGFP) during cell death of target cells have been collected over several minutes to 40 hours. Design issues and improvements are also discussed.

  8. High-Throughput Phenotyping of Plant Height: Comparing Unmanned Aerial Vehicles and Ground LiDAR Estimates

    PubMed Central

    Madec, Simon; Baret, Fred; de Solan, Benoît; Thomas, Samuel; Dutartre, Dan; Jezequel, Stéphane; Hemmerlé, Matthieu; Colombeau, Gallian; Comar, Alexis

    2017-01-01

    The capacity of LiDAR and Unmanned Aerial Vehicles (UAVs) to provide plant height estimates as a high-throughput plant phenotyping trait was explored. An experiment over wheat genotypes conducted under well watered and water stress modalities was conducted. Frequent LiDAR measurements were performed along the growth cycle using a phénomobile unmanned ground vehicle. UAV equipped with a high resolution RGB camera was flying the experiment several times to retrieve the digital surface model from structure from motion techniques. Both techniques provide a 3D dense point cloud from which the plant height can be estimated. Plant height first defined as the z-value for which 99.5% of the points of the dense cloud are below. This provides good consistency with manual measurements of plant height (RMSE = 3.5 cm) while minimizing the variability along each microplot. Results show that LiDAR and structure from motion plant height values are always consistent. However, a slight under-estimation is observed for structure from motion techniques, in relation with the coarser spatial resolution of UAV imagery and the limited penetration capacity of structure from motion as compared to LiDAR. Very high heritability values (H2> 0.90) were found for both techniques when lodging was not present. The dynamics of plant height shows that it carries pertinent information regarding the period and magnitude of the plant stress. Further, the date when the maximum plant height is reached was found to be very heritable (H2> 0.88) and a good proxy of the flowering stage. Finally, the capacity of plant height as a proxy for total above ground biomass and yield is discussed. PMID:29230229

  9. High risk of lead contamination for scavengers in an area with high moose hunting success.

    PubMed

    Legagneux, Pierre; Suffice, Pauline; Messier, Jean-Sébastien; Lelievre, Frédérick; Tremblay, Junior A; Maisonneuve, Charles; Saint-Louis, Richard; Bêty, Joël

    2014-01-01

    Top predators and scavengers are vulnerable to pollutants, particularly those accumulated along the food chain. Lead accumulation can induce severe disorders and alter survival both in mammals (including humans) and in birds. A potential source of lead poisoning in wild animals, and especially in scavengers, results from the consumption of ammunition residues in the tissues of big game killed by hunters. For two consecutive years we quantified the level lead exposure in individuals of a sentinel scavenger species, the common raven (Corvus corax), captured during the moose (Alces alces) hunting season in eastern Quebec, Canada. The source of the lead contamination was also determined using stable isotope analyses. Finally, we identified the different scavenger species that could potentially be exposed to lead by installing automatic cameras targeting moose gut piles. Blood lead concentration in ravens increased over time, indicating lead accumulation over the moose-hunting season. Using a contamination threshold of 100 µg x L(-1), more than 50% of individuals were lead-contaminated during the moose hunting period. Lead concentration was twice as high in one year compared to the other, matching the number of rifle-shot moose in the area. Non-contaminated birds exhibited no ammunition isotope signatures. The isotope signature of the lead detected in contaminated ravens tended towards the signature from lead ammunition. We also found that black bears (Ursus americanus), golden eagles and bald eagles (Aquila chrysaetos and Haliaeetus leucocephalus, two species of conservation concern) scavenged heavily on moose viscera left by hunters. Our unequivocal results agree with other studies and further motivate the use of non-toxic ammunition for big game hunting.

  10. High Risk of Lead Contamination for Scavengers in an Area with High Moose Hunting Success

    PubMed Central

    Legagneux, Pierre; Suffice, Pauline; Messier, Jean-Sébastien; Lelievre, Frédérick; Tremblay, Junior A.; Maisonneuve, Charles; Saint-Louis, Richard; Bêty, Joël

    2014-01-01

    Top predators and scavengers are vulnerable to pollutants, particularly those accumulated along the food chain. Lead accumulation can induce severe disorders and alter survival both in mammals (including humans) and in birds. A potential source of lead poisoning in wild animals, and especially in scavengers, results from the consumption of ammunition residues in the tissues of big game killed by hunters. For two consecutive years we quantified the level lead exposure in individuals of a sentinel scavenger species, the common raven (Corvus corax), captured during the moose (Alces alces) hunting season in eastern Quebec, Canada. The source of the lead contamination was also determined using stable isotope analyses. Finally, we identified the different scavenger species that could potentially be exposed to lead by installing automatic cameras targeting moose gut piles. Blood lead concentration in ravens increased over time, indicating lead accumulation over the moose-hunting season. Using a contamination threshold of 100 µg.L−1, more than 50% of individuals were lead-contaminated during the moose hunting period. Lead concentration was twice as high in one year compared to the other, matching the number of rifle-shot moose in the area. Non-contaminated birds exhibited no ammunition isotope signatures. The isotope signature of the lead detected in contaminated ravens tended towards the signature from lead ammunition. We also found that black bears (Ursus americanus), golden eagles and bald eagles (Aquila chrysaetos and Haliaeetus leucocephalus, two species of conservation concern) scavenged heavily on moose viscera left by hunters. Our unequivocal results agree with other studies and further motivate the use of non-toxic ammunition for big game hunting. PMID:25389754

  11. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    ERIC Educational Resources Information Center

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  12. Bird habitat relationships along a Great Basin elevational gradient

    Treesearch

    Dean E. Medin; Bruce L. Welch; Warren P. Clary

    2000-01-01

    Bird censuses were taken on 11 study plots along an elevational gradient ranging from 5,250 to 11,400 feet. Each plot represented a different vegetative type or zone: shadscale, shadscale-Wyoming big sagebrush, Wyoming big sagebrush, Wyoming big sagebrush-pinyon/juniper, pinyon/juniper, pinyon/juniper-mountain big sagebrush, mountain big sagebrush, mountain big...

  13. Innovative R.E.A. tools for integrated bathymetric survey

    NASA Astrophysics Data System (ADS)

    Demarte, Maurizio; Ivaldi, Roberta; Sinapi, Luigi; Bruzzone, Gabriele; Caccia, Massimo; Odetti, Angelo; Fontanelli, Giacomo; Masini, Andrea; Simeone, Emilio

    2017-04-01

    The REA (Rapid Environmental Assessment) concept is a methodology finalized to acquire environmental information, process them and return in standard paper-chart or standard digital format. Acquired data become thus available for the ingestion or the valorization of the Civilian Protection Emergency Organization or the Rapid Response Forces. The use of Remotely Piloted Aircraft Systems (RPAS) with the miniaturization of multispectral camera or Hyperspectral camera gives to the operator the capability to react in a short time jointly with the capacity to collect a big amount of different data and to deliver a very large number of products. The proposed methodology incorporates data collected from remote and autonomous sensors that acquire data over areas in a cost-effective manner. The hyperspectral sensors are able to map seafloor morphology, seabed structure, depth of bottom surface and an estimate of sediment development. The considerable spectral portions are selected using an appropriate configuration of hyperspectral cameras to maximize the spectral resolution. Data acquired by hyperspectral camera are geo-referenced synchronously to an Attitude and Heading Reference Systems (AHRS) sensor. The data can be subjected to a first step on-board processing of the unmanned vehicle before be transferred through the Ground Control Station (GCS) to a Processing Exploitation Dissemination (PED) system. The recent introduction of Data Distribution Systems (DDS) capabilities in PED allow a cooperative distributed approach to modern decision making. Two platforms are used in our project, a Remote Piloted Aircraft (RPAS) and an Unmanned Surface Vehicle (USV). The two platforms mutually interact to cover a surveyed area wider than the ones that could be covered by the single vehicles. The USV, especially designed to work in very shallow water, has a modular structure and an open hardware and software architecture allowing for an easy installation and integration of various sensors useful for seabed analysis. The very stable platform located on the top of the USV allows for taking-off and landing of the RPAS. By exploiting its higher power autonomy and load capability, the USV will be used as a mothership for the RPAS. In particular, during the missions the USV will be able to furnish recharging possibility for the RPAS and it will be able to function as a bridge for the communication between the RPAS and its control station. The main advantage of the system is the remote acquisition of high-resolution bathymetric data from RPAS in areas where the possibility to have a systematic and traditional survey are few or none. These tools (USV carrying an RPAS with Hyperspectral camera) constitute an innovative and powerful system that gives to the Emergency Response Unit the right instruments to react quickly. The developing of this support could be solve the classical conflict between resolution, needed to capture the fine scale variability and coverage, needed for the large environmental phenomena, with very high variability over a wide range of spatial and temporal scales as the coastal environment.

  14. Seeding considerations in restoring big sagebrush habitat

    Treesearch

    Scott M. Lambert

    2005-01-01

    This paper describes methods of managing or seeding to restore big sagebrush communities for wildlife habitat. The focus is on three big sagebrush subspecies, Wyoming big sagebrush (Artemisia tridentata ssp. wyomingensis), basin big sagebrush (Artemisia tridentata ssp. tridentata), and mountain...

  15. Toward a Literature-Driven Definition of Big Data in Healthcare

    PubMed Central

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data. PMID:26137488

  16. Toward a Literature-Driven Definition of Big Data in Healthcare.

    PubMed

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    The aim of this study was to provide a definition of big data in healthcare. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. A total of 196 papers were included. Big data can be defined as datasets with Log(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data.

  17. 76 FR 54415 - Proposed Flood Elevation Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-01

    ... following flooding sources: Bear Creek (backwater effects from Cumberland River), Big Renox Creek (backwater effects from Cumberland River), Big Whetstone Creek (backwater effects from Cumberland River), Big Willis... River), Big Renox Creek (backwater effects from Cumberland River), Big Whetstone Creek (backwater...

  18. Rethinking big data: A review on the data quality and usage issues

    NASA Astrophysics Data System (ADS)

    Liu, Jianzheng; Li, Jie; Li, Weifeng; Wu, Jiansheng

    2016-05-01

    The recent explosive publications of big data studies have well documented the rise of big data and its ongoing prevalence. Different types of ;big data; have emerged and have greatly enriched spatial information sciences and related fields in terms of breadth and granularity. Studies that were difficult to conduct in the past time due to data availability can now be carried out. However, big data brings lots of ;big errors; in data quality and data usage, which cannot be used as a substitute for sound research design and solid theories. We indicated and summarized the problems faced by current big data studies with regard to data collection, processing and analysis: inauthentic data collection, information incompleteness and noise of big data, unrepresentativeness, consistency and reliability, and ethical issues. Cases of empirical studies are provided as evidences for each problem. We propose that big data research should closely follow good scientific practice to provide reliable and scientific ;stories;, as well as explore and develop techniques and methods to mitigate or rectify those 'big-errors' brought by big data.

  19. A glossary for big data in population and public health: discussion and commentary on terminology and research methods.

    PubMed

    Fuller, Daniel; Buote, Richard; Stanley, Kevin

    2017-11-01

    The volume and velocity of data are growing rapidly and big data analytics are being applied to these data in many fields. Population and public health researchers may be unfamiliar with the terminology and statistical methods used in big data. This creates a barrier to the application of big data analytics. The purpose of this glossary is to define terms used in big data and big data analytics and to contextualise these terms. We define the five Vs of big data and provide definitions and distinctions for data mining, machine learning and deep learning, among other terms. We provide key distinctions between big data and statistical analysis methods applied to big data. We contextualise the glossary by providing examples where big data analysis methods have been applied to population and public health research problems and provide brief guidance on how to learn big data analysis methods. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. Digital Morphometrics: A New Upper Airway Phenotyping Paradigm in OSA.

    PubMed

    Schwab, Richard J; Leinwand, Sarah E; Bearn, Cary B; Maislin, Greg; Rao, Ramya Bhat; Nagaraja, Adithya; Wang, Stephen; Keenan, Brendan T

    2017-08-01

    OSA is associated with changes in pharyngeal anatomy. The goal of this study was to objectively and reproducibly quantify pharyngeal anatomy by using digital morphometrics based on a laser ruler and to assess differences between subjects with OSA and control subjects and associations with the apnea-hypopnea index (AHI). To the best of our knowledge, this study is the first to use digital morphometrics to quantify intraoral risk factors for OSA. Digital photographs were obtained by using an intraoral laser ruler and digital camera in 318 control subjects (mean AHI, 4.2 events/hour) and 542 subjects with OSA (mean AHI, 39.2 events/hour). The digital morphometric paradigm was validated and reproducible over time and camera distances. A larger modified Mallampati score and having a nonvisible airway were associated with a higher AHI, both unadjusted (P < .001) and controlling for age, sex, race, and BMI (P = .015 and P = .018, respectively). Measures of tongue size were larger in subjects with OSA vs control subjects in unadjusted models and controlling for age, sex, and race but nonsignificant controlling for BMI; similar results were observed with AHI severity. Multivariate regression suggests photography-based variables capture independent associations with OSA. Measures of tongue size, airway visibility, and Mallampati scores were associated with increased OSA risk and severity. This study shows that digital morphometrics is an accurate, high-throughput, and noninvasive technique to identify anatomic OSA risk factors. Morphometrics may also provide a more reproducible and standardized measurement of the Mallampati score. Digital morphometrics represent an efficient and cost-effective method of examining intraoral crowding and tongue size when examining large populations, genetics, or screening for OSA. Copyright © 2017 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  1. Tools for automating the imaging of zebrafish larvae.

    PubMed

    Pulak, Rock

    2016-03-01

    The VAST BioImager system is a set of tools developed for zebrafish researchers who require the collection of images from a large number of 2-7 dpf zebrafish larvae. The VAST BioImager automates larval handling, positioning and orientation tasks. Color images at about 10 μm resolution are collected from the on-board camera of the system. If images of greater resolution and detail are required, this system is mounted on an upright microscope, such as a confocal or fluorescence microscope, to utilize their capabilities. The system loads a larvae, positions it in view of the camera, determines orientation using pattern recognition analysis, and then more precisely positions to user-defined orientation for optimal imaging of any desired tissue or organ system. Multiple images of the same larva can be collected. The specific part of each larva and the desired orientation and position is identified by the researcher and an experiment defining the settings and a series of steps can be saved and repeated for imaging of subsequent larvae. The system captures images, then ejects and loads another larva from either a bulk reservoir, a well of a 96 well plate using the LP Sampler, or individually targeted larvae from a Petri dish or other container using the VAST Pipettor. Alternative manual protocols for handling larvae for image collection are tedious and time consuming. The VAST BioImager automates these steps to allow for greater throughput of assays and screens requiring high-content image collection of zebrafish larvae such as might be used in drug discovery and toxicology studies. Copyright © 2015 The Author. Published by Elsevier Inc. All rights reserved.

  2. Prediction of optical communication link availability: real-time observation of cloud patterns using a ground-based thermal infrared camera

    NASA Astrophysics Data System (ADS)

    Bertin, Clément; Cros, Sylvain; Saint-Antonin, Laurent; Schmutz, Nicolas

    2015-10-01

    The growing demand for high-speed broadband communications with low orbital or geostationary satellites is a major challenge. Using an optical link at 1.55 μm is an advantageous solution which potentially can increase the satellite throughput by a factor 10. Nevertheless, cloud cover is an obstacle for this optical frequency. Such communication requires an innovative management system to optimize the optical link availability between a satellite and several Optical Ground Stations (OGS). The Saint-Exupery Technological Research Institute (France) leads the project ALBS (French acronym for BroadBand Satellite Access). This initiative involving small and medium enterprises, industrial groups and research institutions specialized in aeronautics and space industries, is currently developing various solutions to increase the telecommunication satellite bandwidth. This paper presents the development of a preliminary prediction system preventing the cloud blockage of an optical link between a satellite and a given OGS. An infrared thermal camera continuously observes (night and day) the sky vault. Cloud patterns are observed and classified several times a minute. The impact of the detected clouds on the optical beam (obstruction or not) is determined by the retrieval of the cloud optical depth at the wavelength of communication. This retrieval is based on realistic cloud-modelling on libRadtran. Then, using subsequent images, cloud speed and trajectory are estimated. Cloud blockage over an OGS can then be forecast up to 30 minutes ahead. With this information, the preparation of the new link between the satellite and another OGS under a clear sky can be prepared before the link breaks due to cloud blockage.

  3. Automated Analysis of a Nematode Population-based Chemosensory Preference Assay

    PubMed Central

    Chai, Cynthia M.; Cronin, Christopher J.; Sternberg, Paul W.

    2017-01-01

    The nematode, Caenorhabditis elegans' compact nervous system of only 302 neurons underlies a diverse repertoire of behaviors. To facilitate the dissection of the neural circuits underlying these behaviors, the development of robust and reproducible behavioral assays is necessary. Previous C. elegans behavioral studies have used variations of a "drop test", a "chemotaxis assay", and a "retention assay" to investigate the response of C. elegans to soluble compounds. The method described in this article seeks to combine the complementary strengths of the three aforementioned assays. Briefly, a small circle in the middle of each assay plate is divided into four quadrants with the control and experimental solutions alternately placed. After the addition of the worms, the assay plates are loaded into a behavior chamber where microscope cameras record the worms' encounters with the treated regions. Automated video analysis is then performed and a preference index (PI) value for each video is generated. The video acquisition and automated analysis features of this method minimizes the experimenter's involvement and any associated errors. Furthermore, minute amounts of the experimental compound are used per assay and the behavior chamber's multi-camera setup increases experimental throughput. This method is particularly useful for conducting behavioral screens of genetic mutants and novel chemical compounds. However, this method is not appropriate for studying stimulus gradient navigation due to the close proximity of the control and experimental solution regions. It should also not be used when only a small population of worms is available. While suitable for assaying responses only to soluble compounds in its current form, this method can be easily modified to accommodate multimodal sensory interaction and optogenetic studies. This method can also be adapted to assay the chemosensory responses of other nematode species. PMID:28745641

  4. Big Data Analytics in Medicine and Healthcare.

    PubMed

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  5. BigBWA: approaching the Burrows-Wheeler aligner to Big Data technologies.

    PubMed

    Abuín, José M; Pichel, Juan C; Pena, Tomás F; Amigo, Jorge

    2015-12-15

    BigBWA is a new tool that uses the Big Data technology Hadoop to boost the performance of the Burrows-Wheeler aligner (BWA). Important reductions in the execution times were observed when using this tool. In addition, BigBWA is fault tolerant and it does not require any modification of the original BWA source code. BigBWA is available at the project GitHub repository: https://github.com/citiususc/BigBWA. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Big data processing in the cloud - Challenges and platforms

    NASA Astrophysics Data System (ADS)

    Zhelev, Svetoslav; Rozeva, Anna

    2017-12-01

    Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.

  7. Technical challenges for big data in biomedicine and health: data sources, infrastructure, and analytics.

    PubMed

    Peek, N; Holmes, J H; Sun, J

    2014-08-15

    To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.

  8. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    PubMed

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  9. Apollo 15 Lunar eclipse views

    NASA Image and Video Library

    1971-08-01

    S71-58222 (31 July-2 Aug. 1971) --- During the lunar eclipse that occurred during the Apollo 15 lunar landing mission, astronaut Alfred M. Worden, command module pilot, used a 35mm Nikon camera to obtain a series of 15 photographs while the moon was entering and exiting Earth's umbra. Although it might seem that there should be no light on the moon when it is in Earth's shadow, sunlight is scattered into this region by Earth's atmosphere. This task was an attempt to measure by photographic photometry the amount of scattered light reaching the moon. The four views from upper left to lower right were selected to show the moon as it entered Earth's umbra. The first is a four-second exposure which was taken at the moment when the moon had just entered umbra; the second is a 15-second exposure taken two minutes after entry; the third, a 30-second exposure three minutes after entry; and the fourth is a 60-second exposure four minutes after entry. In all cases the light reaching the moon was so bright on the very high speed film (Eastman Kodak type 2485 emulsion) that the halation obscures the lunar image, which should be about one-third as big as the circle of light. The background star field is clearly evident, and this is very important for these studies. The spacecraft was in full sunlight when these photographs were taken, and it was pointed almost directly away from the sun so that the windows and a close-in portion of the camera's line-of-sight were in shadow. The environment around the vehicle at this time appears to be very "clean" with no light scattering particles noticeable.

  10. 1.5 Meter Per Pixel View of Boulders in Ganges Chasma

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Mars Orbiter Camera (MOC) on board the Mars Global Surveyor (MGS)spacecraft was designed to be able to take pictures that 'bridge the gap' between what could be seen by the Mariner 9 and Viking Orbiters from space and what could be seen by landers from the ground. In other words, MOC was designed to be able to see boulders of sizes similar to and larger than those named 'Yogi' at the Mars Pathfinder site and 'Big Joe' at the Viking 1 landing site. To see such boulders, a resolution of at least 1.5 meters (5 feet) per pixel was required.

    With the start of the MGS Mapping Phase of the mission during the second week of March 1999, the MOC team is pleased to report that 'the gap is bridged.' This image shows a field of boulders on the surface of a landslide deposit in Ganges Chasma. Ganges Chasma is one of the valleys in the Valles Marineris canyon system. The image resolution is 1.5 meters per pixel. The boulders shown here range in size from about 2 meters (7 feet) to about 20 meters (66 feet) in size. The image covers an area 1 kilometer (0.62 miles) across, and illumination is from the upper left.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  11. Observing the Earliest Galaxies: Looking for the Sources of Reionization

    NASA Astrophysics Data System (ADS)

    Illingworth, Garth

    2015-04-01

    Systematic searches for the earliest galaxies in the reionization epoch finally became possible in 2009 when the Hubble Space Telescope was updated with a powerful new infrared camera during the final Shuttle servicing mission SM4 to Hubble. The reionization epoch represents the last major phase transition of the universe and was a major event in cosmic history. The intense ultraviolet radiation from young star-forming galaxies is increasingly considered to be the source of the photons that reionized intergalactic hydrogen in the period between the ``dark ages'' (the time before the first stars and galaxies at about 100-200 million years after the Big Bang) and the end of reionization around 800-900 million years. Yet finding and measuring the earliest galaxies in this era of cosmic dawn has proven to a challenging task, even with Hubble's new infrared camera. I will discuss the deep imaging undertaken by Hubble and the remarkable insights that have accrued from the imaging datasets taken over the last decade on the Hubble Ultra-Deep Field (HUDF, HUDF09/12) and other regions. The HUDF datasets are central to the story and have been assembled into the eXtreme Deep Field (XDF), the deepest image ever from Hubble data. The XDF, when combined with results from shallower wide-area imaging surveys (e.g., GOODS, CANDELS) and with detections of galaxies from the Frontier Fields, has provided significant insights into the role of galaxies in reionization. Yet many questions remain. The puzzle is far from being fully solved and, while much will done over the next few years, the solution likely awaits the launch of JWST. NASA/STScI Grant HST-GO-11563.

  12. A mini outburst from the nightside of comet 67P/Churyumov-Gerasimenko observed by the OSIRIS camera on Rosetta

    NASA Astrophysics Data System (ADS)

    Knollenberg, J.; Lin, Z. Y.; Hviid, S. F.; Oklay, N.; Vincent, J.-B.; Bodewits, D.; Mottola, S.; Pajola, M.; Sierks, H.; Barbieri, C.; Lamy, P.; Rodrigo, R.; Koschny, D.; Rickman, H.; A'Hearn, M. F.; Barucci, M. A.; Bertaux, J. L.; Bertini, I.; Cremonese, G.; Davidsson, B.; Da Deppo, V.; Debei, S.; De Cecco, M.; Fornasier, S.; Fulle, M.; Groussin, O.; Gutiérrez, P. J.; Ip, W.-H.; Jorda, L.; Keller, H. U.; Kührt, E.; Kramm, J. R.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Lopez Moreno, J. J.; Marzari, F.; Naletto, G.; Thomas, N.; Güttler, C.; Preusker, F.; Scholten, F.; Tubiana, C.

    2016-12-01

    Context. On 12 March 2015 the OSIRIS WAC camera onboard the ESA Rosetta spacecraft orbiting comet 67P/Churyumov-Gerasimenko observed a small outburst originating from the Imhotep region at the foot of the big lobe of the comet. These measurements are unique since it was the first time that the initial phase of a transient outburst event could be directly observed. Aims: We investigate the evolution of the dust jet in order to derive clues about the outburst source mechanism and the ejected dust particles, in particular the dust mass, dust-to-gas ratio and the particle size distribution. Methods: Analysis of the images and of the observation geometry using comet shape models in combination with gasdynamic modeling of the transient dust jet were the main tools used in this study. Synthetic images were computed for comparison with the observations. Results: Analysis of the geometry revealed that the source region was not illuminated until 1.5 h after the event implying true nightside activity was observed. The outburst lasted for less than one hour and the average dust production rate during the initial four minutes was of the order of 1 kg/s. During this time the outburst dust production rate was approximately constant, no sign for an initial explosion could be detected. For dust grains between 0.01-1 mm a power law size distribution characterized by an index of about 2.6 provides the best fit to the observed radiance profiles. The dust-to-gas ratio of the outburst jet is in the range 0.6-1.8.

  13. High-efficient Unmanned Aircraft System Operations for Ecosystem Assessment

    NASA Astrophysics Data System (ADS)

    Xu, H.; Zhang, H.

    2016-02-01

    Diverse national and international agencies support the idea that incorporating Unmanned Aircraft Systems (UAS) into ecosystem assessment will improve the operations efficiency and accuracy. In this paper, a UAS will be designed to monitor the Gulf of Mexico's coastal area ecosystems intelligently and routinely. UAS onboard sensors will capture information that can be utilized to detect and geo-locate areas affected by invasive grasses. Moreover, practical ecosystem will be better assessed by analyzing the collected information. Compared with human-based/satellite-based surveillance, the proposed strategy is more efficient and accurate, and eliminates limitations and risks associated with human factors. State of the art UAS onboard sensors (e.g. high-resolution electro optical camera, night vision camera, thermal sensor etc.) will be used for monitoring coastal ecosystems. Once detected the potential risk in ecosystem, the onboard GPS data will be used to geo-locate and to store the exact coordinates of the affected area. Moreover, the UAS sensors will be used to observe and to record the daily evolution of coastal ecosystems. Further, benefitting from the data collected by the UAS, an intelligent big data processing scheme will be created to assess the ecosystem evolution effectively. Meanwhile, a cost-efficient intelligent autonomous navigation strategy will be implemented into the UAS, in order to guarantee that the UAS can fly over designated areas, and collect significant data in a safe and effective way. Furthermore, the proposed UAS-based ecosystem surveillance and assessment methodologies can be utilized for natural resources conservation. Flying UAS with multiple state of the art sensors will monitor and report the actual state of high importance natural resources frequently. Using the collected data, the ecosystem conservation strategy can be performed effectively and intelligently.

  14. Advances in hyperspectral LWIR pushbroom imagers

    NASA Astrophysics Data System (ADS)

    Holma, Hannu; Mattila, Antti-Jussi; Hyvärinen, Timo; Weatherbee, Oliver

    2011-06-01

    Two long-wave infrared (LWIR) hyperspectral imagers have been under extensive development. The first one utilizes a microbolometer focal plane array (FPA) and the second one is based on an Mercury Cadmium Telluride (MCT) FPA. Both imagers employ a pushbroom imaging spectrograph with a transmission grating and on-axis optics. The main target has been to develop high performance instruments with good image quality and compact size for various industrial and remote sensing application requirements. A big challenge in realizing these goals without considerable cooling of the whole instrument is to control the instrument radiation. The challenge is much bigger in a hyperspectral instrument than in a broadband camera, because the optical signal from the target is spread spectrally, but the instrument radiation is not dispersed. Without any suppression, the instrument radiation can overwhelm the radiation from the target even by 1000 times. The means to handle the instrument radiation in the MCT imager include precise instrument temperature stabilization (but not cooling), efficient optical background suppression and the use of background-monitoring-on-chip (BMC) method. This approach has made possible the implementation of a high performance, extremely compact spectral imager in the 7.7 to 12.4 μm spectral range. The imager performance with 84 spectral bands and 384 spatial pixels has been experimentally verified and an excellent NESR of 14 mW/(m2srμm) at 10 μm wavelength with a 300 K target has been achieved. This results in SNR of more than 700. The LWIR imager based on a microbolometer detector array, first time introduced in 2009, has been upgraded. The sensitivity of the imager has improved drastically by a factor of 3 and SNR by about 15 %. It provides a rugged hyperspectral camera for chemical imaging applications in reflection mode in laboratory and industry.

  15. High-throughput phenotyping of large wheat breeding nurseries using unmanned aerial system, remote sensing and GIS techniques

    NASA Astrophysics Data System (ADS)

    Haghighattalab, Atena

    Wheat breeders are in a race for genetic gain to secure the future nutritional needs of a growing population. Multiple barriers exist in the acceleration of crop improvement. Emerging technologies are reducing these obstacles. Advances in genotyping technologies have significantly decreased the cost of characterizing the genetic make-up of candidate breeding lines. However, this is just part of the equation. Field-based phenotyping informs a breeder's decision as to which lines move forward in the breeding cycle. This has long been the most expensive and time-consuming, though most critical, aspect of breeding. The grand challenge remains in connecting genetic variants to observed phenotypes followed by predicting phenotypes based on the genetic composition of lines or cultivars. In this context, the current study was undertaken to investigate the utility of UAS in assessment field trials in wheat breeding programs. The major objective was to integrate remotely sensed data with geospatial analysis for high throughput phenotyping of large wheat breeding nurseries. The initial step was to develop and validate a semi-automated high-throughput phenotyping pipeline using a low-cost UAS and NIR camera, image processing, and radiometric calibration to build orthomosaic imagery and 3D models. The relationship between plot-level data (vegetation indices and height) extracted from UAS imagery and manual measurements were examined and found to have a high correlation. Data derived from UAS imagery performed as well as manual measurements while exponentially increasing the amount of data available. The high-resolution, high-temporal HTP data extracted from this pipeline offered the opportunity to develop a within season grain yield prediction model. Due to the variety in genotypes and environmental conditions, breeding trials are inherently spatial in nature and vary non-randomly across the field. This makes geographically weighted regression models a good choice as a geospatial prediction model. Finally, with the addition of georeferenced and spatial data integral in HTP and imagery, we were able to reduce the environmental effect from the data and increase the accuracy of UAS plot-level data. The models developed through this research, when combined with genotyping technologies, increase the volume, accuracy, and reliability of phenotypic data to better inform breeder selections. This increased accuracy with evaluating and predicting grain yield will help breeders to rapidly identify and advance the most promising candidate wheat varieties.

  16. BIG1, a brefeldin A-inhibited guanine nucleotide-exchange protein regulates neurite development via PI3K-AKT and ERK signaling pathways.

    PubMed

    Zhou, C; Li, C; Li, D; Wang, Y; Shao, W; You, Y; Peng, J; Zhang, X; Lu, L; Shen, X

    2013-12-19

    The elongation of neuron is highly dependent on membrane trafficking. Brefeldin A (BFA)-inhibited guanine nucleotide-exchange protein 1 (BIG1) functions in the membrane trafficking between the Golgi apparatus and the plasma membrane. BFA, an uncompetitive inhibitor of BIG1 can inhibit neurite outgrowth and polarity development. In this study, we aimed to define the possible role of BIG1 in neurite development and to further investigate the potential mechanism. By immunostaining, we found that BIG1 was extensively colocalized with synaptophysin, a marker for synaptic vesicles in soma and partly in neurites. The amount of both protein and mRNA of BIG1 were up-regulated during rat brain development. BIG1 depletion significantly decreased the neurite length and inhibited the phosphorylation of phosphatidylinositide 3-kinase (PI3K) and protein kinase B (AKT). Inhibition of BIG1 guanine nucleotide-exchange factor (GEF) activity by BFA or overexpression of the dominant-negative BIG1 reduced PI3K and AKT phosphorylation, indicating regulatory effects of BIG1 on PI3K-AKT signaling pathway is dependent on its GEF activity. BIG1 siRNA or BFA treatment also significantly reduced extracellular signal-regulated kinase (ERK) phosphorylation. Overexpression of wild-type BIG1 significantly increased ERK phosphorylation, but the dominant-negative BIG1 had no effect on ERK phosphorylation, indicating the involvement of BIG1 in ERK signaling regulation may not be dependent on its GEF activity. Our result identified a novel function of BIG1 in neurite development. The newly recognized function integrates the function of BIG1 in membrane trafficking with the activation of PI3K-AKT and ERK signaling pathways which are critical in neurite development. Copyright © 2013 IBRO. Published by Elsevier Ltd. All rights reserved.

  17. Measuring the Promise of Big Data Syllabi

    ERIC Educational Resources Information Center

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  18. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    NASA Astrophysics Data System (ADS)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  19. The 2016 Transit of Mercury Observed from Major Solar Telescopes and Satellites

    NASA Astrophysics Data System (ADS)

    Pasachoff, Jay M.; Schneider, Glenn; Gary, Dale; Chen, Bin; Sterling, Alphonse C.; Reardon, Kevin P.; Dantowitz, Ronald; Kopp, Greg A.

    2016-10-01

    We report observations from the ground and space of the 9 May 2016 transit of Mercury. We build on our explanation of the black-drop effect in transits of Venus based on spacecraft observations of the 1999 transit of Mercury (Schneider, Pasachoff, and Golub, Icarus 168, 249, 2004). In 2016, we used the 1.6-m New Solar Telescope at the Big Bear Solar Observatory with active optics to observe Mercury's transit at high spatial resolution. We again saw a small black-drop effect as 3rd contact neared, confirming the data that led to our earlier explanation as a confluence of the point-spread function and the extreme solar limb darkening (Pasachoff, Schneider, and Golub, in IAU Colloq. 196, 2004). We again used IBIS on the Dunn Solar Telescope of the Sacramento Peak Observatory, as A. Potter continued his observations, previously made at the 2006 transit of Mercury, at both telescopes of the sodium exosphere of Mercury (Potter, Killen, Reardon, and Bida, Icarus 226, 172, 2013). We imaged the transit with IBIS as well as with two RED Epic IMAX-quality cameras alongside it, one with a narrow passband. We show animations of our high-resolution ground-based observations along with observations from XRT on JAXA's Hinode and from NASA's Solar Dynamics Observatory. Further, we report on the limit of the transit change in the Total Solar Irradiance, continuing our interest from the transit of Venus TSI (Schneider, Pasachoff, and Willson, ApJ 641, 565, 2006; Pasachoff, Schneider, and Willson, AAS 2005), using NASA's SORCE/TIM and the Air Force's TCTE/TIM. See http://transitofvenus.info and http://nicmosis.as.arizona.edu.Acknowledgments: We were glad for the collaboration at Big Bear of Claude Plymate and his colleagues of the staff of the Big Bear Solar Observatory. We also appreciate the collaboration on the transit studies of Robert Lucas (Sydney, Australia) and Evan Zucker (San Diego, California). JMP appreciates the sabbatical hospitality of the Division of Geosciences and Planetary Sciences of the California Institute of Technology, and of Prof. Andrew Ingersoll there. The solar observations lead into the 2017 eclipse studies, for which JMP is supported by grants from the NSF AGS and National Geographic CRE.

  20. Optics for Processes, Products and Metrology

    NASA Astrophysics Data System (ADS)

    Mather, George

    1999-04-01

    Optical physics has a variety of applications in industry, including process inspection, coatings development, vision instrumentation, spectroscopy, and many others. Optics has been used extensively in the design of solar energy collection systems and coatings, for example. Also, with the availability of good CCD cameras and fast computers, it has become possible to develop real-time inspection and metrology devices that can accommodate the high throughputs encountered in modern production processes. More recently, developments in moiré interferometry show great promise for applications in the basic metals and electronics industries. The talk will illustrate applications of optics by discussing process inspection techniques for defect detection, part dimensioning, birefringence measurement, and the analysis of optical coatings in the automotive, glass, and optical disc industries. In particular, examples of optical techniques for the quality control of CD-R, MO, and CD-RW discs will be presented. In addition, the application of optical concepts to solar energy collector design and to metrology by moiré techniques will be discussed. Finally, some of the modern techniques and instruments used for qualitative and quantitative material analysis will be presented.

Top