Sample records for comprehensive large array-data

  1. arrayCGHbase: an analysis platform for comparative genomic hybridization microarrays

    PubMed Central

    Menten, Björn; Pattyn, Filip; De Preter, Katleen; Robbrecht, Piet; Michels, Evi; Buysse, Karen; Mortier, Geert; De Paepe, Anne; van Vooren, Steven; Vermeesch, Joris; Moreau, Yves; De Moor, Bart; Vermeulen, Stefan; Speleman, Frank; Vandesompele, Jo

    2005-01-01

    Background The availability of the human genome sequence as well as the large number of physically accessible oligonucleotides, cDNA, and BAC clones across the entire genome has triggered and accelerated the use of several platforms for analysis of DNA copy number changes, amongst others microarray comparative genomic hybridization (arrayCGH). One of the challenges inherent to this new technology is the management and analysis of large numbers of data points generated in each individual experiment. Results We have developed arrayCGHbase, a comprehensive analysis platform for arrayCGH experiments consisting of a MIAME (Minimal Information About a Microarray Experiment) supportive database using MySQL underlying a data mining web tool, to store, analyze, interpret, compare, and visualize arrayCGH results in a uniform and user-friendly format. Following its flexible design, arrayCGHbase is compatible with all existing and forthcoming arrayCGH platforms. Data can be exported in a multitude of formats, including BED files to map copy number information on the genome using the Ensembl or UCSC genome browser. Conclusion ArrayCGHbase is a web based and platform independent arrayCGH data analysis tool, that allows users to access the analysis suite through the internet or a local intranet after installation on a private server. ArrayCGHbase is available at . PMID:15910681

  2. Modeling needs for very large systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, Joshua S.

    2010-10-01

    Most system performance models assume a point measurement for irradiance and that, except for the impact of shading from nearby obstacles, incident irradiance is uniform across the array. Module temperature is also assumed to be uniform across the array. For small arrays and hourly-averaged simulations, this may be a reasonable assumption. Stein is conducting research to characterize variability in large systems and to develop models that can better accommodate large system factors. In large, multi-MW arrays, passing clouds may block sunlight from a portion of the array but never affect another portion. Figure 22 shows that two irradiance measurements atmore » opposite ends of a multi-MW PV plant appear to have similar irradiance (left), but in fact the irradiance is not always the same (right). Module temperature may also vary across the array, with modules on the edges being cooler because they have greater wind exposure. Large arrays will also have long wire runs and will be subject to associated losses. Soiling patterns may also vary, with modules closer to the source of soiling, such as an agricultural field, receiving more dust load. One of the primary concerns associated with this effort is how to work with integrators to gain access to better and more comprehensive data for model development and validation.« less

  3. Transverse vorticity measurements using an array of four hot-wire probes

    NASA Technical Reports Server (NTRS)

    Foss, J. F.; Klewickc, C. L.; Disimile, P. J.

    1986-01-01

    A comprehensive description of the technique used to obtain a time series of the quasi-instantaneous transverse vorticity from a four wire array of probes is presented. The algorithmic structure which supports the technique is described in detail and demonstration data, from a large plane shear layer, are presented to provide a specific utilization of the technique. Sensitivity calculations are provided which allow one contribution to the inherent uncertainty of the technique to be evaluated.

  4. A Comprehensive Approach to Fusion for Microsensor Networks: Distributed and Hierarchical Inference, Communication, and Adaption

    DTIC Science & Technology

    2000-08-01

    lecturer of LATIN 2006 , (Latin America Theoretical Informat- ics, 2006 ), Valdivia , Chile, March 2006 . 67. Sergio Verdu gave a Keynote Talk at the New...NUMBER OF PAGES 20. LIMITATION OF ABSTRACT UL - 31-Jan- 2006 Data Fusion in Large Arrays of Microsensors (SensorWeb): A Comprehensive Approach to...Transactions on Wireless Communications, February 2006 . 21. A.P. George, W.B. Powell, S.R. Kulkarni. The Statistics of Hierarchical Aggregation for

  5. New Developments in NOAA's Comprehensive Large Array-Data Stewardship System

    NASA Astrophysics Data System (ADS)

    Ritchey, N. A.; Morris, J. S.; Carter, D. J.

    2012-12-01

    The Comprehensive Large Array-data Stewardship System (CLASS) is part of the NOAA strategic goal of Climate Adaptation and Mitigation that gives focus to the building and sustaining of key observational assets and data archives critical to maintaining the global climate record. Since 2002, CLASS has been NOAA's enterprise solution for ingesting, storing and providing access to a host of near real-time remote sensing streams such as the Polar and Geostationary Operational Environmental Satellites (POES and GOES) and the Defense Meteorological Satellite Program (DMSP). Since October, 2011 CLASS has also been the dedicated Archive Data Segment (ADS) of the Suomi National Polar-orbiting Partnership (S-NPP). As the ADS, CLASS receives raw and processed S-NPP records for archival and distribution to the broad user community. Moving beyond just remote sensing and model data, NOAA has endorsed a plan to migrate all archive holdings from NOAA's National Data Centers into CLASS while retiring various disparate legacy data storage systems residing at the National Climatic Data Center (NCDC), National Geophysical Data Center (NGDC) and the National Oceanographic Data Center (NODC). In parallel to this data migration, CLASS is evolving to a service-oriented architecture utilizing cloud technologies for dissemination in addition to clearly defined interfaces that allow better collaboration with partners. This evolution will require implementation of standard access protocols and metadata which will lead to cost effective data and information preservation.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cutler, Dylan; Frank, Stephen; Slovensky, Michelle

    Rich, well-organized building performance and energy consumption data enable a host of analytic capabilities for building owners and operators, from basic energy benchmarking to detailed fault detection and system optimization. Unfortunately, data integration for building control systems is challenging and costly in any setting. Large portfolios of buildings--campuses, cities, and corporate portfolios--experience these integration challenges most acutely. These large portfolios often have a wide array of control systems, including multiple vendors and nonstandard communication protocols. They typically have complex information technology (IT) networks and cybersecurity requirements and may integrate distributed energy resources into their infrastructure. Although the challenges are significant,more » the integration of control system data has the potential to provide proportionally greater value for these organizations through portfolio-scale analytics, comprehensive demand management, and asset performance visibility. As a large research campus, the National Renewable Energy Laboratory (NREL) experiences significant data integration challenges. To meet them, NREL has developed an architecture for effective data collection, integration, and analysis, providing a comprehensive view of data integration based on functional layers. The architecture is being evaluated on the NREL campus through deployment of three pilot implementations.« less

  7. Identifying tagging SNPs for African specific genetic variation from the African Diaspora Genome

    PubMed Central

    Johnston, Henry Richard; Hu, Yi-Juan; Gao, Jingjing; O’Connor, Timothy D.; Abecasis, Gonçalo R.; Wojcik, Genevieve L; Gignoux, Christopher R.; Gourraud, Pierre-Antoine; Lizee, Antoine; Hansen, Mark; Genuario, Rob; Bullis, Dave; Lawley, Cindy; Kenny, Eimear E.; Bustamante, Carlos; Beaty, Terri H.; Mathias, Rasika A.; Barnes, Kathleen C.; Qin, Zhaohui S.; Preethi Boorgula, Meher; Campbell, Monica; Chavan, Sameer; Ford, Jean G.; Foster, Cassandra; Gao, Li; Hansel, Nadia N.; Horowitz, Edward; Huang, Lili; Ortiz, Romina; Potee, Joseph; Rafaels, Nicholas; Ruczinski, Ingo; Scott, Alan F.; Taub, Margaret A.; Vergara, Candelaria; Levin, Albert M.; Padhukasahasram, Badri; Williams, L. Keoki; Dunston, Georgia M.; Faruque, Mezbah U.; Gietzen, Kimberly; Deshpande, Aniket; Grus, Wendy E.; Locke, Devin P.; Foreman, Marilyn G.; Avila, Pedro C.; Grammer, Leslie; Kim, Kwang-Youn A.; Kumar, Rajesh; Schleimer, Robert; De La Vega, Francisco M.; Shringarpure, Suyash S.; Musharoff, Shaila; Burchard, Esteban G.; Eng, Celeste; Hernandez, Ryan D.; Pino-Yanes, Maria; Torgerson, Dara G.; Szpiech, Zachary A.; Torres, Raul; Nicolae, Dan L.; Ober, Carole; Olopade, Christopher O; Olopade, Olufunmilayo; Oluwole, Oluwafemi; Arinola, Ganiyu; Song, Wei; Correa, Adolfo; Musani, Solomon; Wilson, James G.; Lange, Leslie A.; Akey, Joshua; Bamshad, Michael; Chong, Jessica; Fu, Wenqing; Nickerson, Deborah; Reiner, Alexander; Hartert, Tina; Ware, Lorraine B.; Bleecker, Eugene; Meyers, Deborah; Ortega, Victor E.; Maul, Pissamai; Maul, Trevor; Watson, Harold; Ilma Araujo, Maria; Riccio Oliveira, Ricardo; Caraballo, Luis; Marrugo, Javier; Martinez, Beatriz; Meza, Catherine; Ayestas, Gerardo; Francisco Herrera-Paz, Edwin; Landaverde-Torres, Pamela; Erazo, Said Omar Leiva; Martinez, Rosella; Mayorga, Alvaro; Mayorga, Luis F.; Mejia-Mejia, Delmy-Aracely; Ramos, Hector; Saenz, Allan; Varela, Gloria; Marina Vasquez, Olga; Ferguson, Trevor; Knight-Madden, Jennifer; Samms-Vaughan, Maureen; Wilks, Rainford J.; Adegnika, Akim; Ateba-Ngoa, Ulysse; Yazdanbakhsh, Maria

    2017-01-01

    A primary goal of The Consortium on Asthma among African-ancestry Populations in the Americas (CAAPA) is to develop an ‘African Diaspora Power Chip’ (ADPC), a genotyping array consisting of tagging SNPs, useful in comprehensively identifying African specific genetic variation. This array is designed based on the novel variation identified in 642 CAAPA samples of African ancestry with high coverage whole genome sequence data (~30× depth). This novel variation extends the pattern of variation catalogued in the 1000 Genomes and Exome Sequencing Projects to a spectrum of populations representing the wide range of West African genomic diversity. These individuals from CAAPA also comprise a large swath of the African Diaspora population and incorporate historical genetic diversity covering nearly the entire Atlantic coast of the Americas. Here we show the results of designing and producing such a microchip array. This novel array covers African specific variation far better than other commercially available arrays, and will enable better GWAS analyses for researchers with individuals of African descent in their study populations. A recent study cataloging variation in continental African populations suggests this type of African-specific genotyping array is both necessary and valuable for facilitating large-scale GWAS in populations of African ancestry. PMID:28429804

  8. High-throughput genotyping of hop (Humulus lupulus L.) utilising diversity arrays technology (DArT)

    USDA-ARS?s Scientific Manuscript database

    Implementation of molecular methods in hop breeding is dependent on the availability of sizeable numbers of polymorphic markers and a comprehensive understanding of genetic variation. Diversity Arrays Technology (DArT) is a high-throughput cost-effective method for the discovery of large numbers of...

  9. Characterization of High-power Quasi-cw Laser Diode Arrays

    NASA Technical Reports Server (NTRS)

    Stephen, Mark A.; Vasilyev, Aleksey; Troupaki, Elisavet; Allan, Graham R.; Kashem, Nasir B.

    2005-01-01

    NASA s requirements for high reliability, high performance satellite laser instruments have driven the investigation of many critical components; specifically, 808 nm laser diode array (LDA) pump devices. Performance and comprehensive characterization data of Quasi-CW, High-power, laser diode arrays is presented.

  10. The data array, a tool to interface the user to a large data base

    NASA Technical Reports Server (NTRS)

    Foster, G. H.

    1974-01-01

    Aspects of the processing of spacecraft data is considered. Use of the data array in a large address space as an intermediate form in data processing for a large scientific data base is advocated. Techniques for efficient indexing in data arrays are reviewed and the data array method for mapping an arbitrary structure onto linear address space is shown. A compromise between the two forms is given. The impact of the data array on the user interface are considered along with implementation.

  11. The Cold Gas History of the Universe as seen by the ngVLA

    NASA Astrophysics Data System (ADS)

    Riechers, Dominik A.; Carilli, Chris Luke; Casey, Caitlin; da Cunha, Elisabete; Hodge, Jacqueline; Ivison, Rob; Murphy, Eric J.; Narayanan, Desika; Sargent, Mark T.; Scoville, Nicholas; Walter, Fabian

    2017-01-01

    The Next Generation Very Large Array (ngVLA) will fundamentally advance our understanding of the formation processes that lead to the assembly of galaxies throughout cosmic history. The combination of large bandwidth with unprecedented sensitivity to the critical low-level CO lines over virtually the entire redshift range will open up the opportunity to conduct large-scale, deep cold molecular gas surveys, mapping the fuel for star formation in galaxies over substantial cosmic volumes. Informed by the first efforts with the Karl G. Jansky Very Large Array (COLDz survey) and the Atacama Large (sub)Millimeter Array (ASPECS survey), we here present initial predictions and possible survey strategies for such "molecular deep field" observations with the ngVLA. These investigations will provide a detailed measurement of the volume density of molecular gas in galaxies as a function of redshift, the "cold gas history of the universe". This will crucially complement studies of the neutral gas, star formation and stellar mass histories with large low-frequency arrays, the Large UV/Optical/Infrared Surveyor, and the Origins Space Telescope, providing the means to obtain a comprehensive picture of galaxy evolution through cosmic times.

  12. Array coding for large data memories

    NASA Technical Reports Server (NTRS)

    Tranter, W. H.

    1982-01-01

    It is pointed out that an array code is a convenient method for storing large quantities of data. In a typical application, the array consists of N data words having M symbols in each word. The probability of undetected error is considered, taking into account three symbol error probabilities which are of interest, and a formula for determining the probability of undetected error. Attention is given to the possibility of reading data into the array using a digital communication system with symbol error probability p. Two different schemes are found to be of interest. The conducted analysis of array coding shows that the probability of undetected error is very small even for relatively large arrays.

  13. Hanford Site Raptor Nest Monitoring Report for Calendar Year 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nugent, John J.; Lindsey, Cole T.; Wilde, Justin W.

    2014-02-13

    The U.S. Department of Energy, Richland Operations Office (DOE-RL) conducts ecological monitoring on the Hanford Site to collect and track data needed to ensure compliance with an array of environmental laws, regulations, and policies governing DOE activities. Ecological monitoring data provide baseline information about the plants, animals, and habitat under DOE-RL stewardship at Hanford required for decision-making under the National Environmental Policy Act (NEPA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). The Hanford Site Comprehensive Land Use Plan (CLUP, DOE/EIS-0222-F) which is the Environmental Impact Statement for Hanford Site activities, helps ensure that DOE-RL, its contractors, and othermore » entities conducting activities on the Hanford Site are in compliance with NEPA. The Hanford Site supports a large and diverse community of raptorial birds (Fitzner et al. 1981), with 26 species of raptors observed on the Hanford Site.« less

  14. BeadArray Expression Analysis Using Bioconductor

    PubMed Central

    Ritchie, Matthew E.; Dunning, Mark J.; Smith, Mike L.; Shi, Wei; Lynch, Andy G.

    2011-01-01

    Illumina whole-genome expression BeadArrays are a popular choice in gene profiling studies. Aside from the vendor-provided software tools for analyzing BeadArray expression data (GenomeStudio/BeadStudio), there exists a comprehensive set of open-source analysis tools in the Bioconductor project, many of which have been tailored to exploit the unique properties of this platform. In this article, we explore a number of these software packages and demonstrate how to perform a complete analysis of BeadArray data in various formats. The key steps of importing data, performing quality assessments, preprocessing, and annotation in the common setting of assessing differential expression in designed experiments will be covered. PMID:22144879

  15. Dust devil signatures in infrasound records of the International Monitoring System

    NASA Astrophysics Data System (ADS)

    Lorenz, Ralph D.; Christie, Douglas

    2015-03-01

    We explore whether dust devils have a recognizable signature in infrasound array records, since several Comprehensive Nuclear-Test-Ban Treaty verification stations conducting continuous measurements with microbarometers are in desert areas which see dust devils. The passage of dust devils (and other boundary layer vortices, whether dust laden or not) causes a local temporary drop in pressure: the high-pass time domain filtering in microbarometers results in a "heartbeat" signature, which we observe at the Warramunga station in Australia. We also observe a ~50 min pseudoperiodicity in the occurrence of these signatures and some higher-frequency infrasound. Dust devils do not significantly degrade the treaty verification capability. The pipe arrays for spatial averaging used in infrasound monitoring degrade the detection efficiency of small devils, but the long observation time may allow a useful census of large vortices, and thus, the high-sensitivity infrasonic array data from the monitoring network can be useful in studying columnar vortices in the lower atmosphere.

  16. Big Data Challenges for Large Radio Arrays

    NASA Technical Reports Server (NTRS)

    Jones, Dayton L.; Wagstaff, Kiri; Thompson, David; D'Addario, Larry; Navarro, Robert; Mattmann, Chris; Majid, Walid; Lazio, Joseph; Preston, Robert; Rebbapragada, Umaa

    2012-01-01

    Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields.

  17. High speed, very large (8 megabyte) first in/first out buffer memory (FIFO)

    DOEpatents

    Baumbaugh, Alan E.; Knickerbocker, Kelly L.

    1989-01-01

    A fast FIFO (First In First Out) memory buffer capable of storing data at rates of 100 megabytes per second. The invention includes a data packer which concatenates small bit data words into large bit data words, a memory array having individual data storage addresses adapted to store the large bit data words, a data unpacker into which large bit data words from the array can be read and reconstructed into small bit data words, and a controller to control and keep track of the individual data storage addresses in the memory array into which data from the packer is being written and data to the unpacker is being read.

  18. Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS), phase 1

    NASA Technical Reports Server (NTRS)

    Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.

    1986-01-01

    The large-signal behaviors of a regulator depend largely on the type of power circuit topology and control. Thus, for maximum flexibility, it is best to develop models for each functional block a independent modules. A regulator can then be configured by collecting appropriate pre-defined modules for each functional block. In order to complete the component model generation for a comprehensive spacecraft power system, the following modules were developed: solar array switching unit and control; shunt regulators; and battery discharger. The capability of each module is demonstrated using a simplified Direct Energy Transfer (DET) system. Large-signal behaviors of solar array power systems were analyzed. Stability of the solar array system operating points with a nonlinear load is analyzed. The state-plane analysis illustrates trajectories of the system operating point under various conditions. Stability and transient responses of the system operating near the solar array's maximum power point are also analyzed. The solar array system mode of operation is described using the DET spacecraft power system. The DET system is simulated for various operating conditions. Transfer of the software program CAMAPPS (Computer Aided Modeling and Analysis of Power Processing Systems) to NASA/GSFC (Goddard Space Flight Center) was accomplished.

  19. A comprehensive sensitivity analysis of microarray breast cancer classification under feature variability

    PubMed Central

    2009-01-01

    Background Large discrepancies in signature composition and outcome concordance have been observed between different microarray breast cancer expression profiling studies. This is often ascribed to differences in array platform as well as biological variability. We conjecture that other reasons for the observed discrepancies are the measurement error associated with each feature and the choice of preprocessing method. Microarray data are known to be subject to technical variation and the confidence intervals around individual point estimates of expression levels can be wide. Furthermore, the estimated expression values also vary depending on the selected preprocessing scheme. In microarray breast cancer classification studies, however, these two forms of feature variability are almost always ignored and hence their exact role is unclear. Results We have performed a comprehensive sensitivity analysis of microarray breast cancer classification under the two types of feature variability mentioned above. We used data from six state of the art preprocessing methods, using a compendium consisting of eight diferent datasets, involving 1131 hybridizations, containing data from both one and two-color array technology. For a wide range of classifiers, we performed a joint study on performance, concordance and stability. In the stability analysis we explicitly tested classifiers for their noise tolerance by using perturbed expression profiles that are based on uncertainty information directly related to the preprocessing methods. Our results indicate that signature composition is strongly influenced by feature variability, even if the array platform and the stratification of patient samples are identical. In addition, we show that there is often a high level of discordance between individual class assignments for signatures constructed on data coming from different preprocessing schemes, even if the actual signature composition is identical. Conclusion Feature variability can have a strong impact on breast cancer signature composition, as well as the classification of individual patient samples. We therefore strongly recommend that feature variability is considered in analyzing data from microarray breast cancer expression profiling experiments. PMID:19941644

  20. Big data challenges for large radio arrays

    NASA Astrophysics Data System (ADS)

    Jones, D. L.; Wagstaff, K.; Thompson, D. R.; D'Addario, L.; Navarro, R.; Mattmann, C.; Majid, W.; Lazio, J.; Preston, J.; Rebbapragada, U.

    2012-03-01

    Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields. The Jet Propulsion Laboratory is developing technologies to address big data issues, with an emphasis in three areas: 1) Lower-power digital processing architectures to make highvolume data generation operationally affordable, 2) Date-adaptive machine learning algorithms for real-time analysis (or "data triage") of large data volumes, and 3) Scalable data archive systems that allow efficient data mining and remote user code to run locally where the data are stored.

  1. Predictors of Sustainability of Social Programs

    ERIC Educational Resources Information Center

    Savaya, Riki; Spiro, Shimon E.

    2012-01-01

    This article presents the findings of a large scale study that tested a comprehensive model of predictors of three manifestations of sustainability: continuation, institutionalization, and duration. Based on the literature the predictors were arrayed in four groups: variables pertaining to the project, the auspice organization, the community, and…

  2. Numerical studies and metric development for validation of magnetohydrodynamic models on the HIT-SI experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, C., E-mail: hansec@uw.edu; Columbia University, New York, New York 10027; Victor, B.

    We present application of three scalar metrics derived from the Biorthogonal Decomposition (BD) technique to evaluate the level of agreement between macroscopic plasma dynamics in different data sets. BD decomposes large data sets, as produced by distributed diagnostic arrays, into principal mode structures without assumptions on spatial or temporal structure. These metrics have been applied to validation of the Hall-MHD model using experimental data from the Helicity Injected Torus with Steady Inductive helicity injection experiment. Each metric provides a measure of correlation between mode structures extracted from experimental data and simulations for an array of 192 surface-mounted magnetic probes. Numericalmore » validation studies have been performed using the NIMROD code, where the injectors are modeled as boundary conditions on the flux conserver, and the PSI-TET code, where the entire plasma volume is treated. Initial results from a comprehensive validation study of high performance operation with different injector frequencies are presented, illustrating application of the BD method. Using a simplified (constant, uniform density and temperature) Hall-MHD model, simulation results agree with experimental observation for two of the three defined metrics when the injectors are driven with a frequency of 14.5 kHz.« less

  3. Calibrating a tensor magnetic gradiometer using spin data

    USGS Publications Warehouse

    Bracken, Robert E.; Smith, David V.; Brown, Philip J.

    2005-01-01

    Scalar magnetic data are often acquired to discern characteristics of geologic source materials and buried objects. It is evident that a great deal can be done with scalar data, but there are significant advantages to direct measurement of the magnetic gradient tensor in applications with nearby sources, such as unexploded ordnance (UXO). To explore these advantages, we adapted a prototype tensor magnetic gradiometer system (TMGS) and successfully implemented a data-reduction procedure. One of several critical reduction issues is the precise determination of a large group of calibration coefficients for the sensors and sensor array. To resolve these coefficients, we devised a spin calibration method, after similar methods of calibrating space-based magnetometers (Snare, 2001). The spin calibration procedure consists of three parts: (1) collecting data by slowly revolving the sensor array in the Earth?s magnetic field, (2) deriving a comprehensive set of coefficients from the spin data, and (3) applying the coefficients to the survey data. To show that the TMGS functions as a tensor gradiometer, we conducted an experimental survey that verified that the reduction procedure was effective (Bracken and Brown, in press). Therefore, because it was an integral part of the reduction, it can be concluded that the spin calibration was correctly formulated with acceptably small errors.

  4. Creation of a Human Secretome: A Novel Composite Library of Human Secreted Proteins: Validation Using Ovarian Cancer Gene Expression Data and a Virtual Secretome Array.

    PubMed

    Vathipadiekal, Vinod; Wang, Victoria; Wei, Wei; Waldron, Levi; Drapkin, Ronny; Gillette, Michael; Skates, Steven; Birrer, Michael

    2015-11-01

    To generate a comprehensive "Secretome" of proteins potentially found in the blood and derive a virtual Affymetrix array. To validate the utility of this database for the discovery of novel serum-based biomarkers using ovarian cancer transcriptomic data. The secretome was constructed by aggregating the data from databases of known secreted proteins, transmembrane or membrane proteins, signal peptides, G-protein coupled receptors, or proteins existing in the extracellular region, and the virtual array was generated by mapping them to Affymetrix probeset identifiers. Whole-genome microarray data from ovarian cancer, normal ovarian surface epithelium, and fallopian tube epithelium were used to identify transcripts upregulated in ovarian cancer. We established the secretome from eight public databases and a virtual array consisting of 16,521 Affymetrix U133 Plus 2.0 probesets. Using ovarian cancer transcriptomic data, we identified candidate blood-based biomarkers for ovarian cancer and performed bioinformatic validation by demonstrating rediscovery of known biomarkers including CA125 and HE4. Two novel top biomarkers (FGF18 and GPR172A) were validated in serum samples from an independent patient cohort. We present the secretome, comprising the most comprehensive resource available for protein products that are potentially found in the blood. The associated virtual array can be used to translate gene-expression data into cancer biomarker discovery. A list of blood-based biomarkers for ovarian cancer detection is reported and includes CA125 and HE4. FGF18 and GPR172A were identified and validated by ELISA as being differentially expressed in the serum of ovarian cancer patients compared with controls. ©2015 American Association for Cancer Research.

  5. Electrical Components for Marine Renewable Energy Arrays: A Techno-Economic Review

    DOE PAGES

    Collin, Adam J.; Nambiar, Anup J.; Bould, David; ...

    2017-11-27

    This paper presents a review of the main electrical components that are expected to be present in marine renewable energy arrays. The review is put in context by appraising the current needs of the industry and identifying the key components required in both device and array-scale developments. For each component, electrical, mechanical and cost considerations are discussed; with quantitative data collected during the review made freely available for use by the community via an open access online repository. Here, this data collection updates previous research and addresses gaps specific to emerging offshore technologies, such as marine and floating wind, andmore » provides a comprehensive resource for the techno-economic assessment of offshore energy arrays.« less

  6. Electrical Components for Marine Renewable Energy Arrays: A Techno-Economic Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collin, Adam J.; Nambiar, Anup J.; Bould, David

    This paper presents a review of the main electrical components that are expected to be present in marine renewable energy arrays. The review is put in context by appraising the current needs of the industry and identifying the key components required in both device and array-scale developments. For each component, electrical, mechanical and cost considerations are discussed; with quantitative data collected during the review made freely available for use by the community via an open access online repository. Here, this data collection updates previous research and addresses gaps specific to emerging offshore technologies, such as marine and floating wind, andmore » provides a comprehensive resource for the techno-economic assessment of offshore energy arrays.« less

  7. Reliability apportionment approach for spacecraft solar array using fuzzy reasoning Petri net and fuzzy comprehensive evaluation

    NASA Astrophysics Data System (ADS)

    Wu, Jianing; Yan, Shaoze; Xie, Liyang; Gao, Peng

    2012-07-01

    The reliability apportionment of spacecraft solar array is of significant importance for spacecraft designers in the early stage of design. However, it is difficult to use the existing methods to resolve reliability apportionment problem because of the data insufficiency and the uncertainty of the relations among the components in the mechanical system. This paper proposes a new method which combines the fuzzy comprehensive evaluation with fuzzy reasoning Petri net (FRPN) to accomplish the reliability apportionment of the solar array. The proposed method extends the previous fuzzy methods and focuses on the characteristics of the subsystems and the intrinsic associations among the components. The analysis results show that the synchronization mechanism may obtain the highest reliability value and the solar panels and hinges may get the lowest reliability before design and manufacturing. Our developed method is of practical significance for the reliability apportionment of solar array where the design information has not been clearly identified, particularly in early stage of design.

  8. Toward a comprehensive and systematic methylome signature in colorectal cancers.

    PubMed

    Ashktorab, Hassan; Rahi, Hamed; Wansley, Daniel; Varma, Sudhir; Shokrani, Babak; Lee, Edward; Daremipouran, Mohammad; Laiyemo, Adeyinka; Goel, Ajay; Carethers, John M; Brim, Hassan

    2013-08-01

    CpG Island Methylator Phenotype (CIMP) is one of the underlying mechanisms in colorectal cancer (CRC). This study aimed to define a methylome signature in CRC through a methylation microarray analysis and a compilation of promising CIMP markers from the literature. Illumina HumanMethylation27 (IHM27) array data was generated and analyzed based on statistical differences in methylation data (1st approach) or based on overall differences in methylation percentages using lower 95% CI (2nd approach). Pyrosequencing was performed for the validation of nine genes. A meta-analysis was used to identify CIMP and non-CIMP markers that were hypermethylated in CRC but did not yet make it to the CIMP genes' list. Our 1st approach for array data analysis demonstrated the limitations in selecting genes for further validation, highlighting the need for the 2nd bioinformatics approach to adequately select genes with differential aberrant methylation. A more comprehensive list, which included non-CIMP genes, such as APC, EVL, CD109, PTEN, TWIST1, DCC, PTPRD, SFRP1, ICAM5, RASSF1A, EYA4, 30ST2, LAMA1, KCNQ5, ADHEF1, and TFPI2, was established. Array data are useful to categorize and cluster colonic lesions based on their global methylation profiles; however, its usefulness in identifying robust methylation markers is limited and rely on the data analysis method. We have identified 16 non-CIMP-panel genes for which we provide rationale for inclusion in a more comprehensive characterization of CIMP+ CRCs. The identification of a definitive list for methylome specific genes in CRC will contribute to better clinical management of CRC patients.

  9. Benchmark Modeling of the Near-Field and Far-Field Wave Effects of Wave Energy Arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhinefrank, Kenneth E; Haller, Merrick C; Ozkan-Haller, H Tuba

    2013-01-26

    This project is an industry-led partnership between Columbia Power Technologies and Oregon State University that will perform benchmark laboratory experiments and numerical modeling of the near-field and far-field impacts of wave scattering from an array of wave energy devices. These benchmark experimental observations will help to fill a gaping hole in our present knowledge of the near-field effects of multiple, floating wave energy converters and are a critical requirement for estimating the potential far-field environmental effects of wave energy arrays. The experiments will be performed at the Hinsdale Wave Research Laboratory (Oregon State University) and will utilize an array ofmore » newly developed Buoys' that are realistic, lab-scale floating power converters. The array of Buoys will be subjected to realistic, directional wave forcing (1:33 scale) that will approximate the expected conditions (waves and water depths) to be found off the Central Oregon Coast. Experimental observations will include comprehensive in-situ wave and current measurements as well as a suite of novel optical measurements. These new optical capabilities will include imaging of the 3D wave scattering using a binocular stereo camera system, as well as 3D device motion tracking using a newly acquired LED system. These observing systems will capture the 3D motion history of individual Buoys as well as resolve the 3D scattered wave field; thus resolving the constructive and destructive wave interference patterns produced by the array at high resolution. These data combined with the device motion tracking will provide necessary information for array design in order to balance array performance with the mitigation of far-field impacts. As a benchmark data set, these data will be an important resource for testing of models for wave/buoy interactions, buoy performance, and far-field effects on wave and current patterns due to the presence of arrays. Under the proposed project we will initiate high-resolution (fine scale, very near-field) fluid/structure interaction simulations of buoy motions, as well as array-scale, phase-resolving wave scattering simulations. These modeling efforts will utilize state-of-the-art research quality models, which have not yet been brought to bear on this complex problem of large array wave/structure interaction problem.« less

  10. Web-based NGS data analysis using miRMaster: a large-scale meta-analysis of human miRNAs.

    PubMed

    Fehlmann, Tobias; Backes, Christina; Kahraman, Mustafa; Haas, Jan; Ludwig, Nicole; Posch, Andreas E; Würstle, Maximilian L; Hübenthal, Matthias; Franke, Andre; Meder, Benjamin; Meese, Eckart; Keller, Andreas

    2017-09-06

    The analysis of small RNA NGS data together with the discovery of new small RNAs is among the foremost challenges in life science. For the analysis of raw high-throughput sequencing data we implemented the fast, accurate and comprehensive web-based tool miRMaster. Our toolbox provides a wide range of modules for quantification of miRNAs and other non-coding RNAs, discovering new miRNAs, isomiRs, mutations, exogenous RNAs and motifs. Use-cases comprising hundreds of samples are processed in less than 5 h with an accuracy of 99.4%. An integrative analysis of small RNAs from 1836 data sets (20 billion reads) indicated that context-specific miRNAs (e.g. miRNAs present only in one or few different tissues / cell types) still remain to be discovered while broadly expressed miRNAs appear to be largely known. In total, our analysis of known and novel miRNAs indicated nearly 22 000 candidates of precursors with one or two mature forms. Based on these, we designed a custom microarray comprising 11 872 potential mature miRNAs to assess the quality of our prediction. MiRMaster is a convenient-to-use tool for the comprehensive and fast analysis of miRNA NGS data. In addition, our predicted miRNA candidates provided as custom array will allow researchers to perform in depth validation of candidates interesting to them. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Two neutron correlations in photo-fission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dale, D. S.; Kosinov, O.; Forest, T.

    2016-01-01

    A large body of experimental work has established the strong kinematical correlation between fission fragments and fission neutrons. Here, we report on the progress of investigations of the potential for strong two neutron correlations arising from the nearly back-to-back nature of the two fission fragments that emit these neutrons in the photo-fission process. In initial measurements, a pulsed electron linear accelerator was used to generate bremsstrahlung photons that impinged upon an actinide target, and the energy and opening angle distributions of coincident neutrons were measured using a large acceptance neutron detector array. A planned comprehensive set of measurements of twomore » neutron correlations in the photo-fission of actinides is expected to shed light on several fundamental aspects of the fission process including the multiplicity distributions associated with the light and heavy fission fragments, the nuclear temperatures of the fission fragments, and the mass distribution of the fission fragments as a function of energy released. In addition to these measurements providing important nuclear data, the unique kinematics of fission and the resulting two neutron correlations have the potential to be the basis for a new tool to detect fissionable materials. A key technical challenge of this program arises from the need to perform coincidence measurements with a low duty factor, pulsed electron accelerator. This has motivated the construction of a large acceptance neutron detector array, and the development of data analysis techniques to directly measure uncorrelated two neutron backgrounds.« less

  12. A model of binding on DNA microarrays: understanding the combined effect of probe synthesis failure, cross-hybridization, DNA fragmentation and other experimental details of affymetrix arrays

    PubMed Central

    2012-01-01

    Background DNA microarrays are used both for research and for diagnostics. In research, Affymetrix arrays are commonly used for genome wide association studies, resequencing, and for gene expression analysis. These arrays provide large amounts of data. This data is analyzed using statistical methods that quite often discard a large portion of the information. Most of the information that is lost comes from probes that systematically fail across chips and from batch effects. The aim of this study was to develop a comprehensive model for hybridization that predicts probe intensities for Affymetrix arrays and that could provide a basis for improved microarray analysis and probe development. The first part of the model calculates probe binding affinities to all the possible targets in the hybridization solution using the Langmuir isotherm. In the second part of the model we integrate details that are specific to each experiment and contribute to the differences between hybridization in solution and on the microarray. These details include fragmentation, wash stringency, temperature, salt concentration, and scanner settings. Furthermore, the model fits probe synthesis efficiency and target concentration parameters directly to the data. All the parameters used in the model have a well-established physical origin. Results For the 302 chips that were analyzed the mean correlation between expected and observed probe intensities was 0.701 with a range of 0.88 to 0.55. All available chips were included in the analysis regardless of the data quality. Our results show that batch effects arise from differences in probe synthesis, scanner settings, wash strength, and target fragmentation. We also show that probe synthesis efficiencies for different nucleotides are not uniform. Conclusions To date this is the most complete model for binding on microarrays. This is the first model that includes both probe synthesis efficiency and hybridization kinetics/cross-hybridization. These two factors are sequence dependent and have a large impact on probe intensity. The results presented here provide novel insight into the effect of probe synthesis errors on Affymetrix microarrays; furthermore, the algorithms developed in this work provide useful tools for the analysis of cross-hybridization, probe synthesis efficiency, fragmentation, wash stringency, temperature, and salt concentration on microarray intensities. PMID:23270536

  13. Scan blindness in infinite phased arrays of printed dipoles

    NASA Technical Reports Server (NTRS)

    Pozar, D. M.; Schaubert, D. H.

    1984-01-01

    A comprehensive study of infinite phased arrays of printed dipole antennas is presented, with emphasis on the scan blindness phenomenon. A rigorous and efficient moment method procedure is used to calculate the array impedance versus scan angle. Data are presented for the input reflection coefficient for various element spacings and substrate parameters. A simple theory, based on coupling from Floquet modes to surface wave modes on the substrate, is shown to predict the occurrence of scan blindness. Measurements from a waveguide simulator of a blindness condition confirm the theory.

  14. A comprehensive biosensor integrated with a ZnO nanorod FET array for selective detection of glucose, cholesterol and urea.

    PubMed

    Ahmad, Rafiq; Tripathy, Nirmalya; Park, Jin-Ho; Hahn, Yoon-Bong

    2015-08-04

    We report a novel straightforward approach for simultaneous and highly-selective detection of multi-analytes (i.e. glucose, cholesterol and urea) using an integrated field-effect transistor (i-FET) array biosensor without any interference in each sensor response. Compared to analytically-measured data, performance of the ZnO nanorod based i-FET array biosensor is found to be highly reliable for rapid detection of multi-analytes in mice blood, and serum and blood samples of diabetic dogs.

  15. High performance large infrared and visible astronomy arrays for low background applications: instruments performance data and future developments at Raytheon

    NASA Astrophysics Data System (ADS)

    Beuville, Eric; Acton, David; Corrales, Elizabeth; Drab, John; Levy, Alan; Merrill, Michael; Peralta, Richard; Ritchie, William

    2007-09-01

    Raytheon Vision Systems (RVS) has developed a family of high performance large format infrared detector arrays for astronomy and civil space applications. RVS offers unique off-the-shelf solutions to the astronomy community. This paper describes mega-pixel arrays, based on multiple detector materials, developed for astronomy and low-background applications. New focal plane arrays under development at RVS for the astronomy community will also be presented. Large Sensor Chip Assemblies (SCAs) using various detector materials like Si:PIN, HgCdTe, InSb, and Si:As IBC, covering a detection range from visible to large wavelength infrared (LWIR) have been demonstrated with an excellent quantum efficiency and very good uniformity. These focal plane arrays have been assembled using state-of-the-art low noise, low power, readout integrated circuits (ROIC) designed at RVS. Raytheon packaging capabilities address reliability, precision alignment and flatness requirements for both ground-based and space applications. Multiple SCAs can be packaged into even larger focal planes. The VISTA telescope, for example, contains sixteen 2k × 2k infrared focal plane arrays. RVS astronomical arrays are being deployed world-wide in ground-based and space-based applications. A summary of performance data for each of these array types from instruments in operation will be presented (VIRGO Array for large format SWIR, the ORION and VISTA Arrays, NEWFIRM and other solutions for MWIR spectral ranges).

  16. PTGBase: an integrated database to study tandem duplicated genes in plants.

    PubMed

    Yu, Jingyin; Ke, Tao; Tehrim, Sadia; Sun, Fengming; Liao, Boshou; Hua, Wei

    2015-01-01

    Tandem duplication is a wide-spread phenomenon in plant genomes and plays significant roles in evolution and adaptation to changing environments. Tandem duplicated genes related to certain functions will lead to the expansion of gene families and bring increase of gene dosage in the form of gene cluster arrays. Many tandem duplication events have been studied in plant genomes; yet, there is a surprising shortage of efforts to systematically present the integration of large amounts of information about publicly deposited tandem duplicated gene data across the plant kingdom. To address this shortcoming, we developed the first plant tandem duplicated genes database, PTGBase. It delivers the most comprehensive resource available to date, spanning 39 plant genomes, including model species and newly sequenced species alike. Across these genomes, 54 130 tandem duplicated gene clusters (129 652 genes) are presented in the database. Each tandem array, as well as its member genes, is characterized in complete detail. Tandem duplicated genes in PTGBase can be explored through browsing or searching by identifiers or keywords of functional annotation and sequence similarity. Users can download tandem duplicated gene arrays easily to any scale, up to the complete annotation data set for an entire plant genome. PTGBase will be updated regularly with newly sequenced plant species as they become available. © The Author(s) 2015. Published by Oxford University Press.

  17. A user-friendly workflow for analysis of Illumina gene expression bead array data available at the arrayanalysis.org portal.

    PubMed

    Eijssen, Lars M T; Goelela, Varshna S; Kelder, Thomas; Adriaens, Michiel E; Evelo, Chris T; Radonjic, Marijana

    2015-06-30

    Illumina whole-genome expression bead arrays are a widely used platform for transcriptomics. Most of the tools available for the analysis of the resulting data are not easily applicable by less experienced users. ArrayAnalysis.org provides researchers with an easy-to-use and comprehensive interface to the functionality of R and Bioconductor packages for microarray data analysis. As a modular open source project, it allows developers to contribute modules that provide support for additional types of data or extend workflows. To enable data analysis of Illumina bead arrays for a broad user community, we have developed a module for ArrayAnalysis.org that provides a free and user-friendly web interface for quality control and pre-processing for these arrays. This module can be used together with existing modules for statistical and pathway analysis to provide a full workflow for Illumina gene expression data analysis. The module accepts data exported from Illumina's GenomeStudio, and provides the user with quality control plots and normalized data. The outputs are directly linked to the existing statistics module of ArrayAnalysis.org, but can also be downloaded for further downstream analysis in third-party tools. The Illumina bead arrays analysis module is available at http://www.arrayanalysis.org . A user guide, a tutorial demonstrating the analysis of an example dataset, and R scripts are available. The module can be used as a starting point for statistical evaluation and pathway analysis provided on the website or to generate processed input data for a broad range of applications in life sciences research.

  18. Comparative Performance and Model Agreement of Three Common Photovoltaic Array Configurations.

    PubMed

    Boyd, Matthew T

    2018-02-01

    Three grid-connected monocrystalline silicon arrays on the National Institute of Standards and Technology (NIST) campus in Gaithersburg, MD have been instrumented and monitored for 1 yr, with only minimal gaps in the data sets. These arrays range from 73 kW to 271 kW, and all use the same module, but have different tilts, orientations, and configurations. One array is installed facing east and west over a parking lot, one in an open field, and one on a flat roof. Various measured relationships and calculated standard metrics have been used to compare the relative performance of these arrays in their different configurations. Comprehensive performance models have also been created in the modeling software pvsyst for each array, and its predictions using measured on-site weather data are compared to the arrays' measured outputs. The comparisons show that all three arrays typically have monthly performance ratios (PRs) above 0.75, but differ significantly in their relative output, strongly correlating to their operating temperature and to a lesser extent their orientation. The model predictions are within 5% of the monthly delivered energy values except during the winter months, when there was intermittent snow on the arrays, and during maintenance and other outages.

  19. Comprehensive comparison of three commercial human whole-exome capture platforms.

    PubMed

    Asan; Xu, Yu; Jiang, Hui; Tyler-Smith, Chris; Xue, Yali; Jiang, Tao; Wang, Jiawei; Wu, Mingzhi; Liu, Xiao; Tian, Geng; Wang, Jun; Wang, Jian; Yang, Huangming; Zhang, Xiuqing

    2011-09-28

    Exome sequencing, which allows the global analysis of protein coding sequences in the human genome, has become an effective and affordable approach to detecting causative genetic mutations in diseases. Currently, there are several commercial human exome capture platforms; however, the relative performances of these have not been characterized sufficiently to know which is best for a particular study. We comprehensively compared three platforms: NimbleGen's Sequence Capture Array and SeqCap EZ, and Agilent's SureSelect. We assessed their performance in a variety of ways, including number of genes covered and capture efficacy. Differences that may impact on the choice of platform were that Agilent SureSelect covered approximately 1,100 more genes, while NimbleGen provided better flanking sequence capture. Although all three platforms achieved similar capture specificity of targeted regions, the NimbleGen platforms showed better uniformity of coverage and greater genotype sensitivity at 30- to 100-fold sequencing depth. All three platforms showed similar power in exome SNP calling, including medically relevant SNPs. Compared with genotyping and whole-genome sequencing data, the three platforms achieved a similar accuracy of genotype assignment and SNP detection. Importantly, all three platforms showed similar levels of reproducibility, GC bias and reference allele bias. We demonstrate key differences between the three platforms, particularly advantages of solutions over array capture and the importance of a large gene target set.

  20. HAGRID/ VANDLE spectroscopy of Rb decays

    NASA Astrophysics Data System (ADS)

    King, Thomas; Grzywacz, Robert; Taylor, Steven; Paulauskas, Stanley; Smith, Karl; Vandle Collaboration

    2017-09-01

    Many neutron-rich isotopes that contribute in both decay heat production and r-process nucleosynthesis have substantial beta-delayed neutron branching ratios. Beta-delayed neutron emission is a relatively complicated mechanism which can leave the daughter in an gamma-emitting excited state. A comprehensive understanding of their energy output and decay strength, S_beta, therefore requires the detection of both neutrons and gamma rays in coincidence. A series of measurements of delayed neutron precursors were performed at the On-Line Test Facility (OLTF) at the Oak Ridge National Laboratories using chemically selective ion sources and an enhanced VANDLE array. The main goal of this experiment was to revisit the decays of IAEA-marked priority precursors, including bromine, rubidium, cesium, and iodine, that are required to model the global properties in the fission of 238U.The unique data set, with neutron and gamma ray coincidences, benefited from the addition of a high-efficiency gamma-ray array, consisting of 16 LaBr3 crystals (HAGRiD), and a set of large volume NaI detectors to the VANDLE array. Characterization of and preliminary results from the new gamma-ray array for the decays of 94Rb and 97Rb will be presented. National Nuclear Security Administration under the Stewardship Science Academic Alliances program through DOE Award No. DE-NA0002132 and the Office of Nuclear Physics, U.S. Department of Energy under Award No. DE-FG02-96ER40983.

  1. Imaging 2015 Mw 7.8 Gorkha Earthquake and Its Aftershock Sequence Combining Multiple Calibrated Global Seismic Arrays

    NASA Astrophysics Data System (ADS)

    LI, B.; Ghosh, A.

    2016-12-01

    The 2015 Mw 7.8 Gorkha earthquake provides a good opportunity to study the tectonics and earthquake hazards in the Himalayas, one of the most seismically active plate boundaries. Details of the seismicity patterns and associated structures in the Himalayas are poorly understood mainly due to limited instrumentation. Here, we apply a back-projection method to study the mainshock rupture and the following aftershock sequence using four large aperture global seismic arrays. All the arrays show eastward rupture propagation of about 130 km and reveal similar evolution of seismic energy radiation, with strong high-frequency energy burst about 50 km north of Kathmandu. Each single array, however, is typically limited by large azimuthal gap, low resolution, and artifacts due to unmodeled velocity structures. Therefore, we use a self-consistent empirical calibration method to combine four different arrays to image the Gorkha event. It greatly improves the resolution, can better track rupture and reveal details that cannot be resolved by any individual array. In addition, we also use the same arrays at teleseismic distances and apply a back-projection technique to detect and locate the aftershocks immediately following the Gorkha earthquake. We detect about 2.5 times the aftershocks recorded by the Advance National Seismic System comprehensive earthquake catalog during the 19 days following the mainshock. The aftershocks detected by the arrays show an east-west trend in general, with majority of the aftershocks located at the eastern part of the rupture patch and surrounding the rupture zone of the largest Mw 7.3 aftershock. Overall spatiotemporal aftershock pattern agrees well with global catalog, with our catalog showing more details relative to the standard global catalog. The improved aftershock catalog enables us to better study the aftershock dynamics, stress evolution in this region. Moreover, rapid and better imaging of aftershock distribution may aid rapid response and hazard assessment after destructive large earthquakes. Existing multiple global seismic arrays, when properly calibrated and used in combinations, provide a high resolution image of rupture of large earthquakes and spatiotemporal distribution of aftershocks.

  2. From reading numbers to seeing ratios: a benefit of icons for risk comprehension.

    PubMed

    Tubau, Elisabet; Rodríguez-Ferreiro, Javier; Barberia, Itxaso; Colomé, Àngels

    2018-06-21

    Promoting a better understanding of statistical data is becoming increasingly important for improving risk comprehension and decision-making. In this regard, previous studies on Bayesian problem solving have shown that iconic representations help infer frequencies in sets and subsets. Nevertheless, the mechanisms by which icons enhance performance remain unclear. Here, we tested the hypothesis that the benefit offered by icon arrays lies in a better alignment between presented and requested relationships, which should facilitate the comprehension of the requested ratio beyond the represented quantities. To this end, we analyzed individual risk estimates based on data presented either in standard verbal presentations (percentages and natural frequency formats) or as icon arrays. Compared to the other formats, icons led to estimates that were more accurate, and importantly, promoted the use of equivalent expressions for the requested probability. Furthermore, whereas the accuracy of the estimates based on verbal formats depended on their alignment with the text, all the estimates based on icons were equally accurate. Therefore, these results support the proposal that icons enhance the comprehension of the ratio and its mapping onto the requested probability and point to relational misalignment as potential interference for text-based Bayesian reasoning. The present findings also argue against an intrinsic difficulty with understanding single-event probabilities.

  3. Somatic mutations affect key pathways in lung adenocarcinoma

    PubMed Central

    Ding, Li; Getz, Gad; Wheeler, David A.; Mardis, Elaine R.; McLellan, Michael D.; Cibulskis, Kristian; Sougnez, Carrie; Greulich, Heidi; Muzny, Donna M.; Morgan, Margaret B.; Fulton, Lucinda; Fulton, Robert S.; Zhang, Qunyuan; Wendl, Michael C.; Lawrence, Michael S.; Larson, David E.; Chen, Ken; Dooling, David J.; Sabo, Aniko; Hawes, Alicia C.; Shen, Hua; Jhangiani, Shalini N.; Lewis, Lora R.; Hall, Otis; Zhu, Yiming; Mathew, Tittu; Ren, Yanru; Yao, Jiqiang; Scherer, Steven E.; Clerc, Kerstin; Metcalf, Ginger A.; Ng, Brian; Milosavljevic, Aleksandar; Gonzalez-Garay, Manuel L.; Osborne, John R.; Meyer, Rick; Shi, Xiaoqi; Tang, Yuzhu; Koboldt, Daniel C.; Lin, Ling; Abbott, Rachel; Miner, Tracie L.; Pohl, Craig; Fewell, Ginger; Haipek, Carrie; Schmidt, Heather; Dunford-Shore, Brian H.; Kraja, Aldi; Crosby, Seth D.; Sawyer, Christopher S.; Vickery, Tammi; Sander, Sacha; Robinson, Jody; Winckler, Wendy; Baldwin, Jennifer; Chirieac, Lucian R.; Dutt, Amit; Fennell, Tim; Hanna, Megan; Johnson, Bruce E.; Onofrio, Robert C.; Thomas, Roman K.; Tonon, Giovanni; Weir, Barbara A.; Zhao, Xiaojun; Ziaugra, Liuda; Zody, Michael C.; Giordano, Thomas; Orringer, Mark B.; Roth, Jack A.; Spitz, Margaret R.; Wistuba, Ignacio I.; Ozenberger, Bradley; Good, Peter J.; Chang, Andrew C.; Beer, David G.; Watson, Mark A.; Ladanyi, Marc; Broderick, Stephen; Yoshizawa, Akihiko; Travis, William D.; Pao, William; Province, Michael A.; Weinstock, George M.; Varmus, Harold E.; Gabriel, Stacey B.; Lander, Eric S.; Gibbs, Richard A.; Meyerson, Matthew; Wilson, Richard K.

    2009-01-01

    Determining the genetic basis of cancer requires comprehensive analyses of large collections of histopathologically well-classified primary tumours. Here we report the results of a collaborative study to discover somatic mutations in 188 human lung adenocarcinomas. DNA sequencing of 623 genes with known or potential relationships to cancer revealed more than 1,000 somatic mutations across the samples. Our analysis identified 26 genes that are mutated at significantly high frequencies and thus are probably involved in carcinogenesis. The frequently mutated genes include tyrosine kinases, among them the EGFR homologue ERBB4; multiple ephrin receptor genes, notably EPHA3; vascular endothelial growth factor receptor KDR; and NTRK genes. These data provide evidence of somatic mutations in primary lung adenocarcinoma for several tumour suppressor genes involved in other cancers—including NF1, APC, RB1 and ATM—and for sequence changes in PTPRD as well as the frequently deleted gene LRP1B. The observed mutational profiles correlate with clinical features, smoking status and DNA repair defects. These results are reinforced by data integration including single nucleotide polymorphism array and gene expression array. Our findings shed further light on several important signalling pathways involved in lung adenocarcinoma, and suggest new molecular targets for treatment. PMID:18948947

  4. Sweetwater, Texas Large N Experiment

    NASA Astrophysics Data System (ADS)

    Sumy, D. F.; Woodward, R.; Barklage, M.; Hollis, D.; Spriggs, N.; Gridley, J. M.; Parker, T.

    2015-12-01

    From 7 March to 30 April 2014, NodalSeismic, Nanometrics, and IRIS PASSCAL conducted a collaborative, spatially-dense seismic survey with several thousand nodal short-period geophones complemented by a backbone array of broadband sensors near Sweetwater, Texas. This pilot project demonstrates the efficacy of industry and academic partnerships, and leveraged a larger, commercial 3D survey to collect passive source seismic recordings to image the subsurface. This innovative deployment of a large-N mixed-mode array allows industry to explore array geometries and investigate the value of broadband recordings, while affording academics a dense wavefield imaging capability and an operational model for high volume instrument deployment. The broadband array consists of 25 continuously-recording stations from IRIS PASSCAL and Nanometrics, with an array design that maximized recording of horizontal-traveling seismic energy for surface wave analysis over the primary target area with sufficient offset for imaging objectives at depth. In addition, 2639 FairfieldNodal Zland nodes from NodalSeismic were deployed in three sub-arrays: the outlier, backbone, and active source arrays. The backbone array consisted of 292 nodes that covered the entire survey area, while the outlier array consisted of 25 continuously-recording nodes distributed at a ~3 km distance away from the survey perimeter. Both the backbone and outlier array provide valuable constraints for the passive source portion of the analysis. This project serves as a learning platform to develop best practices in the support of large-N arrays with joint industry and academic expertise. Here we investigate lessons learned from a facility perspective, and present examples of data from the various sensors and array geometries. We will explore first-order results from local and teleseismic earthquakes, and show visualizations of the data across the array. Data are archived at the IRIS DMC under stations codes XB and 1B.

  5. Modeling and Flight Data Analysis of Spacecraft Dynamics with a Large Solar Array Paddle

    NASA Technical Reports Server (NTRS)

    Iwata, Takanori; Maeda, Ken; Hoshino, Hiroki

    2007-01-01

    The Advanced Land Observing Satellite (ALOS) was launched on January 24 2006 and has been operated successfully since then. This satellite has the attitude dynamics characterized by three large flexible structures, four large moving components, and stringent attitude/pointing stability requirements. In particular, it has one of the largest solar array paddles. Presented in this paper are flight data analyses and modeling of spacecraft attitude motion induced by the large solar array paddle. On orbit attitude dynamics was first characterized and summarized. These characteristic motions associated with the solar array paddle were identified and assessed. These motions are thermally induced motion, the pitch excitation by the paddle drive, and the role excitation. The thermally induced motion and the pitch excitation by the paddle drive were modeled and simulated to verify the mechanics of the motions. The control law updates implemented to mitigate the attitude vibrations are also reported.

  6. Early-branching Gut Fungi Possess A Large, And Comprehensive Array Of Biomass-Degrading Enzymes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solomon, Kevin V.; Haitjema, Charles; Henske, John K.

    The fungal kingdom is the source of almost all industrial enzymes in use for lignocellulose bioprocessing. Its more primitive members, however, remain relatively unexploited. We developed a systems-level approach that integrates RNA-Seq, proteomics, phenotype and biochemical studies of relatively unexplored early-branching free-living fungi. Anaerobic gut fungi isolated from herbivores produce a large array of biomass-degrading enzymes that synergistically degrade crude, unpretreated plant biomass, and are competitive with optimized commercial preparations from Aspergillus and Trichoderma. Compared to these model platforms, gut fungal enzymes are unbiased in substrate preference due to a wealth of xylan-degrading enzymes. These enzymes are universally catabolite repressed,more » and are further regulated by a rich landscape of noncoding regulatory RNAs. Furthermore, we identified several promising sequence divergent enzyme candidates for lignocellulosic bioprocessing.« less

  7. Uncertainties associated with parameter estimation in atmospheric infrasound arrays.

    PubMed

    Szuberla, Curt A L; Olson, John V

    2004-01-01

    This study describes a method for determining the statistical confidence in estimates of direction-of-arrival and trace velocity stemming from signals present in atmospheric infrasound data. It is assumed that the signal source is far enough removed from the infrasound sensor array that a plane-wave approximation holds, and that multipath and multiple source effects are not present. Propagation path and medium inhomogeneities are assumed not to be known at the time of signal detection, but the ensemble of time delays of signal arrivals between array sensor pairs is estimable and corrupted by uncorrelated Gaussian noise. The method results in a set of practical uncertainties that lend themselves to a geometric interpretation. Although quite general, this method is intended for use by analysts interpreting data from atmospheric acoustic arrays, or those interested in designing and deploying them. The method is applied to infrasound arrays typical of those deployed as a part of the International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization.

  8. Kids & Media @ the New Millennium: A Kaiser Family Foundation Report. A Comprehensive National Analysis of Children's Media Use. Executive Summary.

    ERIC Educational Resources Information Center

    Roberts, Donald F.

    A study examined media use patterns among a large, nationally representative sample of children ages 2-18, and which explored how children choose and interact with the whole array of media available to them, including television, movies, computers, music, video games, radio, magazines, books, and newspapers. The goal was to provide a solid base…

  9. Theoretical Model of Electrode Polarization and AC Electroosmotic Fluid Flow in Planar Electrode Arrays.

    PubMed

    Scott, Matthew; Kaler, Karan V. I. S.; Paul, Reginald

    2001-06-15

    Strong frequency-dependent fluid flow has been observed near the surface of microelectrode arrays. Modeling this phenomenon has proven to be difficult, with existing theories unable to account for the qualitative trend observed in the frequency spectra of this flow. Using recent electrode polarization results, a more comprehensive model of the double layer on the electrode surface is used to obtain good theoretical agreement with experimental data. Copyright 2001 Academic Press.

  10. Environmental Testing and Thermal Analysis of the NPS Solar Cell Array Tester (NPS-SCAT) CubeSat

    DTIC Science & Technology

    2011-06-01

    BCR Battery Charge Regulator C&DH Command and Data Handling CAD Computer Aided Design CDR Critical Design Review CFT Comprehensive Functional Test ...CPT Comprehensive Performance Test CoM Center of Mass COTS Commercial Off-the-Shelf CTB Cargo Transfer Bag EDU Engineering Design Unit EPS...and inexpensive solution. 2 C. ENVIRONMENTAL TESTING Environmental testing is an important element of the design and testing of a satellite. By

  11. The Role of NOAA's National Data Centers in the Earth and Space Science Infrastructure

    NASA Astrophysics Data System (ADS)

    Fox, C. G.

    2008-12-01

    NOAA's National Data Centers (NNDC) provide access to long-term archives of environmental data from NOAA and other sources. The NNDCs face significant challenges in the volume and complexity of modern data sets. Data volume challenges are being addressed using more capable data archive systems such as the Comprehensive Large Array-Data Stewardship System (CLASS). Challenges in assuring data quality and stewardship are in many ways more challenging. In the past, scientists at the Data Centers could provide reasonable stewardship of data sets in their area of expertise. As staff levels have decreased and data complexity has increased, Data Centers depend on their data providers and user communities to provide high-quality metadata, feedback on data problems and improvements. This relationship requires strong partnerships between the NNDCs and academic, commercial, and international partners, as well as advanced data management and access tools that conform to established international standards when available. The NNDCs are looking to geospatial databases, interactive mapping, web services, and other Application Program Interface approaches to help preserve NNDC data and information and to make it easily available to the scientific community.

  12. Anisotropy and corotation of galactic cosmic rays.

    PubMed

    Amenomori, M; Ayabe, S; Bi, X J; Chen, D; Cui, S W; Danzengluobu; Ding, L K; Ding, X H; Feng, C F; Feng, Zhaoyang; Feng, Z Y; Gao, X Y; Geng, Q X; Guo, H W; He, H H; He, M; Hibino, K; Hotta, N; Hu, Haibing; Hu, H B; Huang, J; Huang, Q; Jia, H Y; Kajino, F; Kasahara, K; Katayose, Y; Kato, C; Kawata, K; Labaciren; Le, G M; Li, A F; Li, J Y; Lou, Y-Q; Lu, H; Lu, S L; Meng, X R; Mizutani, K; Mu, J; Munakata, K; Nagai, A; Nanjo, H; Nishizawa, M; Ohnishi, M; Ohta, I; Onuma, H; Ouchi, T; Ozawa, S; Ren, J R; Saito, T; Saito, T Y; Sakata, M; Sako, T K; Sasaki, T; Shibata, M; Shiomi, A; Shirai, T; Sugimoto, H; Takita, M; Tan, Y H; Tateyama, N; Torii, S; Tsuchiya, H; Udo, S; Wang, B; Wang, H; Wang, X; Wang, Y G; Wu, H R; Xue, L; Yamamoto, Y; Yan, C T; Yang, X C; Yasue, S; Ye, Z H; Yu, G C; Yuan, A F; Yuda, T; Zhang, H M; Zhang, J L; Zhang, N J; Zhang, X Y; Zhang, Y; Zhang, Yi; Zhaxisangzhu; Zhou, X X

    2006-10-20

    The intensity of Galactic cosmic rays is nearly isotropic because of the influence of magnetic fields in the Milky Way. Here, we present two-dimensional high-precision anisotropy measurement for energies from a few to several hundred teraelectronvolts (TeV), using the large data sample of the Tibet Air Shower Arrays. Besides revealing finer details of the known anisotropies, a new component of Galactic cosmic ray anisotropy in sidereal time is uncovered around the Cygnus region direction. For cosmic-ray energies up to a few hundred TeV, all components of anisotropies fade away, showing a corotation of Galactic cosmic rays with the local Galactic magnetic environment. These results have broad implications for a comprehensive understanding of cosmic rays, supernovae, magnetic fields, and heliospheric and Galactic dynamic environments.

  13. Botulinum neurotoxin A complex recognizes host carbohydrates through its hemagglutinin component.

    PubMed

    Yao, Guorui; Lee, Kwangkook; Gu, Shenyan; Lam, Kwok-Ho; Jin, Rongsheng

    2014-02-12

    Botulinum neurotoxins (BoNTs) are potent bacterial toxins. The high oral toxicity of BoNTs is largely attributed to the progenitor toxin complex (PTC), which is assembled from BoNT and nontoxic neurotoxin-associated proteins (NAPs) that are produced together with BoNT in bacteria. Here, we performed ex vivo studies to examine binding of the highly homogeneous recombinant NAPs to mouse small intestine. We also carried out the first comprehensive glycan array screening with the hemagglutinin (HA) component of NAPs. Our data confirmed that intestinal binding of the PTC is partly mediated by the HA moiety through multivalent interactions between HA and host carbohydrates. The specific HA-carbohydrate recognition could be inhibited by receptor-mimicking saccharides.

  14. A comprehensive transcript index of the human genome generated using microarrays and computational approaches

    PubMed Central

    Schadt, Eric E; Edwards, Stephen W; GuhaThakurta, Debraj; Holder, Dan; Ying, Lisa; Svetnik, Vladimir; Leonardson, Amy; Hart, Kyle W; Russell, Archie; Li, Guoya; Cavet, Guy; Castle, John; McDonagh, Paul; Kan, Zhengyan; Chen, Ronghua; Kasarskis, Andrew; Margarint, Mihai; Caceres, Ramon M; Johnson, Jason M; Armour, Christopher D; Garrett-Engele, Philip W; Tsinoremas, Nicholas F; Shoemaker, Daniel D

    2004-01-01

    Background Computational and microarray-based experimental approaches were used to generate a comprehensive transcript index for the human genome. Oligonucleotide probes designed from approximately 50,000 known and predicted transcript sequences from the human genome were used to survey transcription from a diverse set of 60 tissues and cell lines using ink-jet microarrays. Further, expression activity over at least six conditions was more generally assessed using genomic tiling arrays consisting of probes tiled through a repeat-masked version of the genomic sequence making up chromosomes 20 and 22. Results The combination of microarray data with extensive genome annotations resulted in a set of 28,456 experimentally supported transcripts. This set of high-confidence transcripts represents the first experimentally driven annotation of the human genome. In addition, the results from genomic tiling suggest that a large amount of transcription exists outside of annotated regions of the genome and serves as an example of how this activity could be measured on a genome-wide scale. Conclusions These data represent one of the most comprehensive assessments of transcriptional activity in the human genome and provide an atlas of human gene expression over a unique set of gene predictions. Before the annotation of the human genome is considered complete, however, the previously unannotated transcriptional activity throughout the genome must be fully characterized. PMID:15461792

  15. Hierarchical imaging: a new concept for targeted imaging of large volumes from cells to tissues.

    PubMed

    Wacker, Irene; Spomer, Waldemar; Hofmann, Andreas; Thaler, Marlene; Hillmer, Stefan; Gengenbach, Ulrich; Schröder, Rasmus R

    2016-12-12

    Imaging large volumes such as entire cells or small model organisms at nanoscale resolution seemed an unrealistic, rather tedious task so far. Now, technical advances have lead to several electron microscopy (EM) large volume imaging techniques. One is array tomography, where ribbons of ultrathin serial sections are deposited on solid substrates like silicon wafers or glass coverslips. To ensure reliable retrieval of multiple ribbons from the boat of a diamond knife we introduce a substrate holder with 7 axes of translation or rotation specifically designed for that purpose. With this device we are able to deposit hundreds of sections in an ordered way in an area of 22 × 22 mm, the size of a coverslip. Imaging such arrays in a standard wide field fluorescence microscope produces reconstructions with 200 nm lateral resolution and 100 nm (the section thickness) resolution in z. By hierarchical imaging cascades in the scanning electron microscope (SEM), using a new software platform, we can address volumes from single cells to complete organs. In our first example, a cell population isolated from zebrafish spleen, we characterize different cell types according to their organelle inventory by segmenting 3D reconstructions of complete cells imaged with nanoscale resolution. In addition, by screening large numbers of cells at decreased resolution we can define the percentage at which different cell types are present in our preparation. With the second example, the root tip of cress, we illustrate how combining information from intermediate resolution data with high resolution data from selected regions of interest can drastically reduce the amount of data that has to be recorded. By imaging only the interesting parts of a sample considerably less data need to be stored, handled and eventually analysed. Our custom-designed substrate holder allows reproducible generation of section libraries, which can then be imaged in a hierarchical way. We demonstrate, that EM volume data at different levels of resolution can yield comprehensive information, including statistics, morphology and organization of cells and tissue. We predict, that hierarchical imaging will be a first step in tackling the big data issue inevitably connected with volume EM.

  16. VizieR Online Data Catalog: 8 Fermi GRB afterglows follow-up (Singer+, 2015)

    NASA Astrophysics Data System (ADS)

    Singer, L. P.; Kasliwal, M. M.; Cenko, S. B.; Perley, D. A.; Anderson, G. E.; Anupama, G. C.; Arcavi, I.; Bhalerao, V.; Bue, B. D.; Cao, Y.; Connaughton, V.; Corsi, A.; Cucchiara, A.; Fender, R. P.; Fox, D. B.; Gehrels, N.; Goldstein, A.; Gorosabel, J.; Horesh, A.; Hurley, K.; Johansson, J.; Kann, D. A.; Kouveliotou, C.; Huang, K.; Kulkarni, S. R.; Masci, F.; Nugent, P.; Rau, A.; Rebbapragada, U. D.; Staley, T. D.; Svinkin, D.; Thone, C. C.; de Ugarte Postigo, A.; Urata, Y.; Weinstein, A.

    2015-10-01

    In this work, we present the GBM-iPTF (intermediate Palomar Transient Factory) afterglows from the first 13 months of this project. Follow-up observations include R-band photometry from the P48, multicolor photometry from the P60, spectroscopy (acquired with the P200, Keck, Gemini, APO, Magellan, Very Large Telescope (VLT), and GTC), and radio observations with the Very Large Array (VLA), the Combined Array for Research in Millimeter-wave Astronomy (CARMA), the Australia Telescope Compact Array (ATCA), and the Arcminute Microkelvin Imager (AMI). (3 data files).

  17. R classes and methods for SNP array data.

    PubMed

    Scharpf, Robert B; Ruczinski, Ingo

    2010-01-01

    The Bioconductor project is an "open source and open development software project for the analysis and comprehension of genomic data" (1), primarily based on the R programming language. Infrastructure packages, such as Biobase, are maintained by Bioconductor core developers and serve several key roles to the broader community of Bioconductor software developers and users. In particular, Biobase introduces an S4 class, the eSet, for high-dimensional assay data. Encapsulating the assay data as well as meta-data on the samples, features, and experiment in the eSet class definition ensures propagation of the relevant sample and feature meta-data throughout an analysis. Extending the eSet class promotes code reuse through inheritance as well as interoperability with other R packages and is less error-prone. Recently proposed class definitions for high-throughput SNP arrays extend the eSet class. This chapter highlights the advantages of adopting and extending Biobase class definitions through a working example of one implementation of classes for the analysis of high-throughput SNP arrays.

  18. Massively parallel processor computer

    NASA Technical Reports Server (NTRS)

    Fung, L. W. (Inventor)

    1983-01-01

    An apparatus for processing multidimensional data with strong spatial characteristics, such as raw image data, characterized by a large number of parallel data streams in an ordered array is described. It comprises a large number (e.g., 16,384 in a 128 x 128 array) of parallel processing elements operating simultaneously and independently on single bit slices of a corresponding array of incoming data streams under control of a single set of instructions. Each of the processing elements comprises a bidirectional data bus in communication with a register for storing single bit slices together with a random access memory unit and associated circuitry, including a binary counter/shift register device, for performing logical and arithmetical computations on the bit slices, and an I/O unit for interfacing the bidirectional data bus with the data stream source. The massively parallel processor architecture enables very high speed processing of large amounts of ordered parallel data, including spatial translation by shifting or sliding of bits vertically or horizontally to neighboring processing elements.

  19. Validation of a common data model for active safety surveillance research

    PubMed Central

    Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E

    2011-01-01

    Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893

  20. A Comparative Study of a 1/4-Scale Gulfstream G550 Aircraft Nose Gear Model

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Neuhart, Dan H.; Zawodny, Nikolas S.; Liu, Fei; Yardibi, Tarik; Cattafesta, Louis; Van de Ven, Thomas

    2009-01-01

    A series of fluid dynamic and aeroacoustic wind tunnel experiments are performed at the University of Florida Aeroacoustic Flow Facility and the NASA-Langley Basic Aerodynamic Research Tunnel Facility on a high-fidelity -scale model of Gulfstream G550 aircraft nose gear. The primary objectives of this study are to obtain a comprehensive aeroacoustic dataset for a nose landing gear and to provide a clearer understanding of landing gear contributions to overall airframe noise of commercial aircraft during landing configurations. Data measurement and analysis consist of mean and fluctuating model surface pressure, noise source localization maps using a large-aperture microphone directional array, and the determination of far field noise level spectra using a linear array of free field microphones. A total of 24 test runs are performed, consisting of four model assembly configurations, each of which is subjected to three test section speeds, in two different test section orientations. The different model assembly configurations vary in complexity from a fully-dressed to a partially-dressed geometry. The two model orientations provide flyover and sideline views from the perspective of a phased acoustic array for noise source localization via beamforming. Results show that the torque arm section of the model exhibits the highest rms pressures for all model configurations, which is also evidenced in the sideline view noise source maps for the partially-dressed model geometries. Analysis of acoustic spectra data from the linear array microphones shows a slight decrease in sound pressure levels at mid to high frequencies for the partially-dressed cavity open model configuration. In addition, far field sound pressure level spectra scale approximately with the 6th power of velocity and do not exhibit traditional Strouhal number scaling behavior.

  1. Storage and computationally efficient permutations of factorized covariance and square-root information arrays

    NASA Technical Reports Server (NTRS)

    Muellerschoen, R. J.

    1988-01-01

    A unified method to permute vector stored Upper triangular Diagonal factorized covariance and vector stored upper triangular Square Root Information arrays is presented. The method involves cyclic permutation of the rows and columns of the arrays and retriangularization with fast (slow) Givens rotations (reflections). Minimal computation is performed, and a one dimensional scratch array is required. To make the method efficient for large arrays on a virtual memory machine, computations are arranged so as to avoid expensive paging faults. This method is potentially important for processing large volumes of radio metric data in the Deep Space Network.

  2. High-resolution dynamic pressure sensor array based on piezo-phototronic effect tuned photoluminescence imaging.

    PubMed

    Peng, Mingzeng; Li, Zhou; Liu, Caihong; Zheng, Qiang; Shi, Xieqing; Song, Ming; Zhang, Yang; Du, Shiyu; Zhai, Junyi; Wang, Zhong Lin

    2015-03-24

    A high-resolution dynamic tactile/pressure display is indispensable to the comprehensive perception of force/mechanical stimulations such as electronic skin, biomechanical imaging/analysis, or personalized signatures. Here, we present a dynamic pressure sensor array based on pressure/strain tuned photoluminescence imaging without the need for electricity. Each sensor is a nanopillar that consists of InGaN/GaN multiple quantum wells. Its photoluminescence intensity can be modulated dramatically and linearly by small strain (0-0.15%) owing to the piezo-phototronic effect. The sensor array has a high pixel density of 6350 dpi and exceptional small standard deviation of photoluminescence. High-quality tactile/pressure sensing distribution can be real-time recorded by parallel photoluminescence imaging without any cross-talk. The sensor array can be inexpensively fabricated over large areas by semiconductor product lines. The proposed dynamic all-optical pressure imaging with excellent resolution, high sensitivity, good uniformity, and ultrafast response time offers a suitable way for smart sensing, micro/nano-opto-electromechanical systems.

  3. Optimal Chunking of Large Multidimensional Arrays for Data Warehousing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Otoo, Ekow J; Otoo, Ekow J.; Rotem, Doron

    2008-02-15

    Very large multidimensional arrays are commonly used in data intensive scientific computations as well as on-line analytical processingapplications referred to as MOLAP. The storage organization of such arrays on disks is done by partitioning the large global array into fixed size sub-arrays called chunks or tiles that form the units of data transfer between disk and memory. Typical queries involve the retrieval of sub-arrays in a manner that access all chunks that overlap the query results. An important metric of the storage efficiency is the expected number of chunks retrieved over all such queries. The question that immediately arises is"whatmore » shapes of array chunks give the minimum expected number of chunks over a query workload?" The problem of optimal chunking was first introduced by Sarawagi and Stonebraker who gave an approximate solution. In this paper we develop exact mathematical models of the problem and provide exact solutions using steepest descent and geometric programming methods. Experimental results, using synthetic and real life workloads, show that our solutions are consistently within than 2.0percent of the true number of chunks retrieved for any number of dimensions. In contrast, the approximate solution of Sarawagi and Stonebraker can deviate considerably from the true result with increasing number of dimensions and also may lead to suboptimal chunk shapes.« less

  4. Large-region acoustic source mapping using a movable array and sparse covariance fitting.

    PubMed

    Zhao, Shengkui; Tuna, Cagdas; Nguyen, Thi Ngoc Tho; Jones, Douglas L

    2017-01-01

    Large-region acoustic source mapping is important for city-scale noise monitoring. Approaches using a single-position measurement scheme to scan large regions using small arrays cannot provide clean acoustic source maps, while deploying large arrays spanning the entire region of interest is prohibitively expensive. A multiple-position measurement scheme is applied to scan large regions at multiple spatial positions using a movable array of small size. Based on the multiple-position measurement scheme, a sparse-constrained multiple-position vectorized covariance matrix fitting approach is presented. In the proposed approach, the overall sample covariance matrix of the incoherent virtual array is first estimated using the multiple-position array data and then vectorized using the Khatri-Rao (KR) product. A linear model is then constructed for fitting the vectorized covariance matrix and a sparse-constrained reconstruction algorithm is proposed for recovering source powers from the model. The user parameter settings are discussed. The proposed approach is tested on a 30 m × 40 m region and a 60 m × 40 m region using simulated and measured data. Much cleaner acoustic source maps and lower sound pressure level errors are obtained compared to the beamforming approaches and the previous sparse approach [Zhao, Tuna, Nguyen, and Jones, Proc. IEEE Intl. Conf. on Acoustics, Speech and Signal Processing (ICASSP) (2016)].

  5. IRIS Arrays: Observing Wavefields at Multiple Scales and Frequencies

    NASA Astrophysics Data System (ADS)

    Sumy, D. F.; Woodward, R.; Frassetto, A.

    2014-12-01

    The Incorporated Research Institutions for Seismology (IRIS) provides instruments for creating and operating seismic arrays at a wide range of scales. As an example, for over thirty years the IRIS PASSCAL program has provided instruments to individual Principal Investigators to deploy arrays of all shapes and sizes on every continent. These arrays have ranged from just a few sensors to hundreds or even thousands of sensors, covering areas with dimensions of meters to thousands of kilometers. IRIS also operates arrays directly, such as the USArray Transportable Array (TA) as part of the EarthScope program. Since 2004, the TA has rolled across North America, at any given time spanning a swath of approximately 800 km by 2,500 km, and thus far sampling 2% of the Earth's surface. This achievement includes all of the lower-48 U.S., southernmost Canada, and now parts of Alaska. IRIS has also facilitated specialized arrays in polar environments and on the seafloor. In all cases, the data from these arrays are freely available to the scientific community. As the community of scientists who use IRIS facilities and data look to the future they have identified a clear need for new array capabilities. In particular, as part of its Wavefields Initiative, IRIS is exploring new technologies that can enable large, dense array deployments to record unaliased wavefields at a wide range of frequencies. Large-scale arrays might utilize multiple sensor technologies to best achieve observing objectives and optimize equipment and logistical costs. Improvements in packaging and power systems can provide equipment with reduced size, weight, and power that will reduce logistical constraints for large experiments, and can make a critical difference for deployments in harsh environments or other situations where rapid deployment is required. We will review the range of existing IRIS array capabilities with an overview of previous and current deployments and examples of data and results. We will review existing IRIS projects that explore new array capabilities and highlight future directions for IRIS instrumentation facilities.

  6. Standard, Random, and Optimum Array conversions from Two-Pole resistance data

    DOE PAGES

    Rucker, D. F.; Glaser, Danney R.

    2014-09-01

    We present an array evaluation of standard and nonstandard arrays over a hydrogeological target. We develop the arrays by linearly combining data from the pole-pole (or 2-pole) array. The first test shows that reconstructed resistances for the standard Schlumberger and dipoledipole arrays are equivalent or superior to the measured arrays in terms of noise, especially at large geometric factors. The inverse models for the standard arrays also confirm what others have presented in terms of target resolvability, namely the dipole-dipole array has the highest resolution. In the second test, we reconstruct random electrode combinations from the 2-pole data segregated intomore » inner, outer, and overlapping dipoles. The resistance data and inverse models from these randomized arrays show those with inner dipoles to be superior in terms of noise and resolution and that overlapping dipoles can cause model instability and low resolution. Finally, we use the 2-pole data to create an optimized array that maximizes the model resolution matrix for a given electrode geometry. The optimized array produces the highest resolution and target detail. Thus, the tests demonstrate that high quality data and high model resolution can be achieved by acquiring field data from the pole-pole array.« less

  7. Reduction of solar vector magnetograph data using a microMSP array processor

    NASA Technical Reports Server (NTRS)

    Kineke, Jack

    1990-01-01

    The processing of raw data obtained by the solar vector magnetograph at NASA-Marshall requires extensive arithmetic operations on large arrays of real numbers. The objectives of this summer faculty fellowship study are to: (1) learn the programming language of the MicroMSP Array Processor and adapt some existing data reduction routines to exploit its capabilities; and (2) identify other applications and/or existing programs which lend themselves to array processor utilization which can be developed by undergraduate student programmers under the provisions of project JOVE.

  8. Efficient Memory Access with NumPy Global Arrays using Local Memory Access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, Jeffrey A.; Berghofer, Dan C.

    This paper discusses the work completed working with Global Arrays of data on distributed multi-computer systems and improving their performance. The tasks completed were done at Pacific Northwest National Laboratory in the Science Undergrad Laboratory Internship program in the summer of 2013 for the Data Intensive Computing Group in the Fundamental and Computational Sciences DIrectorate. This work was done on the Global Arrays Toolkit developed by this group. This toolkit is an interface for programmers to more easily create arrays of data on networks of computers. This is useful because scientific computation is often done on large amounts of datamore » sometimes so large that individual computers cannot hold all of it. This data is held in array form and can best be processed on supercomputers which often consist of a network of individual computers doing their computation in parallel. One major challenge for this sort of programming is that operations on arrays on multiple computers is very complex and an interface is needed so that these arrays seem like they are on a single computer. This is what global arrays does. The work done here is to use more efficient operations on that data that requires less copying of data to be completed. This saves a lot of time because copying data on many different computers is time intensive. The way this challenge was solved is when data to be operated on with binary operations are on the same computer, they are not copied when they are accessed. When they are on separate computers, only one set is copied when accessed. This saves time because of less copying done although more data access operations were done.« less

  9. Weak-signal Phase Calibration Strategies for Large DSN Arrays

    NASA Technical Reports Server (NTRS)

    Jones, Dayton L.

    2005-01-01

    The NASA Deep Space Network (DSN) is studying arrays of large numbers of small, mass-produced radio antennas as a cost-effective way to increase downlink sensitivity and data rates for future missions. An important issue for the operation of large arrays is the accuracy with which signals from hundreds of small antennas can be combined. This is particularly true at Ka band (32 GHz) where atmospheric phase variations can be large and rapidly changing. A number of algorithms exist to correct the phases of signals from individual antennas in the case where a spacecraft signal provides a useful signal-to-noise ratio (SNR) on time scales shorter than the atmospheric coherence time. However, for very weak spacecraft signals it will be necessary to rely on background natural radio sources to maintain array phasing. Very weak signals could result from a spacecraft emergency or by design, such as direct-to-Earth data transmissions from distant planetary atmospheric or surface probes using only low gain antennas. This paper considers the parameter space where external real-time phase calibration will be necessary, and what this requires in terms of array configuration and signal processing. The inherent limitations of this technique are also discussed.

  10. Simultaneous electrical recording of cardiac electrophysiology and contraction on chip

    DOE PAGES

    Qian, Fang; Huang, Chao; Lin, Yi-Dong; ...

    2017-04-18

    Prevailing commercialized cardiac platforms for in vitro drug development utilize planar microelectrode arrays to map action potentials, or impedance sensing to record contraction in real time, but cannot record both functions on the same chip with high spatial resolution. We report a novel cardiac platform that can record cardiac tissue adhesion, electrophysiology, and contractility on the same chip. The platform integrates two independent yet interpenetrating sensor arrays: a microelectrode array for field potential readouts and an interdigitated electrode array for impedance readouts. Together, these arrays provide real-time, non-invasive data acquisition of both cardiac electrophysiology and contractility under physiological conditions andmore » under drug stimuli. Furthermore, we cultured human induced pluripotent stem cell-derived cardiomyocytes as a model system, and used to validate the platform with an excitation–contraction decoupling chemical. Preliminary data using the platform to investigate the effect of the drug norepinephrine are combined with computational efforts. Finally, this platform provides a quantitative and predictive assay system that can potentially be used for comprehensive assessment of cardiac toxicity earlier in the drug discovery process.« less

  11. Real-time PCR array as a universal platform for the detection of genetically modified crops and its application in identifying unapproved genetically modified crops in Japan.

    PubMed

    Mano, Junichi; Shigemitsu, Natsuki; Futo, Satoshi; Akiyama, Hiroshi; Teshima, Reiko; Hino, Akihiro; Furui, Satoshi; Kitta, Kazumi

    2009-01-14

    We developed a novel type of real-time polymerase chain reaction (PCR) array with TaqMan chemistry as a platform for the comprehensive and semiquantitative detection of genetically modified (GM) crops. Thirty primer-probe sets for the specific detection of GM lines, recombinant DNA (r-DNA) segments, endogenous reference genes, and donor organisms were synthesized, and a 96-well PCR plate was prepared with a different primer-probe in each well as the real-time PCR array. The specificity and sensitivity of the array were evaluated. A comparative analysis with the data and publicly available information on GM crops approved in Japan allowed us to assume the possibility of unapproved GM crop contamination. Furthermore, we designed a Microsoft Excel spreadsheet application, Unapproved GMO Checker version 2.01, which helps process all the data of real-time PCR arrays for the easy assumption of unapproved GM crop contamination. The spreadsheet is available free of charge at http://cse.naro.affrc.go.jp/jmano/index.html .

  12. Simultaneous electrical recording of cardiac electrophysiology and contraction on chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Fang; Huang, Chao; Lin, Yi-Dong

    Prevailing commercialized cardiac platforms for in vitro drug development utilize planar microelectrode arrays to map action potentials, or impedance sensing to record contraction in real time, but cannot record both functions on the same chip with high spatial resolution. We report a novel cardiac platform that can record cardiac tissue adhesion, electrophysiology, and contractility on the same chip. The platform integrates two independent yet interpenetrating sensor arrays: a microelectrode array for field potential readouts and an interdigitated electrode array for impedance readouts. Together, these arrays provide real-time, non-invasive data acquisition of both cardiac electrophysiology and contractility under physiological conditions andmore » under drug stimuli. Furthermore, we cultured human induced pluripotent stem cell-derived cardiomyocytes as a model system, and used to validate the platform with an excitation–contraction decoupling chemical. Preliminary data using the platform to investigate the effect of the drug norepinephrine are combined with computational efforts. Finally, this platform provides a quantitative and predictive assay system that can potentially be used for comprehensive assessment of cardiac toxicity earlier in the drug discovery process.« less

  13. Photon and neutron interrogation techniques for chemical explosives detection in air cargo: A critical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Runkle, Robert C.; White, Timothy A.; Miller, Erin A.

    Scanning cargo transported via aircraft ("air cargo") for explosive threats is a problem that, at present, lacks a comprehensive technical solution. While explosives detection in the baggage-scanning domain has a rich history that sheds light on potential solutions for air cargo, baggage scanning differs in several ways and thus one cannot look to the present array of technologies. Some contemporary solutions, like trace analysis, are not readily applied to cargo due to sampling challenges while the larger geometry of air cargo makes others less effective. This review article examines an array of interrogation techniques using photons and neutrons as incidentmore » particles. We first present a summary of the signatures and observables explosives provide and review how they have been exploited in baggage scanning. Following this is a description of the challenges posed by the air cargo application space. After considering interrogation sources, methods focused on transmission imaging, sub-surface examination and elemental characterization are described. It is our goal to shed light on the technical promise of each method while largely deferring questions that revolve around footprint, safety and conduct of operations. Our overarching intent is that a comprehensive understanding of potential techniques will foster development of a comprehensive solution.« less

  14. The ASTRI/CTA mini-array software system

    NASA Astrophysics Data System (ADS)

    Tosti, Gino; Schwarz, Joseph; Antonelli, Lucio Angelo; Trifoglio, Massimo; Catalano, Osvaldo; Maccarone, Maria Concetta; Leto, Giuseppe; Gianotti, Fulvio; Canestrari, Rodolfo; Giro, Enrico; Fiorini, Mauro; La Palombara, Nicola; Pareschi, Giovanni; Stringhetti, Luca; Vercellone, Stefano; Conforti, Vito; Tanci, Claudio; Bruno, Pietro; Grillo, Alessandro; Testa, Vincenzo; di Paola, Andrea; Gallozzi, Stefano

    2014-07-01

    ASTRI (Astrofisica con Specchi a Tecnologia Replicante Italiana) is a Flagship Project financed by the Italian Ministry of Education, University and Research, and led by INAF, the Italian National Institute of Astrophysics. The main goals of the ASTRI project are the realization of an end-to-end prototype of a Small Size Telescope (SST) for the Cherenkov Telescope Array (CTA) in a dual- mirror configuration (SST-2M) and, subsequently, of a mini-array comprising seven SST-2M telescopes. The mini-array will be placed at the final CTA Southern Site, which will be part of the CTA seed array, around which the whole CTA observatory will be developed. The Mini-Array Software System (MASS) will provide a comprehensive set of tools to prepare an observing proposal, to perform the observations specified therein (monitoring and controlling all the hardware components of each telescope), to analyze the acquired data online and to store/retrieve all the data products to/from the archive. Here we present the main features of the MASS and its first version, to be tested on the ASTRI SST-2M prototype that will be installed at the INAF observing station located at Serra La Nave on Mount Etna in Sicily.

  15. Design and Use of Microphone Directional Arrays for Aeroacoustic Measurements

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.; Brooks, Thomas F.; Hunter, William W., Jr.; Meadows, Kristine R.

    1998-01-01

    An overview of the development of two microphone directional arrays for aeroacoustic testing is presented. These arrays were specifically developed to measure airframe noise in the NASA Langley Quiet Flow Facility. A large aperture directional array using 35 flush-mounted microphones was constructed to obtain high resolution noise localization maps around airframe models. This array possesses a maximum diagonal aperture size of 34 inches. A unique logarithmic spiral layout design was chosen for the targeted frequency range of 2-30 kHz. Complementing the large array is a small aperture directional array, constructed to obtain spectra and directivity information from regions on the model. This array, possessing 33 microphones with a maximum diagonal aperture size of 7.76 inches, is easily moved about the model in elevation and azimuth. Custom microphone shading algorithms have been developed to provide a frequency- and position-invariant sensing area from 10-40 kHz with an overall targeted frequency range for the array of 5-60 kHz. Both arrays are employed in acoustic measurements of a 6 percent of full scale airframe model consisting of a main element NACA 632-215 wing section with a 30 percent chord half-span flap. Representative data obtained from these measurements is presented, along with details of the array calibration and data post-processing procedures.

  16. Ticks and Tick-Borne Pathogens of the Caribbean: Current Understanding and Future Directions for More Comprehensive Surveillance.

    PubMed

    Gondard, Mathilde; Cabezas-Cruz, Alejandro; Charles, Roxanne A; Vayssier-Taussat, Muriel; Albina, Emmanuel; Moutailler, Sara

    2017-01-01

    Ticks are obligate hematophagous arthropods of significant importance to human and veterinary medicine. They transmit a vast array of pathogens, including bacteria, viruses, protozoa, and helminths. Most epidemiological data on ticks and tick-borne pathogens (TBPs) in the West Indies are limited to common livestock pathogens such as Ehrlichia ruminantium, Babesia spp. (i.e., B. bovis and B. bigemina ), and Anaplasma marginale , and less information is available on companion animal pathogens. Of note, human tick-borne diseases (TBDs) remain almost completely uncharacterized in the West Indies. Information on TBP presence in wildlife is also missing. Herein, we provide a comprehensive review of the ticks and TBPs affecting human and animal health in the Caribbean, and introduce the challenges associated with understanding TBD epidemiology and implementing successful TBD management in this region. In particular, we stress the need for innovative and versatile surveillance tools using high-throughput pathogen detection (e.g., high-throughput real-time microfluidic PCR). The use of such tools in large epidemiological surveys will likely improve TBD prevention and control programs in the Caribbean.

  17. Simultaneous fingerprint, quantitative analysis and anti-oxidative based screening of components in Rhizoma Smilacis Glabrae using liquid chromatography coupled with Charged Aerosol and Coulometric array Detection.

    PubMed

    Yang, Guang; Zhao, Xin; Wen, Jun; Zhou, Tingting; Fan, Guorong

    2017-04-01

    An analytical approach including fingerprint, quantitative analysis and rapid screening of anti-oxidative components was established and successfully applied for the comprehensive quality control of Rhizoma Smilacis Glabrae (RSG), a well-known Traditional Chinese Medicine with the homology of medicine and food. Thirteen components were tentatively identified based on their retention behavior, UV absorption and MS fragmentation patterns. Chemometric analysis based on coulmetric array data was performed to evaluate the similarity and variation between fifteen batches. Eight discriminating components were quantified using single-compound calibration. The unit responses of those components in coulmetric array detection were calculated and compared with those of several compounds reported to possess antioxidant activity, and four of them were tentatively identified as main contributors to the total anti-oxidative activity. The main advantage of the proposed approach was that it realized simultaneous fingerprint, quantitative analysis and screening of anti-oxidative components, providing comprehensive information for quality assessment of RSG. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Novel SSR Markers from BAC-End Sequences, DArT Arrays and a Comprehensive Genetic Map with 1,291 Marker Loci for Chickpea (Cicer arietinum L.)

    PubMed Central

    Nayak, Spurthi N.; Varghese, Nicy; Shah, Trushar M.; Penmetsa, R. Varma; Thirunavukkarasu, Nepolean; Gudipati, Srivani; Gaur, Pooran M.; Kulwal, Pawan L.; Upadhyaya, Hari D.; KaviKishor, Polavarapu B.; Winter, Peter; Kahl, Günter; Town, Christopher D.; Kilian, Andrzej; Cook, Douglas R.; Varshney, Rajeev K.

    2011-01-01

    Chickpea (Cicer arietinum L.) is the third most important cool season food legume, cultivated in arid and semi-arid regions of the world. The goal of this study was to develop novel molecular markers such as microsatellite or simple sequence repeat (SSR) markers from bacterial artificial chromosome (BAC)-end sequences (BESs) and diversity arrays technology (DArT) markers, and to construct a high-density genetic map based on recombinant inbred line (RIL) population ICC 4958 (C. arietinum)×PI 489777 (C. reticulatum). A BAC-library comprising 55,680 clones was constructed and 46,270 BESs were generated. Mining of these BESs provided 6,845 SSRs, and primer pairs were designed for 1,344 SSRs. In parallel, DArT arrays with ca. 15,000 clones were developed, and 5,397 clones were found polymorphic among 94 genotypes tested. Screening of newly developed BES-SSR markers and DArT arrays on the parental genotypes of the RIL mapping population showed polymorphism with 253 BES-SSR markers and 675 DArT markers. Segregation data obtained for these polymorphic markers and 494 markers data compiled from published reports or collaborators were used for constructing the genetic map. As a result, a comprehensive genetic map comprising 1,291 markers on eight linkage groups (LGs) spanning a total of 845.56 cM distance was developed (http://cmap.icrisat.ac.in/cmap/sm/cp/thudi/). The number of markers per linkage group ranged from 68 (LG 8) to 218 (LG 3) with an average inter-marker distance of 0.65 cM. While the developed resource of molecular markers will be useful for genetic diversity, genetic mapping and molecular breeding applications, the comprehensive genetic map with integrated BES-SSR markers will facilitate its anchoring to the physical map (under construction) to accelerate map-based cloning of genes in chickpea and comparative genome evolution studies in legumes. PMID:22102885

  19. Data Container Study for Handling array-based data using Hive, Spark, MongoDB, SciDB and Rasdaman

    NASA Astrophysics Data System (ADS)

    Xu, M.; Hu, F.; Yang, J.; Yu, M.; Yang, C. P.

    2017-12-01

    Geoscience communities have come up with various big data storage solutions, such as Rasdaman and Hive, to address the grand challenges for massive Earth observation data management and processing. To examine the readiness of current solutions in supporting big Earth observation, we propose to investigate and compare four popular data container solutions, including Rasdaman, Hive, Spark, SciDB and MongoDB. Using different types of spatial and non-spatial queries, datasets stored in common scientific data formats (e.g., NetCDF and HDF), and two applications (i.e. dust storm simulation data mining and MERRA data analytics), we systematically compare and evaluate the feature and performance of these four data containers in terms of data discover and access. The computing resources (e.g. CPU, memory, hard drive, network) consumed while performing various queries and operations are monitored and recorded for the performance evaluation. The initial results show that 1) the popular data container clusters are able to handle large volume of data, but their performances vary in different situations. Meanwhile, there is a trade-off between data preprocessing, disk saving, query-time saving, and resource consuming. 2) ClimateSpark, MongoDB and SciDB perform the best among all the containers in all the queries tests, and Hive performs the worst. 3) These studied data containers can be applied on other array-based datasets, such as high resolution remote sensing data and model simulation data. 4) Rasdaman clustering configuration is more complex than the others. A comprehensive report will detail the experimental results, and compare their pros and cons regarding system performance, ease of use, accessibility, scalability, compatibility, and flexibility.

  20. Interactions between large space power systems and low-Earth-orbit plasmas

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1985-01-01

    There is a growing tendency to plan space missions that will incorporate very large space power systems. These space power systems must function in the space plasma environment, which can impose operational limitations. As the power output increases, the operating voltage also must increase and this voltage, exposed at solar array interconnects, interacts with the local plasma. The implications of such interactions are considered. The available laboratory data for biased array segment tests are reviewed to demonstrate the basic interactions considered. A data set for a floating high voltage array test was used to generate approximate relationships for positive and negative current collection from plasmas. These relationships were applied to a hypothetical 100 kW power system operating in a 400 km, near equatorial orbit. It was found that discharges from the negative regions of the array are the most probable limiting factor in array operation.

  1. Mapping of transcription factor binding regions in mammalian cells by ChIP: Comparison of array- and sequencing-based technologies

    PubMed Central

    Euskirchen, Ghia M.; Rozowsky, Joel S.; Wei, Chia-Lin; Lee, Wah Heng; Zhang, Zhengdong D.; Hartman, Stephen; Emanuelsson, Olof; Stolc, Viktor; Weissman, Sherman; Gerstein, Mark B.; Ruan, Yijun; Snyder, Michael

    2007-01-01

    Recent progress in mapping transcription factor (TF) binding regions can largely be credited to chromatin immunoprecipitation (ChIP) technologies. We compared strategies for mapping TF binding regions in mammalian cells using two different ChIP schemes: ChIP with DNA microarray analysis (ChIP-chip) and ChIP with DNA sequencing (ChIP-PET). We first investigated parameters central to obtaining robust ChIP-chip data sets by analyzing STAT1 targets in the ENCODE regions of the human genome, and then compared ChIP-chip to ChIP-PET. We devised methods for scoring and comparing results among various tiling arrays and examined parameters such as DNA microarray format, oligonucleotide length, hybridization conditions, and the use of competitor Cot-1 DNA. The best performance was achieved with high-density oligonucleotide arrays, oligonucleotides ≥50 bases (b), the presence of competitor Cot-1 DNA and hybridizations conducted in microfluidics stations. When target identification was evaluated as a function of array number, 80%–86% of targets were identified with three or more arrays. Comparison of ChIP-chip with ChIP-PET revealed strong agreement for the highest ranked targets with less overlap for the low ranked targets. With advantages and disadvantages unique to each approach, we found that ChIP-chip and ChIP-PET are frequently complementary in their relative abilities to detect STAT1 targets for the lower ranked targets; each method detected validated targets that were missed by the other method. The most comprehensive list of STAT1 binding regions is obtained by merging results from ChIP-chip and ChIP-sequencing. Overall, this study provides information for robust identification, scoring, and validation of TF targets using ChIP-based technologies. PMID:17568005

  2. Low dark current InGaAs detector arrays for night vision and astronomy

    NASA Astrophysics Data System (ADS)

    MacDougal, Michael; Geske, Jon; Wang, Chad; Liao, Shirong; Getty, Jonathan; Holmes, Alan

    2009-05-01

    Aerius Photonics has developed large InGaAs arrays (1K x 1K and greater) with low dark currents for use in night vision applications in the SWIR regime. Aerius will present results of experiments to reduce the dark current density of their InGaAs detector arrays. By varying device designs and passivations, Aerius has achieved a dark current density below 1.0 nA/cm2 at 280K on small-pixel, detector arrays. Data is shown for both test structures and focal plane arrays. In addition, data from cryogenically cooled InGaAs arrays will be shown for astronomy applications.

  3. Jet-Surface Interaction Test: Flow Measurements Results

    NASA Technical Reports Server (NTRS)

    Brown, Cliff; Wernet, Mark

    2014-01-01

    Modern aircraft design often puts the engine exhaust in close proximity to the airframe surfaces. Aircraft noise prediction tools must continue to develop in order to meet the challenges these aircraft present. The Jet-Surface Interaction Tests have been conducted to provide a comprehensive quality set of experimental data suitable for development and validation of these exhaust noise prediction methods. Flow measurements have been acquired using streamwise and cross-stream particle image velocimetry (PIV) and fluctuating surface pressure data acquired using flush mounted pressure transducers near the surface trailing edge. These data combined with previously reported far-field and phased array noise measurements represent the first step toward the experimental data base. These flow data are particularly applicable to development of noise prediction methods which rely on computational fluid dynamics to uncover the flow physics. A representative sample of the large flow data set acquired is presented here to show how a surface near a jet affects the turbulent kinetic energy in the plume, the spatial relationship between the jet plume and surface needed to generate surface trailing-edge noise, and differences between heated and unheated jet flows with respect to surfaces.

  4. Large Scale Density Estimation of Blue and Fin Whales: Utilizing Sparse Array Data to Develop and Implement a New Method for Estimating Blue and Fin Whale Density

    DTIC Science & Technology

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...Utilizing Sparse Array Data to Develop and Implement a New Method for Estimating Blue and Fin Whale Density Len Thomas & Danielle Harris Centre...to develop and implement a new method for estimating blue and fin whale density that is effective over large spatial scales and is designed to cope

  5. Cryogenic and radiation-hard asic for interfacing large format NIR/SWIR detector arrays

    NASA Astrophysics Data System (ADS)

    Gao, Peng; Dupont, Benoit; Dierickx, Bart; Müller, Eric; Verbruggen, Geert; Gielis, Stijn; Valvekens, Ramses

    2017-11-01

    For scientific and earth observation space missions, weight and power consumption is usually a critical factor. In order to obtain better vehicle integration, efficiency and controllability for large format NIR/SWIR detector arrays, a prototype ASIC is designed. It performs multiple detector array interfacing, power regulation and data acquisition operations inside the cryogenic chambers. Both operation commands and imaging data are communicated via the SpaceWire interface which will significantly reduce the number of wire goes in and out the cryogenic chamber. This "ASIC" prototype is realized in 0.18um CMOS technology and is designed for radiation hardness.

  6. Progressing Deployment of Solar Photovoltaic Installations in the United States

    NASA Astrophysics Data System (ADS)

    Kwan, Calvin Lee

    2011-07-01

    This dissertation evaluates the likelihood of solar PV playing a larger role in national and state level renewable energy portfolios. I examine the feasibility of large-scale solar PV arrays on college campuses, the financials associated with large-scale solar PV arrays and finally, the influence of environmental, economic, social and political variables on the distribution of residential solar PV arrays in the United States. Chapter two investigates the challenges and feasibility of college campuses adopting a net-zero energy policy. Using energy consumption data, local solar insolation data and projected campus growth, I present a method to identify the minimum sized solar PV array that is required for the City College campus of the Los Angeles Community College District to achieve net-zero energy status. I document how current energy demand can be reduced using strategic demand side management, with remaining energy demand being met using a solar PV array. Chapter three focuses on the financial feasibility of large-scale solar PV arrays, using the proposed City College campus array as an example. I document that even after demand side energy management initiatives and financial incentives, large-scale solar PV arrays continue to have ROIs greater than 25 years. I find that traditional financial evaluation methods are not suitable for environmental projects such as solar PV installations as externalities are not taken into account and therefore calls for development of alternative financial valuation methods. Chapter four investigates the influence of environmental, social, economic and political variables on the distribution of residential solar PV arrays across the United States using ZIP code level data from the 2000 US Census. Using data from the National Renewable Energy Laboratory's Open PV project, I document where residential solar PVs are currently located. A zero-inflated negative binomial model was run to evaluate the influence of selected variables. Using the same model, predicted residential solar PV shares were generated and illustrated using GIS software. The results of this model indicate that solar insolation, state energy deregulation and cost of electricity are statistically significant factors positively correlated with the adoption of residential solar PV arrays. With this information, policymakers at the towns and cities level can establish effective solar PV promoting policies and regulations for their respective locations.

  7. 2013 certified IMS infrasound stations: IS37 (Bardufoss, Norway) and IS58 (Midway, USA)

    NASA Astrophysics Data System (ADS)

    Haralabus, Georgios; Marty, Julien; Kramer, Alfred; Mialle, Pierrick; Robertson, James

    2014-05-01

    The Infrasound component of the International Monitoring System (IMS) of the Comprehensive Nuclear?Test?Ban Treaty Organization (CTBTO) includes 60 infrasound stations out of which 47 are currently certified. The latest two additions to this Infrasound network, namely IS58 on Sand Island, Midway Atoll, United States of America (USA), and IS37 in Bardufoss, Norway, are presented here. Both stations were certified in 2013. IS58 is a 4 element infrasound array arranged in a triangular geometry with a central component. The triangular bases vary from 1.1 to 1.8 km. The micropressure sensors deployed at each element were Chaparral 50A microbarometers. Signals from IS58 were processed by the International Data Centre (IDC) and detection associated not only with microbaroms but also with the activity of the Kliuchevskoi volcano in the Russian Peninsula Kamchatka were built. These initial results indicate good detection capability of the IS58 station for low wind conditions. In Norway the topography allowed for a large element array, so IS37 was built with 10-elements that have average spacing of 1 km. This design allows the formation of several triangles with baseline of 1 to 2 km and also a triangular sub array with spacing of approximately 360 m. The sensors utilized in IS37 elements were MB2005 microbarometers. Initial data analysis by IDC identified distant microbarom sources with strong azimuth and frequency content variability as well as strong detections from local sources, namely the Finnfjord ferro-alloy plant in Norway and the Kiruna iron mine in Sweden.

  8. On-Line Data Reconstruction in Redundant Disk Arrays.

    DTIC Science & Technology

    1994-05-01

    each sale, - file servers that support a large number of clients with differing work schedules , and * automated teller networks in banking systems...24KB Head scheduling : FIFO User data layout: Sequential in address space of array Disk spindles: Synchronized Table 2.2: Default array parameters for...package and a set of scheduling and queueing routines. 2.3.3. Default workload This dissertation reports on many performance evaluations. In order to

  9. Redundant disk arrays: Reliable, parallel secondary storage. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Gibson, Garth Alan

    1990-01-01

    During the past decade, advances in processor and memory technology have given rise to increases in computational performance that far outstrip increases in the performance of secondary storage technology. Coupled with emerging small-disk technology, disk arrays provide the cost, volume, and capacity of current disk subsystems, by leveraging parallelism, many times their performance. Unfortunately, arrays of small disks may have much higher failure rates than the single large disks they replace. Redundant arrays of inexpensive disks (RAID) use simple redundancy schemes to provide high data reliability. The data encoding, performance, and reliability of redundant disk arrays are investigated. Organizing redundant data into a disk array is treated as a coding problem. Among alternatives examined, codes as simple as parity are shown to effectively correct single, self-identifying disk failures.

  10. Integrated Geophysical Monitoring Program to Study Flood Performance and Incidental CO2 Storage Associated with a CO2 EOR Project in the Bell Creek Oil Field

    NASA Astrophysics Data System (ADS)

    Burnison, S. A.; Ditty, P.; Gorecki, C. D.; Hamling, J. A.; Steadman, E. N.; Harju, J. A.

    2013-12-01

    The Plains CO2 Reduction (PCOR) Partnership, led by the Energy & Environmental Research Center, is working with Denbury Onshore LLC to determine the effect of a large-scale injection of carbon dioxide (CO2) into a deep clastic reservoir for the purpose of simultaneous CO2 enhanced oil recovery (EOR) and to study incidental CO2 storage at the Bell Creek oil field located in southeastern Montana. This project will reduce CO2 emissions by more than 1 million tons a year while simultaneously recovering an anticipated 30 million barrels of incremental oil. The Bell Creek project provides a unique opportunity to use and evaluate a comprehensive suite of technologies for monitoring, verification, and accounting (MVA) of CO2 on a large-scale. The plan incorporates multiple geophysical technologies in the presence of complementary and sometimes overlapping data to create a comprehensive data set that will facilitate evaluation and comparison. The MVA plan has been divided into shallow and deep subsurface monitoring. The deep subsurface monitoring plan includes 4-D surface seismic, time-lapse 3-D vertical seismic profile (VSP) surveys incorporating a permanent borehole array, and baseline and subsequent carbon-oxygen logging and other well-based measurements. The goal is to track the movement of CO2 in the reservoir, evaluate the recovery/storage efficiency of the CO2 EOR program, identify fluid migration pathways, and determine the ultimate fate of injected CO2. CO2 injection at Bell Creek began in late May 2013. Prior to injection, a monitoring and characterization well near the field center was drilled and outfitted with a distributed temperature-monitoring system and three down-hole pressure gauges to provide continuous real-time data of the reservoir and overlying strata. The monitoring well allows on-demand access for time-lapse well-based measurements and borehole seismic instrumentation. A 50-level permanent borehole array of 3-component geophones was installed in a second monitoring well. A pre-injection series of carbon-oxygen logging across the reservoir was acquired in 35 wells. The baseline 3-D surface seismic survey was acquired in September 2012. A 3-D VSP incorporating two wells and 2 square miles of overlapping seismic coverage in the middle of the field was acquired in May 2013. Initial iterations of geologic modeling and reservoir simulation of the field have been completed. Currently, passive seismic monitoring with the permanent borehole array is being conducted during injection. Interpretation results from the baseline surface 3-D survey and preliminary results from the baseline 3-D VSP are being evaluated and integrated into the reservoir model. The PCOR Partnership's philosophy is to combine site characterization, modeling, and monitoring strategies into an iterative process to produce descriptive integrated results. The comprehensive effort at Bell Creek will allow a comparison of the effectiveness of several complementary geophysical and well-based methods in meeting the goals of the deep subsurface monitoring effort.

  11. Solar array flight dynamic experiment

    NASA Technical Reports Server (NTRS)

    Schock, R. W.

    1986-01-01

    The purpose of the Solar Array Flight Dynamic Experiment (SAFDE) is to demonstrate the feasibility of on-orbit measurement and ground processing of large space structures dynamic characteristics. Test definition or verification provides the dynamic characteristic accuracy required for control systems use. An illumination/measurement system was developed to fly on space shuttle flight STS-31D. The system was designed to dynamically evaluate a large solar array called the Solar Array Flight Experiment (SAFE) that had been scheduled for this flight. The SAFDE system consisted of a set of laser diode illuminators, retroreflective targets, an intelligent star tracker receiver and the associated equipment to power, condition, and record the results. In six tests on STS-41D, data was successfully acquired from 18 retroreflector targets and ground processed, post flight, to define the solar array's dynamic characteristic. The flight experiment proved the viability of on-orbit test definition of large space structures dynamic characteristics. Future large space structures controllability should be greatly enhanced by this capability.

  12. Solar array flight dynamic experiment

    NASA Technical Reports Server (NTRS)

    Schock, Richard W.

    1986-01-01

    The purpose of the Solar Array Flight Dynamic Experiment (SAFDE) is to demonstrate the feasibility of on-orbit measurement and ground processing of large space structures dynamic characteristics. Test definition or verification provides the dynamic characteristic accuracy required for control systems use. An illumination/measurement system was developed to fly on Space Shuttle flight STS-31D. The system was designed to dynamically evaluate a large solar array called the Solar Array Flight Experiment (SAFE) that had been scheduled for this flight. The SAFDE system consisted of a set of laser diode illuminators, retroreflective targets, an intelligent star tracker receiver and the associated equipment to power, condition, and record the results. In six tests on STS-41D, data was successfully acquired from 18 retroreflector targets and ground processed, post flight, to define the solar array's dynamic characteristic. The flight experiment proved the viability of on-orbit test definition of large space structures dynamic characteristics. Future large space structures controllability should be greatly enhanced by this capability.

  13. Solar array flight dynamic experiment

    NASA Technical Reports Server (NTRS)

    Schock, Richard W.

    1987-01-01

    The purpose of the Solar Array Flight Dynamic Experiment (SAFDE) is to demonstrate the feasibility of on-orbit measurement and ground processing of large space structures' dynamic characteristics. Test definition or verification provides the dynamic characteristic accuracy required for control systems use. An illumination/measurement system was developed to fly on space shuttle flight STS-41D. The system was designed to dynamically evaluate a large solar array called the Solar Array Flight Experiment (SAFE) that had been scheduled for this flight. The SAFDE system consisted of a set of laser diode illuminators, retroreflective targets, an intelligent star tracker receiver and the associated equipment to power, condition, and record the results. In six tests on STS-41D, data was successfully acquired from 18 retroreflector targets and ground processed, post flight, to define the solar array's dynamic characteristic. The flight experiment proved the viability of on-orbit test definition of large space structures dynamic characteristics. Future large space structures controllability should be greatly enhanced by this capability.

  14. Centralized operations and maintenance planning at the Atacama Large Millimeter/submillimeter Array (ALMA)

    NASA Astrophysics Data System (ADS)

    Lopez, Bernhard; Whyborn, Nicholas D.; Guniat, Serge; Hernandez, Octavio; Gairing, Stefan

    2016-07-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA consists of 54 twelve-meter antennas and 12 seven-meter antennas operating as an aperture synthesis array in the (sub)millimeter wavelength range. Since the inauguration of the observatory back in March 2013 there has been a continuous effort to establish solid operations processes for effective and efficient management of technical and administrative tasks on site. Here a key aspect had been the centralized maintenance and operations planning: input is collected from science stakeholders, the computerized maintenance management system (CMMS) and from the technical teams spread around the world, then this information is analyzed and consolidated based on the established maintenance strategy, the observatory long-term plan and the short-term priorities definitions. This paper presents the high-level process that has been developed for the planning and scheduling of planned- and unplanned maintenance tasks, and for site operations like the telescope array reconfiguration campaigns. We focus on the centralized planning approach by presenting its genesis, its current implementation for the observatory operations including related planning products, and we explore the necessary next steps in order to fully achieve a comprehensive centralized planning approach for ALMA in steady-state operations.

  15. Investigation of interpolation techniques for the reconstruction of the first dimension of comprehensive two-dimensional liquid chromatography-diode array detector data.

    PubMed

    Allen, Robert C; Rutan, Sarah C

    2011-10-31

    Simulated and experimental data were used to measure the effectiveness of common interpolation techniques during chromatographic alignment of comprehensive two-dimensional liquid chromatography-diode array detector (LC×LC-DAD) data. Interpolation was used to generate a sufficient number of data points in the sampled first chromatographic dimension to allow for alignment of retention times from different injections. Five different interpolation methods, linear interpolation followed by cross correlation, piecewise cubic Hermite interpolating polynomial, cubic spline, Fourier zero-filling, and Gaussian fitting, were investigated. The fully aligned chromatograms, in both the first and second chromatographic dimensions, were analyzed by parallel factor analysis to determine the relative area for each peak in each injection. A calibration curve was generated for the simulated data set. The standard error of prediction and percent relative standard deviation were calculated for the simulated peak for each technique. The Gaussian fitting interpolation technique resulted in the lowest standard error of prediction and average relative standard deviation for the simulated data. However, upon applying the interpolation techniques to the experimental data, most of the interpolation methods were not found to produce statistically different relative peak areas from each other. While most of the techniques were not statistically different, the performance was improved relative to the PARAFAC results obtained when analyzing the unaligned data. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Large-area high-power VCSEL pump arrays optimized for high-energy lasers

    NASA Astrophysics Data System (ADS)

    Wang, Chad; Geske, Jonathan; Garrett, Henry; Cardellino, Terri; Talantov, Fedor; Berdin, Glen; Millenheft, David; Renner, Daniel; Klemer, Daniel

    2012-06-01

    Practical, large-area, high-power diode pumps for one micron (Nd, Yb) as well as eye-safer wavelengths (Er, Tm, Ho) are critical to the success of any high energy diode pumped solid state laser. Diode efficiency, brightness, availability and cost will determine how realizable a fielded high energy diode pumped solid state laser will be. 2-D Vertical-Cavity Surface-Emitting Laser (VCSEL) arrays are uniquely positioned to meet these requirements because of their unique properties, such as low divergence circular output beams, reduced wavelength drift with temperature, scalability to large 2-D arrays through low-cost and high-volume semiconductor photolithographic processes, high reliability, no catastrophic optical damage failure, and radiation and vacuum operation tolerance. Data will be presented on the status of FLIR-EOC's VCSEL pump arrays. Analysis of the key aspects of electrical, thermal and mechanical design that are critical to the design of a VCSEL pump array to achieve high power efficient array performance will be presented.

  17. Terabyte IDE RAID-5 Disk Arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David A. Sanders et al.

    2003-09-30

    High energy physics experiments are currently recording large amounts of data and in a few years will be recording prodigious quantities of data. New methods must be developed to handle this data and make analysis at universities possible. We examine some techniques that exploit recent developments in commodity hardware. We report on tests of redundant arrays of integrated drive electronics (IDE) disk drives for use in offline high energy physics data analysis. IDE redundant array of inexpensive disks (RAID) prices now are less than the cost per terabyte of million-dollar tape robots! The arrays can be scaled to sizes affordablemore » to institutions without robots and used when fast random access at low cost is important.« less

  18. Developmental transcriptional profiling reveals key insights into Triticeae reproductive development.

    PubMed

    Tran, Frances; Penniket, Carolyn; Patel, Rohan V; Provart, Nicholas J; Laroche, André; Rowland, Owen; Robert, Laurian S

    2013-06-01

    Despite their importance, there remains a paucity of large-scale gene expression-based studies of reproductive development in species belonging to the Triticeae. As a first step to address this deficiency, a gene expression atlas of triticale reproductive development was generated using the 55K Affymetrix GeneChip(®) wheat genome array. The global transcriptional profiles of the anther/pollen, ovary and stigma were analyzed at concurrent developmental stages, and co-expressed as well as preferentially expressed genes were identified. Data analysis revealed both novel and conserved regulatory factors underlying Triticeae floral development and function. This comprehensive resource rests upon detailed gene annotations, and the expression profiles are readily accessible via a web browser. © 2013 Her Majesty the Queen in Right of Canada as represented by the Minister of Agriculture and Agri-Food Canada.

  19. Discharge transient coupling in large space power systems

    NASA Technical Reports Server (NTRS)

    Stevens, N. John; Stillwell, R. P.

    1990-01-01

    Experiments have shown that plasma environments can induce discharges in solar arrays. These plasmas simulate the environments found in low earth orbits where current plans call for operation of very large power systems. The discharges could be large enough to couple into the power system and possibly disrupt operations. Here, the general concepts of the discharge mechanism and the techniques of coupling are discussed. Data from both ground and flight experiments are reviewed to obtain an expected basis for the interactions. These concepts were applied to the Space Station solar array and distribution system as an example of the large space power system. The effect of discharges was found to be a function of the discharge site. For most sites in the array discharges would not seriously impact performance. One location at the negative end of the array was identified as a position where discharges could couple to charge stored in system capacitors. This latter case could impact performance.

  20. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  1. Scalable Earth-observation Analytics for Geoscientists: Spacetime Extensions to the Array Database SciDB

    NASA Astrophysics Data System (ADS)

    Appel, Marius; Lahn, Florian; Pebesma, Edzer; Buytaert, Wouter; Moulds, Simon

    2016-04-01

    Today's amount of freely available data requires scientists to spend large parts of their work on data management. This is especially true in environmental sciences when working with large remote sensing datasets, such as obtained from earth-observation satellites like the Sentinel fleet. Many frameworks like SpatialHadoop or Apache Spark address the scalability but target programmers rather than data analysts, and are not dedicated to imagery or array data. In this work, we use the open-source data management and analytics system SciDB to bring large earth-observation datasets closer to analysts. Its underlying data representation as multidimensional arrays fits naturally to earth-observation datasets, distributes storage and computational load over multiple instances by multidimensional chunking, and also enables efficient time-series based analyses, which is usually difficult using file- or tile-based approaches. Existing interfaces to R and Python furthermore allow for scalable analytics with relatively little learning effort. However, interfacing SciDB and file-based earth-observation datasets that come as tiled temporal snapshots requires a lot of manual bookkeeping during ingestion, and SciDB natively only supports loading data from CSV-like and custom binary formatted files, which currently limits its practical use in earth-observation analytics. To make it easier to work with large multi-temporal datasets in SciDB, we developed software tools that enrich SciDB with earth observation metadata and allow working with commonly used file formats: (i) the SciDB extension library scidb4geo simplifies working with spatiotemporal arrays by adding relevant metadata to the database and (ii) the Geospatial Data Abstraction Library (GDAL) driver implementation scidb4gdal allows to ingest and export remote sensing imagery from and to a large number of file formats. Using added metadata on temporal resolution and coverage, the GDAL driver supports time-based ingestion of imagery to existing multi-temporal SciDB arrays. While our SciDB plugin works directly in the database, the GDAL driver has been specifically developed using a minimum amount of external dependencies (i.e. CURL). Source code for both tools is available from github [1]. We present these tools in a case-study that demonstrates the ingestion of multi-temporal tiled earth-observation data to SciDB, followed by a time-series analysis using R and SciDBR. Through the exclusive use of open-source software, our approach supports reproducibility in scalable large-scale earth-observation analytics. In the future, these tools can be used in an automated way to let scientists only work on ready-to-use SciDB arrays to significantly reduce the data management workload for domain scientists. [1] https://github.com/mappl/scidb4geo} and \\url{https://github.com/mappl/scidb4gdal

  2. Preliminary Results from the Flight of the Solar Array Module Plasma Interactions Experiment (SAMPIE)

    NASA Technical Reports Server (NTRS)

    Ferguson, Dale C.; Hillard, G. Barry

    1994-01-01

    SAMPIE, the Solar Array Module Plasma Interactions Experiment, flew in the Space Shuttle Columbia payload bay as part of the OAST-2 mission on STS-62, March, 1994. SAMPIE biased samples of solar arrays and space power materials to varying potentials with respect to the surrounding space plasma, and recorded the plasma currents collected and the arcs which occurred, along with a set of plasma diagnostics data. A large set of high quality data was obtained on the behavior of solar arrays and space power materials in the space environment. This paper is the first report on the data SAMPIE telemetered to the ground during the mission. It will be seen that the flight data promise to help determine arcing thresholds, snapover potentials and floating potentials for arrays and spacecraft in LEO.

  3. Development of a Wind Plant Large-Eddy Simulation with Measurement-Driven Atmospheric Inflow: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quon, Eliot; Churchfield, Matthew; Cheung, Lawrence

    This paper details the development of an aeroelastic wind plant model with large-eddy simulation (LES). The chosen LES solver is the Simulator for Wind Farm Applications (SOWFA) based on the OpenFOAM framework, coupled to NREL's comprehensive aeroelastic analysis tool, FAST. An atmospheric boundary layer (ABL) precursor simulation was constructed based on assessments of meteorological tower, lidar, and radar data over a 3-hour window. This precursor was tuned to the specific atmospheric conditions that occurred both prior to and during the measurement campaign, enabling capture of a night-to-day transition in the turbulent ABL. In the absence of height-varying temperature measurements, spatiallymore » averaged radar data were sufficient to characterize the atmospheric stability of the wind plant in terms of the shear profile, and near-ground temperature sensors provided a reasonable estimate of the ground heating rate describing the morning transition. A full aeroelastic simulation was then performed for a subset of turbines within the wind plant, driven by the precursor. Analysis of two turbines within the array, one directly waked by the other, demonstrated good agreement with measured time-averaged loads.« less

  4. Development of a Wind Plant Large-Eddy Simulation with Measurement-Driven Atmospheric Inflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quon, Eliot W.; Churchfield, Matthew J.; Cheung, Lawrence

    This paper details the development of an aeroelastic wind plant model with large-eddy simulation (LES). The chosen LES solver is the Simulator for Wind Farm Applications (SOWFA) based on the OpenFOAM framework, coupled to NREL's comprehensive aeroelastic analysis tool, FAST. An atmospheric boundary layer (ABL) precursor simulation was constructed based on assessments of meteorological tower, lidar, and radar data over a 3-hour window. This precursor was tuned to the specific atmospheric conditions that occurred both prior to and during the measurement campaign, enabling capture of a night-to-day transition in the turbulent ABL. In the absence of height-varying temperature measurements, spatiallymore » averaged radar data were sufficient to characterize the atmospheric stability of the wind plant in terms of the shear profile, and near-ground temperature sensors provided a reasonable estimate of the ground heating rate describing the morning transition. A full aeroelastic simulation was then performed for a subset of turbines within the wind plant, driven by the precursor. Analysis of two turbines within the array, one directly waked by the other, demonstrated good agreement with measured time-averaged loads.« less

  5. Beamforming using subspace estimation from a diagonally averaged sample covariance.

    PubMed

    Quijano, Jorge E; Zurk, Lisa M

    2017-08-01

    The potential benefit of a large-aperture sonar array for high resolution target localization is often challenged by the lack of sufficient data required for adaptive beamforming. This paper introduces a Toeplitz-constrained estimator of the clairvoyant signal covariance matrix corresponding to multiple far-field targets embedded in background isotropic noise. The estimator is obtained by averaging along subdiagonals of the sample covariance matrix, followed by covariance extrapolation using the method of maximum entropy. The sample covariance is computed from limited data snapshots, a situation commonly encountered with large-aperture arrays in environments characterized by short periods of local stationarity. Eigenvectors computed from the Toeplitz-constrained covariance are used to construct signal-subspace projector matrices, which are shown to reduce background noise and improve detection of closely spaced targets when applied to subspace beamforming. Monte Carlo simulations corresponding to increasing array aperture suggest convergence of the proposed projector to the clairvoyant signal projector, thereby outperforming the classic projector obtained from the sample eigenvectors. Beamforming performance of the proposed method is analyzed using simulated data, as well as experimental data from the Shallow Water Array Performance experiment.

  6. Large-format InGaAs focal plane arrays for SWIR imaging

    NASA Astrophysics Data System (ADS)

    Hood, Andrew D.; MacDougal, Michael H.; Manzo, Juan; Follman, David; Geske, Jonathan C.

    2012-06-01

    FLIR Electro Optical Components will present our latest developments in large InGaAs focal plane arrays, which are used for low light level imaging in the short wavelength infrared (SWIR) regime. FLIR will present imaging from their latest small pitch (15 μm) focal plane arrays in VGA and High Definition (HD) formats. FLIR will present characterization of the FPA including dark current measurements as well as the use of correlated double sampling to reduce read noise. FLIR will show imagery as well as FPA-level characterization data.

  7. Development of associations and kinetic models for microbiological data to be used in comprehensive food safety prediction software.

    PubMed

    Halder, Amit; Black, D Glenn; Davidson, P Michael; Datta, Ashim

    2010-08-01

    The objective of this study was to use an existing database of food products and their associated processes, link it with a list of the foodborne pathogenic microorganisms associated with those products and finally identify growth and inactivation kinetic parameters associated with those pathogens. The database was to be used as a part of the development of comprehensive software which could predict food safety and quality for any food product. The main issues in building such a predictive system included selection of predictive models, associations of different food types with pathogens (as determined from outbreak histories), and variability in data from different experiments. More than 1000 data sets from published literature were analyzed and grouped according to microorganisms and food types. Final grouping of data consisted of the 8 most prevalent pathogens for 14 different food groups, covering all of the foods (>7000) listed in the USDA Natl. Nutrient Database. Data for each group were analyzed in terms of 1st-order inactivation, 1st-order growth, and sigmoidal growth models, and their kinetic response for growth and inactivation as a function of temperature were reported. Means and 95% confidence intervals were calculated for prediction equations. The primary advantage in obtaining group-specific kinetic data is the ability to extend microbiological growth and death simulation to a large array of product and process possibilities, while still being reasonably accurate. Such simulation capability could provide vital ''what if'' scenarios for industry, Extension, and academia in food safety.

  8. Exploring the performance of large-N radio astronomical arrays

    NASA Astrophysics Data System (ADS)

    Lonsdale, Colin J.; Doeleman, Sheperd S.; Cappallo, Roger J.; Hewitt, Jacqueline N.; Whitney, Alan R.

    2000-07-01

    New radio telescope arrays are currently being contemplated which may be built using hundreds, or even thousands, of relatively small antennas. These include the One Hectare Telescope of the SETI Institute and UC Berkeley, the LOFAR telescope planned for the New Mexico desert surrounding the VLA, and possibly the ambitious international Square Kilometer Array (SKA) project. Recent and continuing advances in signal transmission and processing technology make it realistic to consider full cross-correlation of signals from such a large number of antennas, permitting the synthesis of an aperture with much greater fidelity than in the past. In principle, many advantages in instrumental performance are gained by this 'large-N' approach to the design, most of which require the development of new algorithms. Because new instruments of this type are expected to outstrip the performance of current instruments by wide margins, much of their scientific productivity is likely to come from the study of objects which are currently unknown. For this reason, instrumental flexibility is of special importance in design studies. A research effort has begun at Haystack Observatory to explore large-N performance benefits, and to determine what array design properties and data reduction algorithms are required to achieve them. The approach to these problems, involving a sophisticated data simulator, algorithm development, and exploration of array configuration parameter space, will be described, and progress to date will be summarized.

  9. The Kepler DB: a database management system for arrays, sparse arrays, and binary data

    NASA Astrophysics Data System (ADS)

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-07-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30 minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database management system (Kepler DB)was created to act as the repository of this information. After one year of flight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one-dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  10. The Kepler DB, a Database Management System for Arrays, Sparse Arrays and Binary Data

    NASA Technical Reports Server (NTRS)

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-01-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30-minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database (Kepler DB) management system was created to act as the repository of this information. After one year of ight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  11. High-channel-count, high-density microelectrode array for closed-loop investigation of neuronal networks.

    PubMed

    Tsai, David; John, Esha; Chari, Tarun; Yuste, Rafael; Shepard, Kenneth

    2015-01-01

    We present a system for large-scale electrophysiological recording and stimulation of neural tissue with a planar topology. The recording system has 65,536 electrodes arranged in a 256 × 256 grid, with 25.5 μm pitch, and covering an area approximately 42.6 mm(2). The recording chain has 8.66 μV rms input-referred noise over a 100 ~ 10k Hz bandwidth while providing up to 66 dB of voltage gain. When recording from all electrodes in the array, it is capable of 10-kHz sampling per electrode. All electrodes can also perform patterned electrical microstimulation. The system produces ~ 1 GB/s of data when recording from the full array. To handle, store, and perform nearly real-time analyses of this large data stream, we developed a framework based around Xilinx FPGAs, Intel x86 CPUs and the NVIDIA Streaming Multiprocessors to interface with the electrode array.

  12. Genomic Characterization of DArT Markers Based on High-Density Linkage Analysis and Physical Mapping to the Eucalyptus Genome

    PubMed Central

    Petroli, César D.; Sansaloni, Carolina P.; Carling, Jason; Steane, Dorothy A.; Vaillancourt, René E.; Myburg, Alexander A.; da Silva, Orzenil Bonfim; Pappas, Georgios Joannis; Kilian, Andrzej; Grattapaglia, Dario

    2012-01-01

    Diversity Arrays Technology (DArT) provides a robust, high throughput, cost-effective method to query thousands of sequence polymorphisms in a single assay. Despite the extensive use of this genotyping platform for numerous plant species, little is known regarding the sequence attributes and genome-wide distribution of DArT markers. We investigated the genomic properties of the 7,680 DArT marker probes of a Eucalyptus array, by sequencing them, constructing a high density linkage map and carrying out detailed physical mapping analyses to the Eucalyptus grandis reference genome. A consensus linkage map with 2,274 DArT markers anchored to 210 microsatellites and a framework map, with improved support for ordering, displayed extensive collinearity with the genome sequence. Only 1.4 Mbp of the 75 Mbp of still unplaced scaffold sequence was captured by 45 linkage mapped but physically unaligned markers to the 11 main Eucalyptus pseudochromosomes, providing compelling evidence for the quality and completeness of the current Eucalyptus genome assembly. A highly significant correspondence was found between the locations of DArT markers and predicted gene models, while most of the 89 DArT probes unaligned to the genome correspond to sequences likely absent in E. grandis, consistent with the pan-genomic feature of this multi-Eucalyptus species DArT array. These comprehensive linkage-to-physical mapping analyses provide novel data regarding the genomic attributes of DArT markers in plant genomes in general and for Eucalyptus in particular. DArT markers preferentially target the gene space and display a largely homogeneous distribution across the genome, thereby providing superb coverage for mapping and genome-wide applications in breeding and diversity studies. Data reported on these ubiquitous properties of DArT markers will be particularly valuable to researchers working on less-studied crop species who already count on DArT genotyping arrays but for which no reference genome is yet available to allow such detailed characterization. PMID:22984541

  13. DAnTE: a statistical tool for quantitative analysis of –omics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep

    2008-05-03

    DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

  14. Comparative testing of radiographic testing, ultrasonic testing and phased array advanced ultrasonic testing non destructive testing techniques in accordance with the AWS D1.5 bridge welding code.

    DOT National Transportation Integrated Search

    2014-02-01

    A comprehensive body of non-destructive testing data was collected from steel bridge welds under real-world conditions in a fabricators shop. Three different non-destructive testing (NDT) techniques were used on each weld inspection, these being R...

  15. A systems biology approach using transcriptomic data reveals genes and pathways in porcine skeletal muscle affected by dietary lysine

    USDA-ARS?s Scientific Manuscript database

    Meeting the increasing market demands for pork products requires improvement of the feed efficiency of growing pigs. The use of Affymetrix Porcine Gene 1.0 ST array containing 19,211 genes in this study provides a comprehensive gene expression profile of skeletal muscle of finishing pigs in response...

  16. Solar array electrical performance assessment for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Brisco, Holly

    1993-01-01

    Electrical power for Space Station Freedom will be generated by large Photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis, and test data to date. A description of the LMSC performance model, future test plans, and predicted performance ranges are also given.

  17. Solar array electrical performance assessment for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Brisco, Holly

    1993-01-01

    Electrical power for Space Station Freedom will be generated by large photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis and test data to date. A description of the LMSC performance model future test plans and predicted performance ranges are also given.

  18. The development and test of ultra-large-format multi-anode microchannel array detector systems

    NASA Technical Reports Server (NTRS)

    Timothy, J. G.

    1984-01-01

    The specific tasks that were accomplished with each of the key elements of the multi-anode microchannel array detector system are described. The modes of operation of position-sensitive electronic readout systems for use with high-gain microchannel plates are described and their performance characteristics compared and contrasted. Multi-anode microchannel array detector systems with formats as large as 256 x 1024 pixels are currently under evaluation. Preliminary performance data for sealed ultraviolet and visible-light detector tubes show that the detector systems have unique characteristics which make them complementary to photoconductive array detectors, such as CCDs, and superior to alternative pulse-counting detector systems employing high-gain MCPs.

  19. Comprehensive performance comparison of high-resolution array platforms for genome-wide Copy Number Variation (CNV) analysis in humans.

    PubMed

    Haraksingh, Rajini R; Abyzov, Alexej; Urban, Alexander Eckehart

    2017-04-24

    High-resolution microarray technology is routinely used in basic research and clinical practice to efficiently detect copy number variants (CNVs) across the entire human genome. A new generation of arrays combining high probe densities with optimized designs will comprise essential tools for genome analysis in the coming years. We systematically compared the genome-wide CNV detection power of all 17 available array designs from the Affymetrix, Agilent, and Illumina platforms by hybridizing the well-characterized genome of 1000 Genomes Project subject NA12878 to all arrays, and performing data analysis using both manufacturer-recommended and platform-independent software. We benchmarked the resulting CNV call sets from each array using a gold standard set of CNVs for this genome derived from 1000 Genomes Project whole genome sequencing data. The arrays tested comprise both SNP and aCGH platforms with varying designs and contain between ~0.5 to ~4.6 million probes. Across the arrays CNV detection varied widely in number of CNV calls (4-489), CNV size range (~40 bp to ~8 Mbp), and percentage of non-validated CNVs (0-86%). We discovered strikingly strong effects of specific array design principles on performance. For example, some SNP array designs with the largest numbers of probes and extensive exonic coverage produced a considerable number of CNV calls that could not be validated, compared to designs with probe numbers that are sometimes an order of magnitude smaller. This effect was only partially ameliorated using different analysis software and optimizing data analysis parameters. High-resolution microarrays will continue to be used as reliable, cost- and time-efficient tools for CNV analysis. However, different applications tolerate different limitations in CNV detection. Our study quantified how these arrays differ in total number and size range of detected CNVs as well as sensitivity, and determined how each array balances these attributes. This analysis will inform appropriate array selection for future CNV studies, and allow better assessment of the CNV-analytical power of both published and ongoing array-based genomics studies. Furthermore, our findings emphasize the importance of concurrent use of multiple analysis algorithms and independent experimental validation in array-based CNV detection studies.

  20. SEDPAK—A comprehensive operational system and data-processing package in APPLESOFT BASIC for a settling tube, sediment analyzer

    NASA Astrophysics Data System (ADS)

    Goldbery, R.; Tehori, O.

    SEDPAK provides a comprehensive software package for operation of a settling tube and sand analyzer (2-0.063 mm) and includes data-processing programs for statistical and graphic output of results. The programs are menu-driven and written in APPLESOFT BASIC, conforming with APPLE 3.3 DOS. Data storage and retrieval from disc is an important feature of SEDPAK. Additional features of SEDPAK include condensation of raw settling data via standard size-calibration curves to yield statistical grain-size parameters, plots of grain-size frequency distributions and cumulative log/probability curves. The program also has a module for processing of grain-size frequency data from sieved samples. An addition feature of SEDPAK is the option for automatic data processing and graphic output of a sequential or nonsequential array of samples on one side of a disc.

  1. Review of biased solar arraay. Plasma interaction studies

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1981-01-01

    The Solar Electric Propulsion System (SEPS) is proposed for a variety of space missions. Power for operating SEPS is obtained from large solar array wings capable of generating tens of kilowatts of power. To minimize resistive losses in the solar array bus lines, the array is designed to operate at voltages up to 400 volts. This use of high voltage can increase interactions between the biased solar cell interconnects and plasma environments. With thrusters operating, the system ground is maintained at space plasma potential which exposes large areas of the arrays at the operating voltages. This can increase interactions with both the natural and enhanced charged particle environments. Available data on interactions between biased solar array surfaces and plasma environments are summarized. The apparent relationship between collection phenomena and solar cell size and effects of array size on interactions are discussed. The impact of these interactions on SEPS performance is presented.

  2. Preliminary results from the flight of the Solar Array Module Plasma Interactions Experiment (SAMPIE)

    NASA Technical Reports Server (NTRS)

    Ferguson, Dale C.; Hillard, G. Barry

    1994-01-01

    SAMPIE, the Solar Array Module Plasma Interactions Experiment, flew in the Space Shuttle Columbia payload bay as part of the Office of Aeronautics and Space Technology-2 (OAST-2) mission on STS-62, March, 1994. SAMPIE biased samples of solar arrays and space power materials to varying potentials with respect to the surrounding space plasma, and recorded the plasma currents collected and the arcs which occurred, along with a set of plasma diagnostics data. A large set of high quality data was obtained on the behavior of solar arrays and space power materials in the space environment. This paper is the first report on the data SAMPIE telemetered to the ground during the mission. It will be seen that the flight data promise to help determine arcing thresholds, snapover potentials, and floating potentials for arrays and spacecraft in LEO.

  3. Seismicity in Oklahoma Before Prague

    NASA Astrophysics Data System (ADS)

    Delorey, A. A.; Johnson, P. A.

    2017-12-01

    The 2011 M5.7 Prague earthquake was the first large anthropogenically induced earthquake in Oklahoma. Since then, three more M5+ earthquakes followed it near Fairview, Pawnee, and Cushing. Oklahoma induced seismicity has garnered a lot of attention from both the media and the scientific community. But, little is known about seismicity in Oklahoma prior to the Prague earthquake due to a lack of instrumentation. We ask the question, "Was there any indication in the geophysical record prior to the Prague earthquake that bigger earthquakes were becoming more likely?" Fortunately, stations from Earthscope's Transportable Array were in Oklahoma during 2010 and 2011 providing a sparse, but still useful data set. Using our microseismicity detector called Interstation Seismic Coherence, we were able to catalog over 3000 earthquakes with a magnitude of completeness around 2.0 in northeastern Oklahoma over 17 months between June 2010 and the Prague earthquake in November 2011. During this period of time there are less than 200 earthquakes in the ANSS Comprehensive Catalog and 900 in the catalog produced by the Array Network Facility at the UCSD using Transportable Array stations. The M>5 earthquakes occurred in a region where stress conditions and seismicity rates were evolving much faster than they do in many natural systems presenting an opportunity to study the time dependence of upper crustal behavior. A clustering analysis shows that earthquakes occurring in northeastern Oklahoma during 2010-2011 are highly correlated with the magnitude of solid earth tides. Although some aftershocks and clusters were recorded following the Prague earthquake using temporary arrays, regional seismicity is not well recorded again until later in 2013. Of note, after 2013, we no longer observe tidal correlation suggesting the ensemble of fault criticality has evolved. One explanation for this change in earthquake behavior is a change in poroelastic conditions.

  4. A comprehensive analysis of the performance characteristics of the Mount Laguna solar photovoltaic installation

    NASA Technical Reports Server (NTRS)

    Shumka, A.; Sollock, S. G.

    1981-01-01

    This paper represents the first comprehensive survey of the Mount Laguna Photovoltaic Installation. The novel techniques used for performing the field tests have been effective in locating and characterizing defective modules. A comparative analysis on the two types of modules used in the array indicates that they have significantly different failure rates, different distributions in degradational space and very different failure modes. A life cycle model is presented to explain a multimodal distribution observed for one module type. A statistical model is constructed and it is shown to be in good agreement with the field data.

  5. Statistical analysis of kinetic energy entrainment in a model wind turbine array boundary layer

    NASA Astrophysics Data System (ADS)

    Cal, Raul Bayoan; Hamilton, Nicholas; Kang, Hyung-Suk; Meneveau, Charles

    2012-11-01

    For large wind farms, kinetic energy must be entrained from the flow above the wind turbines to replenish wakes and enable power extraction in the array. Various statistical features of turbulence causing vertical entrainment of mean-flow kinetic energy are studied using hot-wire velocimetry data taken in a model wind farm in a scaled wind tunnel experiment. Conditional statistics and spectral decompositions are employed to characterize the most relevant turbulent flow structures and determine their length-scales. Sweep and ejection events are shown to be the largest contributors to the vertical kinetic energy flux, although their relative contribution depends upon the location in the wake. Sweeps are shown to be dominant in the region above the wind turbine array. A spectral analysis of the data shows that large scales of the flow, about the size of the rotor diameter in length or larger, dominate the vertical entrainment. The flow is more incoherent below the array, causing decreased vertical fluxes there. The results show that improving the rate of vertical kinetic energy entrainment into wind turbine arrays is a standing challenge and would require modifying the large-scale structures of the flow. This work was funded in part by the National Science Foundation (CBET-0730922, CBET-1133800 and CBET-0953053).

  6. Monitoring the Earth's Atmosphere with the Global IMS Infrasound Network

    NASA Astrophysics Data System (ADS)

    Brachet, Nicolas; Brown, David; Mialle, Pierrick; Le Bras, Ronan; Coyne, John; Given, Jeffrey

    2010-05-01

    The Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is tasked with monitoring compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) which bans nuclear weapon explosions underground, in the oceans, and in the atmosphere. The verification regime includes a globally distributed network of seismic, hydroacoustic, infrasound and radionuclide stations which collect and transmit data to the International Data Centre (IDC) in Vienna, Austria shortly after the data are recorded at each station. The infrasound network defined in the Protocol of the CTBT comprises 60 infrasound array stations. Each array is built according to the same technical specifications, it is typically composed of 4 to 9 sensors, with 1 to 3 km aperture geometry. At the end of 2000 only one infrasound station was transmitting data to the IDC. Since then, 41 additional stations have been installed and 70% of the infrasound network is currently certified and contributing data to the IDC. This constitutes the first global infrasound network ever built with such a large and uniform distribution of stations. Infrasound data at the IDC are processed at the station level using the Progressive Multi-Channel Correlation (PMCC) method for the detection and measurement of infrasound signals. The algorithm calculates the signal correlation between sensors at an infrasound array. If the signal is sufficiently correlated and consistent over an extended period of time and frequency range a detection is created. Groups of detections are then categorized according to their propagation and waveform features, and a phase name is assigned for infrasound, seismic or noise detections. The categorization complements the PMCC algorithm to avoid overwhelming the IDC automatic association algorithm with false alarm infrasound events. Currently, 80 to 90% of the detections are identified as noise by the system. Although the noise detections are not used to build events in the context of CTBT monitoring, they represent valuable data for other civil applications like monitoring of natural hazards (volcanic activity, storm tracking) and climate change. Non-noise detections are used in network processing at the IDC along with seismic and hydroacoustic technologies. The arrival phases detected on the three waveform technologies may be combined and used for locating events in an automatically generated bulletin of events. This automatic event bulletin is routinely reviewed by analysts during the interactive review process. However, the fusion of infrasound data with the other waveform technologies has only recently (in early 2010) become part of the IDC operational system, after a software development and testing period that began in 2004. The build-up of the IMS infrasound network, the recent developments of the IDC infrasound software, and the progress accomplished during the last decade in the domain of real-time atmospheric modelling have allowed better understanding of infrasound signals and identification of a growing data set of ground-truth sources. These infragenic sources originate from natural or man-made sources. Some of the detected signals are emitted by local or regional phenomena recorded by a single IMS infrasound station: man-made cultural activity, wind farms, aircraft, artillery exercises, ocean surf, thunderstorms, rumbling volcanoes, iceberg calving, aurora, avalanches. Other signals may be recorded by several IMS infrasound stations at larger distances: ocean swell, sonic booms, and mountain associated waves. Only a small fraction of events meet the event definition criteria considering the Treaty verification mission of the Organization. Candidate event types for the IDC Reviewed Event Bulletin include atmospheric or surface explosions, meteor explosions, rocket launches, signals from large earthquakes and explosive volcanic eruptions.

  7. Fast disk array for image storage

    NASA Astrophysics Data System (ADS)

    Feng, Dan; Zhu, Zhichun; Jin, Hai; Zhang, Jiangling

    1997-01-01

    A fast disk array is designed for the large continuous image storage. It includes a high speed data architecture and the technology of data striping and organization on the disk array. The high speed data path which is constructed by two dual port RAM and some control circuit is configured to transfer data between a host system and a plurality of disk drives. The bandwidth can be more than 100 MB/s if the data path based on PCI (peripheral component interconnect). The organization of data stored on the disk array is similar to RAID 4. Data are striped on a plurality of disk, and each striping unit is equal to a track. I/O instructions are performed in parallel on the disk drives. An independent disk is used to store the parity information in the fast disk array architecture. By placing the parity generation circuit directly on the SCSI (or SCSI 2) bus, the parity information can be generated on the fly. It will affect little on the data writing in parallel on the other disks. The fast disk array architecture designed in the paper can meet the demands of the image storage.

  8. Assessing copy number from exome sequencing and exome array CGH based on CNV spectrum in a large clinical cohort.

    PubMed

    Retterer, Kyle; Scuffins, Julie; Schmidt, Daniel; Lewis, Rachel; Pineda-Alvarez, Daniel; Stafford, Amanda; Schmidt, Lindsay; Warren, Stephanie; Gibellini, Federica; Kondakova, Anastasia; Blair, Amanda; Bale, Sherri; Matyakhina, Ludmila; Meck, Jeanne; Aradhya, Swaroop; Haverfield, Eden

    2015-08-01

    Detection of copy-number variation (CNV) is important for investigating many genetic disorders. Testing a large clinical cohort by array comparative genomic hybridization provides a deep perspective on the spectrum of pathogenic CNV. In this context, we describe a bioinformatics approach to extract CNV information from whole-exome sequencing and demonstrate its utility in clinical testing. Exon-focused arrays and whole-genome chromosomal microarray analysis were used to test 14,228 and 14,000 individuals, respectively. Based on these results, we developed an algorithm to detect deletions/duplications in whole-exome sequencing data and a novel whole-exome array. In the exon array cohort, we observed a positive detection rate of 2.4% (25 duplications, 318 deletions), of which 39% involved one or two exons. Chromosomal microarray analysis identified 3,345 CNVs affecting single genes (18%). We demonstrate that our whole-exome sequencing algorithm resolves CNVs of three or more exons. These results demonstrate the clinical utility of single-exon resolution in CNV assays. Our whole-exome sequencing algorithm approaches this resolution but is complemented by a whole-exome array to unambiguously identify intragenic CNVs and single-exon changes. These data illustrate the next advancements in CNV analysis through whole-exome sequencing and whole-exome array.Genet Med 17 8, 623-629.

  9. Sensor Fusion Techniques for Phased-Array Eddy Current and Phased-Array Ultrasound Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arrowood, Lloyd F.

    Sensor (or Data) fusion is the process of integrating multiple data sources to produce more consistent, accurate and comprehensive information than is provided by a single data source. Sensor fusion may also be used to combine multiple signals from a single modality to improve the performance of a particular inspection technique. Industrial nondestructive testing may utilize multiple sensors to acquire inspection data depending upon the object under inspection and the anticipated types of defects that can be identified. Sensor fusion can be performed at various levels of signal abstraction with each having its strengths and weaknesses. A multimodal data fusionmore » strategy first proposed by Heideklang and Shokouhi that combines spatially scattered detection locations to improve detection performance of surface-breaking and near-surface cracks in ferromagnetic metals is shown using a surface inspection example and is then extended for volumetric inspections. Utilizing data acquired from an Olympus Omniscan MX2 from both phased array eddy current and ultrasound probes on test phantoms, single and multilevel fusion techniques are employed to integrate signals from the two modalities. Preliminary results demonstrate how confidence in defect identification and interpretation benefit from sensor fusion techniques. Lastly, techniques for integrating data into radiographic and volumetric imagery from computed tomography are described and results are presented.« less

  10. Very low-depth sequencing in a founder population identifies a cardioprotective APOC3 signal missed by genome-wide imputation.

    PubMed

    Gilly, Arthur; Ritchie, Graham Rs; Southam, Lorraine; Farmaki, Aliki-Eleni; Tsafantakis, Emmanouil; Dedoussis, George; Zeggini, Eleftheria

    2016-06-01

    Cohort-wide very low-depth whole-genome sequencing (WGS) can comprehensively capture low-frequency sequence variation for the cost of a dense genome-wide genotyping array. Here, we analyse 1x sequence data across the APOC3 gene in a founder population from the island of Crete in Greece (n = 1239) and find significant evidence for association with blood triglyceride levels with the previously reported R19X cardioprotective null mutation (β = -1.09,σ = 0.163, P = 8.2 × 10 -11 ) and a second loss of function mutation, rs138326449 (β = -1.17,σ = 0.188, P = 1.14 × 10 -9 ). The signal cannot be recapitulated by imputing genome-wide genotype data on a large reference panel of 5122 individuals including 249 with 4x WGS data from the same population. Gene-level meta-analysis with other studies reporting burden signals at APOC3 provides robust evidence for a replicable cardioprotective rare variant aggregation (P = 3.2 × 10 -31 , n = 13 480). © The Author 2016. Published by Oxford University Press.

  11. Very low-depth sequencing in a founder population identifies a cardioprotective APOC3 signal missed by genome-wide imputation

    PubMed Central

    Gilly, Arthur; Ritchie, Graham Rs; Southam, Lorraine; Farmaki, Aliki-Eleni; Tsafantakis, Emmanouil; Dedoussis, George; Zeggini, Eleftheria

    2016-01-01

    Cohort-wide very low-depth whole-genome sequencing (WGS) can comprehensively capture low-frequency sequence variation for the cost of a dense genome-wide genotyping array. Here, we analyse 1x sequence data across the APOC3 gene in a founder population from the island of Crete in Greece (n = 1239) and find significant evidence for association with blood triglyceride levels with the previously reported R19X cardioprotective null mutation (β = −1.09,σ = 0.163, P = 8.2 × 10−11) and a second loss of function mutation, rs138326449 (β = −1.17,σ = 0.188, P = 1.14 × 10−9). The signal cannot be recapitulated by imputing genome-wide genotype data on a large reference panel of 5122 individuals including 249 with 4x WGS data from the same population. Gene-level meta-analysis with other studies reporting burden signals at APOC3 provides robust evidence for a replicable cardioprotective rare variant aggregation (P = 3.2 × 10−31, n = 13 480). PMID:27146844

  12. Plasma Interactions with High Voltage Solar Arrays for a Direct Drive Hall Effect Thruster System

    NASA Technical Reports Server (NTRS)

    Schneider, T.; Horvater, M. A.; Vaughn, J.; Carruth, M. R.; Jongeward, G. A.; Mikellides, I. G.

    2003-01-01

    The Environmental Effects Group of NASA s Marshall Space Flight Center (MSFC) is conducting research into the effects of plasma interaction with high voltage solar arrays. These high voltage solar arrays are being developed for a direct drive Hall Effect Thruster propulsion system. A direct drive system configuration will reduce power system mass by eliminating a conventional power-processing unit. The Environmental Effects Group has configured two large vacuum chambers to test different high-voltage array concepts in a plasma environment. Three types of solar arrays have so far been tested, an International Space Station (ISS) planar array, a Tecstar planar array, and a Tecstar solar concentrator array. The plasma environment was generated using a hollow cathode plasma source, which yielded densities between 10(exp 6) - 10(exp 7) per cubic centimeter and electron temperatures of 0.5-1 eV. Each array was positioned in this plasma and biased in the -500 to + 500 volt range. The current collection was monitored continuously. In addition, the characteristics of arcing, snap over, and other features, were recorded. Analysis of the array performance indicates a time dependence associated with the current collection as well as a tendency for "conditioning" over a large number of runs. Mitigation strategies, to reduce parasitic current collection, as well as arcing, include changing cover-glass geometry and layout as well as shielding the solar cell edges. High voltage performance data for each of the solar array types tested will be presented. In addition, data will be provided to indicate the effectiveness of the mitigation techniques.

  13. Hydrogen Epoch of Reinozation Array (HERA) Calibrated FFT Correlator Simulation

    NASA Astrophysics Data System (ADS)

    Salazar, Jeffrey David; Parsons, Aaron

    2018-01-01

    The Hydrogen Epoch of Reionization Array (HERA) project is an astronomical radio interferometer array with a redundant baseline configuration. Interferometer arrays are being used widely in radio astronomy because they have a variety of advantages over single antenna systems. For example, they produce images (visibilities) closely matching that of a large antenna (such as the Arecibo observatory), while both the hardware and maintenance costs are significantly lower. However, this method has some complications; one being the computational cost of correlating data from all of the antennas. A correlator is an electronic device that cross-correlates the data between the individual antennas; these are what radio astronomers call visibilities. HERA, being in its early stages, utilizes a traditional correlator system. The correlator cost scales as N2, where N is the number of antennas in the array. The purpose of a redundant baseline configuration array setup is for the use of a more efficient Fast Fourier Transform (FFT) correlator. FFT correlators scale as Nlog2N. The data acquired from this sort of setup, however, inherits geometric delay and uncalibrated antenna gains. This particular project simulates the process of calibrating signals from astronomical sources. Each signal “received” by an antenna in the simulation is given random antenna gain and geometric delay. The “linsolve” Python module was used to solve for the unknown variables in the simulation (complex gains and delays), which then gave a value for the true visibilities. This first version of the simulation only mimics a one dimensional redundant telescope array detecting a small amount of sources located in the volume above the antenna plane. Future versions, using GPUs, will handle a two dimensional redundant array of telescopes detecting a large amount of sources in the volume above the array.

  14. Hanford Site Anuran Monitoring Report for Calendar Year 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilde, Justin W.; Johnson, Scott J.; Lindsey, Cole T.

    2014-02-13

    The U.S. Department of Energy, Richland Operations Office (DOE-RL) conducts ecological monitoring on the Hanford Site to collect and track data needed to ensure compliance with an array of environmental laws, regulations, and policies governing DOE activities. Ecological monitoring data provide baseline information about the plants, animals, and habitat under DOE-RL stewardship at Hanford required for decision-making under the National Environmental Policy Act (NEPA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). The Hanford Site Comprehensive Land Use Plan (CLUP, DOE/EIS-0222-F) which is the Environmental Impact Statement for Hanford Site activities, helps ensure that DOE-RL, its contractors, and othermore » entities conducting activities on the Hanford Site are in compliance with NEPA.« less

  15. Gamma-Ray Background Variability in Mobile Detectors

    NASA Astrophysics Data System (ADS)

    Aucott, Timothy John

    Gamma-ray background radiation significantly reduces detection sensitivity when searching for radioactive sources in the field, such as in wide-area searches for homeland security applications. Mobile detector systems in particular must contend with a variable background that is not necessarily known or even measurable a priori. This work will present measurements of the spatial and temporal variability of the background, with the goal of merging gamma-ray detection, spectroscopy, and imaging with contextual information--a "nuclear street view" of the ubiquitous background radiation. The gamma-ray background originates from a variety of sources, both natural and anthropogenic. The dominant sources in the field are the primordial isotopes potassium-40, uranium-238, and thorium-232, as well as their decay daughters. In addition to the natural background, many artificially-created isotopes are used for industrial or medical purposes, and contamination from fission products can be found in many environments. Regardless of origin, these backgrounds will reduce detection sensitivity by adding both statistical as well as systematic uncertainty. In particular, large detector arrays will be limited by the systematic uncertainty in the background and will suffer from a high rate of false alarms. The goal of this work is to provide a comprehensive characterization of the gamma-ray background and its variability in order to improve detection sensitivity and evaluate the performance of mobile detectors in the field. Large quantities of data are measured in order to study their performance at very low false alarm rates. Two different approaches, spectroscopy and imaging, are compared in a controlled study in the presence of this measured background. Furthermore, there is additional information that can be gained by correlating the gamma-ray data with contextual data streams (such as cameras and global positioning systems) in order to reduce the variability in the background. This is accomplished by making many hours of background measurements with a truck-mounted system, which utilizes high-purity germanium detectors for spectroscopy and sodium iodide detectors for coded aperture imaging. This system also utilizes various peripheral sensors, such as panoramic cameras, laser ranging systems, global positioning systems, and a weather station to provide context for the gamma-ray data. About three hundred hours of data were taken in the San Francisco Bay Area, covering a wide variety of environments that might be encountered in operational scenarios. These measurements were used in a source injection study to evaluate the sensitivity of different algorithms (imaging and spectroscopy) and hardware (sodium iodide and high-purity germanium detectors). These measurements confirm that background distributions in large, mobile detector systems are dominated by systematic, not statistical variations, and both spectroscopy and imaging were found to substantially reduce this variability. Spectroscopy performed better than the coded aperture for the given scintillator array (one square meter of sodium iodide) for a variety of sources and geometries. By modeling the statistical and systematic uncertainties of the background, the data can be sampled to simulate the performance of a detector array of arbitrary size and resolution. With a larger array or lower resolution detectors, however imaging was better able to compensate for background variability.

  16. Minfi: a flexible and comprehensive Bioconductor package for the analysis of Infinium DNA methylation microarrays

    PubMed Central

    Aryee, Martin J.; Jaffe, Andrew E.; Corrada-Bravo, Hector; Ladd-Acosta, Christine; Feinberg, Andrew P.; Hansen, Kasper D.; Irizarry, Rafael A.

    2014-01-01

    Motivation: The recently released Infinium HumanMethylation450 array (the ‘450k’ array) provides a high-throughput assay to quantify DNA methylation (DNAm) at ∼450 000 loci across a range of genomic features. Although less comprehensive than high-throughput sequencing-based techniques, this product is more cost-effective and promises to be the most widely used DNAm high-throughput measurement technology over the next several years. Results: Here we describe a suite of computational tools that incorporate state-of-the-art statistical techniques for the analysis of DNAm data. The software is structured to easily adapt to future versions of the technology. We include methods for preprocessing, quality assessment and detection of differentially methylated regions from the kilobase to the megabase scale. We show how our software provides a powerful and flexible development platform for future methods. We also illustrate how our methods empower the technology to make discoveries previously thought to be possible only with sequencing-based methods. Availability and implementation: http://bioconductor.org/packages/release/bioc/html/minfi.html. Contact: khansen@jhsph.edu; rafa@jimmy.harvard.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24478339

  17. SAR processing on the MPP

    NASA Technical Reports Server (NTRS)

    Batcher, K. E.; Eddey, E. E.; Faiss, R. O.; Gilmore, P. A.

    1981-01-01

    The processing of synthetic aperture radar (SAR) signals using the massively parallel processor (MPP) is discussed. The fast Fourier transform convolution procedures employed in the algorithms are described. The MPP architecture comprises an array unit (ARU) which processes arrays of data; an array control unit which controls the operation of the ARU and performs scalar arithmetic; a program and data management unit which controls the flow of data; and a unique staging memory (SM) which buffers and permutes data. The ARU contains a 128 by 128 array of bit-serial processing elements (PE). Two-by-four surarrays of PE's are packaged in a custom VLSI HCMOS chip. The staging memory is a large multidimensional-access memory which buffers and permutes data flowing with the system. Efficient SAR processing is achieved via ARU communication paths and SM data manipulation. Real time processing capability can be realized via a multiple ARU, multiple SM configuration.

  18. Big data analytics workflow management for eScience

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the real challenge in many practical scientific use cases. This talk will specifically address the main needs, requirements and challenges regarding data analytics workflow management applied to large scientific datasets. Three real use cases concerning analytics workflows for sea situational awareness, fire danger prevention, climate change and biodiversity will be discussed in detail.

  19. The effect of operations on the ground noise footprints associated with a large multibladed, nonbanging helicopter

    NASA Technical Reports Server (NTRS)

    Hilton, D. A.; Henderson, H. R.; Maglieri, D. J.; Bigler, W. B., II

    1978-01-01

    In order to expand the data base of helicopter external noise characteristics, a flyover noise measurement program was conducted utilizing the NASA Civil Helicopter Research Aircraft. The remotely operated multiple array acoustics range (ROMAAR) and a 2560-m linear microphone array were utilized for the purpose of documenting the noise characteristics of the test helicopter during flyby and landing operations. By utilizing both ROMAAR concept and the linear array, the data necessary to plot the ground noise footprints and noise radiation patterns were obtained. Examples of the measured noise signature of the test helicopter, the ground noise footprint or contours, and the directivity patterns measured during level flyby and landing operations of a large, multibladed, nonbanging helicopter, the CH-53, are presented.

  20. Neural data science: accelerating the experiment-analysis-theory cycle in large-scale neuroscience.

    PubMed

    Paninski, L; Cunningham, J P

    2018-06-01

    Modern large-scale multineuronal recording methodologies, including multielectrode arrays, calcium imaging, and optogenetic techniques, produce single-neuron resolution data of a magnitude and precision that were the realm of science fiction twenty years ago. The major bottlenecks in systems and circuit neuroscience no longer lie in simply collecting data from large neural populations, but also in understanding this data: developing novel scientific questions, with corresponding analysis techniques and experimental designs to fully harness these new capabilities and meaningfully interrogate these questions. Advances in methods for signal processing, network analysis, dimensionality reduction, and optimal control-developed in lockstep with advances in experimental neurotechnology-promise major breakthroughs in multiple fundamental neuroscience problems. These trends are clear in a broad array of subfields of modern neuroscience; this review focuses on recent advances in methods for analyzing neural time-series data with single-neuronal precision. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. SEURAT: Visual analytics for the integrated analysis of microarray data

    PubMed Central

    2010-01-01

    Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data. PMID:20525257

  2. Collection and Analysis of Ground Truth Infrasound Data in Kazakhstan and Russia

    DTIC Science & Technology

    2006-05-01

    Infrasound signals generated by large mining explosions at Ekibastuz coal mines in Northern Kazakstan have been detected by a 4-element infrasound array ...380 km) and Kokchetav (distance=74 km). Detection of infrasound signals at these distance ranges at mid-latitude (50 degrees N), suggests the... infrasound array , contour plot of beam power and array beam trace .............................. 9 5 Infrasound signals from the

  3. DALiuGE: A graph execution framework for harnessing the astronomical data deluge

    NASA Astrophysics Data System (ADS)

    Wu, C.; Tobar, R.; Vinsen, K.; Wicenec, A.; Pallot, D.; Lao, B.; Wang, R.; An, T.; Boulton, M.; Cooper, I.; Dodson, R.; Dolensky, M.; Mei, Y.; Wang, F.

    2017-07-01

    The Data Activated Liu Graph Engine - DALiuGE- is an execution framework for processing large astronomical datasets at a scale required by the Square Kilometre Array Phase 1 (SKA1). It includes an interface for expressing complex data reduction pipelines consisting of both datasets and algorithmic components and an implementation run-time to execute such pipelines on distributed resources. By mapping the logical view of a pipeline to its physical realisation, DALiuGE separates the concerns of multiple stakeholders, allowing them to collectively optimise large-scale data processing solutions in a coherent manner. The execution in DALiuGE is data-activated, where each individual data item autonomously triggers the processing on itself. Such decentralisation also makes the execution framework very scalable and flexible, supporting pipeline sizes ranging from less than ten tasks running on a laptop to tens of millions of concurrent tasks on the second fastest supercomputer in the world. DALiuGE has been used in production for reducing interferometry datasets from the Karl E. Jansky Very Large Array and the Mingantu Ultrawide Spectral Radioheliograph; and is being developed as the execution framework prototype for the Science Data Processor (SDP) consortium of the Square Kilometre Array (SKA) telescope. This paper presents a technical overview of DALiuGE and discusses case studies from the CHILES and MUSER projects that use DALiuGE to execute production pipelines. In a companion paper, we provide in-depth analysis of DALiuGE's scalability to very large numbers of tasks on two supercomputing facilities.

  4. Los Alamos National Laboratory Economic Analysis Capability Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boero, Riccardo; Edwards, Brian Keith; Pasqualini, Donatella

    Los Alamos National Laboratory has developed two types of models to compute the economic impact of infrastructure disruptions. FastEcon is a fast running model that estimates first-­order economic impacts of large scale events such as hurricanes and floods and can be used to identify the amount of economic activity that occurs in a specific area. LANL’s Computable General Equilibrium (CGE) model estimates more comprehensive static and dynamic economic impacts of a broader array of events and captures the interactions between sectors and industries when estimating economic impacts.

  5. Velocity Model Using the Large-N Seismic Array from the Source Physics Experiment (SPE)

    NASA Astrophysics Data System (ADS)

    Chen, T.; Snelson, C. M.

    2016-12-01

    The Source Physics Experiment (SPE) is a multi-institutional, multi-disciplinary project that consists of a series of chemical explosions conducted at the Nevada National Security Site (NNSS). The goal of SPE is to understand the complicated effect of geological structures on seismic wave propagation and source energy partitioning, develop and validate physics-based modeling, and ultimately better monitor low-yield nuclear explosions. A Large-N seismic array was deployed at the SPE site to image the full 3D wavefield from the most recent SPE-5 explosion on April 26, 2016. The Large-N seismic array consists of 996 geophones (half three-component and half vertical-component sensors), and operated for one month, recording the SPE-5 shot, ambient noise, and additional controlled-sources (a large hammer). This study uses Large-N array recordings of the SPE-5 chemical explosion to develop high resolution images of local geologic structures. We analyze different phases of recorded seismic data and construct a velocity model based on arrival times. The results of this study will be incorporated into the large modeling and simulation efforts as ground-truth further validating the models.

  6. Theory of tailorable optical response of two-dimensional arrays of plasmonic nanoparticles at dielectric interfaces.

    PubMed

    Sikdar, Debabrata; Kornyshev, Alexei A

    2016-09-22

    Two-dimensional arrays of plasmonic nanoparticles at interfaces are promising candidates for novel optical metamaterials. Such systems materialise from 'top-down' patterning or 'bottom-up' self-assembly of nanoparticles at liquid/liquid or liquid/solid interfaces. Here, we present a comprehensive analysis of an extended effective quasi-static four-layer-stack model for the description of plasmon-resonance-enhanced optical responses of such systems. We investigate in detail the effects of the size of nanoparticles, average interparticle separation, dielectric constants of the media constituting the interface, and the nanoparticle position relative to the interface. Interesting interplays of these different factors are explored first for normally incident light. For off-normal incidence, the strong effects of the polarisation of light are found at large incident angles, which allows to dynamically tune the reflectance spectra. All the predictions of the theory are tested against full-wave simulations, proving this simplistic model to be adequate within the quasi-static limit. The model takes seconds to calculate the system's optical response and makes it easy to unravel the effect of each system parameter. This helps rapid rationalization of experimental data and understanding of the optical signals from these novel 'metamaterials', optimised for light reflection or harvesting.

  7. Theory of tailorable optical response of two-dimensional arrays of plasmonic nanoparticles at dielectric interfaces

    PubMed Central

    Sikdar, Debabrata; Kornyshev, Alexei A.

    2016-01-01

    Two-dimensional arrays of plasmonic nanoparticles at interfaces are promising candidates for novel optical metamaterials. Such systems materialise from ‘top–down’ patterning or ‘bottom–up’ self-assembly of nanoparticles at liquid/liquid or liquid/solid interfaces. Here, we present a comprehensive analysis of an extended effective quasi-static four-layer-stack model for the description of plasmon-resonance-enhanced optical responses of such systems. We investigate in detail the effects of the size of nanoparticles, average interparticle separation, dielectric constants of the media constituting the interface, and the nanoparticle position relative to the interface. Interesting interplays of these different factors are explored first for normally incident light. For off-normal incidence, the strong effects of the polarisation of light are found at large incident angles, which allows to dynamically tune the reflectance spectra. All the predictions of the theory are tested against full-wave simulations, proving this simplistic model to be adequate within the quasi-static limit. The model takes seconds to calculate the system’s optical response and makes it easy to unravel the effect of each system parameter. This helps rapid rationalization of experimental data and understanding of the optical signals from these novel ‘metamaterials’, optimised for light reflection or harvesting. PMID:27652788

  8. Searches for correlation between UHECR events and high-energy gamma-ray Fermi-LAT data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Álvarez, Ezequiel; Cuoco, Alessandro; Mirabal, Nestor

    The astrophysical sources responsible for ultra high-energy cosmic rays (UHECRs) continue to be one of the most intriguing mysteries in astrophysics. We present a comprehensive search for correlations between high-energy (∼> 1 GeV) gamma-ray events from the Fermi Large Area Telescope (LAT) and UHECRs (∼> 60 EeV) detected by the Telescope Array and the Pierre Auger Observatory. We perform two separate searches. First, we conduct a standard cross-correlation analysis between the arrival directions of 148 UHECRs and 360 gamma-ray sources in the Second Catalog of Hard Fermi-LAT sources (2FHL). Second, we search for a possible correlation between UHECR directions andmore » unresolved Fermi -LAT gamma-ray emission. For the latter, we use three different methods: a stacking technique with both a model-dependent and model-independent background estimate, and a cross-correlation function analysis. We also test for statistically significant excesses in gamma rays from signal regions centered on Cen A and the Telescope Array hotspot. No significant correlation is found in any of the analyses performed, except a weak (∼< 2σ) hint of signal with the correlation function method on scales ∼ 1°. Upper limits on the flux of possible power-law gamma-ray sources of UHECRs are derived.« less

  9. Searches for correlation between UHECR events and high-energy gamma-ray Fermi-LAT data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Álvarez, Ezequiel; Cuoco, Alessandro; Mirabal, Nestor

    The astrophysical sources responsible for ultra high-energy cosmic rays (UHECRs) continue to be one of the most intriguing mysteries in astrophysics. Here, we present a comprehensive search for correlations between high-energy (≳ 1 GeV) gamma-ray events from the Fermi Large Area Telescope (LAT) and UHECRs (≳ 60 EeV) detected by the Telescope Array and the Pierre Auger Observatory. We perform two separate searches. First, we conduct a standard cross-correlation analysis between the arrival directions of 148 UHECRs and 360 gamma-ray sources in the Second Catalog of Hard Fermi-LAT sources (2FHL). Second, we search for a possible correlation between UHECR directionsmore » and unresolved Fermi-LAT gamma-ray emission. For the latter, we use three different methods: a stacking technique with both a model-dependent and model-independent background estimate, and a cross-correlation function analysis. We also test for statistically significant excesses in gamma rays from signal regions centered on Cen A and the Telescope Array hotspot. There was no significant correlation is found in any of the analyses performed, except a weak (≲ 2σ) hint of signal with the correlation function method on scales ~ 1°. Upper limits on the flux of possible power-law gamma-ray sources of UHECRs are derived.« less

  10. Searches for correlation between UHECR events and high-energy gamma-ray Fermi-LAT data

    DOE PAGES

    Álvarez, Ezequiel; Cuoco, Alessandro; Mirabal, Nestor; ...

    2016-12-13

    The astrophysical sources responsible for ultra high-energy cosmic rays (UHECRs) continue to be one of the most intriguing mysteries in astrophysics. Here, we present a comprehensive search for correlations between high-energy (≳ 1 GeV) gamma-ray events from the Fermi Large Area Telescope (LAT) and UHECRs (≳ 60 EeV) detected by the Telescope Array and the Pierre Auger Observatory. We perform two separate searches. First, we conduct a standard cross-correlation analysis between the arrival directions of 148 UHECRs and 360 gamma-ray sources in the Second Catalog of Hard Fermi-LAT sources (2FHL). Second, we search for a possible correlation between UHECR directionsmore » and unresolved Fermi-LAT gamma-ray emission. For the latter, we use three different methods: a stacking technique with both a model-dependent and model-independent background estimate, and a cross-correlation function analysis. We also test for statistically significant excesses in gamma rays from signal regions centered on Cen A and the Telescope Array hotspot. There was no significant correlation is found in any of the analyses performed, except a weak (≲ 2σ) hint of signal with the correlation function method on scales ~ 1°. Upper limits on the flux of possible power-law gamma-ray sources of UHECRs are derived.« less

  11. DETECTION OF FAST RADIO TRANSIENTS WITH MULTIPLE STATIONS: A CASE STUDY USING THE VERY LONG BASELINE ARRAY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, David R.; Wagstaff, Kiri L.; Majid, Walid A.

    2011-07-10

    Recent investigations reveal an important new class of transient radio phenomena that occur on submillisecond timescales. Often, transient surveys' data volumes are too large to archive exhaustively. Instead, an online automatic system must excise impulsive interference and detect candidate events in real time. This work presents a case study using data from multiple geographically distributed stations to perform simultaneous interference excision and transient detection. We present several algorithms that incorporate dedispersed data from multiple sites, and report experiments with a commensal real-time transient detection system on the Very Long Baseline Array. We test the system using observations of pulsar B0329+54.more » The multiple-station algorithms enhanced sensitivity for detection of individual pulses. These strategies could improve detection performance for a future generation of geographically distributed arrays such as the Australian Square Kilometre Array Pathfinder and the Square Kilometre Array.« less

  12. Reliability of high-power QCW arrays

    NASA Astrophysics Data System (ADS)

    Feeler, Ryan; Junghans, Jeremy; Remley, Jennifer; Schnurbusch, Don; Stephens, Ed

    2010-02-01

    Northrop Grumman Cutting Edge Optronics has developed a family of arrays for high-power QCW operation. These arrays are built using CTE-matched heat sinks and hard solder in order to maximize the reliability of the devices. A summary of a recent life test is presented in order to quantify the reliability of QCW arrays and associated laser gain modules. A statistical analysis of the raw lifetime data is presented in order to quantify the data in such a way that is useful for laser system designers. The life tests demonstrate the high level of reliability of these arrays in a number of operating regimes. For single-bar arrays, a MTTF of 19.8 billion shots is predicted. For four-bar samples, a MTTF of 14.6 billion shots is predicted. In addition, data representing a large pump source is analyzed and shown to have an expected lifetime of 13.5 billion shots. This corresponds to an expected operational lifetime of greater than ten thousand hours at repetition rates less than 370 Hz.

  13. Implications from Meteoric and Volcanic Infrasound Measured in the Netherlands

    NASA Astrophysics Data System (ADS)

    Evers, L.

    2003-12-01

    Infrasound observations started in the Netherlands in 1986. Since then, several array configurations and instruments have been developed, tested and made operational. Currently, three infrasound arrays are continuously measuring infrasound with in-house developed microbarometers. The array apertures vary from 30 to 1500 meters and the number of instruments from 6 to 16 microbarometers. The inter-array distance ranges from 50 up to 150 km. This dense network of infrasound arrays is used to distinguish between earthquakes and sources in the atmosphere. Sonic booms, for example, can be experienced in the same manner as small (gas induced) earthquakes. Furthermore, Comprehensive Nuclear-Test-Ban Treaty (CTBT) related research is done. Meteors are one of the few natural impulsive sources generating energy in kT TNT equivalent range. Therefore, the study of meteors is essential to the CTBT where infrasound is applied as monitoring technique. Studies of meteors in the Netherlands have shown the capability of infrasound to trace a meteor through the stratosphere. The propagation of infrasound is in first order dependent on the wind and temperature structure of the atmosphere. The meteor's path could be reconstructed by using ECMWF atmospheric models for wind and temperature. The results were compared to visual observations, confirming the location, direction and reported origin time. The accuracy of the localization mainly depends on the applied atmospheric model and array resolution. Successfully applying infrasound depends on the array configuration that should be based on the -frequency depend- spatial coherence of the signals of interest. The array aperture and inter-element distance will play a decisive role in detecting low signal-to-noise ratios. This is shown by results from studies on volcanic infrasound from Mt. Etna (Italy) detected in the Netherlands. Sub-array processing on the 16 element array revealed an increased detectability of infrasound for small aperture, 800 m, arrays, compared to large aperture, 1500 m, arrays.

  14. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-01-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  15. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-03-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  16. Antennas for the array-based Deep Space Network: current status and future designs

    NASA Technical Reports Server (NTRS)

    Imbriale, William A.; Gama, Eric

    2005-01-01

    Development of very large arrays1,2 of small antennas has been proposed as a way to increase the downlink capability of the NASA Deep Space Network DSN) by two or three orders of magnitude thereby enabling greatly increased science data from currently configured missions or enabling new mission concepts. The current concept is for an array of 400 x 12-m antennas at each of three longitudes. The DSN array will utilize radio astronomy sources for phase calibration and will have wide bandwidth correlation processing for this purpose. NASA has undertaken a technology program to prove the performance and cost of a very large DSN array. Central to that program is a 3-element interferometer to be completed in 2005. This paper describes current status of the low cost 6-meter breadboard antenna to be used as part of the interferometer and the RF design of the 12-meter antenna.

  17. Factors affecting the performance of large-aperture microphone arrays.

    PubMed

    Silverman, Harvey F; Patterson, William R; Sachar, Joshua

    2002-05-01

    Large arrays of microphones have been proposed and studied as a possible means of acquiring data in offices, conference rooms, and auditoria without requiring close-talking microphones. When such an array essentially surrounds all possible sources, it is said to have a large aperture. Large-aperture arrays have attractive properties of spatial resolution and signal-to-noise enhancement. This paper presents a careful comparison of theoretical and measured performance for an array of 256 microphones using simple delay-and-sum beamforming. This is the largest currently functional, all digital-signal-processing array that we know of. The array is wall-mounted in the moderately adverse environment of a general-purpose laboratory (8 m x 8 m x 3 m). The room has a T60 reverberation time of 550 ms. Reverberation effects in this room severely impact the array's performance. However, the width of the main lobe remains comparable to that of a simplified prediction. Broadband spatial resolution shows a single central peak with 10 dB gain about 0.4 m in diameter at the -3 dB level. Away from that peak, the response is approximately flat over most of the room. Optimal weighting for signal-to-noise enhancement degrades the spatial resolution minimally. Experimentally, we verify that signal-to-noise gain is less than proportional to the square root of the number of microphones probably due to the partial correlation of the noise between channels, to variation of signal intensity with polar angle about the source, and to imperfect correlation of the signal over the array caused by reverberations. We show measurements of the relative importance of each effect in our environment.

  18. Factors affecting the performance of large-aperture microphone arrays

    NASA Astrophysics Data System (ADS)

    Silverman, Harvey F.; Patterson, William R.; Sachar, Joshua

    2002-05-01

    Large arrays of microphones have been proposed and studied as a possible means of acquiring data in offices, conference rooms, and auditoria without requiring close-talking microphones. When such an array essentially surrounds all possible sources, it is said to have a large aperture. Large-aperture arrays have attractive properties of spatial resolution and signal-to-noise enhancement. This paper presents a careful comparison of theoretical and measured performance for an array of 256 microphones using simple delay-and-sum beamforming. This is the largest currently functional, all digital-signal-processing array that we know of. The array is wall-mounted in the moderately adverse environment of a general-purpose laboratory (8 m×8 m×3 m). The room has a T60 reverberation time of 550 ms. Reverberation effects in this room severely impact the array's performance. However, the width of the main lobe remains comparable to that of a simplified prediction. Broadband spatial resolution shows a single central peak with 10 dB gain about 0.4 m in diameter at the -3 dB level. Away from that peak, the response is approximately flat over most of the room. Optimal weighting for signal-to-noise enhancement degrades the spatial resolution minimally. Experimentally, we verify that signal-to-noise gain is less than proportional to the square root of the number of microphones probably due to the partial correlation of the noise between channels, to variation of signal intensity with polar angle about the source, and to imperfect correlation of the signal over the array caused by reverberations. We show measurements of the relative importance of each effect in our environment.

  19. Multi-terabyte EIDE disk arrays running Linux RAID5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanders, D.A.; Cremaldi, L.M.; Eschenburg, V.

    2004-11-01

    High-energy physics experiments are currently recording large amounts of data and in a few years will be recording prodigious quantities of data. New methods must be developed to handle this data and make analysis at universities possible. Grid Computing is one method; however, the data must be cached at the various Grid nodes. We examine some storage techniques that exploit recent developments in commodity hardware. Disk arrays using RAID level 5 (RAID-5) include both parity and striping. The striping improves access speed. The parity protects data in the event of a single disk failure, but not in the case ofmore » multiple disk failures. We report on tests of dual-processor Linux Software RAID-5 arrays and Hardware RAID-5 arrays using a 12-disk 3ware controller, in conjunction with 250 and 300 GB disks, for use in offline high-energy physics data analysis. The price of IDE disks is now less than $1/GB. These RAID-5 disk arrays can be scaled to sizes affordable to small institutions and used when fast random access at low cost is important.« less

  20. Large plasmids of Escherichia coli and Salmonella encode highly diverse arrays of accessory genes on common replicon families.

    PubMed

    Williams, Laura E; Wireman, Joy; Hilliard, Valda C; Summers, Anne O

    2013-01-01

    Plasmids are important in evolution and adaptation of host bacteria, yet we lack a comprehensive picture of their own natural variation. We used replicon typing and RFLP analysis to assess diversity and distribution of plasmids in the ECOR, SARA, SARB and SARC reference collections of Escherichia coli and Salmonella. Plasmids, especially large (≥30 kb) plasmids, are abundant in these collections. Host species and genotype clearly impact plasmid prevalence; plasmids are more abundant in ECOR than SAR, but, within ECOR, subgroup B2 strains have the fewest large plasmids. The majority of large plasmids have unique RFLP patterns, suggesting high variation, even within dominant replicon families IncF and IncI1. We found only four conserved plasmid types within ECOR, none of which are widely distributed. Within SAR, conserved plasmid types are primarily serovar-specific, including a pSLT-like plasmid in 13 Typhimurium strains. Conservation of pSLT contrasts with variability of other plasmids, suggesting evolution of serovar-specific virulence plasmids is distinct from that of most enterobacterial plasmids. We sequenced a conserved serovar Heidelberg plasmid but did not detect virulence or antibiotic resistance genes. Our data illustrate the high degree of natural variation in large plasmids of E. coli and Salmonella, even among plasmids sharing backbone genes. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. ImNet: a fiber optic network with multistar topology for high-speed data transmission

    NASA Astrophysics Data System (ADS)

    Vossebuerger, F.; Keizers, Andreas; Soederman, N.; Meyer-Ebrecht, Dietrich

    1993-10-01

    ImNet is a fiber-optic local area network, which has been developed for high speed image communication in Picture Archiving and Communication Systems (PACS). A comprehensive analysis of image communication requirements in hospitals led to the conclusion that there is a need for networks which are optimized for the transmission of large datafiles. ImNet is optimized for this application in contrast to current-state LANs. ImNet consists of two elements: a link module and a switch module. The point-to-point link module can be up to 4 km by using fiber optic cable. For short distances up to 100 m a cheaper module using shielded twisted pair cable is available. The link module works bi-directionally and handles all protocols up to OSI-Level 3. The data rate per link is up to 140 MBit/s (clock rate 175 MHz). The switch module consists of the control unit and the cross-point-switch array. The array has up to fourteen interfaces for link modules. Up to fourteen data transfers each with a maximal transfer rate of 400 MBit/s can be handled at the same time. Thereby the maximal throughput of a switch module is 5.6 GBit/s. Out of these modules a multi-star network can be built i.e., an arbitrary tree structure of stars. This topology allows multiple transmissions at the same time as long as they do not require identical links. Therefore the overall throughput of ImNet can be a multiple of the datarate per link.

  2. Effective data compaction algorithm for vector scan EB writing system

    NASA Astrophysics Data System (ADS)

    Ueki, Shinichi; Ashida, Isao; Kawahira, Hiroichi

    2001-01-01

    We have developed a new mask data compaction algorithm dedicated to vector scan electron beam (EB) writing systems for 0.13 μm device generation. Large mask data size has become a significant problem at mask data processing for which data compaction is an important technique. In our new mask data compaction, 'array' representation and 'cell' representation are used. The mask data format for the EB writing system with vector scan supports these representations. The array representation has a pitch and a number of repetitions in both X and Y direction. The cell representation has a definition of figure group and its reference. The new data compaction method has the following three steps. (1) Search arrays of figures by selecting pitches of array so that a number of figures are included. (2) Find out same arrays that have same repetitive pitch and number of figures. (3) Search cells of figures, where the figures in each cell take identical positional relationship. By this new method for the mask data of a 4M-DRAM block gate layer with peripheral circuits, 202 Mbytes without compaction was highly compacted to 6.7 Mbytes in 20 minutes on a 500 MHz PC.

  3. A 34K SNP genotyping array for Populus trichocarpa: design, application to the study of natural populations and transferability to other Populus species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geraldes, Armando; Hannemann, Jan; Grassa, Chris

    2013-01-01

    Genetic mapping of quantitative traits requires genotypic data for large numbers of markers in many individuals. Despite the declining costs of genotyping by sequencing, for most studies, the use of large SNP genotyping arrays still offers the most cost-effective solution for large-scale targeted genotyping. Here we report on the design and performance of a SNP genotyping array for Populus trichocarpa (black cottonwood). This genotyping array was designed with SNPs pre-ascertained in 34 wild accessions covering most of the species range. Due to the rapid decay of linkage disequilibrium in P. trichocarpa we adopted a candidate gene approach to the arraymore » design that resulted in the selection of 34,131 SNPs, the majority of which are located in, or within 2 kb, of 3,543 candidate genes. A subset of the SNPs (539) was selected based on patterns of variation among the SNP discovery accessions. We show that more than 95% of the loci produce high quality genotypes and that the genotyping error rate for these is likely below 2%, indicating that high-quality data are generated with this array. We demonstrate that even among small numbers of samples (n=10) from local populations over 84% of loci are polymorphic. We also tested the applicability of the array to other species in the genus and found that due to ascertainment bias the number of polymorphic loci decreases rapidly with genetic distance, with the largest numbers detected in other species in section Tacamahaca (P. balsamifera and P. angustifolia). Finally, we provide evidence for the utility of the array for intraspecific studies of genetic differentiation and for species assignment and the detection of natural hybrids.« less

  4. Oligonucleotide arrays vs. metaphase-comparative genomic hybridisation and BAC arrays for single-cell analysis: first applications to preimplantation genetic diagnosis for Robertsonian translocation carriers.

    PubMed

    Ramos, Laia; del Rey, Javier; Daina, Gemma; García-Aragonés, Manel; Armengol, Lluís; Fernandez-Encinas, Alba; Parriego, Mònica; Boada, Montserrat; Martinez-Passarell, Olga; Martorell, Maria Rosa; Casagran, Oriol; Benet, Jordi; Navarro, Joaquima

    2014-01-01

    Comprehensive chromosome analysis techniques such as metaphase-Comparative Genomic Hybridisation (CGH) and array-CGH are available for single-cell analysis. However, while metaphase-CGH and BAC array-CGH have been widely used for Preimplantation Genetic Diagnosis, oligonucleotide array-CGH has not been used in an extensive way. A comparison between oligonucleotide array-CGH and metaphase-CGH has been performed analysing 15 single fibroblasts from aneuploid cell-lines and 18 single blastomeres from human cleavage-stage embryos. Afterwards, oligonucleotide array-CGH and BAC array-CGH were also compared analysing 16 single blastomeres from human cleavage-stage embryos. All three comprehensive analysis techniques provided broadly similar cytogenetic profiles; however, non-identical profiles appeared when extensive aneuploidies were present in a cell. Both array techniques provided an optimised analysis procedure and a higher resolution than metaphase-CGH. Moreover, oligonucleotide array-CGH was able to define extra segmental imbalances in 14.7% of the blastomeres and it better determined the specific unbalanced chromosome regions due to a higher resolution of the technique (≈ 20 kb). Applicability of oligonucleotide array-CGH for Preimplantation Genetic Diagnosis has been demonstrated in two cases of Robertsonian translocation carriers 45,XY,der(13;14)(q10;q10). Transfer of euploid embryos was performed in both cases and pregnancy was achieved by one of the couples. This is the first time that an oligonucleotide array-CGH approach has been successfully applied to Preimplantation Genetic Diagnosis for balanced chromosome rearrangement carriers.

  5. Oligonucleotide Arrays vs. Metaphase-Comparative Genomic Hybridisation and BAC Arrays for Single-Cell Analysis: First Applications to Preimplantation Genetic Diagnosis for Robertsonian Translocation Carriers

    PubMed Central

    Ramos, Laia; del Rey, Javier; Daina, Gemma; García-Aragonés, Manel; Armengol, Lluís; Fernandez-Encinas, Alba; Parriego, Mònica; Boada, Montserrat; Martinez-Passarell, Olga; Martorell, Maria Rosa; Casagran, Oriol; Benet, Jordi; Navarro, Joaquima

    2014-01-01

    Comprehensive chromosome analysis techniques such as metaphase-Comparative Genomic Hybridisation (CGH) and array-CGH are available for single-cell analysis. However, while metaphase-CGH and BAC array-CGH have been widely used for Preimplantation Genetic Diagnosis, oligonucleotide array-CGH has not been used in an extensive way. A comparison between oligonucleotide array-CGH and metaphase-CGH has been performed analysing 15 single fibroblasts from aneuploid cell-lines and 18 single blastomeres from human cleavage-stage embryos. Afterwards, oligonucleotide array-CGH and BAC array-CGH were also compared analysing 16 single blastomeres from human cleavage-stage embryos. All three comprehensive analysis techniques provided broadly similar cytogenetic profiles; however, non-identical profiles appeared when extensive aneuploidies were present in a cell. Both array techniques provided an optimised analysis procedure and a higher resolution than metaphase-CGH. Moreover, oligonucleotide array-CGH was able to define extra segmental imbalances in 14.7% of the blastomeres and it better determined the specific unbalanced chromosome regions due to a higher resolution of the technique (≈20 kb). Applicability of oligonucleotide array-CGH for Preimplantation Genetic Diagnosis has been demonstrated in two cases of Robertsonian translocation carriers 45,XY,der(13;14)(q10;q10). Transfer of euploid embryos was performed in both cases and pregnancy was achieved by one of the couples. This is the first time that an oligonucleotide array-CGH approach has been successfully applied to Preimplantation Genetic Diagnosis for balanced chromosome rearrangement carriers. PMID:25415307

  6. A micro-machined source transducer for a parametric array in air.

    PubMed

    Lee, Haksue; Kang, Daesil; Moon, Wonkyu

    2009-04-01

    Parametric array applications in air, such as highly directional parametric loudspeaker systems, usually rely on large radiators to generate the high-intensity primary beams required for nonlinear interactions. However, a conventional transducer, as a primary wave projector, requires a great deal of electrical power because its electroacoustic efficiency is very low due to the large characteristic mechanical impedance in air. The feasibility of a micro-machined ultrasonic transducer as an efficient finite-amplitude wave projector was studied. A piezoelectric micro-machined ultrasonic transducer array consisting of lead zirconate titanate uni-morph elements was designed and fabricated for this purpose. Theoretical and experimental evaluations showed that a micro-machined ultrasonic transducer array can be used as an efficient source transducer for a parametric array in air. The beam patterns and propagation curves of the difference frequency wave and the primary wave generated by the micro-machined ultrasonic transducer array were measured. Although the theoretical results were based on ideal parametric array models, the theoretical data explained the experimental results reasonably well. These experiments demonstrated the potential of micro-machined primary wave projector.

  7. Hydrostar Thermal and Structural Deformation Analyses of Antenna Array Concept

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Hope, Drew J.

    1998-01-01

    The proposed Hydrostar mission used a large orbiting antenna array to demonstrate synthetic aperture technology in space while obtaining global soil moisture data. In order to produce accurate data, the array was required to remain as close as possible to its perfectly aligned placement while undergoing the mechanical and thermal stresses induced by orbital changes. Thermal and structural analyses for a design concept of this antenna array were performed. The thermal analysis included orbital radiation calculations, as well as parametric studies of orbit altitude, material properties and coating types. The thermal results included predicted thermal distributions over the array for several cases. The structural analysis provided thermally-driven deflections based on these cases, as well as based on a 1-g inertial load. In order to minimize the deflections of the array in orbit, the use of XN70, a carbon-reinforced polycyanate composite, was recommended.

  8. Highly scalable parallel processing of extracellular recordings of Multielectrode Arrays.

    PubMed

    Gehring, Tiago V; Vasilaki, Eleni; Giugliano, Michele

    2015-01-01

    Technological advances of Multielectrode Arrays (MEAs) used for multisite, parallel electrophysiological recordings, lead to an ever increasing amount of raw data being generated. Arrays with hundreds up to a few thousands of electrodes are slowly seeing widespread use and the expectation is that more sophisticated arrays will become available in the near future. In order to process the large data volumes resulting from MEA recordings there is a pressing need for new software tools able to process many data channels in parallel. Here we present a new tool for processing MEA data recordings that makes use of new programming paradigms and recent technology developments to unleash the power of modern highly parallel hardware, such as multi-core CPUs with vector instruction sets or GPGPUs. Our tool builds on and complements existing MEA data analysis packages. It shows high scalability and can be used to speed up some performance critical pre-processing steps such as data filtering and spike detection, helping to make the analysis of larger data sets tractable.

  9. Manycore Performance-Portability: Kokkos Multidimensional Array Library

    DOE PAGES

    Edwards, H. Carter; Sunderland, Daniel; Porter, Vicki; ...

    2012-01-01

    Large, complex scientific and engineering application code have a significant investment in computational kernels to implement their mathematical models. Porting these computational kernels to the collection of modern manycore accelerator devices is a major challenge in that these devices have diverse programming models, application programming interfaces (APIs), and performance requirements. The Kokkos Array programming model provides library-based approach to implement computational kernels that are performance-portable to CPU-multicore and GPGPU accelerator devices. This programming model is based upon three fundamental concepts: (1) manycore compute devices each with its own memory space, (2) data parallel kernels and (3) multidimensional arrays. Kernel executionmore » performance is, especially for NVIDIA® devices, extremely dependent on data access patterns. Optimal data access pattern can be different for different manycore devices – potentially leading to different implementations of computational kernels specialized for different devices. The Kokkos Array programming model supports performance-portable kernels by (1) separating data access patterns from computational kernels through a multidimensional array API and (2) introduce device-specific data access mappings when a kernel is compiled. An implementation of Kokkos Array is available through Trilinos [Trilinos website, http://trilinos.sandia.gov/, August 2011].« less

  10. ArraySolver: an algorithm for colour-coded graphical display and Wilcoxon signed-rank statistics for comparing microarray gene expression data.

    PubMed

    Khan, Haseeb Ahmad

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann-Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n < or = 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform.

  11. ArraySolver: An Algorithm for Colour-Coded Graphical Display and Wilcoxon Signed-Rank Statistics for Comparing Microarray Gene Expression Data

    PubMed Central

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann–Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n ≤ 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform. PMID:18629036

  12. Experimental Study of Arcing on High-voltage Solar Arrays

    NASA Technical Reports Server (NTRS)

    Vayner, Boris; Galofaro, Joel; Ferguson, Dale

    2005-01-01

    The main obstacle to the implementation of a high-voltage solar array in space is arcing on the conductor-dielectric junctions exposed to the surrounding plasma. One obvious solution to this problem would be the installation of fully encapsulated solar arrays which were not having exposed conductors at all. However, there are many technological difficulties that must be overcome before the employment of fully encapsulated arrays will turn into reality. An alternative solution to raise arc threshold by modifications of conventionally designed solar arrays looks more appealing, at least in the nearest future. A comprehensive study of arc inception mechanism [1-4] suggests that such modifications can be done in the following directions: i) to insulate conductor-dielectric junction from a plasma environment (wrapthrough interconnects); ii) to change a coverglass geometry (overhang); iii) to increase a coverglass thickness; iiii) to outgas areas of conductor-dielectric junctions. The operation of high-voltage array in LEO produces also the parasitic current power drain on the electrical system. Moreover, the current collected from space plasma by solar arrays determines the spacecraft floating potential that is very important for the design of spacecraft and its scientific apparatus. In order to verify the validity of suggested modifications and to measure current collection five different solar array samples have been tested in large vacuum chamber. Each sample (36 silicon based cells) consists of three strings containing 12 cells connected in series. Thus, arc rate and current collection can be measured on every string independently, or on a whole sample when strings are connected in parallel. The heater installed in the chamber provides the possibility to test samples under temperature as high as 80 C that simulates the LEO operational temperature. The experimental setup is described below.

  13. Experimental Study of Arcing on High-Voltage Solar Arrays

    NASA Technical Reports Server (NTRS)

    Vayner, Boris; Galofaro, Joel; Ferguson, Dale

    2003-01-01

    The main obstacle to the implementation of a high-voltage solar array in space is arcing on the conductor-dielectric junctions exposed to the surrounding plasma. One obvious solution to this problem would be the installation of fully encapsulated solar arrays which were not having exposed conductors at all. However, there are many technological difficulties that must be overcome before the employment of fully encapsulated arrays will turn into reality. An alternative solution to raise arc threshold by modifications of conventionally designed solar arrays looks more appealing, at least in the nearest future. A comprehensive study of arc inception mechanism suggests that such modifications can be done in the following directions: 1) To insulate conductor-dielectric junction from a plasma environment (wrapthrough interconnects); 2) To change a coverglass geometry (overhang); 3) To increase a coverglass thickness; 4) To outgas areas of conductor-dielectric junctions. The operation of high-voltage array in LEO produces also the parasitic current power drain on the electrical system. Moreover, the current collected from space plasma by solar arrays determines the spacecraft floating potential that is very important for the design of spacecraft and its scientific apparatus. In order to verify the validity of suggested modifications and to measure current collection five different solar array samples have been tested in a large vacuum chamber. Each sample (36 silicon based cells) consists of three strings containing 12 cells connected in series. Thus, arc rate and current collection can be measured on every string independently, or on a whole sample when strings are connected in parallel. The heater installed in the chamber provides the possibility to test samples under temperature as high as 80 C that stimulates the LEO operational temperature. The experimental setup is described below.

  14. The ASTRI mini-array software system (MASS) implementation: a proposal for the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Tanci, Claudio; Tosti, Gino; Conforti, Vito; Schwarz, Joseph; Antolini, Elisa; Antonelli, L. A.; Bulgarelli, Andrea; Bigongiari, Ciro; Bruno, Pietro; Canestrari, Rodolfo; Capalbi, Milvia; Cascone, Enrico; Catalano, Osvaldo; Di Paola, Andrea; Di Pierro, Federico; Fioretti, Valentina; Gallozzi, Stefano; Gardiol, Daniele; Gianotti, Fulvio; Giro, Enrico; Grillo, Alessandro; La Palombara, Nicola; Leto, Giuseppe; Lombardi, Saverio; Maccarone, Maria C.; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Scuderi, Salvo; Stringhetti, Luca; Testa, Vincenzo; Trifoglio, Massimo; Vercellone, Stefano; Zoli, Andrea

    2016-08-01

    The ASTRI mini-array, composed of nine small-size dual mirror (SST-2M) telescopes, has been proposed to be installed at the southern site of the Cherenkov Telescope Array (CTA), as a set of preproduction units of the CTA observatory. The ASTRI mini-array is a collaborative and international effort carried out by Italy, Brazil and South Africa and led by the Italian National Institute of Astrophysics, INAF. We present the main features of the current implementation of the Mini-Array Software System (MASS) now in use for the activities of the ASTRI SST-2M telescope prototype located at the INAF observing station on Mt. Etna, Italy and the characteristics that make it a prototype for the CTA control software system. CTA Data Management (CTADATA) and CTA Array Control and Data Acquisition (CTA-ACTL) requirements and guidelines as well as the ASTRI use cases were considered in the MASS design, most of its features are derived from the Atacama Large Millimeter/sub-millimeter Array Control software. The MASS will provide a set of tools to manage all onsite operations of the ASTRI mini-array in order to perform the observations specified in the short term schedule (including monitoring and controlling all the hardware components of each telescope and calibration device), to analyze the acquired data online and to store/retrieve all the data products to/from the onsite repository.

  15. The spatial coherence structure of infrasonic waves: analysis of data from International Monitoring System arrays

    NASA Astrophysics Data System (ADS)

    Green, David N.

    2015-04-01

    The spatial coherence structure of 30 infrasound array detections, with source-to-receiver ranges of 25-6500 km, has been measured within the 0.25-1 Hz passband. The data were recorded at International Monitoring System (IMS) microbarograph arrays with apertures of between 1 and 4 km. Such array detections are of interest for Comprehensive Nuclear-Test-Ban Treaty monitoring. The majority of array detections (e.g. 80 per cent of recordings in the third-octave passband centred on 0.63 Hz) exhibit spatial coherence loss anisotropy that is consistent with previous lower frequency atmospheric acoustic studies; coherence loss is more rapid perpendicular to the acoustic propagation direction than parallel to it. The thirty array detections display significant interdetection variation in the magnitude of spatial coherence loss. The measurements can be explained by the simultaneous arrival of wave fronts at the recording array with angular beamwidths of between 0.4 and 7° and velocity bandwidths of between 2 and 40 m s-1. There is a statistically significant positive correlation between source-to-receiver range and the magnitude of coherence loss. Acoustic multipathing generated by interactions with fine-scale wind and temperature gradients along stratospheric propagation paths is qualitatively consistent with the observations. In addition, the study indicates that to isolate coherence loss generated by propagation effects, analysis of signals exhibiting high signal-to-noise ratios (SNR) is required (SNR2 > 11 in this study). The rapid temporal variations in infrasonic noise observed in recordings at IMS arrays indicates that correcting measured coherence values for the effect of noise, using pre-signal estimates of noise power, is ineffective.

  16. Solving the corner-turning problem for large interferometers

    NASA Astrophysics Data System (ADS)

    Lutomirski, Andrew; Tegmark, Max; Sanchez, Nevada J.; Stein, Leo C.; Urry, W. Lynn; Zaldarriaga, Matias

    2011-01-01

    The so-called corner-turning problem is a major bottleneck for radio telescopes with large numbers of antennas. The problem is essentially that of rapidly transposing a matrix that is too large to store on one single device; in radio interferometry, it occurs because data from each antenna need to be routed to an array of processors each of which will handle a limited portion of the data (say, a frequency range) but requires input from each antenna. We present a low-cost solution allowing the correlator to transpose its data in real time, without contending for bandwidth, via a butterfly network requiring neither additional RAM memory nor expensive general-purpose switching hardware. We discuss possible implementations of this using FPGA, CMOS, analog logic and optical technology, and conclude that the corner-turner cost can be small even for upcoming massive radio arrays.

  17. Hanford Reach Fall Chinook Redd Monitoring Report for Calendar Year 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindsey, Cole T.; Nugent, John J.

    2014-02-10

    The U.S. Department of Energy, Richland Operations Office (DOE-RL) conducts ecological monitoring on the Hanford Site to collect and track data needed to ensure compliance with an array of environmental laws, regulations, and policies governing DOE activities. Ecological monitoring data provide baseline information about the plants, animals, and habitat under DOE-RL stewardship at Hanford required for decision-making under the National Environmental Policy Act (NEPA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). The Hanford Site Comprehensive Land Use Plan (CLUP, DOE/EIS-0222-F) which is the Environmental Impact Statement for Hanford Site activities, helps ensure that DOE-RL, its contractors, and othermore » entities conducting activities on the Hanford Site are in compliance with NEPA.« less

  18. Hanford Site Black-Tailed Jackrabbit Monitoring Report for Fiscal Year 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindsey, Cole T.; Nugent, John J.; Wilde, Justin W.

    2014-02-13

    The U.S. Department of Energy, Richland Operations Office (DOE-RL) conducts ecological monitoring on the Hanford Site to collect and track data needed to ensure compliance with an array of environmental laws, regulations, and policies governing DOE activities. Ecological monitoring data provide baseline information about the plants, animals, and habitat under DOE-RL stewardship at Hanford required for decision-making under the National Environmental Policy Act (NEPA) and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). The Hanford Site Comprehensive Land Use Plan (CLUP, DOE/EIS-0222-F) which is the Environmental Impact Statement for Hanford Site activities, helps ensure that DOE-RL, its contractors, and othermore » entities conducting activities on the Hanford Site are in compliance with NEPA.« less

  19. A large-eddy simulation study of wake propagation and power production in an array of tidal-current turbines.

    PubMed

    Churchfield, Matthew J; Li, Ye; Moriarty, Patrick J

    2013-02-28

    This paper presents our initial work in performing large-eddy simulations of tidal turbine array flows. First, a horizontally periodic precursor simulation is performed to create turbulent flow data. Then those data are used as inflow into a tidal turbine array two rows deep and infinitely wide. The turbines are modelled using rotating actuator lines, and the finite-volume method is used to solve the governing equations. In studying the wakes created by the turbines, we observed that the vertical shear of the inflow combined with wake rotation causes lateral wake asymmetry. Also, various turbine configurations are simulated, and the total power production relative to isolated turbines is examined. We found that staggering consecutive rows of turbines in the simulated configurations allows the greatest efficiency using the least downstream row spacing. Counter-rotating consecutive downstream turbines in a non-staggered array shows a small benefit. This work has identified areas for improvement. For example, using a larger precursor domain would better capture elongated turbulent structures, and including salinity and temperature equations would account for density stratification and its effect on turbulence. Additionally, the wall shear stress modelling could be improved, and more array configurations could be examined.

  20. Developing infrared array controller with software real time operating system

    NASA Astrophysics Data System (ADS)

    Sako, Shigeyuki; Miyata, Takashi; Nakamura, Tomohiko; Motohara, Kentaro; Uchimoto, Yuka Katsuno; Onaka, Takashi; Kataza, Hirokazu

    2008-07-01

    Real-time capabilities are required for a controller of a large format array to reduce a dead-time attributed by readout and data transfer. The real-time processing has been achieved by dedicated processors including DSP, CPLD, and FPGA devices. However, the dedicated processors have problems with memory resources, inflexibility, and high cost. Meanwhile, a recent PC has sufficient resources of CPUs and memories to control the infrared array and to process a large amount of frame data in real-time. In this study, we have developed an infrared array controller with a software real-time operating system (RTOS) instead of the dedicated processors. A Linux PC equipped with a RTAI extension and a dual-core CPU is used as a main computer, and one of the CPU cores is allocated to the real-time processing. A digital I/O board with DMA functions is used for an I/O interface. The signal-processing cores are integrated in the OS kernel as a real-time driver module, which is composed of two virtual devices of the clock processor and the frame processor tasks. The array controller with the RTOS realizes complicated operations easily, flexibly, and at a low cost.

  1. En route noise levels from propfan test assessment airplane

    NASA Technical Reports Server (NTRS)

    Garber, Donald P.; Willshire, William L., Jr.

    1994-01-01

    The en route noise test was designed to characterize propagation of propfan noise from cruise altitudes to the ground. In-flight measurements of propfan source levels and directional patterns were made by a chase plane flying in formation with the propfan test assessment (PTA) airplane. Ground noise measurements were taken during repeated flights over a distributed microphone array. The microphone array on the ground was used to provide ensemble-averaged estimates of mean flyover noise levels, establish confidence limits for those means, and measure propagation-induced noise variability. Even for identical nominal cruise conditions, peak sound levels for individual overflights varied substantially about the average, particularly when overflights were performed on different days. Large day-to-day variations in peak level measurements appeared to be caused by large day-to-day differences in propagation conditions and tended to obscure small variations arising from operating conditions. A parametric evaluation of the sensitivity of this prediction method to weather measurement and source level uncertainties was also performed. In general, predictions showed good agreement with measurements. However, the method was unable to predict short-term variability of ensemble-averaged data within individual overflights. Although variations in absorption appear to be the dominant factor in variations of peak sound levels recorded on the ground, accurate predictions of those levels require that a complete description of operational conditions be taken into account. The comprehensive and integrated methods presented in this paper have adequately predicted ground-measured sound levels. On average, peak sound levels were predicted within 3 dB for each of the three different cruise conditions.

  2. Wire-number effects on high-power annular z-pinches and some characteristics at high wire number

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SANFORD,THOMAS W. L.

    2000-05-23

    Characteristics of annular wire-array z-pinches as a function of wire number and at high wire number are reviewed. The data, taken primarily using aluminum wires on Saturn are comprehensive. The experiments have provided important insights into the features of wire-array dynamics critical for high x-ray power generation, and have initiated a renaissance in z-pinches when high numbers of wires are used. In this regime, for example, radiation environments characteristic of those encountered during the early pulses required for indirect-drive ICF ignition on the NIF have been produced in hohlraums driven by x-rays from a z-pinch, and are commented on here.

  3. Integrating Low-Cost Mems Accelerometer Mini-Arrays (mama) in Earthquake Early Warning Systems

    NASA Astrophysics Data System (ADS)

    Nof, R. N.; Chung, A. I.; Rademacher, H.; Allen, R. M.

    2016-12-01

    Current operational Earthquake Early Warning Systems (EEWS) acquire data with networks of single seismic stations, and compute source parameters assuming earthquakes to be point sources. For large events, the point-source assumption leads to an underestimation of magnitude, and the use of single stations leads to large uncertainties in the locations of events outside the network. We propose the use of mini-arrays to improve EEWS. Mini-arrays have the potential to: (a) estimate reliable hypocentral locations by beam forming (FK-analysis) techniques; (b) characterize the rupture dimensions and account for finite-source effects, leading to more reliable estimates for large magnitudes. Previously, the high price of multiple seismometers has made creating arrays cost-prohibitive. However, we propose setting up mini-arrays of a new seismometer based on low-cost (<$150), high-performance MEMS accelerometer around conventional seismic stations. The expected benefits of such an approach include decreasing alert-times, improving real-time shaking predictions and mitigating false alarms. We use low-resolution 14-bit Quake Catcher Network (QCN) data collected during Rapid Aftershock Mobilization Program (RAMP) in Christchurch, NZ following the M7.1 Darfield earthquake in September 2010. As the QCN network was so dense, we were able to use small sub-array of up to ten sensors spread along a maximum area of 1.7x2.2 km2 to demonstrate our approach and to solve for the BAZ of two events (Mw4.7 and Mw5.1) with less than ±10° error. We will also present the new 24-bit device details, benchmarks, and real-time measurements.

  4. Genomic analysis using high density SNP based oligonucleotide arrays and MLPA provides a comprehensive analysis of INI1/SMARCB1 in malignant rhabdoid tumors

    PubMed Central

    Jackson, Eric M.; Sievert, Angela J.; Gai, Xiaowu; Hakonarson, Hakon; Judkins, Alexander R; Tooke, Laura; Perin, Juan Carlos; Xie, Hongbo; Shaikh, Tamim H.; Biegel, Jaclyn A.

    2009-01-01

    Translational Relevance Previous reports suggested that abnormalities of INI1 could be detected in 70–75% of malignant rhabdoid tumors. The mechanism of inactivation in the other 25% remained unclear. The goal of this study was to perform a high-resolution genomic analysis of a large series of rhabdoid tumors with the expectation of identifying additional loci related to the initiation or progression of these malignancies. We also developed a comprehensive set of assays, including a new MLPA assay, to interrogate the INI1 locus in 22q11.2. Intragenic deletions could be detected using the Illumina 550K Beadchip, whereas single exon deletions could be detected using MLPA. The current study demonstrates that with a multi-platform approach, alterations at the INI1 locus can be detected in almost all cases. Thus, appropriate molecular genetic testing can be used as an aid in the diagnosis and for treatment planning for most patients. Purpose A high-resolution genomic profiling and comprehensive targeted analysis of INI1/SMARCB1 of a large series of pediatric rhabdoid tumors was performed. The aim was to identify regions of copy number change and loss of heterozygosity that might pinpoint additional loci involved in the development or progression of rhabdoid tumors, and define the spectrum of genomic alterations of INI1 in this malignancy. Experimental Design A multi-platform approach, utilizing Illumina single nucleotide polymorphism (SNP) based oligonucleotide arrays, multiplex ligation dependent probe amplification (MLPA), fluorescence in situ hybridization (FISH), and coding sequence analysis was used to characterize genome wide copy number changes, loss of heterozygosity, and genomic alterations of INI1/SMARCB1 in a series of pediatric rhabdoid tumors. Results The bi-allelic alterations of INI1 that led to inactivation were elucidated in 50 of 51 tumors. INI1 inactivation was demonstrated by a variety of mechanisms, including deletions, mutations, and loss of heterozygosity. The results from the array studies highlighted the complexity of rearrangements of chromosome 22, compared to the low frequency of alterations involving the other chromosomes. Conclusions The results from the genome wide SNP-array analysis suggest that INI1 is the primary tumor suppressor gene involved in the development of rhabdoid tumors with no second locus identified. In addition, we did not identify hot spots for the breakpoints in sporadic tumors with deletions of chromosome 22q11.2. By employing a multimodality approach, the wide spectrum of alterations of INI1 can be identified in the majority of patients, which increases the clinical utility of molecular diagnostic testing. PMID:19276269

  5. Atmospheric effects on microphone array analysis of aircraft vortex sound

    DOT National Transportation Integrated Search

    2006-05-08

    This paper provides the basis of a comprehensive analysis of vortex sound propagation : through the atmosphere in order to assess real atmospheric effects on acoustic array : processing. Such effects may impact vortex localization accuracy and detect...

  6. The effect of the geomagnetic field on cosmic ray energy estimates and large scale anisotropy searches on data from the Pierre Auger Observatory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abreu, P.; /Lisbon, IST; Aglietta, M.

    2011-11-01

    We present a comprehensive study of the influence of the geomagnetic field on the energy estimation of extensive air showers with a zenith angle smaller than 60{sup o}, detected at the Pierre Auger Observatory. The geomagnetic field induces an azimuthal modulation of the estimated energy of cosmic rays up to the {approx} 2% level at large zenith angles. We present a method to account for this modulation of the reconstructed energy. We analyse the effect of the modulation on large scale anisotropy searches in the arrival direction distributions of cosmic rays. At a given energy, the geomagnetic effect is shownmore » to induce a pseudo-dipolar pattern at the percent level in the declination distribution that needs to be accounted for. In this work, we have identified and quantified a systematic uncertainty affecting the energy determination of cosmic rays detected by the surface detector array of the Pierre Auger Observatory. This systematic uncertainty, induced by the influence of the geomagnetic field on the shower development, has a strength which depends on both the zenith and the azimuthal angles. Consequently, we have shown that it induces distortions of the estimated cosmic ray event rate at a given energy at the percent level in both the azimuthal and the declination distributions, the latter of which mimics an almost dipolar pattern. We have also shown that the induced distortions are already at the level of the statistical uncertainties for a number of events N {approx_equal} 32 000 (we note that the full Auger surface detector array collects about 6500 events per year with energies above 3 EeV). Accounting for these effects is thus essential with regard to the correct interpretation of large scale anisotropy measurements taking explicitly profit from the declination distribution.« less

  7. ArrayBridge: Interweaving declarative array processing with high-performance computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xing, Haoyuan; Floratos, Sofoklis; Blanas, Spyros

    Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aimsmore » to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.« less

  8. Highly-Ordered 3D Vertical Resistive Switching Memory Arrays with Ultralow Power Consumption and Ultrahigh Density.

    PubMed

    Al-Haddad, Ahmed; Wang, Chengliang; Qi, Haoyuan; Grote, Fabian; Wen, Liaoyong; Bernhard, Jörg; Vellacheri, Ranjith; Tarish, Samar; Nabi, Ghulam; Kaiser, Ute; Lei, Yong

    2016-09-07

    Resistive switching random access memories (RRAM) have attracted great scientific and industrial attention for next generation data storage because of their advantages of nonvolatile properties, high density, low power consumption, fast writing/erasing speed, good endurance, and simple and small operation system. Here, by using a template-assisted technique, we demonstrate a three-dimensional highly ordered vertical RRAM device array with density as high as that of the nanopores of the template (10(8)-10(9) cm(-2)), which can also be fabricated in large area. The high crystallinity of the materials, the large contact area and the intimate semiconductor/electrode interface (3 nm interfacial layer) make the ultralow voltage operation (millivolt magnitude) and ultralow power consumption (picowatt) possible. Our procedure for fabrication of the nanodevice arrays in large area can be used for producing many other different materials and such three-dimensional electronic device arrays with the capability to adjust the device densities can be extended to other applications of the next generation nanodevice technology.

  9. The design and application of large area intensive lens array focal spots measurement system

    NASA Astrophysics Data System (ADS)

    Chen, Bingzhen; Yao, Shun; Yang, Guanghui; Dai, Mingchong; Wang, Zhiyong

    2014-12-01

    Concentrating Photovoltaic (CPV) modules are getting thinner and using smaller cells now days. Correspondingly, large area intensive lens arrays with smaller unit dimension and shorter focal length are wanted. However, the size and power center of lens array focal spots usually differ from the design value and are hard to measure, especially under large area situation. It is because the machining error and deformation of material of the lens array are hard to simulate in the optical design process. Thus the alignment error between solar cells and focal spots in the module assembly process will be hard to control. Under this kind of situation, the efficiency of CPV module with thinner body and smaller cells is much lower than expected. In this paper, a design of large area lens array focal spots automatic measurement system is presented, as well as its prototype application results. In this system, a four-channel parallel light path and its corresponding image capture and process modules are designed. These modules can simulate focal spots under sunlight and have the spots image captured and processed using charge coupled devices and certain gray level algorithm. Thus the important information of focal spots such as spot size and location will be exported. Motion control module based on grating scale signal and interval measurement method are also employed in this system in order to get test results with high speed and high precision on large area lens array no less than 1m×0.8m. The repeatability of the system prototype measurement is +/-10μm with a velocity of 90 spot/min. Compared to the original module assembled using coordinates from optical design, modules assembled using data exported from the prototype is 18% higher in output power, reaching a conversion efficiency of over 31%. This system and its design can be used in the focal spot measurement of planoconvex lens array and Fresnel lens array, as well as other kinds of large area lens array application with small focal spots.

  10. Combined array CGH plus SNP genome analyses in a single assay for optimized clinical testing

    PubMed Central

    Wiszniewska, Joanna; Bi, Weimin; Shaw, Chad; Stankiewicz, Pawel; Kang, Sung-Hae L; Pursley, Amber N; Lalani, Seema; Hixson, Patricia; Gambin, Tomasz; Tsai, Chun-hui; Bock, Hans-Georg; Descartes, Maria; Probst, Frank J; Scaglia, Fernando; Beaudet, Arthur L; Lupski, James R; Eng, Christine; Wai Cheung, Sau; Bacino, Carlos; Patel, Ankita

    2014-01-01

    In clinical diagnostics, both array comparative genomic hybridization (array CGH) and single nucleotide polymorphism (SNP) genotyping have proven to be powerful genomic technologies utilized for the evaluation of developmental delay, multiple congenital anomalies, and neuropsychiatric disorders. Differences in the ability to resolve genomic changes between these arrays may constitute an implementation challenge for clinicians: which platform (SNP vs array CGH) might best detect the underlying genetic cause for the disease in the patient? While only SNP arrays enable the detection of copy number neutral regions of absence of heterozygosity (AOH), they have limited ability to detect single-exon copy number variants (CNVs) due to the distribution of SNPs across the genome. To provide comprehensive clinical testing for both CNVs and copy-neutral AOH, we enhanced our custom-designed high-resolution oligonucleotide array that has exon-targeted coverage of 1860 genes with 60 000 SNP probes, referred to as Chromosomal Microarray Analysis – Comprehensive (CMA-COMP). Of the 3240 cases evaluated by this array, clinically significant CNVs were detected in 445 cases including 21 cases with exonic events. In addition, 162 cases (5.0%) showed at least one AOH region >10 Mb. We demonstrate that even though this array has a lower density of SNP probes than other commercially available SNP arrays, it reliably detected AOH events >10 Mb as well as exonic CNVs beyond the detection limitations of SNP genotyping. Thus, combining SNP probes and exon-targeted array CGH into one platform provides clinically useful genetic screening in an efficient manner. PMID:23695279

  11. ActiveSeismoPick3D - automatic first arrival determination for large active seismic arrays

    NASA Astrophysics Data System (ADS)

    Paffrath, Marcel; Küperkoch, Ludger; Wehling-Benatelli, Sebastian; Friederich, Wolfgang

    2016-04-01

    We developed a tool for automatic determination of first arrivals in active seismic data based on an approach, that utilises higher order statistics (HOS) and the Akaike information criterion (AIC), commonly used in seismology, but not in active seismics. Automatic picking is highly desirable in active seismics as the number of data provided by large seismic arrays rapidly exceeds of what an analyst can evaluate in a reasonable amount of time. To bring the functionality of automatic phase picking into the context of active data, the software package ActiveSeismoPick3D was developed in Python. It uses a modified algorithm for the determination of first arrivals which searches for the HOS maximum in unfiltered data. Additionally, it offers tools for manual quality control and postprocessing, e.g. various visualisation and repicking functionalities. For flexibility, the tool also includes methods for the preparation of geometry information of large seismic arrays and improved interfaces to the Fast Marching Tomography Package (FMTOMO), which can be used for the prediction of travel times and inversion for subsurface properties. Output files are generated in the VTK format, allowing the 3D visualization of e.g. the inversion results. As a test case, a data set consisting of 9216 traces from 64 shots was gathered, recorded at 144 receivers deployed in a regular 2D array of a size of 100 x 100 m. ActiveSeismoPick3D automatically checks the determined first arrivals by a dynamic signal to noise ratio threshold. From the data a 3D model of the subsurface was generated using the export functionality of the package and FMTOMO.

  12. Time-resolved metabolomics reveals metabolic modulation in rice foliage

    PubMed Central

    Sato, Shigeru; Arita, Masanori; Soga, Tomoyoshi; Nishioka, Takaaki; Tomita, Masaru

    2008-01-01

    Background To elucidate the interaction of dynamics among modules that constitute biological systems, comprehensive datasets obtained from "omics" technologies have been used. In recent plant metabolomics approaches, the reconstruction of metabolic correlation networks has been attempted using statistical techniques. However, the results were unsatisfactory and effective data-mining techniques that apply appropriate comprehensive datasets are needed. Results Using capillary electrophoresis mass spectrometry (CE-MS) and capillary electrophoresis diode-array detection (CE-DAD), we analyzed the dynamic changes in the level of 56 basic metabolites in plant foliage (Oryza sativa L. ssp. japonica) at hourly intervals over a 24-hr period. Unsupervised clustering of comprehensive metabolic profiles using Kohonen's self-organizing map (SOM) allowed classification of the biochemical pathways activated by the light and dark cycle. The carbon and nitrogen (C/N) metabolism in both periods was also visualized as a phenotypic linkage map that connects network modules on the basis of traditional metabolic pathways rather than pairwise correlations among metabolites. The regulatory networks of C/N assimilation/dissimilation at each time point were consistent with previous works on plant metabolism. In response to environmental stress, glutathione and spermidine fluctuated synchronously with their regulatory targets. Adenine nucleosides and nicotinamide coenzymes were regulated by phosphorylation and dephosphorylation. We also demonstrated that SOM analysis was applicable to the estimation of unidentifiable metabolites in metabolome analysis. Hierarchical clustering of a correlation coefficient matrix could help identify the bottleneck enzymes that regulate metabolic networks. Conclusion Our results showed that our SOM analysis with appropriate metabolic time-courses effectively revealed the synchronous dynamics among metabolic modules and elucidated the underlying biochemical functions. The application of discrimination of unidentified metabolites and the identification of bottleneck enzymatic steps even to non-targeted comprehensive analysis promise to facilitate an understanding of large-scale interactions among components in biological systems. PMID:18564421

  13. At-TAX: a whole genome tiling array resource for developmental expression analysis and transcript identification in Arabidopsis thaliana

    PubMed Central

    Laubinger, Sascha; Zeller, Georg; Henz, Stefan R; Sachsenberg, Timo; Widmer, Christian K; Naouar, Naïra; Vuylsteke, Marnik; Schölkopf, Bernhard; Rätsch, Gunnar; Weigel, Detlef

    2008-01-01

    Gene expression maps for model organisms, including Arabidopsis thaliana, have typically been created using gene-centric expression arrays. Here, we describe a comprehensive expression atlas, Arabidopsis thaliana Tiling Array Express (At-TAX), which is based on whole-genome tiling arrays. We demonstrate that tiling arrays are accurate tools for gene expression analysis and identified more than 1,000 unannotated transcribed regions. Visualizations of gene expression estimates, transcribed regions, and tiling probe measurements are accessible online at the At-TAX homepage. PMID:18613972

  14. A light writable microfluidic "flash memory": optically addressed actuator array with latched operation for microfluidic applications.

    PubMed

    Hua, Zhishan; Pal, Rohit; Srivannavit, Onnop; Burns, Mark A; Gulari, Erdogan

    2008-03-01

    This paper presents a novel optically addressed microactuator array (microfluidic "flash memory") with latched operation. Analogous to the address-data bus mediated memory address protocol in electronics, the microactuator array consists of individual phase-change based actuators addressed by localized heating through focused light patterns (address bus), which can be provided by a modified projector or high power laser pointer. A common pressure manifold (data bus) for the entire array is used to generate large deflections of the phase change actuators in the molten phase. The use of phase change material as the working media enables latched operation of the actuator array. After the initial light "writing" during which the phase is temporarily changed to molten, the actuated status is self-maintained by the solid phase of the actuator without power and pressure inputs. The microfluidic flash memory can be re-configured by a new light illumination pattern and common pressure signal. The proposed approach can achieve actuation of arbitrary units in a large-scale array without the need for complex external equipment such as solenoid valves and electrical modules, which leads to significantly simplified system implementation and compact system size. The proposed work therefore provides a flexible, energy-efficient, and low cost multiplexing solution for microfluidic applications based on physical displacements. As an example, the use of the latched microactuator array as "normally closed" or "normally open" microvalves is demonstrated. The phase-change wax is fully encapsulated and thus immune from contamination issues in fluidic environments.

  15. Detection and validation of single feature polymorphisms using RNA expression data from a rice genome array

    USDA-ARS?s Scientific Manuscript database

    A large number of genetic variations have been identified in rice. Such variations must in many cases control phenotypic differences in abiotic stress tolerance and other traits. A single feature polymorphism (SFP) is an oligonucleotide array-based polymorphism which can be used for identification o...

  16. A self-testing dynamic RAM chip

    NASA Astrophysics Data System (ADS)

    You, Y.; Hayes, J. P.

    1985-02-01

    A novel approach to making very large dynamic RAM chips self-testing is presented. It is based on two main concepts: on-chip generation of regular test sequences with very high fault coverage, and concurrent testing of storage-cell arrays to reduce overall testing time. The failure modes of a typical 64 K RAM employing one-transistor cells are analyzed to identify their test requirements. A comprehensive test generation algorithm that can be implemented with minimal modification to a standard cell layout is derived. The self-checking peripheral circuits necessary to implement this testing algorithm are described, and the self-testing RAM is briefly evaluated.

  17. Testing the Reviewed Event Bulletin of the International Data Centre Using Waveform Cross Correlation: Repeat Events at Aitik Copper Mine, Sweden

    NASA Astrophysics Data System (ADS)

    Kitov, I. O.; Rozhkov, N.; Bobrov, D.; Rozhkov, M.; Yedlin, M. J.

    2016-12-01

    The quality of the Reviewed Event Bulletin (REB) issued by the International Data Centre (IDC) of the Comprehensive Nuclear-Test- Ban Treaty Organization (CTBTO) is crucial for the Member States as well as for the seismological community. One of the most efficient methods to test the REB quality is using repeat events having very accurate absolute locations. Hundreds of quarry blasts detonated at Aitik copper mine (the central point of active mining - 67.08N, 20.95E) were recorded by several seismic arrays of the International Monitoring System (IMS), found by IDC automatic processing and then confirmed by analysts as REB events. The size of the quarry is approximately 1 km and one can consider that the uncertainty in absolute coordinates of the studied events is less than 0.5 km as measured from the central point. In the REB, the corresponding epicenters are almost uniformly scattered over the territory 67.0N to 67.3N, and 20.7E to 21.5E. These REB locations are based on the measured arrival times as well as azimuth and slowness estimates at several IMS stations with the main input from ARCES, NOA, FINES, and HFS. The higher scattering of REB locations is caused by the uncertainty in measurements and velocity model. Seismological methods based on waveform cross correlation allow very accurate relative location of repeat events. Here we test the level of similarity between signals from these events. It was found that IMS primary array station ARCES demonstrates the highest similarity as expressed by cross correlation coefficient (CC) and signal-to-noise ratio (SNR) calculated at the CC traces. Small-aperture array FINES is the second best and large-aperture array NOA demonstrating mediocre performance likely due its size and the loss of coherency between high-frequency and relatively low-velocity signals from the mine. During the last five years station ARCES has been upgraded from a vertical array to a 3-C one. This transformation has improved the performance of CC-technique as applied to the Aitik mine events. We have also applied a Principal Component Analysis to estimate the level of variability in the signals as well as to build the best waveform template for effective detection and identification of all blasts conducted at Aitik mine.

  18. Digital processing of radiographic images

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.

    1973-01-01

    Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.

  19. Modeling change from large-scale high-dimensional spatio-temporal array data

    NASA Astrophysics Data System (ADS)

    Lu, Meng; Pebesma, Edzer

    2014-05-01

    The massive data that come from Earth observation satellite and other sensors provide significant information for modeling global change. At the same time, the high dimensionality of the data has brought challenges in data acquisition, management, effective querying and processing. In addition, the output of earth system modeling tends to be data intensive and needs methodologies for storing, validation, analyzing and visualization, e.g. as maps. An important proportion of earth system observations and simulated data can be represented as multi-dimensional array data, which has received increasingly attention in big data management and spatial-temporal analysis. Study cases will be developed in natural science such as climate change, hydrological modeling, sediment dynamics, from which the addressing of big data problems is necessary. Multi-dimensional array-based database management and analytics system such as Rasdaman, SciDB, and R will be applied to these cases. From these studies will hope to learn the strengths and weaknesses of these systems, how they might work together or how semantics of array operations differ, through addressing the problems associated with big data. Research questions include: • How can we reduce dimensions spatially and temporally, or thematically? • How can we extend existing GIS functions to work on multidimensional arrays? • How can we combine data sets of different dimensionality or different resolutions? • Can map algebra be extended to an intelligible array algebra? • What are effective semantics for array programming of dynamic data driven applications? • In which sense are space and time special, as dimensions, compared to other properties? • How can we make the analysis of multi-spectral, multi-temporal and multi-sensor earth observation data easy?

  20. A Large Array of Small Antennas to Support Future NASA Missions

    NASA Astrophysics Data System (ADS)

    Jones, D. L.; Weinreb, S.; Preston, R. A.

    2001-01-01

    A team of engineers and scientists at JPL is currently working on the design of an array of small radio antennas with a total collecting area up to twenty times that of the largest existing (70 m) DSN antennas. An array of this size would provide obvious advantages for high data rate telemetry reception and for spacecraft navigation. Among these advantages are an order-of-magnitude increase in sensitivity for telemetry downlink, flexible sub-arraying to track multiple spacecraft simultaneously, increased reliability through the use of large numbers of identical array elements, very accurate real-time angular spacecraft tracking, and a dramatic reduction in cost per unit area. NASA missions in many disciplines, including planetary science, would benefit from this increased DSN capability. The science return from planned missions could be increased, and opportunities for less expensive or completely new kinds of missions would be created. The DSN array would also bean immensely valuable instrument for radio astronomy. Indeed, it would be by far the most sensitive radio telescope in the world. Additional information is contained in the original extended abstract.

  1. Python Winding Itself Around Datacubes: How to Access Massive Multi-Dimensional Arrays in a Pythonic Way

    NASA Astrophysics Data System (ADS)

    Merticariu, Vlad; Misev, Dimitar; Baumann, Peter

    2017-04-01

    While python has developed into the lingua franca in Data Science there is often a paradigm break when accessing specialized tools. In particular for one of the core data categories in science and engineering, massive multi-dimensional arrays, out-of-memory solutions typically employ their own, different models. We discuss this situation on the example of the scalable open-source array engine, rasdaman ("raster data manager") which offers access to and processing of Petascale multi-dimensional arrays through an SQL-style array query language, rasql. Such queries are executed in the server on a storage engine utilizing adaptive array partitioning and based on a processing engine implementing a "tile streaming" paradigm to allow processing of arrays massively larger than server RAM. The rasdaman QL has acted as blueprint for forthcoming ISO Array SQL and the Open Geospatial Consortium (OGC) geo analytics language, Web Coverage Processing Service, adopted in 2008. Not surprisingly, rasdaman is OGC and INSPIRE Reference Implementation for their "Big Earth Data" standards suite. Recently, rasdaman has been augmented with a python interface which allows to transparently interact with the database (credits go to Siddharth Shukla's Master Thesis at Jacobs University). Programmers do not need to know the rasdaman query language, as the operators are silently transformed, through lazy evaluation, into queries. Arrays delivered are likewise automatically transformed into their python representation. In the talk, the rasdaman concept will be illustrated with the help of large-scale real-life examples of operational satellite image and weather data services, and sample python code.

  2. Development and Calibration of a Field-Deployable Microphone Phased Array for Propulsion and Airframe Noise Flyover Measurements

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.; Lockard, David P.; Khorrami, Mehdi R.; Culliton, William G.; McSwain, Robert G.; Ravetta, Patricio A.; Johns, Zachary

    2016-01-01

    A new aeroacoustic measurement capability has been developed consisting of a large channelcount, field-deployable microphone phased array suitable for airframe noise flyover measurements for a range of aircraft types and scales. The array incorporates up to 185 hardened, weather-resistant sensors suitable for outdoor use. A custom 4-mA current loop receiver circuit with temperature compensation was developed to power the sensors over extended cable lengths with minimal degradation of the signal to noise ratio and frequency response. Extensive laboratory calibrations and environmental testing of the sensors were conducted to verify the design's performance specifications. A compact data system combining sensor power, signal conditioning, and digitization was assembled for use with the array. Complementing the data system is a robust analysis system capable of near real-time presentation of beamformed and deconvolved contour plots and integrated spectra obtained from array data acquired during flyover passes. Additional instrumentation systems needed to process the array data were also assembled. These include a commercial weather station and a video monitoring / recording system. A detailed mock-up of the instrumentation suite (phased array, weather station, and data processor) was performed in the NASA Langley Acoustic Development Laboratory to vet the system performance. The first deployment of the system occurred at Finnegan Airfield at Fort A.P. Hill where the array was utilized to measure the vehicle noise from a number of sUAS (small Unmanned Aerial System) aircraft. A unique in-situ calibration method for the array microphones using a hovering aerial sound source was attempted for the first time during the deployment.

  3. Synthesis of porous NiO/CeO2 hybrid nanoflake arrays as a platform for electrochemical biosensing

    NASA Astrophysics Data System (ADS)

    Cui, Jiewu; Luo, Jinbao; Peng, Bangguo; Zhang, Xinyi; Zhang, Yong; Wang, Yan; Qin, Yongqiang; Zheng, Hongmei; Shu, Xia; Wu, Yucheng

    2015-12-01

    Porous NiO/CeO2 hybrid nanoflake arrays fabricated by a facile hydrothermal method were employed as substrates for electrochemical biosensors. The resulting NiO/CeO2 hybrid nanoflake arrays with a large specific surface area and good biocompatibility presented an excellent platform for electrochemical biosensing.Porous NiO/CeO2 hybrid nanoflake arrays fabricated by a facile hydrothermal method were employed as substrates for electrochemical biosensors. The resulting NiO/CeO2 hybrid nanoflake arrays with a large specific surface area and good biocompatibility presented an excellent platform for electrochemical biosensing. Electronic supplementary information (ESI) available: Optical photographs of the as-prepared samples, SEM, TEM, EDS, XRD and BET data of the samples are presented, I-t curves of glucose biosensors based on NiO and NiO/CeO2 NFAs, EIS results of different electrodes. See DOI: 10.1039/c5nr05924k

  4. Workplace Drug Testing and Worker Drug Use

    PubMed Central

    Carpenter, Christopher S

    2007-01-01

    Objective To examine the nature and extent of the association between workplace drug testing and worker drug use. Data Sources Repeated cross-sections from the 2000 to 2001 National Household Surveys on Drug Abuse (NHSDA) and the 2002 National Survey on Drug Use and Health (NSDUH). Study Design Multivariate logistic regression models of the likelihood of marijuana use are estimated as a function of several different workplace drug policies, including drug testing. Specific questions about penalty severity and the likelihood of detection are used to further evaluate the nature of the association. Principal Findings Individuals whose employers perform drug tests are significantly less likely to report past month marijuana use, even after controlling for a wide array of worker and job characteristics. However, large negative associations are also found for variables indicating whether a firm has drug education, an employee assistance program, or a simple written policy about substance use. Accounting for these other workplace characteristics reduces—but does not eliminate—the testing differential. Frequent testing and severe penalties reduce the likelihood that workers use marijuana. Conclusions Previous studies have interpreted the large negative correlation between workplace drug testing and employee substance use as representing a causal deterrent effect of drug testing. Our results using more comprehensive data suggest that these estimates have been slightly overstated due to omitted variables bias. The overall pattern of results remains largely consistent with the hypothesis that workplace drug testing deters worker drug use. PMID:17362218

  5. RAID Disk Arrays for High Bandwidth Applications

    NASA Technical Reports Server (NTRS)

    Moren, Bill

    1996-01-01

    High bandwidth applications require large amounts of data transferred to/from storage devices at extremely high data rates. Further, these applications often are 'real time' in which access to the storage device must take place on the schedule of the data source, not the storage. A good example is a satellite downlink - the volume of data is quite large and the data rates quite high (dozens of MB/sec). Further, a telemetry downlink must take place while the satellite is overhead. A storage technology which is ideally suited to these types of applications is redundant arrays of independent discs (RAID). Raid storage technology, while offering differing methodologies for a variety of applications, supports the performance and redundancy required in real-time applications. Of the various RAID levels, RAID-3 is the only one which provides high data transfer rates under all operating conditions, including after a drive failure.

  6. Blending of phased array data

    NASA Astrophysics Data System (ADS)

    Duijster, Arno; van Groenestijn, Gert-Jan; van Neer, Paul; Blacquière, Gerrit; Volker, Arno

    2018-04-01

    The use of phased arrays is growing in the non-destructive testing industry and the trend is towards large 2D arrays, but due to limitations, it is currently not possible to record the signals from all elements, resulting in aliased data. In the past, we have presented a data interpolation scheme `beyond spatial aliasing' to overcome this aliasing. In this paper, we present a different approach: blending and deblending of data. On the hardware side, groups of receivers are blended (grouped) in only a few transmit/recording channels. This allows for transmission and recording with all elements, in a shorter acquisition time and with less channels. On the data processing side, this blended data is deblended (separated) by transforming it to a different domain and applying an iterative filtering and thresholding. Two different filtering methods are compared: f-k filtering and wavefield extrapolation filtering. The deblending and filtering methods are demonstrated on simulated experimental data. The wavefield extrapolation filtering proves to outperform f-k filtering. The wavefield extrapolation method can deal with groups of up to 24 receivers, in a phased array of 48 × 48 elements.

  7. Array data extractor (ADE): a LabVIEW program to extract and merge gene array data.

    PubMed

    Kurtenbach, Stefan; Kurtenbach, Sarah; Zoidl, Georg

    2013-12-01

    Large data sets from gene expression array studies are publicly available offering information highly valuable for research across many disciplines ranging from fundamental to clinical research. Highly advanced bioinformatics tools have been made available to researchers, but a demand for user-friendly software allowing researchers to quickly extract expression information for multiple genes from multiple studies persists. Here, we present a user-friendly LabVIEW program to automatically extract gene expression data for a list of genes from multiple normalized microarray datasets. Functionality was tested for 288 class A G protein-coupled receptors (GPCRs) and expression data from 12 studies comparing normal and diseased human hearts. Results confirmed known regulation of a beta 1 adrenergic receptor and further indicate novel research targets. Although existing software allows for complex data analyses, the LabVIEW based program presented here, "Array Data Extractor (ADE)", provides users with a tool to retrieve meaningful information from multiple normalized gene expression datasets in a fast and easy way. Further, the graphical programming language used in LabVIEW allows applying changes to the program without the need of advanced programming knowledge.

  8. Anomalous Arms

    NASA Image and Video Library

    2007-12-18

    This composite image is of spiral galaxy M106 NGC 4258; optical data from the Digitized Sky Survey is yellow, radio data from the Very Large Array is purple, X-ray data from Chandra is blue, and infrared data from the Spitzer Space Telescope is red.

  9. Seismic Strong Motion Array Project (SSMAP) to Record Future Large Earthquakes in the Nicoya Peninsula area, Costa Rica

    NASA Astrophysics Data System (ADS)

    Simila, G.; McNally, K.; Quintero, R.; Segura, J.

    2006-12-01

    The seismic strong motion array project (SSMAP) for the Nicoya Peninsula in northwestern Costa Rica is composed of 10 13 sites including Geotech A900/A800 accelerographs (three-component), Ref-Teks (three- component velocity), and Kinemetric Episensors. The main objectives of the array are to: 1) record and locate strong subduction zone mainshocks [and foreshocks, "early aftershocks", and preshocks] in Nicoya Peninsula, at the entrance of the Nicoya Gulf, and in the Papagayo Gulf regions of Costa Rica, and 2) record and locate any moderate to strong upper plate earthquakes triggered by a large subduction zone earthquake in the above regions. Our digital accelerograph array has been deployed as part of our ongoing research on large earthquakes in conjunction with the Earthquake and Volcano Observatory (OVSICORI) at the Universidad Nacional in Costa Rica. The country wide seismographic network has been operating continuously since the 1980's, with the first earthquake bulletin published more than 20 years ago, in 1984. The recording of seismicity and strong motion data for large earthquakes along the Middle America Trench (MAT) has been a major research project priority over these years, and this network spans nearly half the time of a "repeat cycle" (50 years) for large (Ms 7.5- 7.7) earthquakes beneath the Nicoya Peninsula, with the last event in 1950. Our long time co-collaborators include the seismology group OVSICORI, with coordination for this project by Dr. Ronnie Quintero and Mr. Juan Segura. Numerous international investigators are also studying this region with GPS and seismic stations (US, Japan, Germany, Switzerland, etc.). Also, there are various strong motion instruments operated by local engineers, for building purposes and mainly concentrated in the population centers of the Central Valley. The major goal of our project is to contribute unique scientific information pertaining to a large subduction zone earthquake and its related seismic activity when the next large earthquake occurs in Nicoya. A centralized data base will be created within the main seismic network files at OVSICORI, with various local personnel working in teams that will be responsible to collect data within 3 days following a large mainshock.

  10. The Australian SKA Pathfinder: project update and initial operations

    NASA Astrophysics Data System (ADS)

    Schinckel, Antony E. T.; Bock, Douglas C.-J.

    2016-08-01

    The Australian Square Kilometre Array Pathfinder (ASKAP) will be the fastest dedicated cm-wave survey telescope, and will consist of 36 12-meter 3-axis antennas, each with a large chequerboard phased array feed (PAF) receiver operating between 0.7 and 1.8 GHz, and digital beamforming prior to correlation. The large raw data rates involved ( 100 Tb/sec), and the need to do pipeline processing, has led to the antenna incorporating a third axis to fix the parallactic angle with respect to the entire optical system (blockages and phased array feed). It also results in innovative technical solutions to the data transport and processing issues. ASKAP is located at the Murchison Radio-astronomy Observatory (MRO), a new observatory developed for the Square Kilometre Array (SKA), 315 kilometres north-east of Geraldton, Western Australia. The MRO also hosts the SKA low frequency pathfinder instrument, the Murchison Widefield Array and will host the initial low frequency instrument of the SKA, SKA1-Low. Commissioning of ASKAP using six antennas equipped with first-generation PAFs is now complete and installation of second-generation PAFs and digital systems is underway. In this paper we review technical progress and commissioning to date, and refer the reader to relevant technical and scientific publications.

  11. Analytical Model for Mean Flow and Fluxes of Momentum and Energy in Very Large Wind Farms

    NASA Astrophysics Data System (ADS)

    Markfort, Corey D.; Zhang, Wei; Porté-Agel, Fernando

    2018-01-01

    As wind-turbine arrays continue to be installed and the array size continues to grow, there is an increasing need to represent very large wind-turbine arrays in numerical weather prediction models, for wind-farm optimization, and for environmental assessment. We propose a simple analytical model for boundary-layer flow in fully-developed wind-turbine arrays, based on the concept of sparsely-obstructed shear flows. In describing the vertical distribution of the mean wind speed and shear stress within wind farms, our model estimates the mean kinetic energy harvested from the atmospheric boundary layer, and determines the partitioning between the wind power captured by the wind turbines and that absorbed by the underlying land or water. A length scale based on the turbine geometry, spacing, and performance characteristics, is able to estimate the asymptotic limit for the fully-developed flow through wind-turbine arrays, and thereby determine if the wind-farm flow is fully developed for very large turbine arrays. Our model is validated using data collected in controlled wind-tunnel experiments, and its usefulness for the prediction of wind-farm performance and optimization of turbine-array spacing are described. Our model may also be useful for assessing the extent to which the extraction of wind power affects the land-atmosphere coupling or air-water exchange of momentum, with implications for the transport of heat, moisture, trace gases such as carbon dioxide, methane, and nitrous oxide, and ecologically important oxygen.

  12. Global Analysis of Perovskite Photophysics Reveals Importance of Geminate Pathways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manger, Lydia H.; Rowley, Matthew B.; Fu, Yongping

    Hybrid organic-inorganic perovskites demonstrate desirable photophysical behaviors and promising applications from efficient photovoltaics to lasing, but the fundamental nature of excited state species is still under debate. We also collected time-resolved photoluminescence of single-crystal nanoplates of methylammonium lead iodide perovskite (MAPbI3), with excitation over a range of fluences and repetition rates, to provide a more complete photophysical picture. A fundamentally different way of simulating the photophysics is developed that relies on unnormalized decays, global analysis over a large array of conditions, and inclusion of steady-state behavior; these details are critical to capturing observed behaviors. These additional constraints require inclusion ofmore » spatially-correlated pairs, along with free carriers and traps, demonstrating the importance of our comprehensive analysis. Modeling geminate and non-geminate pathways shows geminate processes are dominant at high carrier densities and early times. This combination of data and simulation provides a detailed picture of perovskite photophysics across multiple excitation regimes that was not previously available.« less

  13. Global Analysis of Perovskite Photophysics Reveals Importance of Geminate Pathways

    DOE PAGES

    Manger, Lydia H.; Rowley, Matthew B.; Fu, Yongping; ...

    2016-12-20

    Hybrid organic-inorganic perovskites demonstrate desirable photophysical behaviors and promising applications from efficient photovoltaics to lasing, but the fundamental nature of excited state species is still under debate. We also collected time-resolved photoluminescence of single-crystal nanoplates of methylammonium lead iodide perovskite (MAPbI3), with excitation over a range of fluences and repetition rates, to provide a more complete photophysical picture. A fundamentally different way of simulating the photophysics is developed that relies on unnormalized decays, global analysis over a large array of conditions, and inclusion of steady-state behavior; these details are critical to capturing observed behaviors. These additional constraints require inclusion ofmore » spatially-correlated pairs, along with free carriers and traps, demonstrating the importance of our comprehensive analysis. Modeling geminate and non-geminate pathways shows geminate processes are dominant at high carrier densities and early times. This combination of data and simulation provides a detailed picture of perovskite photophysics across multiple excitation regimes that was not previously available.« less

  14. Precision molding of advanced glass optics: innovative production technology for lens arrays and free form optics

    NASA Astrophysics Data System (ADS)

    Pongs, Guido; Bresseler, Bernd; Bergs, Thomas; Menke, Gert

    2012-10-01

    Today isothermal precision molding of imaging glass optics has become a widely applied and integrated production technology in the optical industry. Especially in consumer electronics (e.g. digital cameras, mobile phones, Blu-ray) a lot of optical systems contain rotationally symmetrical aspherical lenses produced by precision glass molding. But due to higher demands on complexity and miniaturization of optical elements the established process chain for precision glass molding is not sufficient enough. Wafer based molding processes for glass optics manufacturing become more and more interesting for mobile phone applications. Also cylindrical lens arrays can be used in high power laser systems. The usage of unsymmetrical free-form optics allows an increase of efficiency in optical laser systems. Aixtooling is working on different aspects in the fields of mold manufacturing technologies and molding processes for extremely high complex optical components. In terms of array molding technologies, Aixtooling has developed a manufacturing technology for the ultra-precision machining of carbide molds together with European partners. The development covers the machining of multi lens arrays as well as cylindrical lens arrays. The biggest challenge is the molding of complex free-form optics having no symmetrical axis. A comprehensive CAD/CAM data management along the entire process chain is essential to reach high accuracies on the molded lenses. Within a national funded project Aixtooling is working on a consistent data handling procedure in the process chain for precision molding of free-form optics.

  15. Strain Library Imaging Protocol for high-throughput, automated single-cell microscopy of large bacterial collections arrayed on multiwell plates.

    PubMed

    Shi, Handuo; Colavin, Alexandre; Lee, Timothy K; Huang, Kerwyn Casey

    2017-02-01

    Single-cell microscopy is a powerful tool for studying gene functions using strain libraries, but it suffers from throughput limitations. Here we describe the Strain Library Imaging Protocol (SLIP), which is a high-throughput, automated microscopy workflow for large strain collections that requires minimal user involvement. SLIP involves transferring arrayed bacterial cultures from multiwell plates onto large agar pads using inexpensive replicator pins and automatically imaging the resulting single cells. The acquired images are subsequently reviewed and analyzed by custom MATLAB scripts that segment single-cell contours and extract quantitative metrics. SLIP yields rich data sets on cell morphology and gene expression that illustrate the function of certain genes and the connections among strains in a library. For a library arrayed on 96-well plates, image acquisition can be completed within 4 min per plate.

  16. Microarray R-based analysis of complex lysate experiments with MIRACLE

    PubMed Central

    List, Markus; Block, Ines; Pedersen, Marlene Lemvig; Christiansen, Helle; Schmidt, Steffen; Thomassen, Mads; Tan, Qihua; Baumbach, Jan; Mollenhauer, Jan

    2014-01-01

    Motivation: Reverse-phase protein arrays (RPPAs) allow sensitive quantification of relative protein abundance in thousands of samples in parallel. Typical challenges involved in this technology are antibody selection, sample preparation and optimization of staining conditions. The issue of combining effective sample management and data analysis, however, has been widely neglected. Results: This motivated us to develop MIRACLE, a comprehensive and user-friendly web application bridging the gap between spotting and array analysis by conveniently keeping track of sample information. Data processing includes correction of staining bias, estimation of protein concentration from response curves, normalization for total protein amount per sample and statistical evaluation. Established analysis methods have been integrated with MIRACLE, offering experimental scientists an end-to-end solution for sample management and for carrying out data analysis. In addition, experienced users have the possibility to export data to R for more complex analyses. MIRACLE thus has the potential to further spread utilization of RPPAs as an emerging technology for high-throughput protein analysis. Availability: Project URL: http://www.nanocan.org/miracle/ Contact: mlist@health.sdu.dk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25161257

  17. Microarray R-based analysis of complex lysate experiments with MIRACLE.

    PubMed

    List, Markus; Block, Ines; Pedersen, Marlene Lemvig; Christiansen, Helle; Schmidt, Steffen; Thomassen, Mads; Tan, Qihua; Baumbach, Jan; Mollenhauer, Jan

    2014-09-01

    Reverse-phase protein arrays (RPPAs) allow sensitive quantification of relative protein abundance in thousands of samples in parallel. Typical challenges involved in this technology are antibody selection, sample preparation and optimization of staining conditions. The issue of combining effective sample management and data analysis, however, has been widely neglected. This motivated us to develop MIRACLE, a comprehensive and user-friendly web application bridging the gap between spotting and array analysis by conveniently keeping track of sample information. Data processing includes correction of staining bias, estimation of protein concentration from response curves, normalization for total protein amount per sample and statistical evaluation. Established analysis methods have been integrated with MIRACLE, offering experimental scientists an end-to-end solution for sample management and for carrying out data analysis. In addition, experienced users have the possibility to export data to R for more complex analyses. MIRACLE thus has the potential to further spread utilization of RPPAs as an emerging technology for high-throughput protein analysis. Project URL: http://www.nanocan.org/miracle/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  18. Breadboard linear array scan imager using LSI solid-state technology

    NASA Technical Reports Server (NTRS)

    Tracy, R. A.; Brennan, J. A.; Frankel, D. G.; Noll, R. E.

    1976-01-01

    The performance of large scale integration photodiode arrays in a linear array scan (pushbroom) breadboard was evaluated for application to multispectral remote sensing of the earth's resources. The technical approach, implementation, and test results of the program are described. Several self scanned linear array visible photodetector focal plane arrays were fabricated and evaluated in an optical bench configuration. A 1728-detector array operating in four bands (0.5 - 1.1 micrometer) was evaluated for noise, spectral response, dynamic range, crosstalk, MTF, noise equivalent irradiance, linearity, and image quality. Other results include image artifact data, temporal characteristics, radiometric accuracy, calibration experience, chip alignment, and array fabrication experience. Special studies and experimentation were included in long array fabrication and real-time image processing for low-cost ground stations, including the use of computer image processing. High quality images were produced and all objectives of the program were attained.

  19. A Compact VLSI System for Bio-Inspired Visual Motion Estimation.

    PubMed

    Shi, Cong; Luo, Gang

    2018-04-01

    This paper proposes a bio-inspired visual motion estimation algorithm based on motion energy, along with its compact very-large-scale integration (VLSI) architecture using low-cost embedded systems. The algorithm mimics motion perception functions of retina, V1, and MT neurons in a primate visual system. It involves operations of ternary edge extraction, spatiotemporal filtering, motion energy extraction, and velocity integration. Moreover, we propose the concept of confidence map to indicate the reliability of estimation results on each probing location. Our algorithm involves only additions and multiplications during runtime, which is suitable for low-cost hardware implementation. The proposed VLSI architecture employs multiple (frame, pixel, and operation) levels of pipeline and massively parallel processing arrays to boost the system performance. The array unit circuits are optimized to minimize hardware resource consumption. We have prototyped the proposed architecture on a low-cost field-programmable gate array platform (Zynq 7020) running at 53-MHz clock frequency. It achieved 30-frame/s real-time performance for velocity estimation on 160 × 120 probing locations. A comprehensive evaluation experiment showed that the estimated velocity by our prototype has relatively small errors (average endpoint error < 0.5 pixel and angular error < 10°) for most motion cases.

  20. Characterization testing of MEASAT GaAs/Ge solar cell assemblies

    NASA Technical Reports Server (NTRS)

    Brown, Mike R.; Garcia, Curtis A.; Goodelle, George S.; Powe, Joseph S.; Schwartz, Joel A.

    1996-01-01

    The first commercial communications satellite with gallium-arsenide on germanium (GaAs/Ge) solar arrays is scheduled for launch in December 1995. The spacecraft, named MEASAT, was built by Hughes Space and Communications Company. The solar cell assemblies consisted of large area GaAs/Ge cells supplied by Spectrolab Inc. with infrared reflecting (IRR) coverglass supplied by Pilkington Space Technology. A comprehensive characterization program was performed on the GaAs/Ge solar cell assemblies used on the MEASAT array. This program served two functions; first to establish the database needed to accurately predict on-orbit performance under a variety of conditions; and second, to demonstrate the ability of the solar cell assemblies to withstand all mission environments while still providing the required power at end-of-life. Characterization testing included measurement of electrical performance parameters as a function of radiation exposure, temperature, and angle of incident light; reverse bias stability; optical and thermal properties; mechanical strength tests, panel fabrication, humidity and thermal cycling environmental tests. The results provided a complete database enabling the design of the MEASAT solar array, and demonstrated that the GaAs/Ge cells meet the spacecraft requirements at end-of-life.

  1. Characterization testing of MEASAT GaAs/Ge solar cell assemblies

    NASA Technical Reports Server (NTRS)

    Brown, Mike R.; Garcia, Curtis A.; Goodelle, George S.; Powe, Joseph S.; Schwartz, Joel A.

    1995-01-01

    The first commercial communications satellite with gallium-arsenide on germanium (GaAs/Ge) solar arrays is scheduled for launch in December 1995. The spacecraft, named MEASAT, was built by hughes Space and Telecommunications company for Binariang Satellite Systems of Malaysia. The solar cell assemblies consisted of large area GaAs/Ge cells supplied by Spectrolab Inc. with infrared reflecting (IRR) coverglass supplied by Pilkington Space Technology. A comprehensive characterization program was performed on the GaAs/Ge solar cell assemblies used on the MEASAT array. This program served two functions; first to establish the database needed to accurately predict on-orbit performance under a variety of conditions; and second, to demonstrate the ability of the solar cell assemblies to withstand all mission environments while still providing the required power at end-of-life. characterization testing included measurement of electrical performance parameters as a function of radiation exposure, temperature, and angle of incident light; reverse bias stability; optical and thermal properties; mechanical strength tests, panel fabrication, humidity and thermal cycling environmental tests. The results provided a complete database enabling the design of the MEASAT solar array, and demonstrated that the GaAs/Ge cells meet the spacecraft requirements at end-of-life.

  2. Characterization of Human Cancer Cell Lines by Reverse-phase Protein Arrays* | Office of Cancer Genomics

    Cancer.gov

    Cancer cell lines are major model systems for mechanistic investigation and drug development. However, protein expression data linked to high-quality DNA, RNA, and drug-screening data have not been available across a large number of cancer cell lines. Using reverse-phase protein arrays, we measured expression levels of ∼230 key cancer-related proteins in >650 independent cell lines, many of which have publically available genomic, transcriptomic, and drug-screening data.

  3. Systolic Processor Array For Recognition Of Spectra

    NASA Technical Reports Server (NTRS)

    Chow, Edward T.; Peterson, John C.

    1995-01-01

    Spectral signatures of materials detected and identified quickly. Spectral Analysis Systolic Processor Array (SPA2) relatively inexpensive and satisfies need to analyze large, complex volume of multispectral data generated by imaging spectrometers to extract desired information: computational performance needed to do this in real time exceeds that of current supercomputers. Locates highly similar segments or contiguous subsegments in two different spectra at time. Compares sampled spectra from instruments with data base of spectral signatures of known materials. Computes and reports scores that express degrees of similarity between sampled and data-base spectra.

  4. An Array of Optical Receivers for Deep-Space Communications

    NASA Technical Reports Server (NTRS)

    Vilnrotter, Chi-Wung; Srinivasan, Meera; Andrews, Kenneth

    2007-01-01

    An array of small optical receivers is proposed as an alternative to a single large optical receiver for high-data-rate communications in NASA s Deep Space Network (DSN). Because the telescope for a single receiver capable of satisfying DSN requirements must be greater than 10 m in diameter, the design, building, and testing of the telescope would be very difficult and expensive. The proposed array would utilize commercially available telescopes of 1-m or smaller diameter and, therefore, could be developed and verified with considerably less difficulty and expense. The essential difference between a single-aperture optical-communications receiver and an optical-array receiver is that a single-aperture receiver focuses all of the light energy it collects onto the surface of an optical detector, whereas an array receiver focuses portions of the total collected energy onto separate detectors, optically detects each fractional energy component, then combines the electrical signal from the array of detector outputs to form the observable, or "decision statistic," used to decode the transmitted data. A conceptual block diagram identifying the key components of the optical-array receiver suitable for deep-space telemetry reception is shown in the figure. The most conspicuous feature of the receiver is the large number of small- to medium-size telescopes, with individual apertures and number of telescopes selected to make up the desired total collecting area. This array of telescopes is envisioned to be fully computer- controlled via the user interface and prediction-driven to achieve rough pointing and tracking of the desired spacecraft. Fine-pointing and tracking functions then take over to keep each telescope pointed toward the source, despite imperfect pointing predictions, telescope-drive errors, and vibration caused by wind.

  5. JPL Big Data Technologies for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Jones, Dayton L.; D'Addario, L. R.; De Jong, E. M.; Mattmann, C. A.; Rebbapragada, U. D.; Thompson, D. R.; Wagstaff, K.

    2014-04-01

    During the past three years the Jet Propulsion Laboratory has been working on several technologies to deal with big data challenges facing next-generation radio arrays, among other applications. This program has focused on the following four areas: 1) We are investigating high-level ASIC architectures that reduce power consumption for cross-correlation of data from large interferometer arrays by one to two orders of magnitude. The cost of operations for the Square Kilometre Array (SKA), which may be dominated by the cost of power for data processing, is a serious concern. A large improvement in correlator power efficiency could have a major positive impact. 2) Data-adaptive algorithms (machine learning) for real-time detection and classification of fast transient signals in high volume data streams are being developed and demonstrated. Studies of the dynamic universe, particularly searches for fast (<< 1 second) transient events, require that data be analyzed rapidly and with robust RFI rejection. JPL, in collaboration with the International Center for Radio Astronomy Research in Australia, has developed a fast transient search system for eventual deployment on ASKAP. In addition, a real-time transient detection experiment is now running continuously and commensally on NRAO's Very Long Baseline Array. 3) Scalable frameworks for data archiving, mining, and distribution are being applied to radio astronomy. A set of powerful open-source Object Oriented Data Technology (OODT) tools is now available through Apache. OODT was developed at JPL for Earth science data archives, but it is proving to be useful for radio astronomy, planetary science, health care, Earth climate, and other large-scale archives. 4) We are creating automated, event-driven data visualization tools that can be used to extract information from a wide range of complex data sets. Visualization of complex data can be improved through algorithms that detect events or features of interest and autonomously generate images or video to display those features. This work has been carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  6. A Versatile Multichannel Digital Signal Processing Module for Microcalorimeter Arrays

    NASA Astrophysics Data System (ADS)

    Tan, H.; Collins, J. W.; Walby, M.; Hennig, W.; Warburton, W. K.; Grudberg, P.

    2012-06-01

    Different techniques have been developed for reading out microcalorimeter sensor arrays: individual outputs for small arrays, and time-division or frequency-division or code-division multiplexing for large arrays. Typically, raw waveform data are first read out from the arrays using one of these techniques and then stored on computer hard drives for offline optimum filtering, leading not only to requirements for large storage space but also limitations on achievable count rate. Thus, a read-out module that is capable of processing microcalorimeter signals in real time will be highly desirable. We have developed multichannel digital signal processing electronics that are capable of on-board, real time processing of microcalorimeter sensor signals from multiplexed or individual pixel arrays. It is a 3U PXI module consisting of a standardized core processor board and a set of daughter boards. Each daughter board is designed to interface a specific type of microcalorimeter array to the core processor. The combination of the standardized core plus this set of easily designed and modified daughter boards results in a versatile data acquisition module that not only can easily expand to future detector systems, but is also low cost. In this paper, we first present the core processor/daughter board architecture, and then report the performance of an 8-channel daughter board, which digitizes individual pixel outputs at 1 MSPS with 16-bit precision. We will also introduce a time-division multiplexing type daughter board, which takes in time-division multiplexing signals through fiber-optic cables and then processes the digital signals to generate energy spectra in real time.

  7. SNPchiMp v.3: integrating and standardizing single nucleotide polymorphism data for livestock species.

    PubMed

    Nicolazzi, Ezequiel L; Caprera, Andrea; Nazzicari, Nelson; Cozzi, Paolo; Strozzi, Francesco; Lawley, Cindy; Pirani, Ali; Soans, Chandrasen; Brew, Fiona; Jorjani, Hossein; Evans, Gary; Simpson, Barry; Tosser-Klopp, Gwenola; Brauning, Rudiger; Williams, John L; Stella, Alessandra

    2015-04-10

    In recent years, the use of genomic information in livestock species for genetic improvement, association studies and many other fields has become routine. In order to accommodate different market requirements in terms of genotyping cost, manufacturers of single nucleotide polymorphism (SNP) arrays, private companies and international consortia have developed a large number of arrays with different content and different SNP density. The number of currently available SNP arrays differs among species: ranging from one for goats to more than ten for cattle, and the number of arrays available is increasing rapidly. However, there is limited or no effort to standardize and integrate array- specific (e.g. SNP IDs, allele coding) and species-specific (i.e. past and current assemblies) SNP information. Here we present SNPchiMp v.3, a solution to these issues for the six major livestock species (cow, pig, horse, sheep, goat and chicken). Original data was collected directly from SNP array producers and specific international genome consortia, and stored in a MySQL database. The database was then linked to an open-access web tool and to public databases. SNPchiMp v.3 ensures fast access to the database (retrieving within/across SNP array data) and the possibility of annotating SNP array data in a user-friendly fashion. This platform allows easy integration and standardization, and it is aimed at both industry and research. It also enables users to easily link the information available from the array producer with data in public databases, without the need of additional bioinformatics tools or pipelines. In recognition of the open-access use of Ensembl resources, SNPchiMp v.3 was officially credited as an Ensembl E!mpowered tool. Availability at http://bioinformatics.tecnoparco.org/SNPchimp.

  8. Analysis of host response to bacterial infection using error model based gene expression microarray experiments

    PubMed Central

    Stekel, Dov J.; Sarti, Donatella; Trevino, Victor; Zhang, Lihong; Salmon, Mike; Buckley, Chris D.; Stevens, Mark; Pallen, Mark J.; Penn, Charles; Falciani, Francesco

    2005-01-01

    A key step in the analysis of microarray data is the selection of genes that are differentially expressed. Ideally, such experiments should be properly replicated in order to infer both technical and biological variability, and the data should be subjected to rigorous hypothesis tests to identify the differentially expressed genes. However, in microarray experiments involving the analysis of very large numbers of biological samples, replication is not always practical. Therefore, there is a need for a method to select differentially expressed genes in a rational way from insufficiently replicated data. In this paper, we describe a simple method that uses bootstrapping to generate an error model from a replicated pilot study that can be used to identify differentially expressed genes in subsequent large-scale studies on the same platform, but in which there may be no replicated arrays. The method builds a stratified error model that includes array-to-array variability, feature-to-feature variability and the dependence of error on signal intensity. We apply this model to the characterization of the host response in a model of bacterial infection of human intestinal epithelial cells. We demonstrate the effectiveness of error model based microarray experiments and propose this as a general strategy for a microarray-based screening of large collections of biological samples. PMID:15800204

  9. The Evolving Polarized Jet of Black Hole Candidate Swift J1745-26

    NASA Technical Reports Server (NTRS)

    Curran, P. A.; Coriat, M.; Miller-Jones, J. C. A.; Armstrong, R. P.; Edwards, P. G.; Sivakoff, G. R.; Woudt, P.; Altamirano, D.; Belloni, T. M.; Corbel, S.; hide

    2013-01-01

    Swift J1745-26 is an X-ray binary towards the Galactic Centre that was detected when it went into outburst in September 2012. This source is thought to be one of a growing number of sources that display "failed outbursts", in which the self-absorbed radio jets of the transient source are never fully quenched and the thermal emission from the geometrically-thin inner accretion disk never fully dominates the X-ray flux. We present multifrequency data from the Very Large Array, Australia Telescope Compact Array and Karoo Array Telescope (KAT- 7) radio arrays, spanning the entire period of the outburst. Our rich data set exposes radio emission that displays a high level of large scale variability compared to the X-ray emission and deviations from the standard radio-X-ray correlation that are indicative of an unstable jet and confirm the outburst's transition from the canonical hard state to an intermediate state. We also observe steepening of the spectral index and an increase of the linear polarization to a large fraction (is approx. equal to 50%) of the total flux, as well as a rotation of the electric vector position angle. These are consistent with a transformation from a self-absorbed compact jet to optically-thin ejecta - the first time such a discrete ejection has been observed in a failed outburst - and may imply a complex magnetic field geometry.

  10. Triple seismic source, double research ship, single ambitious goal: integrated imaging of young oceanic crust in the Panama Basin

    NASA Astrophysics Data System (ADS)

    Wilson, Dean; Peirce, Christine; Hobbs, Richard; Gregory, Emma

    2016-04-01

    Understanding geothermal heat and mass fluxes through the seafloor is fundamental to the study of the Earth's energy budget. Using geophysical, geological and physical oceanography data we are exploring the interaction between the young oceanic crust and the ocean in the Panama Basin. We acquired a unique geophysical dataset that will allow us to build a comprehensive model of young oceanic crust from the Costa Rica Ridge axis to ODP borehole 504B. Data were collected over two 35 x 35 km2 3D grid areas, one each at the ridge axis and the borehole, and along three 330 km long 2D profiles orientated in the spreading direction, connecting the two grids. In addition to the 4.5 km long multichannel streamer and 75 ocean-bottom seismographs (OBS), we also deployed 12 magnetotelluric (MT) stations and collected underway swath bathymetry, gravity and magnetic data. For the long 2D profiles we used two research vessels operating synchronously. The RRS James Cook towed a high frequency GI-gun array (120 Hz) to image the sediments, and a medium frequency Bolt-gun array (50 Hz) for shallow-to-mid-crustal imaging. The R/V Sonne followed the Cook, 9 km astern and towed a third seismic source; a low frequency, large volume G-gun array (30 Hz) for whole crustal and upper mantle imaging at large offsets. Two bespoke vertical hydrophone arrays recorded real far field signatures that have enabled us to develop inverse source filters and match filters. Here we present the seismic reflection image, forward and inverse velocity-depth models and a density model along the primary 330 km north-south profile, from ridge axis to 6 Ma crust. By incorporating wide-angle streamer data from our two-ship, synthetic aperture acquisition together with traditional wide-angle OBS data we are able to constrain the structure of the upper oceanic crust. The results show a long-wavelength trend of increasing seismic velocity and density with age, and a correlation between velocity structure and basement roughness. Increased basement roughness leads to a non-uniform distribution of sediments, which we hypothesise influences the pattern of hydrothermal circulation and ultimately the secondary alteration of the upper crust. A combination of the complimentary wide-angle and normal incidence datasets and their individual models act as a starting point for joint inversion of seismic, gravity and MT data. The joint inversion produces a fully integrated model, enabling us to better understand how the oceanic crust evolves as a result of hydrothermal fluid circulation and cooling, as it ages from zero-age at the ridge-axis to 6 Ma at borehole 504B. Ultimately, this model can be used to undertake full waveform inversion to produce a high-resolution velocity model of the oceanic crust in the Panama Basin. This research is part of a major, interdisciplinary NERC-funded research collaboration entitled: Oceanographic and Seismic Characterisation of heat dissipation and alteration by hydrothermal fluids at an Axial Ridge (OSCAR).

  11. A model for the distributed storage and processing of large arrays

    NASA Technical Reports Server (NTRS)

    Mehrota, P.; Pratt, T. W.

    1983-01-01

    A conceptual model for parallel computations on large arrays is developed. The model provides a set of language concepts appropriate for processing arrays which are generally too large to fit in the primary memories of a multiprocessor system. The semantic model is used to represent arrays on a concurrent architecture in such a way that the performance realities inherent in the distributed storage and processing can be adequately represented. An implementation of the large array concept as an Ada package is also described.

  12. Representation-Independent Iteration of Sparse Data Arrays

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    An approach is defined that describes a method of iterating over massively large arrays containing sparse data using an approach that is implementation independent of how the contents of the sparse arrays are laid out in memory. What is unique and important here is the decoupling of the iteration over the sparse set of array elements from how they are internally represented in memory. This enables this approach to be backward compatible with existing schemes for representing sparse arrays as well as new approaches. What is novel here is a new approach for efficiently iterating over sparse arrays that is independent of the underlying memory layout representation of the array. A functional interface is defined for implementing sparse arrays in any modern programming language with a particular focus for the Chapel programming language. Examples are provided that show the translation of a loop that computes a matrix vector product into this representation for both the distributed and not-distributed cases. This work is directly applicable to NASA and its High Productivity Computing Systems (HPCS) program that JPL and our current program are engaged in. The goal of this program is to create powerful, scalable, and economically viable high-powered computer systems suitable for use in national security and industry by 2010. This is important to NASA for its computationally intensive requirements for analyzing and understanding the volumes of science data from our returned missions.

  13. An Evaluation of North Korea’s Nuclear Test by Belbasi Nuclear Tests Monitoring Center-KOERI

    NASA Astrophysics Data System (ADS)

    Necmioglu, O.; Meral Ozel, N.; Semin, K.

    2009-12-01

    Bogazici University and Kandilli Observatory and Earthquake Research Institute (KOERI) is acting as the Turkish National Data Center (NDC) and responsible for the operation of the International Monitoring System (IMS) Primary Seismic Station (PS-43) under Belbasi Nuclear Tests Monitoring Center for the verification of compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) since February 2000. The NDC is responsible for operating two arrays which are part of the IMS, as well as for transmitting data from these stations to the International Data Centre (IDC) in Vienna. The Belbasi array was established in 1951, as a four-element (Benioff 1051) seismic array as part of the United States Atomic Energy Detection System (USAEDS). Turkish General Staff (TGS) and U.S. Air Force Technical Application Center (AFTAC) under the Defense and Economic Cooperation Agreement (DECA) jointly operated this short period array. The station was upgraded and several seismometers were added to array during 1951 and 1994 and the station code was changed from BSRS (Belbasi Seismic Research Station) to BRTR-PS43 later on. PS-43 is composed of two sub-arrays (Ankara and Keskin): the medium-period array with a ~40 km radius located in Ankara and the short-period array with a ~3 km radius located in Keskin. Each array has a broadband element located at the middle of the circular geometry. Short period instruments are installed at depth 30 meters from the surface while medium and broadband instruments are installed at depth 60 meters from surface. On 25 May 2009, The Democratic People’s Republic of Korea (DPRK) claimed that it had conducted a nuclear test. Corresponding seismic event was recorded by IMS and IDC released first automatic estimation of time (00:54:43 GMT), location (41.2896°N and 129.0480°E) and the magnitude (4.52 mb) of the event in less than two hours time (USGS: 00:54:43 GMT; 41.306°N, 129.029°E; 4.7 mb) During our preliminary analysis of the 25th May 2009 DPRK event, we saw a very clear P arrival at 01:05:47 (GMT) at BRTR SP array. The result of the f-k analysis performed in Geotool software, installed at NDC facilities in 2008 and is in full use currently, was also indicating that the arrival belongs to the DPRK event. When comparing our f-k results (calculated at 1-2 Hz) with IDC-REB, however, we have noticed that our calculation and therefore corresponding residuals (calculated with reference to REB residuals) are much better in comparison to REB. The reasons of this ambiguity have been explored and for the first time a comprehensive seismological analysis of a Nuclear Test has been conducted in Turkey. CTBT has an important role for the implementation of the non-proliferation of nuclear weapons and it is a key element for the pursuit of nuclear disarmament. In this study, we would like to reflect the technical and scientific aspects of the 25 May 2009 DPRK event analysis, together with our involvement in CTBT(O) affairs, which we believe it brings new dimensions to Turkey especially in the area of Geophysics.

  14. Near- and far-field infrasound monitoring in the Mediterranean area

    NASA Astrophysics Data System (ADS)

    Campus, Paola; Marchetti, Emanuele; Le Pichon, Alexis; Wallenstein, Nicolau; Ripepe, Maurizio; Kallel, Mohamed; Mialle, Pierrick

    2013-04-01

    The Mediterranean area is characterized by a number of very interesting sources of infrasound signals and offers a promising playground for the development of a deeper understanding of such sources and of the associated propagation models. The progress in the construction and certification of infrasound arrays belonging to the International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) in the vicinity of this area has been complemented, in the last decade, by the construction of infrasound arrays established by several European research groups. The University of Florence (UniFi) plays a crucial role for the detection of infrasound signals in the Mediterranean area, having deployed since several years two infrasound arrays on Stromboli and Etna volcanoes, and, more recently, three infrasound arrays in the Alpine area of NW Italy and one infrasound array on the Apennines (Mount Amiata), designed and established in the framework of the ARISE Project. The IMS infrasound arrays IS42 (Graciosa, Azores, Portugal) and IS48 (Kesra, Tunisia) recorded, since the time of their certification, a number of far-field events which can be correlated with some near-field records of the infrasound arrays belonging to UniFi. An analysis of the results and potentialities of infrasound source's detections in near and far-field realized by IS42, IS48 and UniFi arrays in the Mediterranean area, with special focus on volcanic events is presented. The combined results deriving from the analysis of data recorded by the Unifi arrays and by the IS42 and IS48 arrays, in collaboration with the Department of Analyse et Surveillance (CEA/DASE), will generate a synergy which will certainly contribute to the progress of the ARISE Project.

  15. A hierarchy of ocean biogeochemical comprehensiveness for Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Dunne, J. P.

    2016-12-01

    As Earth System Models mature towards more quantitative explanations of ocean carbon cycle interactions and are applied to an increasingly diverse array of living marine resource communities, the draw towards biogeochemical and ecological comprehensiveness intensifies. However, this draw to comprehensiveness must also be balanced with the added cost of handling additional tracers. One way that GFDL has addressed this constraint is by developing a series of biogeochemical modules based on the 30 tracer TOPAZ formulation used in GFDL's CMIP5 contribution in both simplifying the biogeochemistry down to the 6 tracer BLING formulation and 3 tracer mini-BLING formulation, and in the other direction improving on ecosystem comprehensiveness with the 33 tracer COBALT formulation. We discuss the comparative advantages and disadvantages along this continuum of complexity in terms of both biogeochemical and ecological fidelity and applicability. We also discuss a related approach to separate out other modules for ideal age, 14C, CFCs, SF6, Argon and other tracer suites, allowing use to run an array of experimental designs to suite different needs.

  16. Agile Datacube Analytics (not just) for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Merticariu, Vlad; Baumann, Peter

    2017-04-01

    Metadata are considered small, smart, and queryable; data, on the other hand, are known as big, clumsy, hard to analyze. Consequently, gridded data - such as images, image timeseries, and climate datacubes - are managed separately from the metadata, and with different, restricted retrieval capabilities. One reason for this silo approach is that databases, while good at tables, XML hierarchies, RDF graphs, etc., traditionally do not support multi-dimensional arrays well. This gap is being closed by Array Databases which extend the SQL paradigm of "any query, anytime" to NoSQL arrays. They introduce semantically rich modelling combined with declarative, high-level query languages on n-D arrays. On Server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. This way, they offer new vistas in flexibility, scalability, performance, and data integration. In this respect, the forthcoming ISO SQL extension MDA ("Multi-dimensional Arrays") will be a game changer in Big Data Analytics. We introduce concepts and opportunities through the example of rasdaman ("raster data manager") which in fact has pioneered the field of Array Databases and forms the blueprint for ISO SQL/MDA and further Big Data standards, such as OGC WCPS for querying spatio-temporal Earth datacubes. With operational installations exceeding 140 TB queries have been split across more than one thousand cloud nodes, using CPUs as well as GPUs. Installations can easily be mashed up securely, enabling large-scale location-transparent query processing in federations. Federation queries have been demonstrated live at EGU 2016 spanning Europe and Australia in the context of the intercontinental EarthServer initiative, visualized through NASA WorldWind.

  17. Agile Datacube Analytics (not just) for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2016-12-01

    Metadata are considered small, smart, and queryable; data, on the other hand, are known as big, clumsy, hard to analyze. Consequently, gridded data - such as images, image timeseries, and climate datacubes - are managed separately from the metadata, and with different, restricted retrieval capabilities. One reason for this silo approach is that databases, while good at tables, XML hierarchies, RDF graphs, etc., traditionally do not support multi-dimensional arrays well.This gap is being closed by Array Databases which extend the SQL paradigm of "any query, anytime" to NoSQL arrays. They introduce semantically rich modelling combined with declarative, high-level query languages on n-D arrays. On Server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. This way, they offer new vistas in flexibility, scalability, performance, and data integration. In this respect, the forthcoming ISO SQL extension MDA ("Multi-dimensional Arrays") will be a game changer in Big Data Analytics.We introduce concepts and opportunities through the example of rasdaman ("raster data manager") which in fact has pioneered the field of Array Databases and forms the blueprint for ISO SQL/MDA and further Big Data standards, such as OGC WCPS for querying spatio-temporal Earth datacubes. With operational installations exceeding 140 TB queries have been split across more than one thousand cloud nodes, using CPUs as well as GPUs. Installations can easily be mashed up securely, enabling large-scale location-transparent query processing in federations. Federation queries have been demonstrated live at EGU 2016 spanning Europe and Australia in the context of the intercontinental EarthServer initiative, visualized through NASA WorldWind.

  18. Solar array flight experiment

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Emerging satellite designs require increasing amounts of electrical power to operate spacecraft instruments and to provide environments suitable for human habitation. In the past, electrical power was generated by covering rigid honeycomb panels with solar cells. This technology results in unacceptable weight and volume penalties when large amounts of power are required. To fill the need for large-area, lightweight solar arrays, a fabrication technique in which solar cells are attached to a copper printed circuit laminated to a plastic sheet was developed. The result is a flexible solar array with one-tenth the stowed volume and one-third the weight of comparably sized rigid arrays. An automated welding process developed to attack the cells to the printed circuit guarantees repeatable welds that are more tolerant of severe environments than conventional soldered connections. To demonstrate the flight readiness of this technology, the Solar Array Flight Experiment (SAFE) was developed and flown on the space shuttle Discovery in September 1984. The tests showed the modes and frequencies of the array to be very close to preflight predictions. Structural damping, however, was higher than anticipated. Electrical performance of the active solar panel was also tested. The flight performance and postflight data evaluation are described.

  19. Concepts and Development of Bio-Inspired Distributed Embedded Wired/Wireless Sensor Array Architectures for Acoustic Wave Sensing in Integrated Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Ghoshal, Anindya; Prosser, William H.; Kirikera, Goutham; Schulz, Mark J.; Hughes, Derke J.; Orisamolu, Wally

    2003-01-01

    This paper discusses the modeling of acoustic emissions in plate structures and their sensing by embedded or surface bonded piezoelectric sensor arrays. Three different modeling efforts for acoustic emission (AE) wave generation and propagation are discussed briefly along with their advantages and disadvantages. Continuous sensors placed at right angles on a plate are being discussed as a new approach to measure and locate the source of acoustic waves. Evolutionary novel signal processing algorithms and bio-inspired distributed sensor array systems are used on large structures and integrated aerospace vehicles for AE source localization and preliminary results are presented. These systems allow for a great reduction in the amount of data that needs to be processed and also reduce the chances of false alarms from ambient noises. It is envisioned that these biomimetic sensor arrays and signal processing techniques will be useful for both wireless and wired sensor arrays for real time health monitoring of large integrated aerospace vehicles and earth fixed civil structures. The sensor array architectures can also be used with other types of sensors and for other applications.

  20. SU-F-T-458: Tracking Trends of TG-142 Parameters Via Analysis of Data Recorded by 2D Chamber Array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexandrian, A; Kabat, C; Defoor, D

    Purpose: With increasing QA demands of medical physicists in clinical radiation oncology, the need for an effective method of tracking clinical data has become paramount. A tool was produced which scans through data automatically recorded by a 2D chamber array and extracts relevant information recommended by TG-142. Using this extracted information a timely and comprehensive analysis of QA parameters can be easily performed enabling efficient monthly checks on multiple linear accelerators simultaneously. Methods: A PTW STARCHECK chamber array was used to record several months of beam outputs from two Varian 2100 series linear accelerators and a Varian NovalisTx−. In conjunctionmore » with the chamber array, a beam quality phantom was used to simultaneously to determine beam quality. A minimalist GUI was created in MatLab that allows a user to set the file path of the data for each modality to be analyzed. These file paths are recorded to a MatLab structure and then subsequently accessed by a script written in Python (version 3.5.1) which then extracts values required to perform monthly checks as outlined by recommendations from TG-142. The script incorporates calculations to determine if the values recorded by the chamber array fall within an acceptable threshold. Results: Values obtained by the script are written to a spreadsheet where results can be easily viewed and annotated with a “pass” or “fail” and saved for further analysis. In addition to creating a new scheme for reviewing monthly checks, this application allows for able to succinctly store data for follow up analysis. Conclusion: By utilizing this tool, parameters recommended by TG-142 for multiple linear accelerators can be rapidly obtained and analyzed which can be used for evaluation of monthly checks.« less

  1. An Electronic-Nose Sensor Node Based on a Polymer-Coated Surface Acoustic Wave Array for Wireless Sensor Network Applications

    PubMed Central

    Tang, Kea-Tiong; Li, Cheng-Han; Chiu, Shih-Wen

    2011-01-01

    This study developed an electronic-nose sensor node based on a polymer-coated surface acoustic wave (SAW) sensor array. The sensor node comprised an SAW sensor array, a frequency readout circuit, and an Octopus II wireless module. The sensor array was fabricated on a large K2 128° YX LiNbO3 sensing substrate. On the surface of this substrate, an interdigital transducer (IDT) was produced with a Cr/Au film as its metallic structure. A mixed-mode frequency readout application specific integrated circuit (ASIC) was fabricated using a TSMC 0.18 μm process. The ASIC output was connected to a wireless module to transmit sensor data to a base station for data storage and analysis. This sensor node is applicable for wireless sensor network (WSN) applications. PMID:22163865

  2. An electronic-nose sensor node based on a polymer-coated surface acoustic wave array for wireless sensor network applications.

    PubMed

    Tang, Kea-Tiong; Li, Cheng-Han; Chiu, Shih-Wen

    2011-01-01

    This study developed an electronic-nose sensor node based on a polymer-coated surface acoustic wave (SAW) sensor array. The sensor node comprised an SAW sensor array, a frequency readout circuit, and an Octopus II wireless module. The sensor array was fabricated on a large K(2) 128° YX LiNbO3 sensing substrate. On the surface of this substrate, an interdigital transducer (IDT) was produced with a Cr/Au film as its metallic structure. A mixed-mode frequency readout application specific integrated circuit (ASIC) was fabricated using a TSMC 0.18 μm process. The ASIC output was connected to a wireless module to transmit sensor data to a base station for data storage and analysis. This sensor node is applicable for wireless sensor network (WSN) applications.

  3. Seismo-acoustic Signals Recorded at KSIAR, the Infrasound Array Installed at PS31

    NASA Astrophysics Data System (ADS)

    Kim, T. S.; Che, I. Y.; Jeon, J. S.; Chi, H. C.; Kang, I. B.

    2014-12-01

    One of International Monitoring System (IMS)'s primary seismic stations, PS31, called Korea Seismic Research Station (KSRS), was installed around Wonju, Korea in 1970s. It has been operated by US Air Force Technical Applications Center (AFTAC) for more than 40 years. KSRS is composed of 26 seismic sensors including 19 short period, 6 long period and 1 broad band seismometers. The 19 short period sensors were used to build an array with a 10-km aperture while the 6 long period sensors were used for a relatively long period array with a 40-km aperture. After KSRS was certified as an IMS station in 2006 by Comprehensive Nuclear Test Ban Treaty Organization (CTBTO), Korea Institute of Geoscience and Mineral Resources (KIGAM) which is the Korea National Data Center started to take over responsibilities on the operation and maintenance of KSRS from AFTAC. In April of 2014, KIGAM installed an infrasound array, KSIAR, on the existing four short period seismic stations of KSRS, the sites KS05, KS06, KS07 and KS16. The collocated KSIAR changed KSRS from a seismic array into a seismo-acoustic array. The aperture of KSIAR is 3.3 km. KSIAR also has a 100-m small aperture infrasound array at KS07. The infrasound data from KSIAR except that from the site KS06 is being transmitted in real time to KIGAM with VPN and internet line. An initial analysis on seismo-acoustic signals originated from local and regional distance ranges has been performed since May 2014. The analysis with the utilization of an array process called Progressive Multi-Channel Correlation (PMCC) detected seismo-acoustic signals caused by various sources including small explosions in relation to constructing local tunnels and roads. Some of them were not found in the list of automatic bulletin of KIGAM. The seismo-acoustic signals recorded by KSIAR are supplying a useful information for discriminating local and regional man-made events from natural events.

  4. Multi-Band Miniaturized Patch Antennas for a Compact, Shielded Microwave Breast Imaging Array.

    PubMed

    Aguilar, Suzette M; Al-Joumayly, Mudar A; Burfeindt, Matthew J; Behdad, Nader; Hagness, Susan C

    2013-12-18

    We present a comprehensive study of a class of multi-band miniaturized patch antennas designed for use in a 3D enclosed sensor array for microwave breast imaging. Miniaturization and multi-band operation are achieved by loading the antenna with non-radiating slots at strategic locations along the patch. This results in symmetric radiation patterns and similar radiation characteristics at all frequencies of operation. Prototypes were fabricated and tested in a biocompatible immersion medium. Excellent agreement was obtained between simulations and measurements. The trade-off between miniaturization and radiation efficiency within this class of patch antennas is explored via a numerical analysis of the effects of the location and number of slots, as well as the thickness and permittivity of the dielectric substrate, on the resonant frequencies and gain. Additionally, we compare 3D quantitative microwave breast imaging performance achieved with two different enclosed arrays of slot-loaded miniaturized patch antennas. Simulated array measurements were obtained for a 3D anatomically realistic numerical breast phantom. The reconstructed breast images generated from miniaturized patch array data suggest that, for the realistic noise power levels assumed in this study, the variations in gain observed across this class of multi-band patch antennas do not significantly impact the overall image quality. We conclude that these miniaturized antennas are promising candidates as compact array elements for shielded, multi-frequency microwave breast imaging systems.

  5. Indium antimonide large-format detector arrays

    NASA Astrophysics Data System (ADS)

    Davis, Mike; Greiner, Mark

    2011-06-01

    Large format infrared imaging sensors are required to achieve simultaneously high resolution and wide field of view image data. Infrared sensors are generally required to be cooled from room temperature to cryogenic temperatures in less than 10 min thousands of times during their lifetime. The challenge is to remove mechanical stress, which is due to different materials with different coefficients of expansion, over a very wide temperature range and at the same time, provide a high sensitivity and high resolution image data. These challenges are met by developing a hybrid where the indium antimonide detector elements (pixels) are unconnected islands that essentially float on a silicon substrate and form a near perfect match to the silicon read-out circuit. Since the pixels are unconnected and isolated from each other, the array is reticulated. This paper shows that the front side illuminated and reticulated element indium antimonide focal plane developed at L-3 Cincinnati Electronics are robust, approach background limited sensitivity limit, and provide the resolution expected of the reticulated pixel array.

  6. Improved microseismic event locations through large-N arrays and wave-equation imaging and inversion

    NASA Astrophysics Data System (ADS)

    Witten, B.; Shragge, J. C.

    2016-12-01

    The recent increased focus on small-scale seismicity, Mw < 4 has come about primarily for two reasons. First, there is an increase in induced seismicity related to injection operations primarily for wastewater disposal and hydraulic fracturing for oil and gas recovery and for geothermal energy production. While the seismicity associated with injection is sometimes felt, it is more often weak. Some weak events are detected on current sparse arrays; however, accurate location of the events often requires a larger number of (multi-component) sensors. This leads to the second reason for an increased focus on small magnitude seismicity: a greater number of seismometers are being deployed in large N-arrays. The greater number of sensors decreases the detection threshold and therefore significantly increases the number of weak events found. Overall, these two factors bring new challenges and opportunities. Many standard seismological location and inversion techniques are geared toward large, easily identifiable events recorded on a sparse number of stations. However, with large-N arrays we can detect small events by utilizing multi-trace processing techniques, and increased processing power equips us with tools that employ more complete physics for simultaneously locating events and inverting for P- and S-wave velocity structure. We present a method that uses large-N arrays and wave-equation-based imaging and inversion to jointly locate earthquakes and estimate the elastic velocities of the earth. The technique requires no picking and is thus suitable for weak events. We validate the methodology through synthetic and field data examples.

  7. Performance evaluation of redundant disk array support for transaction recovery

    NASA Technical Reports Server (NTRS)

    Mourad, Antoine N.; Fuchs, W. Kent; Saab, Daniel G.

    1991-01-01

    Redundant disk arrays provide a way of achieving rapid recovery from media failures with a relatively low storage cost for large scale data systems requiring high availability. Here, we propose a method for using redundant disk arrays to support rapid recovery from system crashes and transaction aborts in addition to their role in providing media failure recovery. A twin page scheme is used to store the parity information in the array so that the time for transaction commit processing is not degraded. Using an analytical model, we show that the proposed method achieves a significant increase in the throughput of database systems using redundant disk arrays by reducing the number of recovery operations needed to maintain the consistency of the database.

  8. The value of residential photovoltaic systems: A comprehensive assessment

    NASA Technical Reports Server (NTRS)

    Borden, C. S.

    1983-01-01

    Utility-interactive photovoltaic (PV) arrays on residential rooftops appear to be a potentially attractive, large-scale application of PV technology. Results of a comprehensive assessment of the value (i.e., break-even cost) of utility-grid connected residential photovoltaic power systems under a variety of technological and economic assumptions are presented. A wide range of allowable PV system costs are calculated for small (4.34 kW (p) sub ac) residential PV systems in various locales across the United States. Primary factor in this variation are differences in local weather conditions, utility-specific electric generation capacity, fuel types, and customer-load profiles that effect purchase and sell-back rates, and non-uniform state tax considerations. Additional results from this analysis are: locations having the highest insolation values are not necessary the most economically attractive sites; residential PV systems connected in parallel to the utility demonstrate high percentages of energy sold back to the grid, and owner financial and tax assumptions cause large variations in break-even costs. Significant cost reduction and aggressive resolution of potential institutional impediments (e.g., liability, standards, metering, and technical integration) are required for a residential PV marker to become a major electric-grid-connected energy-generation source.

  9. The value of residential photovoltaic systems: A comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Borden, C. S.

    1983-09-01

    Utility-interactive photovoltaic (PV) arrays on residential rooftops appear to be a potentially attractive, large-scale application of PV technology. Results of a comprehensive assessment of the value (i.e., break-even cost) of utility-grid connected residential photovoltaic power systems under a variety of technological and economic assumptions are presented. A wide range of allowable PV system costs are calculated for small (4.34 kW (p) sub ac) residential PV systems in various locales across the United States. Primary factor in this variation are differences in local weather conditions, utility-specific electric generation capacity, fuel types, and customer-load profiles that effect purchase and sell-back rates, and non-uniform state tax considerations. Additional results from this analysis are: locations having the highest insolation values are not necessary the most economically attractive sites; residential PV systems connected in parallel to the utility demonstrate high percentages of energy sold back to the grid, and owner financial and tax assumptions cause large variations in break-even costs. Significant cost reduction and aggressive resolution of potential institutional impediments (e.g., liability, standards, metering, and technical integration) are required for a residential PV marker to become a major electric-grid-connected energy-generation source.

  10. Recent results from the Telescope Array Experiment

    NASA Astrophysics Data System (ADS)

    Abbasi, Rasha; Telescope Array Collaboration

    2016-03-01

    The Telescope Array (TA) is the largest ultrahigh energy cosmic rays detector in the northern hemisphere. TA is a hybrid detector comprised of three air fluorescence stations and a large surface array consisting of 507 scintillator counters. Each of the three fluorescence stations, located at the periphery of the ground array, views 108 degrees in azimuth and up to 30 degrees in elevation. The surface detectors are arranged in a square grid of 1.2 km spacing, covering over 700 square kilometers. TA has collected more than seven years of data. In this talk, we will present some of the main results on the cosmic rays composition and energy spectrum obtained by TA and its low energy extension (TALE). Finally, we will present our results from the search for arrival direction anisotropy, including the observed large excess of events at the highest energies, seen in the region of the northern sky centered on Ursa Major. Based on the current results, the ``hot spot'' in particular, TA is pursuing the expansion of the surface array to four times its current size.

  11. Phased array feed design technology for Large Aperture Microwave Radiometer (LAMR) Earth observations

    NASA Technical Reports Server (NTRS)

    Schuman, H. K.

    1992-01-01

    An assessment of the potential and limitations of phased array antennas in space-based geophysical precision radiometry is described. Mathematical models exhibiting the dependence of system and scene temperatures and system sensitivity on phased array antenna parameters and components such as phase shifters and low noise amplifiers (LNA) are developed. Emphasis is given to minimum noise temperature designs wherein the LNA's are located at the array level, one per element or subarray. Two types of combiners are considered: array lenses (space feeds) and corporate networks. The result of a survey of suitable components and devices is described. The data obtained from that survey are used in conjunction with the mathematical models to yield an assessment of effective array antenna noise temperature for representative geostationary and low Earth orbit systems. Practical methods of calibrating a space-based, phased array radiometer are briefly addressed as well.

  12. Triboelectric Charging at the Nanostructured Solid/Liquid Interface for Area-Scalable Wave Energy Conversion and Its Use in Corrosion Protection.

    PubMed

    Zhao, Xue Jiao; Zhu, Guang; Fan, You Jun; Li, Hua Yang; Wang, Zhong Lin

    2015-07-28

    We report a flexible and area-scalable energy-harvesting technique for converting kinetic wave energy. Triboelectrification as a result of direct interaction between a dynamic wave and a large-area nanostructured solid surface produces an induced current among an array of electrodes. An integration method ensures that the induced current between any pair of electrodes can be constructively added up, which enables significant enhancement in output power and realizes area-scalable integration of electrode arrays. Internal and external factors that affect the electric output are comprehensively discussed. The produced electricity not only drives small electronics but also achieves effective impressed current cathodic protection. This type of thin-film-based device is a potentially practical solution of on-site sustained power supply at either coastal or off-shore sites wherever a dynamic wave is available. Potential applications include corrosion protection, pollution degradation, water desalination, and wireless sensing for marine surveillance.

  13. Vocabulary Acquisition: Implications for Reading Comprehension

    ERIC Educational Resources Information Center

    Wagner, Richard K., Ed.; Muse, Andrea E., Ed.; Tannenbaum, Kendra R., Ed.

    2006-01-01

    Understanding a text requires more than the ability to read individual words: it depends greatly on vocabulary knowledge. This important book brings together leading literacy scholars to synthesize cutting-edge research on vocabulary development and its connections to reading comprehension. The volume also reviews an array of approaches to…

  14. Automated Determination of Magnitude and Source Length of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Wang, D.; Kawakatsu, H.; Zhuang, J.; Mori, J. J.; Maeda, T.; Tsuruoka, H.; Zhao, X.

    2017-12-01

    Rapid determination of earthquake magnitude is of importance for estimating shaking damages, and tsunami hazards. However, due to the complexity of source process, accurately estimating magnitude for great earthquakes in minutes after origin time is still a challenge. Mw is an accurate estimate for large earthquakes. However, calculating Mw requires the whole wave trains including P, S, and surface phases, which takes tens of minutes to reach stations at tele-seismic distances. To speed up the calculation, methods using W phase and body wave are developed for fast estimating earthquake sizes. Besides these methods that involve Green's Functions and inversions, there are other approaches that use empirically simulated relations to estimate earthquake magnitudes, usually for large earthquakes. The nature of simple implementation and straightforward calculation made these approaches widely applied at many institutions such as the Pacific Tsunami Warning Center, the Japan Meteorological Agency, and the USGS. Here we developed an approach that was originated from Hara [2007], estimating magnitude by considering P-wave displacement and source duration. We introduced a back-projection technique [Wang et al., 2016] instead to estimate source duration using array data from a high-sensitive seismograph network (Hi-net). The introduction of back-projection improves the method in two ways. Firstly, the source duration could be accurately determined by seismic array. Secondly, the results can be more rapidly calculated, and data derived from farther stations are not required. We purpose to develop an automated system for determining fast and reliable source information of large shallow seismic events based on real time data of a dense regional array and global data, for earthquakes that occur at distance of roughly 30°- 85° from the array center. This system can offer fast and robust estimates of magnitudes and rupture extensions of large earthquakes in 6 to 13 min (plus source duration time) depending on the epicenter distances. It may be a promising aid for disaster mitigation right after a damaging earthquake, especially when dealing with the tsunami evacuation and emergency rescue.

  15. Automated Determination of Magnitude and Source Extent of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Wang, Dun

    2017-04-01

    Rapid determination of earthquake magnitude is of importance for estimating shaking damages, and tsunami hazards. However, due to the complexity of source process, accurately estimating magnitude for great earthquakes in minutes after origin time is still a challenge. Mw is an accurate estimate for large earthquakes. However, calculating Mw requires the whole wave trains including P, S, and surface phases, which takes tens of minutes to reach stations at tele-seismic distances. To speed up the calculation, methods using W phase and body wave are developed for fast estimating earthquake sizes. Besides these methods that involve Green's Functions and inversions, there are other approaches that use empirically simulated relations to estimate earthquake magnitudes, usually for large earthquakes. The nature of simple implementation and straightforward calculation made these approaches widely applied at many institutions such as the Pacific Tsunami Warning Center, the Japan Meteorological Agency, and the USGS. Here we developed an approach that was originated from Hara [2007], estimating magnitude by considering P-wave displacement and source duration. We introduced a back-projection technique [Wang et al., 2016] instead to estimate source duration using array data from a high-sensitive seismograph network (Hi-net). The introduction of back-projection improves the method in two ways. Firstly, the source duration could be accurately determined by seismic array. Secondly, the results can be more rapidly calculated, and data derived from farther stations are not required. We purpose to develop an automated system for determining fast and reliable source information of large shallow seismic events based on real time data of a dense regional array and global data, for earthquakes that occur at distance of roughly 30°- 85° from the array center. This system can offer fast and robust estimates of magnitudes and rupture extensions of large earthquakes in 6 to 13 min (plus source duration time) depending on the epicenter distances. It may be a promising aid for disaster mitigation right after a damaging earthquake, especially when dealing with the tsunami evacuation and emergency rescue.

  16. Array data extractor (ADE): a LabVIEW program to extract and merge gene array data

    PubMed Central

    2013-01-01

    Background Large data sets from gene expression array studies are publicly available offering information highly valuable for research across many disciplines ranging from fundamental to clinical research. Highly advanced bioinformatics tools have been made available to researchers, but a demand for user-friendly software allowing researchers to quickly extract expression information for multiple genes from multiple studies persists. Findings Here, we present a user-friendly LabVIEW program to automatically extract gene expression data for a list of genes from multiple normalized microarray datasets. Functionality was tested for 288 class A G protein-coupled receptors (GPCRs) and expression data from 12 studies comparing normal and diseased human hearts. Results confirmed known regulation of a beta 1 adrenergic receptor and further indicate novel research targets. Conclusions Although existing software allows for complex data analyses, the LabVIEW based program presented here, “Array Data Extractor (ADE)”, provides users with a tool to retrieve meaningful information from multiple normalized gene expression datasets in a fast and easy way. Further, the graphical programming language used in LabVIEW allows applying changes to the program without the need of advanced programming knowledge. PMID:24289243

  17. Array Processing in the Cloud: the rasdaman Approach

    NASA Astrophysics Data System (ADS)

    Merticariu, Vlad; Dumitru, Alex

    2015-04-01

    The multi-dimensional array data model is gaining more and more attention when dealing with Big Data challenges in a variety of domains such as climate simulations, geographic information systems, medical imaging or astronomical observations. Solutions provided by classical Big Data tools such as Key-Value Stores and MapReduce, as well as traditional relational databases, proved to be limited in domains associated with multi-dimensional data. This problem has been addressed by the field of array databases, in which systems provide database services for raster data, without imposing limitations on the number of dimensions that a dataset can have. Examples of datasets commonly handled by array databases include 1-dimensional sensor data, 2-D satellite imagery, 3-D x/y/t image time series as well as x/y/z geophysical voxel data, and 4-D x/y/z/t weather data. And this can grow as large as simulations of the whole universe when it comes to astrophysics. rasdaman is a well established array database, which implements many optimizations for dealing with large data volumes and operation complexity. Among those, the latest one is intra-query parallelization support: a network of machines collaborate for answering a single array database query, by dividing it into independent sub-queries sent to different servers. This enables massive processing speed-ups, which promise solutions to research challenges on multi-Petabyte data cubes. There are several correlated factors which influence the speedup that intra-query parallelisation brings: the number of servers, the capabilities of each server, the quality of the network, the availability of the data to the server that needs it in order to compute the result and many more. In the effort of adapting the engine to cloud processing patterns, two main components have been identified: one that handles communication and gathers information about the arrays sitting on every server, and a processing unit responsible with dividing work among available nodes and executing operations on local data. The federation daemon collects and stores statistics from the other network nodes and provides real time updates about local changes. Information exchanged includes available datasets, CPU load and memory usage per host. The processing component is represented by the rasdaman server. Using information from the federation daemon it breaks queries into subqueries to be executed on peer nodes, ships them, and assembles the intermediate results. Thus, we define a rasdaman network node as a pair of a federation daemon and a rasdaman server. Any node can receive a query and will subsequently act as this query's dispatcher, so all peers are at the same level and there is no single point of failure. Should a node become inaccessible then the peers will recognize this and will not any longer consider this peer for distribution. Conversely, a peer at any time can join the network. To assess the feasibility of our approach, we deployed a rasdaman network in the Amazon Elastic Cloud environment on 1001 nodes, and observed that this feature can greatly increase the performance and scalability of the system, offering a large throughput of processed data.

  18. Large vestibular aqueduct syndrome: Impedance changes over time with different cochlear implant electrode arrays.

    PubMed

    Powell, Harry R F; Birman, Catherine S

    2015-01-01

    The aim of this study was to assess if large vestibular aqueduct syndrome (LVAS), with the increase in perilymphatic pressure, affects impedance changes over time with different types of Cochlear(®) implant electrode arrays Contour, Straight, and CI 422. To report speech perception outcomes for these cochlear implant recipients. Retrospective case review of impedance levels and categories of auditory performance. Impedance data were collected at switch on, 1 month, 3, 6, 12, and 24 months after cochlear implantation and compared with control (non-LVAS cochlear implant recipient) data for each array type. Forty-seven patients with exclusive LVAS and no other vestibulocochlear abnormalities or other identifiable cause of deafness were eligible for inclusion in the study. In LVAS patients, there was a significant difference in impedance between the three types of device (P < 0.0001). Time since switch on was associated with a decrease in impedance for all three devices (P < 0.0001). The mean impedance reduced between switch on and 1 month and remained relatively constant thereafter. Sound variation with softening of sounds was seen in four CI 422 (Straight Research Array) recipients due to ongoing fluctuations in electrode compliance. For all three array types, there was no significant difference in the mean impedance between the LVAS patients and controls over the first 12 months. In keeping with previous studies cochlear implant recipients with LVAS hear very well through the cochlear implant.

  19. Engineering sciences area and module performance and failure analysis area

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.; Runkle, L. D.

    1982-01-01

    Photovoltaic-array/power-conditioner interface studies are updated. An experiment conducted to evaluate different operating-point strategies, such as constant voltage and pilot cells, and to determine array energy losses when the array is operated off the maximum power points is described. Initial results over a test period of three and a half weeks showed a 2% energy loss when the array is operated at a fixed voltage. Degraded-array studies conducted at NE RES that used a range of simulated common types of degraded I-V curves are reviewed. The instrumentation installed at the JPL field-test site to obtain the irradiance data was described. Experiments using an optical filter to adjust the spectral irradiance of the large-area pulsed solar simulator (LAPSS) to AM1.5 are described. Residential-array research activity is reviewed. Voltage isolation test results are described. Experiments performed on one type of module to determine the relationship between leakage current and temperature are reviewed. An encapsulated-cell testing approach is explained. The test program, data reduction methods, and initial results of long-duration module testing are described.

  20. Performance of a scintillation detector array operated with LHAASO-KM2A electronics

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Guo, Yiqing; Cai, Hui; Chang, Jinfan; Chen, Tianlu; Danzengluobu; Feng, Youliang; Gao, Qi; Gou, Quanbu; Guo, Yingying; Hou, Chao; Hu, Hongbo; Labaciren; Liu, Cheng; Li, Haijin; Liu, Jia; Liu, Maoyuan; Qiao, Bingqiang; Qian, Xiangli; Sheng, Xiangdong; Tian, Zhen; Wang, Qun; Xue, Liang; Yao, Yuhua; Zhang, Shaoru; Zhang, Xueyao; Zhang, Yi

    2018-04-01

    A scintillation detector array composed of 115 detectors and covering an area of about 20000 m2 was installed at the end of 2016 at the Yangbajing international cosmic ray observatory and has been taking data since then. The array is equipped with electronics from Large High Altitude Air Shower Observatory Square Kilometer Complex Array (LHAASO-KM2A) and, in turn, currently serves as the largest debugging and testing platform for the LHAASO-KM2A. Furthermore, the array was used to study the performance of a wide field-of-view air Cherenkov telescope by providing accurate information on the shower core, direction and energy, etc. This work is mainly dealing with the scintillation detector array. The experimental setup and the offline calibration are described in detail. Then, a thorough comparison between the data and Monte Carlo (MC) simulations is presented and a good agreement is obtained. With the even-odd method, the resolutions of the shower direction and core are measured. Finally, successful observations of the expected Moon's and Sun's shadows of cosmic rays (CRs) verify the measured angular resolution.

  1. BEAT: Bioinformatics Exon Array Tool to store, analyze and visualize Affymetrix GeneChip Human Exon Array data from disease experiments

    PubMed Central

    2012-01-01

    Background It is known from recent studies that more than 90% of human multi-exon genes are subject to Alternative Splicing (AS), a key molecular mechanism in which multiple transcripts may be generated from a single gene. It is widely recognized that a breakdown in AS mechanisms plays an important role in cellular differentiation and pathologies. Polymerase Chain Reactions, microarrays and sequencing technologies have been applied to the study of transcript diversity arising from alternative expression. Last generation Affymetrix GeneChip Human Exon 1.0 ST Arrays offer a more detailed view of the gene expression profile providing information on the AS patterns. The exon array technology, with more than five million data points, can detect approximately one million exons, and it allows performing analyses at both gene and exon level. In this paper we describe BEAT, an integrated user-friendly bioinformatics framework to store, analyze and visualize exon arrays datasets. It combines a data warehouse approach with some rigorous statistical methods for assessing the AS of genes involved in diseases. Meta statistics are proposed as a novel approach to explore the analysis results. BEAT is available at http://beat.ba.itb.cnr.it. Results BEAT is a web tool which allows uploading and analyzing exon array datasets using standard statistical methods and an easy-to-use graphical web front-end. BEAT has been tested on a dataset with 173 samples and tuned using new datasets of exon array experiments from 28 colorectal cancer and 26 renal cell cancer samples produced at the Medical Genetics Unit of IRCCS Casa Sollievo della Sofferenza. To highlight all possible AS events, alternative names, accession Ids, Gene Ontology terms and biochemical pathways annotations are integrated with exon and gene level expression plots. The user can customize the results choosing custom thresholds for the statistical parameters and exploiting the available clinical data of the samples for a multivariate AS analysis. Conclusions Despite exon array chips being widely used for transcriptomics studies, there is a lack of analysis tools offering advanced statistical features and requiring no programming knowledge. BEAT provides a user-friendly platform for a comprehensive study of AS events in human diseases, displaying the analysis results with easily interpretable and interactive tables and graphics. PMID:22536968

  2. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    PubMed Central

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  3. Comprehensive GMO detection using real-time PCR array: single-laboratory validation.

    PubMed

    Mano, Junichi; Harada, Mioko; Takabatake, Reona; Furui, Satoshi; Kitta, Kazumi; Nakamura, Kosuke; Akiyama, Hiroshi; Teshima, Reiko; Noritake, Hiromichi; Hatano, Shuko; Futo, Satoshi; Minegishi, Yasutaka; Iizuka, Tayoshi

    2012-01-01

    We have developed a real-time PCR array method to comprehensively detect genetically modified (GM) organisms. In the method, genomic DNA extracted from an agricultural product is analyzed using various qualitative real-time PCR assays on a 96-well PCR plate, targeting for individual GM events, recombinant DNA (r-DNA) segments, taxon-specific DNAs, and donor organisms of the respective r-DNAs. In this article, we report the single-laboratory validation of both DNA extraction methods and component PCR assays constituting the real-time PCR array. We selected some DNA extraction methods for specified plant matrixes, i.e., maize flour, soybean flour, and ground canola seeds, then evaluated the DNA quantity, DNA fragmentation, and PCR inhibition of the resultant DNA extracts. For the component PCR assays, we evaluated the specificity and LOD. All DNA extraction methods and component PCR assays satisfied the criteria set on the basis of previous reports.

  4. Large-N correlator systems for low frequency radio astronomy

    NASA Astrophysics Data System (ADS)

    Foster, Griffin

    Low frequency radio astronomy has entered a second golden age driven by the development of a new class of large-N interferometric arrays. The low frequency array (LOFAR) and a number of redshifted HI Epoch of Reionization (EoR) arrays are currently undergoing commission and regularly observing. Future arrays of unprecedented sensitivity and resolutions at low frequencies, such as the square kilometer array (SKA) and the hydrogen epoch of reionization array (HERA), are in development. The combination of advancements in specialized field programmable gate array (FPGA) hardware for signal processing, computing and graphics processing unit (GPU) resources, and new imaging and calibration algorithms has opened up the oft underused radio band below 300 MHz. These interferometric arrays require efficient implementation of digital signal processing (DSP) hardware to compute the baseline correlations. FPGA technology provides an optimal platform to develop new correlators. The significant growth in data rates from these systems requires automated software to reduce the correlations in real time before storing the data products to disk. Low frequency, widefield observations introduce a number of unique calibration and imaging challenges. The efficient implementation of FX correlators using FPGA hardware is presented. Two correlators have been developed, one for the 32 element BEST-2 array at Medicina Observatory and the other for the 96 element LOFAR station at Chilbolton Observatory. In addition, calibration and imaging software has been developed for each system which makes use of the radio interferometry measurement equation (RIME) to derive calibrations. A process for generating sky maps from widefield LOFAR station observations is presented. Shapelets, a method of modelling extended structures such as resolved sources and beam patterns has been adapted for radio astronomy use to further improve system calibration. Scaling of computing technology allows for the development of larger correlator systems, which in turn allows for improvements in sensitivity and resolution. This requires new calibration techniques which account for a broad range of systematic effects.

  5. Best Practices and Joint Calling of the HumanExome BeadChip: The CHARGE Consortium

    PubMed Central

    Grove, Megan L.; Yu, Bing; Cochran, Barbara J.; Haritunians, Talin; Bis, Joshua C.; Taylor, Kent D.; Hansen, Mark; Borecki, Ingrid B.; Cupples, L. Adrienne; Fornage, Myriam; Gudnason, Vilmundur; Harris, Tamara B.; Kathiresan, Sekar; Kraaij, Robert; Launer, Lenore J.; Levy, Daniel; Liu, Yongmei; Mosley, Thomas; Peloso, Gina M.; Psaty, Bruce M.; Rich, Stephen S.; Rivadeneira, Fernando; Siscovick, David S.; Smith, Albert V.; Uitterlinden, Andre; van Duijn, Cornelia M.; Wilson, James G.; O’Donnell, Christopher J.; Rotter, Jerome I.; Boerwinkle, Eric

    2013-01-01

    Genotyping arrays are a cost effective approach when typing previously-identified genetic polymorphisms in large numbers of samples. One limitation of genotyping arrays with rare variants (e.g., minor allele frequency [MAF] <0.01) is the difficulty that automated clustering algorithms have to accurately detect and assign genotype calls. Combining intensity data from large numbers of samples may increase the ability to accurately call the genotypes of rare variants. Approximately 62,000 ethnically diverse samples from eleven Cohorts for Heart and Aging Research in Genomic Epidemiology (CHARGE) Consortium cohorts were genotyped with the Illumina HumanExome BeadChip across seven genotyping centers. The raw data files for the samples were assembled into a single project for joint calling. To assess the quality of the joint calling, concordance of genotypes in a subset of individuals having both exome chip and exome sequence data was analyzed. After exclusion of low performing SNPs on the exome chip and non-overlap of SNPs derived from sequence data, genotypes of 185,119 variants (11,356 were monomorphic) were compared in 530 individuals that had whole exome sequence data. A total of 98,113,070 pairs of genotypes were tested and 99.77% were concordant, 0.14% had missing data, and 0.09% were discordant. We report that joint calling allows the ability to accurately genotype rare variation using array technology when large sample sizes are available and best practices are followed. The cluster file from this experiment is available at www.chargeconsortium.com/main/exomechip. PMID:23874508

  6. Modelling spatiotemporal change using multidimensional arrays Meng

    NASA Astrophysics Data System (ADS)

    Lu, Meng; Appel, Marius; Pebesma, Edzer

    2017-04-01

    The large variety of remote sensors, model simulations, and in-situ records provide great opportunities to model environmental change. The massive amount of high-dimensional data calls for methods to integrate data from various sources and to analyse spatiotemporal and thematic information jointly. An array is a collection of elements ordered and indexed in arbitrary dimensions, which naturally represent spatiotemporal phenomena that are identified by their geographic locations and recording time. In addition, array regridding (e.g., resampling, down-/up-scaling), dimension reduction, and spatiotemporal statistical algorithms are readily applicable to arrays. However, the role of arrays in big geoscientific data analysis has not been systematically studied: How can arrays discretise continuous spatiotemporal phenomena? How can arrays facilitate the extraction of multidimensional information? How can arrays provide a clean, scalable and reproducible change modelling process that is communicable between mathematicians, computer scientist, Earth system scientist and stakeholders? This study emphasises on detecting spatiotemporal change using satellite image time series. Current change detection methods using satellite image time series commonly analyse data in separate steps: 1) forming a vegetation index, 2) conducting time series analysis on each pixel, and 3) post-processing and mapping time series analysis results, which does not consider spatiotemporal correlations and ignores much of the spectral information. Multidimensional information can be better extracted by jointly considering spatial, spectral, and temporal information. To approach this goal, we use principal component analysis to extract multispectral information and spatial autoregressive models to account for spatial correlation in residual based time series structural change modelling. We also discuss the potential of multivariate non-parametric time series structural change methods, hierarchical modelling, and extreme event detection methods to model spatiotemporal change. We show how array operations can facilitate expressing these methods, and how the open-source array data management and analytics software SciDB and R can be used to scale the process and make it easily reproducible.

  7. The Matthew Effect and widely prescribed pharmaceuticals lacking environmental monitoring: case study of an exposure-assessment vulnerability.

    PubMed

    Daughton, Christian G

    2014-01-01

    Assessing ambient exposure to chemical stressors often begins with time-consuming and costly monitoring studies to establish environmental occurrence. Both human and ecological toxicology are currently challenged by the unknowns surrounding low-dose exposure/effects, compounded by the reality that exposure undoubtedly involves mixtures of multiple stressors whose identities and levels can vary over time. Long absent from the assessment process, however, is whether the full scope of the identities of the stressors is sufficiently known. The Matthew Effect (a psychosocial phenomenon sometimes informally called the "bandwagon effect" or "iceberg effect," among others) may adversely bias or corrupt the exposure assessment process. The Matthew Effect is evidenced by decisions that base the selection of stressors to target in environmental monitoring surveys on whether they have been identified in prior studies, rather than considering the possibility that additional, but previously unreported, stressors might also play important roles in an exposure scenario. The possibility that the Matthew Effect might influence the scope of environmental stressor research is explored for the first time in a comprehensive case study that examines the preponderance of "absence of data" (in contrast to positive data and "data of absence") for the environmental occurrence of a very large class of potential chemical stressors associated with ubiquitous consumer use - active pharmaceutical ingredients (APIs). Comprehensive examination of the published data for an array of several hundred of the most frequently used drugs for whether their APIs are environmental contaminants provides a prototype example to catalyze discussion among the many disciplines involved with assessing risk. The findings could help guide the selection of those APIs that might merit targeting for environmental monitoring (based on the absence of data for environmental occurrence) as well as the prescribing of those medications that might have minimal environmental impact (based on data of absence for environmental occurrence). © 2013. Published by Elsevier B.V. All rights reserved.

  8. Workplace drug testing and worker drug use.

    PubMed

    Carpenter, Christopher S

    2007-04-01

    To examine the nature and extent of the association between workplace drug testing and worker drug use. Repeated cross-sections from the 2000 to 2001 National Household Surveys on Drug Abuse (NHSDA) and the 2002 National Survey on Drug Use and Health (NSDUH). Multivariate logistic regression models of the likelihood of marijuana use are estimated as a function of several different workplace drug policies, including drug testing. Specific questions about penalty severity and the likelihood of detection are used to further evaluate the nature of the association. Individuals whose employers perform drug tests are significantly less likely to report past month marijuana use, even after controlling for a wide array of worker and job characteristics. However, large negative associations are also found for variables indicating whether a firm has drug education, an employee assistance program, or a simple written policy about substance use. Accounting for these other workplace characteristics reduces-but does not eliminate-the testing differential. Frequent testing and severe penalties reduce the likelihood that workers use marijuana. Previous studies have interpreted the large negative correlation between workplace drug testing and employee substance use as representing a causal deterrent effect of drug testing. Our results using more comprehensive data suggest that these estimates have been slightly overstated due to omitted variables bias. The overall pattern of results remains largely consistent with the hypothesis that workplace drug testing deters worker drug use.

  9. Antenna and Electronics Cost Tradeoffs For Large Arrays

    NASA Technical Reports Server (NTRS)

    D'Addario, Larry R.

    2007-01-01

    This viewgraph presentation describes the cost tradeoffs for large antenna arrays. The contents include: 1) Cost modeling for large arrays; 2) Antenna mechanical cost over a wide range of sizes; and 3) Cost of per-antenna electronics.

  10. Copy number variants analysis in a cohort of isolated and syndromic developmental delay/intellectual disability reveals novel genomic disorders, position effects and candidate disease genes.

    PubMed

    Di Gregorio, E; Riberi, E; Belligni, E F; Biamino, E; Spielmann, M; Ala, U; Calcia, A; Bagnasco, I; Carli, D; Gai, G; Giordano, M; Guala, A; Keller, R; Mandrile, G; Arduino, C; Maffè, A; Naretto, V G; Sirchia, F; Sorasio, L; Ungari, S; Zonta, A; Zacchetti, G; Talarico, F; Pappi, P; Cavalieri, S; Giorgio, E; Mancini, C; Ferrero, M; Brussino, A; Savin, E; Gandione, M; Pelle, A; Giachino, D F; De Marchi, M; Restagno, G; Provero, P; Cirillo Silengo, M; Grosso, E; Buxbaum, J D; Pasini, B; De Rubeis, S; Brusco, A; Ferrero, G B

    2017-10-01

    Array-comparative genomic hybridization (array-CGH) is a widely used technique to detect copy number variants (CNVs) associated with developmental delay/intellectual disability (DD/ID). Identification of genomic disorders in DD/ID. We performed a comprehensive array-CGH investigation of 1,015 consecutive cases with DD/ID and combined literature mining, genetic evidence, evolutionary constraint scores, and functional information in order to assess the pathogenicity of the CNVs. We identified non-benign CNVs in 29% of patients. Amongst the pathogenic variants (11%), detected with a yield consistent with the literature, we found rare genomic disorders and CNVs spanning known disease genes. We further identified and discussed 51 cases with likely pathogenic CNVs spanning novel candidate genes, including genes encoding synaptic components and/or proteins involved in corticogenesis. Additionally, we identified two deletions spanning potential Topological Associated Domain (TAD) boundaries probably affecting the regulatory landscape. We show how phenotypic and genetic analyses of array-CGH data allow unraveling complex cases, identifying rare disease genes, and revealing unexpected position effects. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Brute-force mapmaking with compact interferometers: a MITEoR northern sky map from 128 to 175 MHz

    NASA Astrophysics Data System (ADS)

    Zheng, H.; Tegmark, M.; Dillon, J. S.; Liu, A.; Neben, A. R.; Tribiano, S. M.; Bradley, R. F.; Buza, V.; Ewall-Wice, A.; Gharibyan, H.; Hickish, J.; Kunz, E.; Losh, J.; Lutomirski, A.; Morgan, E.; Narayanan, S.; Perko, A.; Rosner, D.; Sanchez, N.; Schutz, K.; Valdez, M.; Villasenor, J.; Yang, H.; Zarb Adami, K.; Zelko, I.; Zheng, K.

    2017-03-01

    We present a new method for interferometric imaging that is ideal for the large fields of view and compact arrays common in 21 cm cosmology. We first demonstrate the method with the simulations for two very different low-frequency interferometers, the Murchison Widefield Array and the MIT Epoch of Reionization (MITEoR) experiment. We then apply the method to the MITEoR data set collected in 2013 July to obtain the first northern sky map from 128 to 175 MHz at ∼2° resolution and find an overall spectral index of -2.73 ± 0.11. The success of this imaging method bodes well for upcoming compact redundant low-frequency arrays such as Hydrogen Epoch of Reionization Array. Both the MITEoR interferometric data and the 150 MHz sky map are available at http://space.mit.edu/home/tegmark/omniscope.html.

  12. A comparison of earthquake backprojection imaging methods for dense local arrays

    NASA Astrophysics Data System (ADS)

    Beskardes, G. D.; Hole, J. A.; Wang, K.; Michaelides, M.; Wu, Q.; Chapman, M. C.; Davenport, K. K.; Brown, L. D.; Quiros, D. A.

    2018-03-01

    Backprojection imaging has recently become a practical method for local earthquake detection and location due to the deployment of densely sampled, continuously recorded, local seismograph arrays. While backprojection sometimes utilizes the full seismic waveform, the waveforms are often pre-processed and simplified to overcome imaging challenges. Real data issues include aliased station spacing, inadequate array aperture, inaccurate velocity model, low signal-to-noise ratio, large noise bursts and varying waveform polarity. We compare the performance of backprojection with four previously used data pre-processing methods: raw waveform, envelope, short-term averaging/long-term averaging and kurtosis. Our primary goal is to detect and locate events smaller than noise by stacking prior to detection to improve the signal-to-noise ratio. The objective is to identify an optimized strategy for automated imaging that is robust in the presence of real-data issues, has the lowest signal-to-noise thresholds for detection and for location, has the best spatial resolution of the source images, preserves magnitude, and considers computational cost. Imaging method performance is assessed using a real aftershock data set recorded by the dense AIDA array following the 2011 Virginia earthquake. Our comparisons show that raw-waveform backprojection provides the best spatial resolution, preserves magnitude and boosts signal to detect events smaller than noise, but is most sensitive to velocity error, polarity error and noise bursts. On the other hand, the other methods avoid polarity error and reduce sensitivity to velocity error, but sacrifice spatial resolution and cannot effectively reduce noise by stacking. Of these, only kurtosis is insensitive to large noise bursts while being as efficient as the raw-waveform method to lower the detection threshold; however, it does not preserve the magnitude information. For automatic detection and location of events in a large data set, we therefore recommend backprojecting kurtosis waveforms, followed by a second pass on the detected events using noise-filtered raw waveforms to achieve the best of all criteria.

  13. Reconfigurable Antennas for High Data Rate Multi-beam Communication Systems

    NASA Technical Reports Server (NTRS)

    Bernhard, Jennifer T.; Michielssen, Eric

    2005-01-01

    High-speed (2-100 Mb/sec) wireless data communication - whether land- or satellite-based - faces a major challenge: high error rates caused by interference and unpredictable environments. A planar antenna system that can be reconfigured to respond to changing conditions has the potential to dramatically improve data throughput and system reliability. Moreover, new planar antenna designs that reduce array size, weight, and cost can have a significant impact on terrestrial and satellite communication system performance. This research developed new individually-reconfigurable planar antenna array elements that can be adjusted to provide multiple beams while providing increased scan angles and higher aperture efficiency than traditional diffraction-limited arrays. These new elements are microstrip spiral antennas with specialized tuning mechanisms that provide adjustable radiation patterns. We anticipate that these new elements can be used in both large and small arrays for inter-satellite communication as well as tracking of multiple mobile surface-based units. Our work has developed both theoretical descriptions as well as experimental prototypes of the antennas in both single element and array embodiments. The technical summary of the results of this work is divided into six sections: A. Cavity model for analysis and design of pattern reconfigurable antennas; B. Performance of antenna in array configurations for broadside and endfire operation; C. Performance of antenna in array configurations for beam scanning operation; D. Simulation of antennas in infinite phased arrays; E. Demonstration of antenna with commercially-available RF MEMS switches; F. Design of antenna MEMS switch combinations for direct simultaneous fabrication.

  14. Performance of the image statistics decoder in conjunction with the Goldstone-VLA array

    NASA Technical Reports Server (NTRS)

    Wang, H. C.; Pitt, G. H., III

    1989-01-01

    During Voyager's Neptune encounter, the National Radio Astronomy Observatory's Very Large Array (VLA) will be arrayed with Goldstone antennas to receive the transmitted telemetry data from the spacecraft. The telemetry signal from the VLA will drop out periodically, resulting in a periodic drop in the received signal-to-noise ratio (SNR). The Image Statistics Decoder (ISD), which assumes a correlation between pixels, can improve the bit error rate (BER) for images during these dropout periods. Simulation results have shown that the ISD, in conjunction with the Goldstone-VLA array can provide a 3-dB gain for uncompressed images at a BER of 5.0 x 10(exp -3).

  15. Mineral resources of the Cabinet Mountains Wilderness, Lincoln and Sanders Counties, Montana

    USGS Publications Warehouse

    Lindsey, David A.; Wells, J.D.; Van Loenen, R. E.; Banister, D.P.; Welded, R.D.; Zilka, N.T.; Schmauch, S.W.

    1978-01-01

    This report describes the differential array, of seismometers recently installed at the Hollister, California, Municipal Airport. Such an array of relatively closely spaced seismometers has already been installed in El Centro and provided useful information for both engineering and seismological applications from the 1979 Imperial Valley earthquake. Differential ground motions, principally due to horizontally propagating surface waves, are important in determining the stresses in such extended structures as large mat foundations for nuclear power stations, dams, bridges and pipelines. Further, analyses of the records of the 1979 Imperial Valley earthquake from the differential array have demonstrated the utility of short-baseline array data in tracking the progress of the rupture wave front of an earthquake.

  16. Development of a Microphone Phased Array Capability for the Langley 14- by 22-Foot Subsonic Tunnel

    NASA Technical Reports Server (NTRS)

    Humphreys, William M.; Brooks, Thomas F.; Bahr, Christopher J.; Spalt, Taylor B.; Bartram, Scott M.; Culliton, William G.; Becker, Lawrence E.

    2014-01-01

    A new aeroacoustic measurement capability has been developed for use in open-jet testing in the NASA Langley 14- by 22-Foot Subsonic Tunnel (14x22 tunnel). A suite of instruments has been developed to characterize noise source strengths, locations, and directivity for both semi-span and full-span test articles in the facility. The primary instrument of the suite is a fully traversable microphone phased array for identification of noise source locations and strengths on models. The array can be mounted in the ceiling or on either side of the facility test section to accommodate various test article configurations. Complementing the phased array is an ensemble of streamwise traversing microphones that can be placed around the test section at defined locations to conduct noise source directivity studies along both flyover and sideline axes. A customized data acquisition system has been developed for the instrumentation suite that allows for command and control of all aspects of the array and microphone hardware, and is coupled with a comprehensive data reduction system to generate information in near real time. This information includes such items as time histories and spectral data for individual microphones and groups of microphones, contour presentations of noise source locations and strengths, and hemispherical directivity data. The data acquisition system integrates with the 14x22 tunnel data system to allow real time capture of facility parameters during acquisition of microphone data. The design of the phased array system has been vetted via a theoretical performance analysis based on conventional monopole beamforming and DAMAS deconvolution. The performance analysis provides the ability to compute figures of merit for the array as well as characterize factors such as beamwidths, sidelobe levels, and source discrimination for the types of noise sources anticipated in the 14x22 tunnel. The full paper will summarize in detail the design of the instrumentation suite, the construction of the hardware system, and the results of the performance analysis. Although the instrumentation suite is designed to characterize noise for a variety of test articles in the 14x22 tunnel, this paper will concentrate on description of the instruments for two specific test campaigns in the facility, namely a full-span NASA Hybrid Wing Body (HWB) model entry and a semi-span Gulfstream aircraft model entry, tested in the facility in the winter of 2012 and spring of 2013, respectively.

  17. Global characterization of copy number variants in epilepsy patients from whole genome sequencing

    PubMed Central

    Meloche, Caroline; Andrade, Danielle M.; Lafreniere, Ron G.; Gravel, Micheline; Spiegelman, Dan; Dionne-Laporte, Alexandre; Boelman, Cyrus; Hamdan, Fadi F.; Michaud, Jacques L.; Rouleau, Guy; Minassian, Berge A.; Bourque, Guillaume; Cossette, Patrick

    2018-01-01

    Epilepsy will affect nearly 3% of people at some point during their lifetime. Previous copy number variants (CNVs) studies of epilepsy have used array-based technology and were restricted to the detection of large or exonic events. In contrast, whole-genome sequencing (WGS) has the potential to more comprehensively profile CNVs but existing analytic methods suffer from limited accuracy. We show that this is in part due to the non-uniformity of read coverage, even after intra-sample normalization. To improve on this, we developed PopSV, an algorithm that uses multiple samples to control for technical variation and enables the robust detection of CNVs. Using WGS and PopSV, we performed a comprehensive characterization of CNVs in 198 individuals affected with epilepsy and 301 controls. For both large and small variants, we found an enrichment of rare exonic events in epilepsy patients, especially in genes with predicted loss-of-function intolerance. Notably, this genome-wide survey also revealed an enrichment of rare non-coding CNVs near previously known epilepsy genes. This enrichment was strongest for non-coding CNVs located within 100 Kbp of an epilepsy gene and in regions associated with changes in the gene expression, such as expression QTLs or DNase I hypersensitive sites. Finally, we report on 21 potentially damaging events that could be associated with known or new candidate epilepsy genes. Our results suggest that comprehensive sequence-based profiling of CNVs could help explain a larger fraction of epilepsy cases. PMID:29649218

  18. Ultralow-Background Large-Format Bolometer Arrays

    NASA Technical Reports Server (NTRS)

    Benford, Dominic; Chervenak, Jay; Irwin, Kent; Moseley, S. Harvey; Oegerle, William (Technical Monitor)

    2002-01-01

    In the coming decade, work will commence in earnest on large cryogenic far-infrared telescopes and interferometers. All such observatories - for example, SAFIR, SPIRIT, and SPECS - require large format, two dimensional arrays of close-packed detectors capable of reaching the fundamental limits imposed by the very low photon backgrounds present in deep space. In the near term, bolometer array architectures which permit 1000 pixels - perhaps sufficient for the next generation of space-based instruments - can be arrayed efficiently. Demonstrating the necessary performance, with Noise Equivalent Powers (NEPs) of order 10-20 W/square root of Hz, will be a hurdle in the coming years. Superconducting bolometer arrays are a promising technology for providing both the performance and the array size necessary. We discuss the requirements for future detector arrays in the far-infrared and submillimeter, describe the parameters of superconducting bolometer arrays able to meet these requirements, and detail the present and near future technology of superconducting bolometer arrays. Of particular note is the coming development of large format planar arrays with absorber-coupled and antenna-coupled bolometers.

  19. Accurate prediction of secondary metabolite gene clusters in filamentous fungi.

    PubMed

    Andersen, Mikael R; Nielsen, Jakob B; Klitgaard, Andreas; Petersen, Lene M; Zachariasen, Mia; Hansen, Tilde J; Blicher, Lene H; Gotfredsen, Charlotte H; Larsen, Thomas O; Nielsen, Kristian F; Mortensen, Uffe H

    2013-01-02

    Biosynthetic pathways of secondary metabolites from fungi are currently subject to an intense effort to elucidate the genetic basis for these compounds due to their large potential within pharmaceutics and synthetic biochemistry. The preferred method is methodical gene deletions to identify supporting enzymes for key synthases one cluster at a time. In this study, we design and apply a DNA expression array for Aspergillus nidulans in combination with legacy data to form a comprehensive gene expression compendium. We apply a guilt-by-association-based analysis to predict the extent of the biosynthetic clusters for the 58 synthases active in our set of experimental conditions. A comparison with legacy data shows the method to be accurate in 13 of 16 known clusters and nearly accurate for the remaining 3 clusters. Furthermore, we apply a data clustering approach, which identifies cross-chemistry between physically separate gene clusters (superclusters), and validate this both with legacy data and experimentally by prediction and verification of a supercluster consisting of the synthase AN1242 and the prenyltransferase AN11080, as well as identification of the product compound nidulanin A. We have used A. nidulans for our method development and validation due to the wealth of available biochemical data, but the method can be applied to any fungus with a sequenced and assembled genome, thus supporting further secondary metabolite pathway elucidation in the fungal kingdom.

  20. Integrative Genomics Viewer (IGV) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    The Integrative Genomics Viewer (IGV) is a high-performance visualization tool for interactive exploration of large, integrated genomic datasets. It supports a wide variety of data types, including array-based and next-generation sequence data, and genomic annotations.

  1. Synthesis of a large communications aperture using small antennas

    NASA Technical Reports Server (NTRS)

    Resch, George M.; Cwik, T. W.; Jamnejad, V.; Logan, R. T.; Miller, R. B.; Rogstad, Dave H.

    1994-01-01

    In this report we compare the cost of an array of small antennas to that of a single large antenna assuming both the array and single large antenna have equal performance and availability. The single large antenna is taken to be one of the 70-m antennas of the Deep Space Network. The cost of the array is estimated as a function of the array element diameter for three different values of system noise temperature corresponding to three different packaging schemes for the first amplifier. Array elements are taken to be fully steerable paraboloids and their cost estimates were obtained from commercial vendors. Array loss mechanisms and calibration problems are discussed. For array elements in the range 3 - 35 m there is no minimum in the cost versus diameter curve for the three system temperatures that were studied.

  2. Jllumina - A comprehensive Java-based API for statistical Illumina Infinium HumanMethylation450 and Infinium MethylationEPIC BeadChip data processing.

    PubMed

    Almeida, Diogo; Skov, Ida; Lund, Jesper; Mohammadnejad, Afsaneh; Silva, Artur; Vandin, Fabio; Tan, Qihua; Baumbach, Jan; Röttger, Richard

    2016-10-01

    Measuring differential methylation of the DNA is the nowadays most common approach to linking epigenetic modifications to diseases (called epigenome-wide association studies, EWAS). For its low cost, its efficiency and easy handling, the Illumina HumanMethylation450 BeadChip and its successor, the Infinium MethylationEPIC BeadChip, is the by far most popular techniques for conduction EWAS in large patient cohorts. Despite the popularity of this chip technology, raw data processing and statistical analysis of the array data remains far from trivial and still lacks dedicated software libraries enabling high quality and statistically sound downstream analyses. As of yet, only R-based solutions are freely available for low-level processing of the Illumina chip data. However, the lack of alternative libraries poses a hurdle for the development of new bioinformatic tools, in particular when it comes to web services or applications where run time and memory consumption matter, or EWAS data analysis is an integrative part of a bigger framework or data analysis pipeline. We have therefore developed and implemented Jllumina, an open-source Java library for raw data manipulation of Illumina Infinium HumanMethylation450 and Infinium MethylationEPIC BeadChip data, supporting the developer with Java functions covering reading and preprocessing the raw data, down to statistical assessment, permutation tests, and identification of differentially methylated loci. Jllumina is fully parallelizable and publicly available at http://dimmer.compbio.sdu.dk/download.html.

  3. Jllumina - A comprehensive Java-based API for statistical Illumina Infinium HumanMethylation450 and MethylationEPIC data processing.

    PubMed

    Almeida, Diogo; Skov, Ida; Lund, Jesper; Mohammadnejad, Afsaneh; Silva, Artur; Vandin, Fabio; Tan, Qihua; Baumbach, Jan; Röttger, Richard

    2016-12-18

    Measuring differential methylation of the DNA is the nowadays most common approach to linking epigenetic modifications to diseases (called epigenome-wide association studies, EWAS). For its low cost, its efficiency and easy handling, the Illumina HumanMethylation450 BeadChip and its successor, the Infinium MethylationEPIC BeadChip, is the by far most popular techniques for conduction EWAS in large patient cohorts. Despite the popularity of this chip technology, raw data processing and statistical analysis of the array data remains far from trivial and still lacks dedicated software libraries enabling high quality and statistically sound downstream analyses. As of yet, only R-based solutions are freely available for low-level processing of the Illumina chip data. However, the lack of alternative libraries poses a hurdle for the development of new bioinformatic tools, in particular when it comes to web services or applications where run time and memory consumption matter, or EWAS data analysis is an integrative part of a bigger framework or data analysis pipeline. We have therefore developed and implemented Jllumina, an open-source Java library for raw data manipulation of Illumina Infinium HumanMethylation450 and Infinium MethylationEPIC BeadChip data, supporting the developer with Java functions covering reading and preprocessing the raw data, down to statistical assessment, permutation tests, and identification of differentially methylated loci. Jllumina is fully parallelizable and publicly available at http://dimmer.compbio.sdu.dk/download.html.

  4. The Long Wavelength Array (LWA): A Large HF/VHF Array for Solar Physics, Ionospheric Science, and Solar Radar

    DTIC Science & Technology

    2010-09-01

    adds an extra dimension to both IPS and other observations. The polarization of the CME synchrotron emission observed by [3] will be of great...base funding. 8. REFERENCES 1. Kassim et al., The 74 MHz System on the Very Large Array, The Astrophysical Journal Supplement Series, Vol. 172...The Long Wavelength Array (LWA): A Large HF/VHF Array for Solar Physics, Ionospheric Science, and Solar Radar Namir E. Kassim Naval Research

  5. The compression–error trade-off for large gridded data sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silver, Jeremy D.; Zender, Charles S.

    The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scalemore » and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed. When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.« less

  6. The compression–error trade-off for large gridded data sets

    DOE PAGES

    Silver, Jeremy D.; Zender, Charles S.

    2017-01-27

    The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scalemore » and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed. When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.« less

  7. Practical Considerations for Optimizing Position Sensitivity in Arrays of Position-sensitive TES's

    NASA Technical Reports Server (NTRS)

    Smith, Stephen J.; Bandler, Simon R.; Figueroa-Feliciano, Encetali; Iyomoto, Naoko; Kelley, Richard L.; Kilbourne, Caroline A.; Porder, Frederick S.; Sadleir, John E.

    2007-01-01

    We are developing Position-Sensitive Transitions-Edge Sensors (PoST's) for future X-ray astronomy missions such as NASA's Constellation-X. The PoST consists of one or more Transitions Edge Sensors (TES's) thermally connected to a large X-ray absorber, which through heat diffusion, gives rise to position dependence. The development of PoST's is motivated by the desire to achieve the largest the focal-plan coverage with the fewest number of readout channels. In order to develop a practical array, consisting of an inner pixellated core with an outer array of large absorber PoST's, we must be able to simultaneously read out all (-1800) channels in the array. This is achievable using time division multiplexing (TDM), but does set stringent slew rate requirements on the array. Typically, we must damp the pulses to reduce the slew rate of the input signal to the TDM. This is achieved by applying a low-pass analog filter with large inductance to the signal. This attenuates the high frequency components of the signal, essential for position discrimination in PoST's, relative to the white noise of the readout chain and degrades the position sensitivity. Using numerically simulated data, we investigate the position sensing ability of typical PoST designs under such high inductance conditions. We investigate signal-processing techniques for optimal determination of the event position and discuss the practical considerations for real-time implementation.

  8. Modeling and optimization of atomic layer deposition processes on vertically aligned carbon nanotubes.

    PubMed

    Yazdani, Nuri; Chawla, Vipin; Edwards, Eve; Wood, Vanessa; Park, Hyung Gyu; Utke, Ivo

    2014-01-01

    Many energy conversion and storage devices exploit structured ceramics with large interfacial surface areas. Vertically aligned carbon nanotube (VACNT) arrays have emerged as possible scaffolds to support large surface area ceramic layers. However, obtaining conformal and uniform coatings of ceramics on structures with high aspect ratio morphologies is non-trivial, even with atomic layer deposition (ALD). Here we implement a diffusion model to investigate the effect of the ALD parameters on coating kinetics and use it to develop a guideline for achieving conformal and uniform thickness coatings throughout the depth of ultra-high aspect ratio structures. We validate the model predictions with experimental data from ALD coatings of VACNT arrays. However, the approach can be applied to predict film conformality as a function of depth for any porous topology, including nanopores and nanowire arrays.

  9. Between Counting and Multiplication: Low-Attaining Students' Spatial Structuring, Enumeration and Errors in Concretely-Presented 3D Array Tasks

    ERIC Educational Resources Information Center

    Finesilver, Carla

    2017-01-01

    The move from additive to multiplicative thinking requires significant change in children's comprehension and manipulation of numerical relationships, involves various conceptual components, and can be a slow, multistage process for some. Unit arrays are a key visuospatial representation for supporting learning, but most research focuses on 2D…

  10. Evaluating the consistency of gene sets used in the analysis of bacterial gene expression data.

    PubMed

    Tintle, Nathan L; Sitarik, Alexandra; Boerema, Benjamin; Young, Kylie; Best, Aaron A; Dejongh, Matthew

    2012-08-08

    Statistical analyses of whole genome expression data require functional information about genes in order to yield meaningful biological conclusions. The Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) are common sources of functionally grouped gene sets. For bacteria, the SEED and MicrobesOnline provide alternative, complementary sources of gene sets. To date, no comprehensive evaluation of the data obtained from these resources has been performed. We define a series of gene set consistency metrics directly related to the most common classes of statistical analyses for gene expression data, and then perform a comprehensive analysis of 3581 Affymetrix® gene expression arrays across 17 diverse bacteria. We find that gene sets obtained from GO and KEGG demonstrate lower consistency than those obtained from the SEED and MicrobesOnline, regardless of gene set size. Despite the widespread use of GO and KEGG gene sets in bacterial gene expression data analysis, the SEED and MicrobesOnline provide more consistent sets for a wide variety of statistical analyses. Increased use of the SEED and MicrobesOnline gene sets in the analysis of bacterial gene expression data may improve statistical power and utility of expression data.

  11. Seismic &Infrasound Integrated Array "Apatity". Techniques, data processing, first results of observations.

    NASA Astrophysics Data System (ADS)

    Vinogradov, Y.; Baryshnikov, A.

    2003-04-01

    Since September 2001 3 infrasound membrane type sensors "K-304 AM" have been installed on the territory seismic array "Apatity" near the lake Imandra. A seismic array comprising 11 short-period sensors (type "Geotech S-500"), disposed on small and large circle (0.4 and 1 km diameter). Infrasound sensors located on small circle near the seismograths. All data are digitized at the array site and transmitted in real time to a processing center in Apatity to the Kola Regional Seismological Centre (KRSC). Common complex we are called - Seismic &Infrasound Integrated Array (SISIA) "Apatity". To support temporary storage the transmitting data in a disk loop and access to the data "NEWNORAC" program was created. This program replaced "NORAC" system developed by Norwegian Institute NORSAR, which was in use in KRSC before. A program package EL (event locator) for display and processing of the data has been modified. Now it includes the following : - quick access to the data stored in the disk loop (last two weeks); - data convertation from disk loop format to CSS 3.0 format; - data filtering using bandpass, highpass, lowpass, adaptive or rejector filters; - calculation of spectra and sonograms (spectral diagrams); - seismic events location with plotting on a map; - calculation of backazimuth and apparent velocity of acoustic wave by similar parts of wave recordings; - loading and processing CSS 3.0 seismic and acoustic data from KRSC archive. To store the acoustic data permanently the program BARCSS was made. It rewrites the data from the disk loop to KRSC archive in CSS 3.0 format. For comparison of acoustic noise level with wind we use data from meteorological station in Kandalaksha city, sampling rate is 3 hours. During the period from October 2001 to October 2002 more than 745 seismic events, which basically connected with mine technical activity of the large mining enterprises at the Kola Peninsula, were registered. The most part of events, caused by ground explosions, was registered by infrasound part of SISIA "Apatity". Their sources were at distances from 38 to 220 km. The result of observations during the first 1 year enabled us to estimate frequency range and main directions of arrivals of acoustic waves and noise level in the place of observations. In accordance with the results and relief a 4-rays wind-noise-reducing pipe array would be install at all 3 sensors at May 2003, for improvement the delectability during windy conditions. A schemes of the SISIA "Apatity", data transmitting and processing and samples of detected signals are shown in the presentation.

  12. Failure mode analysis of silicon-based intracortical microelectrode arrays in non-human primates

    PubMed Central

    Barrese, James C; Rao, Naveen; Paroo, Kaivon; Triebwasser, Corey; Vargas-Irwin, Carlos; Franquemont, Lachlan; Donoghue, John P

    2016-01-01

    Objective Brain–computer interfaces (BCIs) using chronically implanted intracortical microelectrode arrays (MEAs) have the potential to restore lost function to people with disabilities if they work reliably for years. Current sensors fail to provide reliably useful signals over extended periods of time for reasons that are not clear. This study reports a comprehensive retrospective analysis from a large set of implants of a single type of intracortical MEA in a single species, with a common set of measures in order to evaluate failure modes. Approach Since 1996, 78 silicon MEAs were implanted in 27 monkeys (Macaca mulatta). We used two approaches to find reasons for sensor failure. First, we classified the time course leading up to complete recording failure as acute (abrupt) or chronic (progressive). Second, we evaluated the quality of electrode recordings over time based on signal features and electrode impedance. Failure modes were divided into four categories: biological, material, mechanical, and unknown. Main results Recording duration ranged from 0 to 2104 days (5.75 years), with a mean of 387 days and a median of 182 days (n = 78). Sixty-two arrays failed completely with a mean time to failure of 332 days (median = 133 days) while nine array experiments were electively terminated for experimental reasons (mean = 486 days). Seven remained active at the close of this study (mean = 753 days). Most failures (56%) occurred within a year of implantation, with acute mechanical failures the most common class (48%), largely because of connector issues (83%). Among grossly observable biological failures (24%), a progressive meningeal reaction that separated the array from the parenchyma was most prevalent (14.5%). In the absence of acute interruptions, electrode recordings showed a slow progressive decline in spike amplitude, noise amplitude, and number of viable channels that predicts complete signal loss by about eight years. Impedance measurements showed systematic early increases, which did not appear to affect recording quality, followed by a slow decline over years. The combination of slowly falling impedance and signal quality in these arrays indicate that insulating material failure is the most significant factor. Significance This is the first long-term failure mode analysis of an emerging BCI technology in a large series of non-human primates. The classification system introduced here may be used to standardize how neuroprosthetic failure modes are evaluated. The results demonstrate the potential for these arrays to record for many years, but achieving reliable sensors will require replacing connectors with implantable wireless systems, controlling the meningeal reaction, and improving insulation materials. These results will focus future research in order to create clinical neuroprosthetic sensors, as well as valuable research tools, that are able to safely provide reliable neural signals for over a decade. PMID:24216311

  13. Failure mode analysis of silicon-based intracortical microelectrode arrays in non-human primates

    NASA Astrophysics Data System (ADS)

    Barrese, James C.; Rao, Naveen; Paroo, Kaivon; Triebwasser, Corey; Vargas-Irwin, Carlos; Franquemont, Lachlan; Donoghue, John P.

    2013-12-01

    Objective. Brain-computer interfaces (BCIs) using chronically implanted intracortical microelectrode arrays (MEAs) have the potential to restore lost function to people with disabilities if they work reliably for years. Current sensors fail to provide reliably useful signals over extended periods of time for reasons that are not clear. This study reports a comprehensive retrospective analysis from a large set of implants of a single type of intracortical MEA in a single species, with a common set of measures in order to evaluate failure modes. Approach. Since 1996, 78 silicon MEAs were implanted in 27 monkeys (Macaca mulatta). We used two approaches to find reasons for sensor failure. First, we classified the time course leading up to complete recording failure as acute (abrupt) or chronic (progressive). Second, we evaluated the quality of electrode recordings over time based on signal features and electrode impedance. Failure modes were divided into four categories: biological, material, mechanical, and unknown. Main results. Recording duration ranged from 0 to 2104 days (5.75 years), with a mean of 387 days and a median of 182 days (n = 78). Sixty-two arrays failed completely with a mean time to failure of 332 days (median = 133 days) while nine array experiments were electively terminated for experimental reasons (mean = 486 days). Seven remained active at the close of this study (mean = 753 days). Most failures (56%) occurred within a year of implantation, with acute mechanical failures the most common class (48%), largely because of connector issues (83%). Among grossly observable biological failures (24%), a progressive meningeal reaction that separated the array from the parenchyma was most prevalent (14.5%). In the absence of acute interruptions, electrode recordings showed a slow progressive decline in spike amplitude, noise amplitude, and number of viable channels that predicts complete signal loss by about eight years. Impedance measurements showed systematic early increases, which did not appear to affect recording quality, followed by a slow decline over years. The combination of slowly falling impedance and signal quality in these arrays indicates that insulating material failure is the most significant factor. Significance. This is the first long-term failure mode analysis of an emerging BCI technology in a large series of non-human primates. The classification system introduced here may be used to standardize how neuroprosthetic failure modes are evaluated. The results demonstrate the potential for these arrays to record for many years, but achieving reliable sensors will require replacing connectors with implantable wireless systems, controlling the meningeal reaction, and improving insulation materials. These results will focus future research in order to create clinical neuroprosthetic sensors, as well as valuable research tools, that are able to safely provide reliable neural signals for over a decade.

  14. VizieR Online Data Catalog: List of Telescope Array events with E > 57EeV (Abbasi+, 2014)

    NASA Astrophysics Data System (ADS)

    Abbasi, R. U.; Abe, M.; Abu-Zayyad, T.; Allen, M.; Anderson, R.; Azuma, R.; Barcikowski, E.; Belz, J. W.; Bergman, D. R.; Blake, S. A.; Cady, R.; Chae, M. J.; Cheon, B. G.; Chiba, J.; Chikawa, M.; Cho, W. R.; Fujii, T.; Fukushima, M.; Goto, T.; Hanlon, W.; Hayashi, Y.; Hayashida, N.; Hibino, K.; Honda, K.; Ikeda, D.; Inoue, N.; Ishii, T.; Ishimori, R.; Ito, H.; Ivanov, D.; Jui, C. C. H.; Kadota, K.; Kakimoto, F.; Kalashev, O.; Kasahara, K.; Kawai, H.; Kawakami, S.; Kawana, S.; Kawata, K.; Kido, E.; Kim, H. B.; Kim, J. H.; Kitamura, S.; Kitamura, Y.; Kuzmin, V.; Kwon, Y. J.; Lan, J.; Lim, S. I.; Lundquist, J. P.; Machida, K.; Martens, K.; Matsuda, T.; Matsuyama, T.; Matthews, J. N.; Minamino, M.; Mukai, K.; Myers, I.; Nagasawa, K.; Nagataki, S.; Nakamura, T.; Nonaka, T.; Nozato, A.; Ogio, S.; Ogura, J.; Ohnishi, M.; Ohoka, H.; Oki, K.; Okuda, T.; Ono, M.; Oshima, A.; Ozawa, S.; Park, I. H.; Pshirkov, M. S.; Rodriguez, D. C.; Rubtsov, G.; Ryu, D.; Sagawa, H.; Sakurai, N.; Sampson, A. L.; Scott, L. M.; Shah, P. D.; Shibata, F.; Shibata, T.; Shimodaira, H.; Shin, B. K.; Smith, J. D.!; Sokolsk, Y. P.; Springer, R. W.; Stokes, B. T.; Stratton, S. R.; Stroman, T. A.; Suzawa, T.; Takamura, M.; Takeda, M.; Takeishi, R.; Taketa, A.; Takita, M.; Tameda, Y.; Tanaka, H.; Tanaka, K.; Tanaka, M.; Thomas, S. B.; Thomson, G. B.; Tinyakov, P.; Tkachev, I.; Tokuno, H.; Tomida, T.; Troitsky, S.; Tsunesada, Y.; Tsutsumi, K.; Uchihori, Y.; Udo, S.; Urban, F.; Vasiloff, G.; Wong, T.; Yamane, R.; Yamaoka, H.; Yamazaki, K.; Yang, J.; Yashiro, K.; Yoneda, Y.; Yoshida, S.; Yoshii, H.; Zollinger, R.; Zundel, Z.

    2017-03-01

    The TA is the largest cosmic-ray detector in the northern hemisphere. It consists of a scintillator SD array (Abu-Zayyad et al. 2012NIMPA.689...87A) and three fluorescence detector (FD) stations (Tokuno et al. 2012NIMPA.676...54T). The observatory has been in full operation in Millard Country, Utah, USA (39fdg30N, 112fdg91W; about 1400 m above sea level) since 2008. The TA SD array consists of 507 plastic scintillation detectors each 3 m2 in area and located on a 1.2 km square grid. The array has an area of ~700 km2. The TA SD array observes cosmic-ray-induced extensive air showers with E > ~1 EeV, regardless of weather conditions with a duty cycle near 100% and a wide field of view (FoV). These capabilities ensure a very stable and large geometrical exposure over the northern sky survey in comparison with FD observations that have a duty cycle of ~10%. In this analysis, we used SD data recorded between 2008 May 11 and 2013 May 4. (1 data file).

  15. Optically addressed ultra-wideband phased antenna array

    NASA Astrophysics Data System (ADS)

    Bai, Jian

    Demands for high data rate and multifunctional apertures from both civilian and military users have motivated development of ultra-wideband (UWB) electrically steered phased arrays. Meanwhile, the need for large contiguous frequency is pushing operation of radio systems into the millimeter-wave (mm-wave) range. Therefore, modern radio systems require UWB performance from VHF to mm-wave. However, traditional electronic systems suffer many challenges that make achieving these requirements difficult. Several examples includes: voltage controlled oscillators (VCO) cannot provide a tunable range of several octaves, distribution of wideband local oscillator signals undergo high loss and dispersion through RF transmission lines, and antennas have very limited bandwidth or bulky sizes. Recently, RF photonics technology has drawn considerable attention because of its advantages over traditional systems, with the capability of offering extreme power efficiency, information capacity, frequency agility, and spatial beam diversity. A hybrid RF photonic communication system utilizing optical links and an RF transducer at the antenna potentially provides ultra-wideband data transmission, i.e., over 100 GHz. A successful implementation of such an optically addressed phased array requires addressing several key challenges. Photonic generation of an RF source with over a seven-octave bandwidth has been demonstrated in the last few years. However, one challenge which still remains is how to convey phased optical signals to downconversion modules and antennas. Therefore, a feed network with phase sweeping capability and low excessive phase noise needs to be developed. Another key challenge is to develop an ultra-wideband array antenna. Modern frontends require antennas to be compact, planar, and low-profile in addition to possessing broad bandwidth, conforming to stringent space, weight, cost, and power constraints. To address these issues, I will study broadband and miniaturization techniques for both single and array antennas. In addition, a prototype transmitting phased array system is developed and shown to demonstrate large bandwidth as well as a beam steering capability. The architecture of this system can be further developed to a large-scale array at higher frequencies such as mm-wave. This solution serves as a candidate for UWB multifunctional frontends.

  16. Report of the ultraviolet and visible sensors panel

    NASA Technical Reports Server (NTRS)

    Timothy, J. Gethyn; Blouke, M.; Bredthauer, R.; Kimble, R.; Lee, T.-H.; Lesser, M.; Siegmund, O.; Weckler, G.

    1991-01-01

    In order to meet the science objectives of the Astrotech 21 mission set the Ultraviolet (UV) and Visible Sensors Panel made a number of recommendations. In the UV wavelength range of 0.01 to 0.3 micro-m the focus is on the need for large format high quantum efficiency, radiation hard 'solar-blind' detectors. Options recommended for support include Si and non-Si charge coupled devices (CCDs) as well as photocathodes with improved microchannel plate readouts. For the 0.3 to 0.9 micro-m range, it was felt that Si CCDs offer the best option for high quantum efficiencies at these wavelengths. In the 0.9 to 2.5 micro-m the panel recommended support for the investigation of monolithic arrays. Finally, the panel noted that the implementation of very large arrays will require new data transmission, data recording, and data handling technologies.

  17. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives.

    PubMed

    Zhao, Min; Wang, Qingguo; Wang, Quan; Jia, Peilin; Zhao, Zhongming

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development.

  18. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives

    PubMed Central

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development. PMID:24564169

  19. Comprehensive Survey on Improved Focality and Penetration Depth of Transcranial Magnetic Stimulation Employing Multi-Coil Arrays.

    PubMed

    Wei, Xile; Li, Yao; Lu, Meili; Wang, Jiang; Yi, Guosheng

    2017-11-14

    Multi-coil arrays applied in transcranial magnetic stimulation (TMS) are proposed to accurately stimulate brain tissues and modulate neural activities by an induced electric field (EF). Composed of numerous independently driven coils, a multi-coil array has alternative energizing strategies to evoke EFs targeting at different cerebral regions. To improve the locating resolution and the stimulating focality, we need to fully understand the variation properties of induced EFs and the quantitative control method of the spatial arrangement of activating coils, both of which unfortunately are still unclear. In this paper, a comprehensive analysis of EF properties was performed based on multi-coil arrays. Four types of planar multi-coil arrays were used to study the relationship between the spatial distribution of EFs and the structure of stimuli coils. By changing coil-driven strategies in a basic 16-coil array, we find that an EF induced by compactly distributed coils decays faster than that induced by dispersedly distributed coils, but the former has an advantage over the latter in terms of the activated brain volume. Simulation results also indicate that the attenuation rate of an EF induced by the 36-coil dense array is 3 times and 1.5 times greater than those induced by the 9-coil array and the 16-coil array, respectively. The EF evoked by the 36-coil dispense array has the slowest decay rate. This result demonstrates that larger multi-coil arrays, compared to smaller ones, activate deeper brain tissues at the expense of decreased focality. A further study on activating a specific field of a prescribed shape and size was conducted based on EF variation. Accurate target location was achieved with a 64-coil array 18 mm in diameter. A comparison between the figure-8 coil, the planar array, and the cap-formed array was made and demonstrates an improvement of multi-coil configurations in the penetration depth and the focality. These findings suggest that there is a tradeoff between attenuation rate and focality in the application of multi-coil arrays. Coil-energizing strategies and array dimensions should be based on an adequate evaluation of these two important demands and the topological structure of target tissues.

  20. Layout and cabling considerations for a large communications antenna array

    NASA Technical Reports Server (NTRS)

    Logan, R. T., Jr.

    1993-01-01

    Layout considerations for a large deep space communications antenna array are discussed. A fractal geometry for the antenna layout is described that provides optimal packing of antenna elements, efficient cable routing, and logical division of the array into identical sub-arrays.

  1. High-power laser diodes at various wavelengths

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emanuel, M.A.

    High power laser diodes at various wavelengths are described. First, performance and reliability of an optimized large transverse mode diode structure at 808 and 941 nm are presented. Next, data are presented on a 9.5 kW peak power array at 900 nm having a narrow emission bandwidth suitable for pumping Yb:S-FAP laser materials. Finally, results on a fiber-coupled laser diode array at {approx}730 nm are presented.

  2. Search for Long Period Solar Normal Modes in Ambient Seismic Noise

    NASA Astrophysics Data System (ADS)

    Caton, R.; Pavlis, G. L.

    2016-12-01

    We search for evidence of solar free oscillations (normal modes) in long period seismic data through multitaper spectral analysis of array stacks. This analysis is similar to that of Thomson & Vernon (2015), who used data from the most quiet single stations of the global seismic network. Our approach is to use stacks of large arrays of noisier stations to reduce noise. Arrays have the added advantage of permitting the use of nonparametic statistics (jackknife errors) to provide objective error estimates. We used data from the Transportable Array, the broadband borehole array at Pinyon Flat, and the 3D broadband array in Homestake Mine in Lead, SD. The Homestake Mine array has 15 STS-2 sensors deployed in the mine that are extremely quiet at long periods due to stable temperatures and stable piers anchored to hard rock. The length of time series used ranged from 50 days to 85 days. We processed the data by low-pass filtering with a corner frequency of 10 mHz, followed by an autoregressive prewhitening filter and median stack. We elected to use the median instead of the mean in order to get a more robust stack. We then used G. Prieto's mtspec library to compute multitaper spectrum estimates on the data. We produce delete-one jackknife error estimates of the uncertainty at each frequency by computing median stacks of all data with one station removed. The results from the TA data show tentative evidence for several lines between 290 μHz and 400 μHz, including a recurring line near 379 μHz. This 379 μHz line is near the Earth mode 0T2 and the solar mode 5g5, suggesting that 5g5 could be coupling into the Earth mode. Current results suggest more statistically significant lines may be present in Pinyon Flat data, but additional processing of the data is underway to confirm this observation.

  3. The Use of Signal Dimensionality for Automatic QC of Seismic Array Data

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.; Draganov, D.; Maceira, M.; Gomez, M.

    2014-12-01

    A significant problem in seismic array analysis is the inclusion of bad sensor channels in the beam-forming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by-node basis, so the dimensionality of the node traffic is instead monitored for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. We examine the signal dimension in similar way to the method addressing node traffic anomalies in large computer systems. We explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements. We show preliminary results applied to arrays in Kazakhstan (Makanchi) and Argentina (Malargue).

  4. High resolution beamforming on large aperture vertical line arrays: Processing synthetic data

    NASA Astrophysics Data System (ADS)

    Tran, Jean-Marie Q.; Hodgkiss, William S.

    1990-09-01

    This technical memorandum studies the beamforming of large aperture line arrays deployed vertically in the water column. The work concentrates on the use of high resolution techniques. Two processing strategies are envisioned: (1) full aperture coherent processing which offers in theory the best processing gain; and (2) subaperture processing which consists in extracting subapertures from the array and recombining the angular spectra estimated from these subarrays. The conventional beamformer, the minimum variance distortionless response (MVDR) processor, the multiple signal classification (MUSIC) algorithm and the minimum norm method are used in this study. To validate the various processing techniques, the ATLAS normal mode program is used to generate synthetic data which constitute a realistic signals environment. A deep-water, range-independent sound velocity profile environment, characteristic of the North-East Pacific, is being studied for two different 128 sensor arrays: a very long one cut for 30 Hz and operating at 20 Hz; and a shorter one cut for 107 Hz and operating at 100 Hz. The simulated sound source is 5 m deep. The full aperture and subaperture processing are being implemented with curved and plane wavefront replica vectors. The beamforming results are examined and compared to the ray-theory results produced by the generic sonar model.

  5. Teenage Pregnancy and Primary Prevention: New Approaches to an Old Problem.

    ERIC Educational Resources Information Center

    Pate, David J., Jr.; Knight, Susan

    This document describes the Parents Too Soon (PTS) program, a project which integrated a comprehensive array of services for teenagers in an effort to help prevent premature and unwanted pregnancies. Four components of the PTS program are listed: (1) comprehensive family planning medical services including provision of contraceptives; (2) social…

  6. Terahertz emission and spectroscopy on InN epilayer and nanostructure

    NASA Astrophysics Data System (ADS)

    Ahn, H.; Pan, C.-L.; Gwo, S.

    2009-02-01

    We report a comprehensive study on THz emission and spectroscopy of indium nitride (InN) films and its nanorod arrays grown by plasma-assisted molecular beam epitaxy technique. For the enhancement of THz emission from InN, we demonstrated two method; firstly using nanorod arrays, which have large surface area for optical absorption and THz emission, and secondly using nonpolar InN film, of which the electric field is along the sample surface. We propose that a "screened" photo-Dember effect due to narrow surface electron accumulation layer of InN is responsible for the nanorod-size-dependent enhancement from InN nanorods. The primary THz radiation mechanism of nonpolar InN is found to be due to the acceleration of photoexcited carriers under the polarization-induced in-plane electric field. THz time-domain spectroscopy has been used to investigate THz conductivity and dielectric response of InN nanorod arrays and epitaxial film. The complex THz conductivity of InN film is well fitted by the Drude model, while the negative imaginary conductivity of the InN nanorods can be described by using a non-Drude model, which includes a preferential backward scattering due to defects in InN nanorods, or a Coulombic restoring force from charged defects.

  7. A biomedical sensor system for real-time monitoring of astronauts' physiological parameters during extra-vehicular activities.

    PubMed

    Fei, Ding-Yu; Zhao, Xiaoming; Boanca, Cosmin; Hughes, Esther; Bai, Ou; Merrell, Ronald; Rafiq, Azhar

    2010-07-01

    To design and test an embedded biomedical sensor system that can monitor astronauts' comprehensive physiological parameters, and provide real-time data display during extra-vehicle activities (EVA) in the space exploration. An embedded system was developed with an array of biomedical sensors that can be integrated into the spacesuit. Wired communications were tested for physiological data acquisition and data transmission to a computer mounted on the spacesuit during task performances simulating EVA sessions. The sensor integration, data collection and communication, and the real-time data monitoring were successfully validated in the NASA field tests. The developed system may work as an embedded system for monitoring health status during long-term space mission. Copyright 2010 Elsevier Ltd. All rights reserved.

  8. Small Arrays for Seismic Intruder Detections: A Simulation Based Experiment

    NASA Astrophysics Data System (ADS)

    Pitarka, A.

    2014-12-01

    Seismic sensors such as geophones and fiber optic have been increasingly recognized as promising technologies for intelligence surveillance, including intruder detection and perimeter defense systems. Geophone arrays have the capability to provide cost effective intruder detection in protecting assets with large perimeters. A seismic intruder detection system uses one or multiple arrays of geophones design to record seismic signals from footsteps and ground vehicles. Using a series of real-time signal processing algorithms the system detects, classify and monitors the intruder's movement. We have carried out numerical experiments to demonstrate the capability of a seismic array to detect moving targets that generate seismic signals. The seismic source is modeled as a vertical force acting on the ground that generates continuous impulsive seismic signals with different predominant frequencies. Frequency-wave number analysis of the synthetic array data was used to demonstrate the array's capability at accurately determining intruder's movement direction. The performance of the array was also analyzed in detecting two or more objects moving at the same time. One of the drawbacks of using a single array system is its inefficiency at detecting seismic signals deflected by large underground objects. We will show simulation results of the effect of an underground concrete block at shielding the seismic signal coming from an intruder. Based on simulations we found that multiple small arrays can greatly improve the system's detection capability in the presence of underground structures. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344

  9. Implementing Access to Data Distributed on Many Processors

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A reference architecture is defined for an object-oriented implementation of domains, arrays, and distributions written in the programming language Chapel. This technology primarily addresses domains that contain arrays that have regular index sets with the low-level implementation details being beyond the scope of this discussion. What is defined is a complete set of object-oriented operators that allows one to perform data distributions for domain arrays involving regular arithmetic index sets. What is unique is that these operators allow for the arbitrary regions of the arrays to be fragmented and distributed across multiple processors with a single point of access giving the programmer the illusion that all the elements are collocated on a single processor. Today's massively parallel High Productivity Computing Systems (HPCS) are characterized by a modular structure, with a large number of processing and memory units connected by a high-speed network. Locality of access as well as load balancing are primary concerns in these systems that are typically used for high-performance scientific computation. Data distributions address these issues by providing a range of methods for spreading large data sets across the components of a system. Over the past two decades, many languages, systems, tools, and libraries have been developed for the support of distributions. Since the performance of data parallel applications is directly influenced by the distribution strategy, users often resort to low-level programming models that allow fine-tuning of the distribution aspects affecting performance, but, at the same time, are tedious and error-prone. This technology presents a reusable design of a data-distribution framework for data parallel high-performance applications. Distributions are a means to express locality in systems composed of large numbers of processor and memory components connected by a network. Since distributions have a great effect on the performance of applications, it is important that the distribution strategy is flexible, so its behavior can change depending on the needs of the application. At the same time, high productivity concerns require that the user be shielded from error-prone, tedious details such as communication and synchronization.

  10. Indexing data cubes for content-based searches in radio astronomy

    NASA Astrophysics Data System (ADS)

    Araya, M.; Candia, G.; Gregorio, R.; Mendoza, M.; Solar, M.

    2016-01-01

    Methods for observing space have changed profoundly in the past few decades. The methods needed to detect and record astronomical objects have shifted from conventional observations in the optical range to more sophisticated methods which permit the detection of not only the shape of an object but also the velocity and frequency of emissions in the millimeter-scale wavelength range and the chemical substances from which they originate. The consolidation of radio astronomy through a range of global-scale projects such as the Very Long Baseline Array (VLBA) and the Atacama Large Millimeter/submillimeter Array (ALMA) reinforces the need to develop better methods of data processing that can automatically detect regions of interest (ROIs) within data cubes (position-position-velocity), index them and facilitate subsequent searches via methods based on queries using spatial coordinates and/or velocity ranges. In this article, we present the development of an automatic system for indexing ROIs in data cubes that is capable of automatically detecting and recording ROIs while reducing the necessary storage space. The system is able to process data cubes containing megabytes of data in fractions of a second without human supervision, thus allowing it to be incorporated into a production line for displaying objects in a virtual observatory. We conducted a set of comprehensive experiments to illustrate how our system works. As a result, an index of 3% of the input size was stored in a spatial database, representing a compression ratio equal to 33:1 over an input of 20.875 GB, achieving an index of 773 MB approximately. On the other hand, a single query can be evaluated over our system in a fraction of second, showing that the indexing step works as a shock-absorber of the computational time involved in data cube processing. The system forms part of the Chilean Virtual Observatory (ChiVO), an initiative which belongs to the International Virtual Observatory Alliance (IVOA) that seeks to provide the capability of content-based searches on data cubes to the astronomical community.

  11. Conceptual design of the early implementation of the NEutron Detector Array (NEDA) with AGATA

    NASA Astrophysics Data System (ADS)

    Hüyük, Tayfun; Di Nitto, Antonio; Jaworski, Grzegorz; Gadea, Andrés; Javier Valiente-Dobón, José; Nyberg, Johan; Palacz, Marcin; Söderström, Pär-Anders; Jose Aliaga-Varea, Ramon; de Angelis, Giacomo; Ataç, Ayşe; Collado, Javier; Domingo-Pardo, Cesar; Egea, Francisco Javier; Erduran, Nizamettin; Ertürk, Sefa; de France, Gilles; Gadea, Rafael; González, Vicente; Herrero-Bosch, Vicente; Kaşkaş, Ayşe; Modamio, Victor; Moszynski, Marek; Sanchis, Enrique; Triossi, Andrea; Wadsworth, Robert

    2016-03-01

    The NEutron Detector Array (NEDA) project aims at the construction of a new high-efficiency compact neutron detector array to be coupled with large γ-ray arrays such as AGATA. The application of NEDA ranges from its use as selective neutron multiplicity filter for fusion-evaporation reaction to a large solid angle neutron tagging device. In the present work, possible configurations for the NEDA coupled with the Neutron Wall for the early implementation with AGATA has been simulated, using Monte Carlo techniques, in order to evaluate their performance figures. The goal of this early NEDA implementation is to improve, with respect to previous instruments, efficiency and capability to select multiplicity for fusion-evaporation reaction channels in which 1, 2 or 3 neutrons are emitted. Each NEDA detector unit has the shape of a regular hexagonal prism with a volume of about 3.23l and it is filled with the EJ301 liquid scintillator, that presents good neutron- γ discrimination properties. The simulations have been performed using a fusion-evaporation event generator that has been validated with a set of experimental data obtained in the 58Ni + 56Fe reaction measured with the Neutron Wall detector array.

  12. Simultaneous data communication and position sensing with an impact ionization engineered avalanche photodiode array for free space optical communication

    NASA Astrophysics Data System (ADS)

    Ferraro, Mike S.; Mahon, Rita; Rabinovich, William S.; Murphy, James L.; Dexter, James L.; Clark, William R.; Waters, William D.; Vaccaro, Kenneth; Krejca, Brian D.

    2017-02-01

    Photodetectors in free space optical communication systems perform two functions: reception of data communication signals and position sensing for pointing, tracking, and stabilization. Traditionally, the optical receive path in an FSO system is split into separate paths for data detection and position sensing. The need for separate paths is a consequence of conflicting performance criteria between position sensitive detectors (PSD) and data detectors. Combining the functionality of both detector types requires that the combinational sensor not only have the bandwidth to support high data rate communication but the active area and spatial discrimination to accommodate position sensing. In this paper we present a large area, concentric five element impact ionization engineered avalanche photodiode array rated for bandwidths beyond 1GHz with a measured carrier ionization ratio of less than 0.1 at moderate APD gains. The integration of this array as a combinational sensor in an FSO system is discussed along with the development of a pointing and stabilization algorithm.

  13. Large Scale Analysis of Geospatial Data with Dask and XArray

    NASA Astrophysics Data System (ADS)

    Zender, C. S.; Hamman, J.; Abernathey, R.; Evans, K. J.; Rocklin, M.; Zender, C. S.; Rocklin, M.

    2017-12-01

    The analysis of geospatial data with high level languages has acceleratedinnovation and the impact of existing data resources. However, as datasetsgrow beyond single-machine memory, data structures within these high levellanguages can become a bottleneck. New libraries like Dask and XArray resolve some of these scalability issues,providing interactive workflows that are both familiar tohigh-level-language researchers while also scaling out to much largerdatasets. This broadens the access of researchers to larger datasets on highperformance computers and, through interactive development, reducestime-to-insight when compared to traditional parallel programming techniques(MPI). This talk describes Dask, a distributed dynamic task scheduler, Dask.array, amulti-dimensional array that copies the popular NumPy interface, and XArray,a library that wraps NumPy/Dask.array with labeled and indexes axes,implementing the CF conventions. We discuss both the basic design of theselibraries and how they change interactive analysis of geospatial data, and alsorecent benefits and challenges of distributed computing on clusters ofmachines.

  14. GOES-R User Data Types and Structure

    NASA Astrophysics Data System (ADS)

    Royle, A. W.

    2012-12-01

    GOES-R meteorological data is provided to the operational and science user community through four main distribution mechanisms. The GOES-R Ground Segment (GS) generates a set of Level 1b (L1b) data from each of the six primary satellite instruments and formats the data into a direct broadcast stream known as GOES Rebroadcast (GRB). Terrestrially, cloud and moisture imagery data is provided to forecasters at the National Weather Service (NWS) through a direct interface to the Advanced Weather Interactive Processing System (AWIPS). A secondary pathway for the user community to receive data terrestrially is via NOAA's Environmental Satellite Processing and Distribution System (ESPDS) Product Distribution and Access (PDA) system. The ESPDS PDA will service the NWS and other meteorological users through a data portal, which provides both a subscription service and an ad hoc query capability. Finally, GOES-R data is made available to NOAA's Comprehensive Large Array-Data Stewardship System (CLASS) for long-term archive. CLASS data includes the L1b and L2+ products sent to PDA, along with the Level 0 data used to create these products, and other data used for product generation and processing. This session will provide a summary description of the data types and formats associated with each of the four primary distribution pathways for user data from GOES-R. It will discuss the resources that are being developed by GOES-R to document the data structures and formats. It will also provide a brief introduction to the types of metadata associated with each of the primary data flows.

  15. Array Databases: Agile Analytics (not just) for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Misev, D.

    2015-12-01

    Gridded data, such as images, image timeseries, and climate datacubes, today are managed separately from the metadata, and with different, restricted retrieval capabilities. While databases are good at metadata modelled in tables, XML hierarchies, or RDF graphs, they traditionally do not support multi-dimensional arrays.This gap is being closed by Array Databases, pioneered by the scalable rasdaman ("raster data manager") array engine. Its declarative query language, rasql, extends SQL with array operators which are optimized and parallelized on server side. Installations can easily be mashed up securely, thereby enabling large-scale location-transparent query processing in federations. Domain experts value the integration with their commonly used tools leading to a quick learning curve.Earth, Space, and Life sciences, but also Social sciences as well as business have massive amounts of data and complex analysis challenges that are answered by rasdaman. As of today, rasdaman is mature and in operational use on hundreds of Terabytes of timeseries datacubes, with transparent query distribution across more than 1,000 nodes. Additionally, its concepts have shaped international Big Data standards in the field, including the forthcoming array extension to ISO SQL, many of which are supported by both open-source and commercial systems meantime. In the geo field, rasdaman is reference implementation for the Open Geospatial Consortium (OGC) Big Data standard, WCS, now also under adoption by ISO. Further, rasdaman is in the final stage of OSGeo incubation.In this contribution we present array queries a la rasdaman, describe the architecture and novel optimization and parallelization techniques introduced in 2015, and put this in context of the intercontinental EarthServer initiative which utilizes rasdaman for enabling agile analytics on Petascale datacubes.

  16. Large area projection liquid-crystal video display system with inherent grid pattern optically removed

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Inventor)

    1992-01-01

    A relatively small and low-cost system is provided for projecting a large and bright television image onto a screen. A miniature liquid crystal array is driven by video circuitry to produce a pattern of transparencies in the array corresponding to a television image. Light is directed against the rear surface of the array to illuminate it, while a projection lens lies in front of the array to project the image of the array onto a large screen. Grid lines in the liquid crystal array are eliminated by a spacial filter which comprises a negative of the Fourier transform of the grid.

  17. Development of High-Fill-Factor Large-Aperture Micromirrors for Agile Optical Phased Arrays

    DTIC Science & Technology

    2010-02-28

    Final Project Report Contract/Grant Title: Development of High-Fill-Factor Large-Aperture Micromirrors for Agile Optical Phased Arrays...factor (HFF) micromirror array (MMA) has been proposed, fabricated and tested. Optical-phased-array (OPA) beam steering based on the HFF MMA has also...electrically tuned to multiple 2. 1. Background High-fill-factor (HFF) micromirror arrays (MMAs) can form optical phased arrays (OPAs) for laser beam

  18. Disturbance characteristics of half-selected cells in a cross-point resistive switching memory array

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Li, Haitong; Chen, Hong-Yu; Chen, Bing; Liu, Rui; Huang, Peng; Zhang, Feifei; Jiang, Zizhen; Ye, Hongfei; Gao, Bin; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng; Wong, H.-S. Philip; Yu, Shimeng

    2016-05-01

    Disturbance characteristics of cross-point resistive random access memory (RRAM) arrays are comprehensively studied in this paper. An analytical model is developed to quantify the number of pulses (#Pulse) the cell can bear before disturbance occurs under various sub-switching voltage stresses based on physical understanding. An evaluation methodology is proposed to assess the disturb behavior of half-selected (HS) cells in cross-point RRAM arrays by combining the analytical model and SPICE simulation. The characteristics of cross-point RRAM arrays such as energy consumption, reliable operating cycles and total error bits are evaluated by the methodology. A possible solution to mitigate disturbance is proposed.

  19. An automated mapping satellite system ( Mapsat).

    USGS Publications Warehouse

    Colvocoresses, A.P.

    1982-01-01

    The favorable environment of space permits a satellite to orbit the Earth with very high stability as long as no local perturbing forces are involved. Solid-state linear-array sensors have no moving parts and create no perturbing force on the satellite. Digital data from highly stabilized stereo linear arrays are amenable to simplified processing to produce both planimetric imagery and elevation data. A satellite imaging system, called Mapsat, including this concept has been proposed to produce data from which automated mapping in near real time can be accomplished. Image maps as large as 1:50 000 scale with contours as close as a 20-m interval may be produced from Mapsat data. -from Author

  20. Visualizing the deep end of sound: plotting multi-parameter results from infrasound data analysis

    NASA Astrophysics Data System (ADS)

    Perttu, A. B.; Taisne, B.

    2016-12-01

    Infrasound is sound below the threshold of human hearing: approximately 20 Hz. The field of infrasound research, like other waveform based fields relies on several standard processing methods and data visualizations, including waveform plots and spectrograms. The installation of the International Monitoring System (IMS) global network of infrasound arrays, contributed to the resurgence of infrasound research. Array processing is an important method used in infrasound research, however, this method produces data sets with a large number of parameters, and requires innovative plotting techniques. The goal in designing new figures is to be able to present easily comprehendible, and information-rich plots by careful selection of data density and plotting methods.

  1. Static Corrections to Improve Seismic Monitoring of the North Korean Nuclear Test Site with Regional Arrays

    NASA Astrophysics Data System (ADS)

    Wilkins, N.; Wookey, J. M.; Selby, N. D.

    2017-12-01

    Seismology is an important part of the International Monitoring System (IMS) installed to detect, identify, and locate nuclear detonations in breach of the Comprehensive nuclear Test Ban Treaty (CTBT) prior to and after its entry into force. Seismic arrays in particular provide not only a means of detecting and locating underground nuclear explosions, but in discriminating them from naturally occurring earthquakes of similar magnitude. One potential discriminant is the amplitude ratio of high frequency (> 2 Hz) P waves to S waves (P/S) measured at regional distances (3 - 17 °). Accurate measurement of such discriminants, and the ability to detect low-magnitude seismicity from a suspicious event relies on high signal-to-noise ratio (SNR) data. A correction to the slowness vector of the incident seismic wavefield, and static corrections applied to the waveforms recorded at each receiver within the array can be shown to improve the SNR. We apply codes we have developed to calculate slowness-azimuth station corrections (SASCs) and static corrections to the arrival time and amplitude of the seismic waveform to seismic arrays regional to the DPRK nuclear test site at Punggye-ri, North Korea. We use the F-statistic to demonstrate the SNR improvement to data from the nuclear tests and other seismic events in the vicinity of the test site. We also make new measurements of P/S with the corrected waveforms and compare these with existing measurements.

  2. Open and scalable analytics of large Earth observation datasets: From scenes to multidimensional arrays using SciDB and GDAL

    NASA Astrophysics Data System (ADS)

    Appel, Marius; Lahn, Florian; Buytaert, Wouter; Pebesma, Edzer

    2018-04-01

    Earth observation (EO) datasets are commonly provided as collection of scenes, where individual scenes represent a temporal snapshot and cover a particular region on the Earth's surface. Using these data in complex spatiotemporal modeling becomes difficult as soon as data volumes exceed a certain capacity or analyses include many scenes, which may spatially overlap and may have been recorded at different dates. In order to facilitate analytics on large EO datasets, we combine and extend the geospatial data abstraction library (GDAL) and the array-based data management and analytics system SciDB. We present an approach to automatically convert collections of scenes to multidimensional arrays and use SciDB to scale computationally intensive analytics. We evaluate the approach in three study cases on national scale land use change monitoring with Landsat imagery, global empirical orthogonal function analysis of daily precipitation, and combining historical climate model projections with satellite-based observations. Results indicate that the approach can be used to represent various EO datasets and that analyses in SciDB scale well with available computational resources. To simplify analyses of higher-dimensional datasets as from climate model output, however, a generalization of the GDAL data model might be needed. All parts of this work have been implemented as open-source software and we discuss how this may facilitate open and reproducible EO analyses.

  3. Computer Drawing Method for Operating Characteristic Curve of PV Power Plant Array Unit

    NASA Astrophysics Data System (ADS)

    Tan, Jianbin

    2018-02-01

    According to the engineering design of large-scale grid-connected photovoltaic power stations and the research and development of many simulation and analysis systems, it is necessary to draw a good computer graphics of the operating characteristic curves of photovoltaic array elements and to propose a good segmentation non-linear interpolation algorithm. In the calculation method, Component performance parameters as the main design basis, the computer can get 5 PV module performances. At the same time, combined with the PV array series and parallel connection, the computer drawing of the performance curve of the PV array unit can be realized. At the same time, the specific data onto the module of PV development software can be calculated, and the good operation of PV array unit can be improved on practical application.

  4. A Comparison of Earthquake Back-Projection Imaging Methods for Dense Local Arrays, and Application to the 2011 Virginia Aftershock Sequence

    NASA Astrophysics Data System (ADS)

    Beskardes, G. D.; Hole, J. A.; Wang, K.; Wu, Q.; Chapman, M. C.; Davenport, K. K.; Michaelides, M.; Brown, L. D.; Quiros, D. A.

    2016-12-01

    Back-projection imaging has recently become a practical method for local earthquake detection and location due to the deployment of densely sampled, continuously recorded, local seismograph arrays. Back-projection is scalable to earthquakes with a wide range of magnitudes from very tiny to very large. Local dense arrays provide the opportunity to capture very tiny events for a range applications, such as tectonic microseismicity, source scaling studies, wastewater injection-induced seismicity, hydraulic fracturing, CO2 injection monitoring, volcano studies, and mining safety. While back-projection sometimes utilizes the full seismic waveform, the waveforms are often pre-processed to overcome imaging issues. We compare the performance of back-projection using four previously used data pre-processing methods: full waveform, envelope, short-term averaging / long-term averaging (STA/LTA), and kurtosis. The goal is to identify an optimized strategy for an entirely automated imaging process that is robust in the presence of real-data issues, has the lowest signal-to-noise thresholds for detection and for location, has the best spatial resolution of the energy imaged at the source, preserves magnitude information, and considers computational cost. Real data issues include aliased station spacing, low signal-to-noise ratio (to <1), large noise bursts and spatially varying waveform polarity. For evaluation, the four imaging methods were applied to the aftershock sequence of the 2011 Virginia earthquake as recorded by the AIDA array with 200-400 m station spacing. These data include earthquake magnitudes from -2 to 3 with highly variable signal to noise, spatially aliased noise, and large noise bursts: realistic issues in many environments. Each of the four back-projection methods has advantages and disadvantages, and a combined multi-pass method achieves the best of all criteria. Preliminary imaging results from the 2011 Virginia dataset will be presented.

  5. Conclusive evidence for hexasomic inheritance in chrysanthemum based on analysis of a 183 k SNP array.

    PubMed

    van Geest, Geert; Voorrips, Roeland E; Esselink, Danny; Post, Aike; Visser, Richard Gf; Arens, Paul

    2017-08-07

    Cultivated chrysanthemum is an outcrossing hexaploid (2n = 6× = 54) with a disputed mode of inheritance. In this paper, we present a single nucleotide polymorphism (SNP) selection pipeline that was used to design an Affymetrix Axiom array with 183 k SNPs from RNA sequencing data (1). With this array, we genotyped four bi-parental populations (with sizes of 405, 53, 76 and 37 offspring plants respectively), and a cultivar panel of 63 genotypes. Further, we present a method for dosage scoring in hexaploids from signal intensities of the array based on mixture models (2) and validation of selection steps in the SNP selection pipeline (3). The resulting genotypic data is used to draw conclusions on the mode of inheritance in chrysanthemum (4), and to make an inference on allelic expression bias (5). With use of the mixture model approach, we successfully called the dosage of 73,936 out of 183,130 SNPs (40.4%) that segregated in any of the bi-parental populations. To investigate the mode of inheritance, we analysed markers that segregated in the large bi-parental population (n = 405). Analysis of segregation of duplex x nulliplex SNPs resulted in evidence for genome-wide hexasomic inheritance. This evidence was substantiated by the absence of strong linkage between markers in repulsion, which indicated absence of full disomic inheritance. We present the success rate of SNP discovery out of RNA sequencing data as affected by different selection steps, among which SNP coverage over genotypes and use of different types of sequence read mapping software. Genomic dosage highly correlated with relative allele coverage from the RNA sequencing data, indicating that most alleles are expressed according to their genomic dosage. The large population, genotyped with a very large number of markers, is a unique framework for extensive genetic analyses in hexaploid chrysanthemum. As starting point, we show conclusive evidence for genome-wide hexasomic inheritance.

  6. Climatic, ecological, and socioeconomic factors associated with West Nile virus incidence in Atlanta, Georgia, U.S.A.

    Treesearch

    Graeme Lockaby; Navideh Noori; Wayde Morse; Wayne Zipperer; Latif Kalin; Robin Governo; Rajesh Sawant; Matthew Ricker

    2016-01-01

    The integrated effects of the many risk factors associated with West Nile virus (WNV) incidence are complex and notwell understood. We studied an array of risk factors in and around Atlanta, GA, that have been shown to be linked with WNV inother locations. This array was comprehensive and included climate and meteorological metrics, vegetation...

  7. Using redundancy of round-trip ultrasound signal for non-continuous arrays: Application to gap and blockage compensation.

    PubMed

    Robert, Jean-Luc; Erkamp, Ramon; Korukonda, Sanghamithra; Vignon, François; Radulescu, Emil

    2015-11-01

    In ultrasound imaging, an array of elements is used to image a medium. If part of the array is blocked by an obstacle, or if the array is made from several sub-arrays separated by a gap, grating lobes appear and the image is degraded. The grating lobes are caused by missing spatial frequencies, corresponding to the blocked or non-existing elements. However, in an active imaging system, where elements are used both for transmitting and receiving, the round trip signal is redundant: different pairs of transmit and receive elements carry similar information. It is shown here that, if the gaps are smaller than the active sub-apertures, this redundancy can be used to compensate for the missing signals and recover full resolution. Three algorithms are proposed: one is based on a synthetic aperture method, a second one uses dual-apodization beamforming, and the third one is a radio frequency (RF) data based deconvolution. The algorithms are evaluated on simulated and experimental data sets. An application could be imaging through ribs with a large aperture.

  8. Two Siblings with Alternate Unbalanced Recombinants Derived from a Large Cryptic Maternal Pericentric Inversion of Chromosome 20

    PubMed Central

    DeScipio, Cheryl; Morrissette, Jennifer J.D.; Conlin, Laura K.; Clark, Dinah; Kaur, Maninder; Coplan, James; Riethman, Harold; Spinner, Nancy B.; Krantz, Ian D.

    2009-01-01

    Two brothers, with dissimilar clinical features, were each found to have different abnormalities of chromosome 20 by subtelomere fluorescence in situ hybridization (FISH). The proband had deletion of 20p subtelomere and duplication of 20q subtelomere, while his brother was found to have a duplication of 20p subtelomere and deletion of 20q subtelomere. Parental cytogenetic studies were initially thought to be normal, both by G-banding and by subtelomere FISH analysis. Since chromosome 20 is a metacentric chromosome and an inversion was suspected, we used anchored FISH to assist in identifying a possible inversion. This approach employed concomitant hybridization of a FISH probe to the short (p) arm of chromosome 20 with the 20q subtelomere probe. We identified a cytogenetically non-visible, mosaic pericentric inversion of one of the maternal chromosome 20 homologues, providing a mechanistic explanation for the chromosomal abnormalities present in these brothers. Array comparative genomic hybridization (CGH) with both a custom-made BAC and cosmid-based subtelomere specific array (TEL array) and a commercially-available SNP-based array confirmed and further characterized these rearrangements, identifying this as the largest pericentric inversion of chromosome 20 described to date. TEL array data indicate that the 20p breakpoint is defined by BAC RP11-978M13, ~900 kb from the pter; SNP array data reveal this breakpoint to occur within BAC RP11-978M13. The 20q breakpoint is defined by BAC RP11-93B14, ~1.7 Mb from the qter, by TEL array; SNP array data refine this breakpoint to within a gap between BACs on the TEL array (i.e. between RP11-93B14 and proximal BAC RP11-765G16). PMID:20101690

  9. Two siblings with alternate unbalanced recombinants derived from a large cryptic maternal pericentric inversion of chromosome 20.

    PubMed

    Descipio, Cheryl; Morrissette, Jennifer D; Conlin, Laura K; Clark, Dinah; Kaur, Maninder; Coplan, James; Riethman, Harold; Spinner, Nancy B; Krantz, Ian D

    2010-02-01

    Two brothers, with dissimilar clinical features, were each found to have different abnormalities of chromosome 20 by subtelomere fluorescence in situ hybridization (FISH). The proband had deletion of 20p subtelomere and duplication of 20q subtelomere, while his brother was found to have a duplication of 20p subtelomere and deletion of 20q subtelomere. Parental cytogenetic studies were initially thought to be normal, both by G-banding and by subtelomere FISH analysis. Since chromosome 20 is a metacentric chromosome and an inversion was suspected, we used anchored FISH to assist in identifying a possible inversion. This approach employed concomitant hybridization of a FISH probe to the short (p) arm of chromosome 20 with the 20q subtelomere probe. We identified a cytogenetically non-visible, mosaic pericentric inversion of one of the maternal chromosome 20 homologs, providing a mechanistic explanation for the chromosomal abnormalities present in these brothers. Array comparative genomic hybridization (CGH) with both a custom-made BAC and cosmid-based subtelomere specific array (TEL array) and a commercially available SNP-based array confirmed and further characterized these rearrangements, identifying this as the largest pericentric inversion of chromosome 20 described to date. TEL array data indicate that the 20p breakpoint is defined by BAC RP11-978M13, approximately 900 kb from the pter; SNP array data reveal this breakpoint to occur within BAC RP11-978M13. The 20q breakpoint is defined by BAC RP11-93B14, approximately 1.7 Mb from the qter, by TEL array; SNP array data refine this breakpoint to within a gap between BACs on the TEL array (i.e., between RP11-93B14 and proximal BAC RP11-765G16). Copyright 2010 Wiley-Liss, Inc.

  10. Reprogrammable logic in memristive crossbar for in-memory computing

    NASA Astrophysics Data System (ADS)

    Cheng, Long; Zhang, Mei-Yun; Li, Yi; Zhou, Ya-Xiong; Wang, Zhuo-Rui; Hu, Si-Yu; Long, Shi-Bing; Liu, Ming; Miao, Xiang-Shui

    2017-12-01

    Memristive stateful logic has emerged as a promising next-generation in-memory computing paradigm to address escalating computing-performance pressures in traditional von Neumann architecture. Here, we present a nonvolatile reprogrammable logic method that can process data between different rows and columns in a memristive crossbar array based on material implication (IMP) logic. Arbitrary Boolean logic can be executed with a reprogrammable cell containing four memristors in a crossbar array. In the fabricated Ti/HfO2/W memristive array, some fundamental functions, such as universal NAND logic and data transfer, were experimentally implemented. Moreover, using eight memristors in a 2  ×  4 array, a one-bit full adder was theoretically designed and verified by simulation to exhibit the feasibility of our method to accomplish complex computing tasks. In addition, some critical logic-related performances were further discussed, such as the flexibility of data processing, cascading problem and bit error rate. Such a method could be a step forward in developing IMP-based memristive nonvolatile logic for large-scale in-memory computing architecture.

  11. Glycan array data management at Consortium for Functional Glycomics.

    PubMed

    Venkataraman, Maha; Sasisekharan, Ram; Raman, Rahul

    2015-01-01

    Glycomics or the study of structure-function relationships of complex glycans has reshaped post-genomics biology. Glycans mediate fundamental biological functions via their specific interactions with a variety of proteins. Recognizing the importance of glycomics, large-scale research initiatives such as the Consortium for Functional Glycomics (CFG) were established to address these challenges. Over the past decade, the Consortium for Functional Glycomics (CFG) has generated novel reagents and technologies for glycomics analyses, which in turn have led to generation of diverse datasets. These datasets have contributed to understanding glycan diversity and structure-function relationships at molecular (glycan-protein interactions), cellular (gene expression and glycan analysis), and whole organism (mouse phenotyping) levels. Among these analyses and datasets, screening of glycan-protein interactions on glycan array platforms has gained much prominence and has contributed to cross-disciplinary realization of the importance of glycomics in areas such as immunology, infectious diseases, cancer biomarkers, etc. This manuscript outlines methodologies for capturing data from glycan array experiments and online tools to access and visualize glycan array data implemented at the CFG.

  12. An Arrayed Genome-Scale Lentiviral-Enabled Short Hairpin RNA Screen Identifies Lethal and Rescuer Gene Candidates

    PubMed Central

    Bhinder, Bhavneet; Antczak, Christophe; Ramirez, Christina N.; Shum, David; Liu-Sullivan, Nancy; Radu, Constantin; Frattini, Mark G.

    2013-01-01

    Abstract RNA interference technology is becoming an integral tool for target discovery and validation.; With perhaps the exception of only few studies published using arrayed short hairpin RNA (shRNA) libraries, most of the reports have been either against pooled siRNA or shRNA, or arrayed siRNA libraries. For this purpose, we have developed a workflow and performed an arrayed genome-scale shRNA lethality screen against the TRC1 library in HeLa cells. The resulting targets would be a valuable resource of candidates toward a better understanding of cellular homeostasis. Using a high-stringency hit nomination method encompassing criteria of at least three active hairpins per gene and filtered for potential off-target effects (OTEs), referred to as the Bhinder–Djaballah analysis method, we identified 1,252 lethal and 6 rescuer gene candidates, knockdown of which resulted in severe cell death or enhanced growth, respectively. Cross referencing individual hairpins with the TRC1 validated clone database, 239 of the 1,252 candidates were deemed independently validated with at least three validated clones. Through our systematic OTE analysis, we have identified 31 microRNAs (miRNAs) in lethal and 2 in rescuer genes; all having a seed heptamer mimic in the corresponding shRNA hairpins and likely cause of the OTE observed in our screen, perhaps unraveling a previously unknown plausible essentiality of these miRNAs in cellular viability. Taken together, we report on a methodology for performing large-scale arrayed shRNA screens, a comprehensive analysis method to nominate high-confidence hits, and a performance assessment of the TRC1 library highlighting the intracellular inefficiencies of shRNA processing in general. PMID:23198867

  13. Numerical study of read scheme in one-selector one-resistor crossbar array

    NASA Astrophysics Data System (ADS)

    Kim, Sungho; Kim, Hee-Dong; Choi, Sung-Jin

    2015-12-01

    A comprehensive numerical circuit analysis of read schemes of a one selector-one resistance change memory (1S1R) crossbar array is carried out. Three schemes-the ground, V/2, and V/3 schemes-are compared with each other in terms of sensing margin and power consumption. Without the aid of a complex analytical approach or SPICE-based simulation, a simple numerical iteration method is developed to simulate entire current flows and node voltages within a crossbar array. Understanding such phenomena is essential in successfully evaluating the electrical specifications of selectors for suppressing intrinsic drawbacks of crossbar arrays, such as sneaky current paths and series line resistance problems. This method provides a quantitative tool for the accurate analysis of crossbar arrays and provides guidelines for developing an optimal read scheme, array configuration, and selector device specifications.

  14. A University-Assisted, Place-Based Model for Enhancing Students' Peer, Family, and Community Ecologies

    ERIC Educational Resources Information Center

    Lawson, Michael A.; Alameda-Lawson, Tania; Richards, K. Andrew R.

    2016-01-01

    Community schools have recently (re)emerged in the United States as a vital, comprehensive strategy for addressing poverty-related barriers to children's school learning. However, not all low-income school communities are endowed with the resources needed to launch a comprehensive array of school-based/linked services and programs. In this…

  15. Brief Report: The Negev Hospital-University-Based (HUB) Autism Database

    ERIC Educational Resources Information Center

    Meiri, Gal; Dinstein, Ilan; Michaelowski, Analya; Flusser, Hagit; Ilan, Michal; Faroy, Michal; Bar-Sinai, Asif; Manelis, Liora; Stolowicz, Dana; Yosef, Lili Lea; Davidovitch, Nadav; Golan, Hava; Arbelle, Shosh; Menashe, Idan

    2017-01-01

    Elucidating the heterogeneous etiologies of autism will require investment in comprehensive longitudinal data acquisition from large community based cohorts. With this in mind, we have established a hospital-university-based (HUB) database of autism which incorporates prospective and retrospective data from a large and ethnically diverse…

  16. Communicating Treatment Risk Reduction to People With Low Numeracy Skills: A Cross-Cultural Comparison

    PubMed Central

    2009-01-01

    Objectives. We sought to address denominator neglect (i.e. the focus on the number of treated and nontreated patients who died, without sufficiently considering the overall numbers of patients) in estimates of treatment risk reduction, and analyzed whether icon arrays aid comprehension. Methods. We performed a survey of probabilistic, national samples in the United States and Germany in July and August of 2008. Participants received scenarios involving equally effective treatments but differing in the overall number of treated and nontreated patients. In some conditions, the number who received a treatment equaled the number who did not; in others the number was smaller or larger. Some participants received icon arrays. Results. Participants—particularly those with low numeracy skills—showed denominator neglect in treatment risk reduction perceptions. Icon arrays were an effective method for eliminating denominator neglect. We found cross-cultural differences that are important in light of the countries' different medical systems. Conclusions. Problems understanding numerical information often reside not in the mind but in the problem's representation. These findings suggest suitable ways to communicate quantitative medical data. PMID:19833983

  17. Refinement of light-responsive transcript lists using rice oligonucleotide arrays: evaluation of gene-redundancy.

    PubMed

    Jung, Ki-Hong; Dardick, Christopher; Bartley, Laura E; Cao, Peijian; Phetsom, Jirapa; Canlas, Patrick; Seo, Young-Su; Shultz, Michael; Ouyang, Shu; Yuan, Qiaoping; Frank, Bryan C; Ly, Eugene; Zheng, Li; Jia, Yi; Hsia, An-Ping; An, Kyungsook; Chou, Hui-Hsien; Rocke, David; Lee, Geun Cheol; Schnable, Patrick S; An, Gynheung; Buell, C Robin; Ronald, Pamela C

    2008-10-06

    Studies of gene function are often hampered by gene-redundancy, especially in organisms with large genomes such as rice (Oryza sativa). We present an approach for using transcriptomics data to focus functional studies and address redundancy. To this end, we have constructed and validated an inexpensive and publicly available rice oligonucleotide near-whole genome array, called the rice NSF45K array. We generated expression profiles for light- vs. dark-grown rice leaf tissue and validated the biological significance of the data by analyzing sources of variation and confirming expression trends with reverse transcription polymerase chain reaction. We examined trends in the data by evaluating enrichment of gene ontology terms at multiple false discovery rate thresholds. To compare data generated with the NSF45K array with published results, we developed publicly available, web-based tools (www.ricearray.org). The Oligo and EST Anatomy Viewer enables visualization of EST-based expression profiling data for all genes on the array. The Rice Multi-platform Microarray Search Tool facilitates comparison of gene expression profiles across multiple rice microarray platforms. Finally, we incorporated gene expression and biochemical pathway data to reduce the number of candidate gene products putatively participating in the eight steps of the photorespiration pathway from 52 to 10, based on expression levels of putatively functionally redundant genes. We confirmed the efficacy of this method to cope with redundancy by correctly predicting participation in photorespiration of a gene with five paralogs. Applying these methods will accelerate rice functional genomics.

  18. Pacific Array of, by and for Global Deep Earth Research

    NASA Astrophysics Data System (ADS)

    Kawakatsu, H.

    2016-12-01

    Recent advances in ocean bottom geophysical observations, together with advances in the analysis methodology, have now enabled us to resolve the regional 1-D structure of the entire lithosphere- asthenosphere system (LAS), from the surface to a depth of ˜200km, including seismic anisotropy (azimuthal), with deployments of ˜10-15 BBOBSs & OBEMs each for a year or so (Takeo et al, 2013, 2016; Baba et al., 2013; Lin et al. 2016). Thus the in-situ characterization of the physical properties of the entire oceanic LAS without a priori assumption for the shallow-most structure, the assumption often made for global studies, has become possible. We are now entering a new stage that a large scale array experiment in the ocean (e.g., Pacific Array: http://gachon.eri.u-tokyo.ac.jp/ hitosi/PArray/) has become approachable: having 10-15 BBOBSs as an array unit for a 1-2-year deployment, and repeating such deployments in a leap-frog way or concurrently (an array of arrays) for a decade or so would enable us to cover a large portion of the Pacific basin. Such array observations not only by giving regional constraints on the 1-D structure (including seismic anisotropy), but also by sharing waveform data for global scale waveform tomography (e.g., Fichtner et al. 2010; French et al. 2013; Zhu & Tromp 2013), would drastically increase our knowledge of how plate tectonics works beneath oceanic basins, as well as of the large scale picture of the interior of the Earth. For such an array of arrays to be realized, international collaboration seems essential. If three or four countries collaborate together, it may be achieved within a 10-year time frame that makes this concept attractive. It is also essential that global seismology, geodynamics, and deep earth (GSGD) communities work closely with the ocean science community for Pacific Array to be realized, as they would get most benefit from it. While unit array deployments may have their own scientific goals, it is important that they are planned to fit within a larger international Pacific Array structure. The GSGD community should take a lead in providing such an umbrella, as well as stimulating collaborations between different disciplines .

  19. Tumor Touch Imprints as Source for Whole Genome Analysis of Neuroblastoma Tumors

    PubMed Central

    Brunner, Clemens; Brunner-Herglotz, Bettina; Ziegler, Andrea; Frech, Christian; Amann, Gabriele; Ladenstein, Ruth; Ambros, Inge M.; Ambros, Peter F.

    2016-01-01

    Introduction Tumor touch imprints (TTIs) are routinely used for the molecular diagnosis of neuroblastomas by interphase fluorescence in-situ hybridization (I-FISH). However, in order to facilitate a comprehensive, up-to-date molecular diagnosis of neuroblastomas and to identify new markers to refine risk and therapy stratification methods, whole genome approaches are needed. We examined the applicability of an ultra-high density SNP array platform that identifies copy number changes of varying sizes down to a few exons for the detection of genomic changes in tumor DNA extracted from TTIs. Material and Methods DNAs were extracted from TTIs of 46 neuroblastoma and 4 other pediatric tumors. The DNAs were analyzed on the Cytoscan HD SNP array platform to evaluate numerical and structural genomic aberrations. The quality of the data obtained from TTIs was compared to that from randomly chosen fresh or fresh frozen solid tumors (n = 212) and I-FISH validation was performed. Results SNP array profiles were obtained from 48 (out of 50) TTI DNAs of which 47 showed genomic aberrations. The high marker density allowed for single gene analysis, e.g. loss of nine exons in the ATRX gene and the visualization of chromothripsis. Data quality was comparable to fresh or fresh frozen tumor SNP profiles. SNP array results were confirmed by I-FISH. Conclusion TTIs are an excellent source for SNP array processing with the advantage of simple handling, distribution and storage of tumor tissue on glass slides. The minimal amount of tumor tissue needed to analyze whole genomes makes TTIs an economic surrogate source in the molecular diagnostic work up of tumor samples. PMID:27560999

  20. Three gangliogliomas: results of GTG-banding, SKY, genome-wide high resolution SNP-array, gene expression and review of the literature.

    PubMed

    Xu, Li-Xin; Holland, Heidrun; Kirsten, Holger; Ahnert, Peter; Krupp, Wolfgang; Bauer, Manfred; Schober, Ralf; Mueller, Wolf; Fritzsch, Dominik; Meixensberger, Jürgen; Koschny, Ronald

    2015-04-01

    According to the World Health Organization gangliogliomas are classified as well-differentiated and slowly growing neuroepithelial tumors, composed of neoplastic mature ganglion and glial cells. It is the most frequent tumor entity observed in patients with long-term epilepsy. Comprehensive cytogenetic and molecular cytogenetic data including high-resolution genomic profiling (single nucleotide polymorphism (SNP)-array) of gangliogliomas are scarce but necessary for a better oncological understanding of this tumor entity. For a detailed characterization at the single cell and cell population levels, we analyzed genomic alterations of three gangliogliomas using trypsin-Giemsa banding (GTG-banding) and by spectral karyotyping (SKY) in combination with SNP-array and gene expression array experiments. By GTG and SKY, we could confirm frequently detected chromosomal aberrations (losses within chromosomes 10, 13 and 22; gains within chromosomes 5, 7, 8 and 12), and identify so far unknown genetic aberrations like the unbalanced non-reciprocal translocation t(1;18)(q21;q21). Interestingly, we report on the second so far detected ganglioglioma with ring chromosome 1. Analyses of SNP-array data from two of the tumors and respective germline DNA (peripheral blood) identified few small gains and losses and a number of copy-neutral regions with loss of heterozygosity (LOH) in germline and in tumor tissue. In comparison to germline DNA, tumor tissues did not show substantial regions with significant loss or gain or with newly developed LOH. Gene expression analyses of tumor-specific genes revealed similarities in the profile of the analyzed samples regarding different relevant pathways. Taken together, we describe overlapping but also distinct and novel genetic aberrations of three gangliogliomas. © 2014 Japanese Society of Neuropathology.

  1. Wire array Z-pinch insights for enhanced x-ray production

    NASA Astrophysics Data System (ADS)

    Sanford, T. W. L.; Mock, R. C.; Spielman, R. B.; Haines, M. G.; Chittenden, J. P.; Whitney, K. G.; Apruzese, J. P.; Peterson, D. L.; Greenly, J. B.; Sinars, D. B.; Reisman, D. B.; Mosher, D.

    1999-05-01

    Comparisons of measured total radiated x-ray power from annular wire-array z-pinches with a variety of models as a function of wire number, array mass, and load radius are reviewed. The data, which are comprehensive, have provided important insights into the features of wire-array dynamics that are critical for high x-ray power generation. Collectively, the comparisons of the data with the model calculations suggest that a number of underlying dynamical mechanisms involving cylindrical asymmetries and plasma instabilities contribute to the measured characteristics. For example, under the general assumption that the measured risetime of the total-radiated-power pulse is related to the thickness of the plasma shell formed on axis, the Heuristic Model [IEEE Trans. Plasma Sci. 26, 1275 (1998)] agrees with the measured risetime under a number of specific assumptions about the way the breakdown of the wires, the wire-plasma expansion, and the Rayleigh-Taylor instability in the r-z plane, develop. Likewise, in the high wire-number regime (where the wires are calculated to form a plasma shell prior to significant radial motion of the shell) the comparisons show that the variation in the power of the radiation generated as a function of load mass and array radius can be simulated by the two-dimensional Eulerian-radiation- magnetohydrodynamics code (E-RMHC) [Phys. Plasmas 3, 368 (1996)], using a single random-density perturbation that seeds the Rayleigh-Taylor instability in the r-z plane. For a given pulse-power generator, the comparisons suggest that (1) the smallest interwire gaps compatible with practical load construction and (2) the minimum implosion time consistent with the optimum required energy coupling of the generator to the load should produce the highest total-radiated-power levels.

  2. Contextual Compression of Large-Scale Wind Turbine Array Simulations: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny M; Brunhart-Lupo, Nicholas J; Potter, Kristin C

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysismore » and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interactive visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contexualized representation is a valid approach and encourages contextual data management.« less

  3. Contextual Compression of Large-Scale Wind Turbine Array Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny M; Brunhart-Lupo, Nicholas J; Potter, Kristin C

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysismore » and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interative visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contextualized representation is a valid approach and encourages contextual data management.« less

  4. Nanoparticle Analysis by Online Comprehensive Two-Dimensional Liquid Chromatography combining Hydrodynamic Chromatography and Size-Exclusion Chromatography with Intermediate Sample Transformation

    PubMed Central

    2017-01-01

    Polymeric nanoparticles have become indispensable in modern society with a wide array of applications ranging from waterborne coatings to drug-carrier-delivery systems. While a large range of techniques exist to determine a multitude of properties of these particles, relating physicochemical properties of the particle to the chemical structure of the intrinsic polymers is still challenging. A novel, highly orthogonal separation system based on comprehensive two-dimensional liquid chromatography (LC × LC) has been developed. The system combines hydrodynamic chromatography (HDC) in the first-dimension to separate the particles based on their size, with ultrahigh-performance size-exclusion chromatography (SEC) in the second dimension to separate the constituting polymer molecules according to their hydrodynamic radius for each of 80 to 100 separated fractions. A chip-based mixer is incorporated to transform the sample by dissolving the separated nanoparticles from the first-dimension online in tetrahydrofuran. The polymer bands are then focused using stationary-phase-assisted modulation to enhance sensitivity, and the water from the first-dimension eluent is largely eliminated to allow interaction-free SEC. Using the developed system, the combined two-dimensional distribution of the particle-size and the molecular-size of a mixture of various polystyrene (PS) and polyacrylate (PACR) nanoparticles has been obtained within 60 min. PMID:28745485

  5. Large Phased Array Radar Using Networked Small Parabolic Reflectors

    NASA Technical Reports Server (NTRS)

    Amoozegar, Farid

    2006-01-01

    Multifunction phased array systems with radar, telecom, and imaging applications have already been established for flat plate phased arrays of dipoles, or waveguides. In this paper the design trades and candidate options for combining the radar and telecom functions of the Deep Space Network (DSN) into a single large transmit array of small parabolic reflectors will be discussed. In particular the effect of combing the radar and telecom functions on the sizes of individual antenna apertures and the corresponding spacing between the antenna elements of the array will be analyzed. A heterogeneous architecture for the DSN large transmit array is proposed to meet the radar and telecom requirements while considering the budget, scheduling, and strategic planning constrains.

  6. Scaled-Up Production and Transport Applications of Graphitic Carbon Nanomaterials

    NASA Astrophysics Data System (ADS)

    Saviers, Kimberly R.

    Graphitic carbon nanomaterials enhance the performance of engineered systems for energy harvesting and storage. However, commercial availability remains largely cost-prohibitive due to technical barriers to mass production. This thesis examines both the scaled-up production and energy transport applications of graphitic materials. Cost driven-production of graphitic petals is developed, carbon nanotube array thermal interface materials enhance waste heat energy harvesting, and microsupercapacitors are visually examined using a new electroreflectance measurement method. Graphitic materials have previously been synthesized using batch-style processing methods with small sample sizes, limiting their commercial viability. In order to increase production throughput, a roll-to-roll radio-frequency plasma chemical vapor deposition method is employed to continuously deposit graphitic petals on carbon fiber tow. In consideration of a full production framework, efficient and informative characterization methods in the form of electrical resistance and electrochemical capacitance are highlighted. To co-optimize the functional characteristics of the material, the processing conditions are comprehensively varied using a data-driven predictive design of experiments method. Repeatable and reliable production of graphitic materials will enable a host of creative graphene-based devices to emerge into the marketplace. Two such applications are discussed in the remaining chapters. Waste heat is most efficiently harvested at high temperatures, such as vehicle exhaust systems near 600°C. However, the resistance to heat flux at the interfaces between the harvesting device and its surroundings is detrimental to the system-level performance. To study the performance of thermal interface materials up to 700°C, a reference bar measurement method was designed. Design considerations are discussed and compared to past implementations, particularly regarding radiation heat flux and thermal expansion at these elevated temperatures. The microscale roughness of the contacting measurement surface is fully characterized, as it fundamentally affects the resulting thermal interface resistance. This comprehensive method for determining thermal interface resistance at high temperatures includes the physical equipment, data acquisition system, and data analysis method. Thermomechanical evaluation of carbon nanotube arrays up to 700°C has shown that the arrays provide mechanical flexibility to accommodate thermal expansion in a thermomechanically mismatched interface. To demonstrate the application of the arrays for improving energy generation, they were evaluated in conjunction with a thermoelectric module. The system-level efficiency increases significantly when a carbon nanotube array is applied to the hot side of the thermoelectric module. Additional materials characterization suggests the presence of a strong thermal connection between the carbon nanotubes and their catalyst layers, due to covalent bonding between them. In another application of harvesting waste heat, the carbon nanotube arrays increase the performance of a thermo-magnetically actuated shuttle device for solar photovoltaic cells due to decreased thermal interface resistance. Vertically-oriented graphitic petals have previously enhanced supercapacitor power density. Here, a spatiotemporal characterization method is developed and utilized to study ageing phenomena in microsupercapacitor electrodes. The electroreflectance method captures images of charge accumulation in the electrodes at varying states during each charge-discharge cycle. The method was exploited by imaging each an ideal device and a device with defects over an extended period of over four million cycles. The charge accumulation patterns over the ageing period relate to the physical transport behavior. During a single discharge cycle, one may visually observe the electrons drifting out of the electrode. Overall, the investigations herein determine the following. Continuous production of graphitic petals is possible and is optimized by considering the effect of plasma conditions on the resulting functional performance of the material. Thermal interface resistance may be measured at high temperatures in order to understand the viability of interface materials for energy harvesting applications. Carbon nanotube array thermal interface materials lead to increased energy generation from thermoelectric modules. Spatial electroreflectance measurements of microsupercapacitors lead to observation of decreased physical wetting between the electrode and electrolyte, impacting device performance. Looking forward, creative application of graphitic carbon nanomaterials, coupled with cost-driven production capability, will launch them into the commercial marketplace.

  7. A Comprehensive Infrastructure for Big Data in Cancer Research: Accelerating Cancer Research and Precision Medicine

    PubMed Central

    Hinkson, Izumi V.; Davidsen, Tanja M.; Klemm, Juli D.; Chandramouliswaran, Ishwar; Kerlavage, Anthony R.; Kibbe, Warren A.

    2017-01-01

    Advancements in next-generation sequencing and other -omics technologies are accelerating the detailed molecular characterization of individual patient tumors, and driving the evolution of precision medicine. Cancer is no longer considered a single disease, but rather, a diverse array of diseases wherein each patient has a unique collection of germline variants and somatic mutations. Molecular profiling of patient-derived samples has led to a data explosion that could help us understand the contributions of environment and germline to risk, therapeutic response, and outcome. To maximize the value of these data, an interdisciplinary approach is paramount. The National Cancer Institute (NCI) has initiated multiple projects to characterize tumor samples using multi-omic approaches. These projects harness the expertise of clinicians, biologists, computer scientists, and software engineers to investigate cancer biology and therapeutic response in multidisciplinary teams. Petabytes of cancer genomic, transcriptomic, epigenomic, proteomic, and imaging data have been generated by these projects. To address the data analysis challenges associated with these large datasets, the NCI has sponsored the development of the Genomic Data Commons (GDC) and three Cloud Resources. The GDC ensures data and metadata quality, ingests and harmonizes genomic data, and securely redistributes the data. During its pilot phase, the Cloud Resources tested multiple cloud-based approaches for enhancing data access, collaboration, computational scalability, resource democratization, and reproducibility. These NCI-led efforts are continuously being refined to better support open data practices and precision oncology, and to serve as building blocks of the NCI Cancer Research Data Commons. PMID:28983483

  8. Observatories Combine to Crack Open the Crab Nebula

    NASA Image and Video Library

    2017-12-08

    Astronomers have produced a highly detailed image of the Crab Nebula, by combining data from telescopes spanning nearly the entire breadth of the electromagnetic spectrum, from radio waves seen by the Karl G. Jansky Very Large Array (VLA) to the powerful X-ray glow as seen by the orbiting Chandra X-ray Observatory. And, in between that range of wavelengths, the Hubble Space Telescope's crisp visible-light view, and the infrared perspective of the Spitzer Space Telescope. This composite image of the Crab Nebula, a supernova remnant, was assembled by combining data from five telescopes spanning nearly the entire breadth of the electromagnetic spectrum: the Very Large Array, the Spitzer Space Telescope, the Hubble Space Telescope, the XMM-Newton Observatory, and the Chandra X-ray Observatory. Credits: NASA, ESA, NRAO/AUI/NSF and G. Dubner (University of Buenos Aires) #nasagoddard #space #science

  9. Diffraction pattern simulation of cellulose fibrils using distributed and quantized pair distances

    DOE PAGES

    Zhang, Yan; Inouye, Hideyo; Crowley, Michael; ...

    2016-10-14

    Intensity simulation of X-ray scattering from large twisted cellulose molecular fibrils is important in understanding the impact of chemical or physical treatments on structural properties such as twisting or coiling. This paper describes a highly efficient method for the simulation of X-ray diffraction patterns from complex fibrils using atom-type-specific pair-distance quantization. Pair distances are sorted into arrays which are labelled by atom type. Histograms of pair distances in each array are computed and binned and the resulting population distributions are used to represent the whole pair-distance data set. These quantized pair-distance arrays are used with a modified and vectorized Debyemore » formula to simulate diffraction patterns. This approach utilizes fewer pair distances in each iteration, and atomic scattering factors are moved outside the iteration since the arrays are labelled by atom type. As a result, this algorithm significantly reduces the computation time while maintaining the accuracy of diffraction pattern simulation, making possible the simulation of diffraction patterns from large twisted fibrils in a relatively short period of time, as is required for model testing and refinement.« less

  10. Measurement Of Trailing Edge Noise using Directional Array and Coherent Output Power Methods

    NASA Technical Reports Server (NTRS)

    Hutcheson, Florence V.; Brooks, Thomas F.

    2002-01-01

    The use of a directional array of microphones for the measurement of trailing edge (TE) noise is described. The capabilities of this method are evaluated via measurements of TE noise from a NACA 63-215 airfoil model and from a cylindrical rod. This TE noise measurement approach is compared to one that is based on the cross spectral analysis of output signals from a pair of microphones (COP method). Advantages and limitations of both methods are examined. It is shown that the microphone array can accurately measures TE noise and captures its two-dimensional characteristic over a large frequency range for any TE configuration as long as noise contamination from extraneous sources is within bounds. The COP method is shown to also accurately measure TE noise but over a more limited frequency range that narrows for increased TE thickness. Finally, the applicability and generality of an airfoil self-noise prediction method was evaluated via comparison to the experimental data obtained using the COP and array measurement methods. The predicted and experimental results are shown to agree over large frequency ranges.

  11. The AlpArray Seismic Network: current status and next steps

    NASA Astrophysics Data System (ADS)

    Hetényi, György; Molinari, Irene; Clinton, John; Kissling, Edi

    2016-04-01

    The AlpArray initiative (http://www.alparray.ethz.ch) is a large-scale European collaboration to study the entire Alpine orogen at high resolution and in 3D with a large variety of geoscientific methods. The core element of the initiative is an extensive and dense broadband seismological network, the AlpArray Seismic Network (AASN), which complements the permanent seismological stations to ensure homogeneous coverage of the greater Alpine area. The some 260 temporary stations of the AlpArray Seismic Network are operated as a joint effort by a number of institutions from Austria, Bosnia-Herzegovina, Croatia, Czech Republic, France, Germany, Hungary, Italy, Slovakia and Switzerland. The first stations were installed in Spring 2015 and the full AASN is planned to be operational by early Summer 2016. In this poster we present the actual status of the deployment, the effort undertaken by the contributing groups, station performance, typical noise levels, best practices in installation as well as in data management, often encountered challenges, and planned next steps including the deployment of ocean bottom seismometers in the Ligurian Sea.

  12. Diffraction pattern simulation of cellulose fibrils using distributed and quantized pair distances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yan; Inouye, Hideyo; Crowley, Michael

    Intensity simulation of X-ray scattering from large twisted cellulose molecular fibrils is important in understanding the impact of chemical or physical treatments on structural properties such as twisting or coiling. This paper describes a highly efficient method for the simulation of X-ray diffraction patterns from complex fibrils using atom-type-specific pair-distance quantization. Pair distances are sorted into arrays which are labelled by atom type. Histograms of pair distances in each array are computed and binned and the resulting population distributions are used to represent the whole pair-distance data set. These quantized pair-distance arrays are used with a modified and vectorized Debyemore » formula to simulate diffraction patterns. This approach utilizes fewer pair distances in each iteration, and atomic scattering factors are moved outside the iteration since the arrays are labelled by atom type. This algorithm significantly reduces the computation time while maintaining the accuracy of diffraction pattern simulation, making possible the simulation of diffraction patterns from large twisted fibrils in a relatively short period of time, as is required for model testing and refinement.« less

  13. Diffraction pattern simulation of cellulose fibrils using distributed and quantized pair distances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yan; Inouye, Hideyo; Crowley, Michael

    Intensity simulation of X-ray scattering from large twisted cellulose molecular fibrils is important in understanding the impact of chemical or physical treatments on structural properties such as twisting or coiling. This paper describes a highly efficient method for the simulation of X-ray diffraction patterns from complex fibrils using atom-type-specific pair-distance quantization. Pair distances are sorted into arrays which are labelled by atom type. Histograms of pair distances in each array are computed and binned and the resulting population distributions are used to represent the whole pair-distance data set. These quantized pair-distance arrays are used with a modified and vectorized Debyemore » formula to simulate diffraction patterns. This approach utilizes fewer pair distances in each iteration, and atomic scattering factors are moved outside the iteration since the arrays are labelled by atom type. As a result, this algorithm significantly reduces the computation time while maintaining the accuracy of diffraction pattern simulation, making possible the simulation of diffraction patterns from large twisted fibrils in a relatively short period of time, as is required for model testing and refinement.« less

  14. A 34K SNP genotyping array for Populus trichocarpa: design, application to the study of natural populations and transferability to other Populus species.

    PubMed

    Geraldes, A; Difazio, S P; Slavov, G T; Ranjan, P; Muchero, W; Hannemann, J; Gunter, L E; Wymore, A M; Grassa, C J; Farzaneh, N; Porth, I; McKown, A D; Skyba, O; Li, E; Fujita, M; Klápště, J; Martin, J; Schackwitz, W; Pennacchio, C; Rokhsar, D; Friedmann, M C; Wasteneys, G O; Guy, R D; El-Kassaby, Y A; Mansfield, S D; Cronk, Q C B; Ehlting, J; Douglas, C J; Tuskan, G A

    2013-03-01

    Genetic mapping of quantitative traits requires genotypic data for large numbers of markers in many individuals. For such studies, the use of large single nucleotide polymorphism (SNP) genotyping arrays still offers the most cost-effective solution. Herein we report on the design and performance of a SNP genotyping array for Populus trichocarpa (black cottonwood). This genotyping array was designed with SNPs pre-ascertained in 34 wild accessions covering most of the species latitudinal range. We adopted a candidate gene approach to the array design that resulted in the selection of 34 131 SNPs, the majority of which are located in, or within 2 kb of, 3543 candidate genes. A subset of the SNPs on the array (539) was selected based on patterns of variation among the SNP discovery accessions. We show that more than 95% of the loci produce high quality genotypes and that the genotyping error rate for these is likely below 2%. We demonstrate that even among small numbers of samples (n = 10) from local populations over 84% of loci are polymorphic. We also tested the applicability of the array to other species in the genus and found that the number of polymorphic loci decreases rapidly with genetic distance, with the largest numbers detected in other species in section Tacamahaca. Finally, we provide evidence for the utility of the array to address evolutionary questions such as intraspecific studies of genetic differentiation, species assignment and the detection of natural hybrids. © 2013 Blackwell Publishing Ltd.

  15. Large-pitch steerable synthetic transmit aperture imaging (LPSSTA)

    NASA Astrophysics Data System (ADS)

    Li, Ying; Kolios, Michael C.; Xu, Yuan

    2016-04-01

    A linear ultrasound array system usually has a larger pitch and is less costly than a phased array system, but loses the ability to fully steer the ultrasound beam. In this paper, we propose a system whose hardware is similar to a large-pitch linear array system, but whose ability to steer the beam is similar to a phased array system. The motivation is to reduce the total number of measurement channels M (the product of the number of transmissions, nT, and the number of the receive channels in each transmission, nR), while maintaining reasonable image quality. We combined adjacent elements (with proper delays introduced) into groups that would be used in both the transmit and receive processes of synthetic transmit aperture imaging. After the M channels of RF data were acquired, a pseudo-inversion was applied to estimate the equivalent signal in traditional STA to reconstruct a STA image. Even with the similar M, different choices of nT and nR will produce different image quality. The images produced with M=N2/15 in the selected regions of interest (ROI) were demonstrated to be comparable with a full phased array, where N is the number of the array elements. The disadvantage of the proposed system is that its field of view in one delay-configuration is smaller than a standard full phased array. However, by adjusting the delay for each element within each group, the beam can be steered to cover the same field of view as the standard fully-filled phased array. The LPSSTA system might be useful for 3D ultrasound imaging.

  16. Preliminary Results from an Hydroacoustic Experiment in the Indian Ocean

    NASA Astrophysics Data System (ADS)

    Royer, J.; Dziak, R. P.; Delatre, M.; Brachet, C.; Haxel, J. H.; Matsumoto, H.; Goslin, J.; Brandon, V.; Bohnenstiehl, D. R.; Guinet, C.; Samaran, F.

    2008-12-01

    We report initial results from a 14-month hydroacoustic experiment in the Indian Ocean conducted by CNRS/University of Brest and NOAA/Oregon State University. The objective was to monitor the low-level seismic activity associated with the three contrasting spreading ridges and deforming zones in the Indian Ocean. Three autonomous hydrophones, moored in the SOFAR channel, were deployed in October 2006 and recovered early 2008 by R/V Marion Dufresne, in the Madagascar Basin, and northeast and southwest of Amsterdam Island, complementing the two permanent hydroacoustic stations of the Comprehensive nuclear-Test-Ban Treaty Organization (CTBTO) located near Diego Garcia Island and off Cape Leeuwin. Our temporary network detected more than 2000 events. Inside the array, we located 592 events (compared to 49 in the NEIC earthquake catalog) with location errors less than 5 km and time error less than 2s. The hydrophone array detected on average 5 to 40 times more events per month than land-based networks. First-order observations indicate that hydroacoustic seismicity along the Southeast Indian ridge (SEIR) occurs predominantly along the transform faults. The Southwest Indian Ridge exhibits some periodicity in earthquake activity between adjacent ridge segments. Two large tectonic/volcanic earthquake swarms are observed along the Central Indian Ridge (near the triple junction) in September and December 2007. Moreover, many off ridge-axis events are also observed both south and north of the SEIR axis. Improved localization using the CTBTO records will help refine these preliminary results and further investigate extended volcanic sequences along the SEIR east of 80°E and other events outside of the temporary array. The records also display numerous vocalizations of baleen whales in the 20-40Hz bandwidth. The calls are attributed to fin whales, Antarctic blue whales and pygmy blue whales of Madagascar and Australian type. Their vocal activity is found to be highly seasonal, occurring mainly from April to October with subspecies variations. This array thus provides a unique data set to improve our understanding of the seismic activity in this region and to establish the occurrence and migration pattern of critically endangered whale species.

  17. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements through a jackknifing process to isolate the anomalous channels, so that an automated analysis system might discard them prior to FK analysis and beamforming on events of interest.

  18. Overview of the 2009 and 2011 Sayarim Infrasound Calibration Experiments

    NASA Astrophysics Data System (ADS)

    Fee, D.; Waxler, R.; Drob, D.; Gitterman, Y.; Given, J.

    2012-04-01

    The establishment of the International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has stimulated infrasound research and development. However, as the network comes closer to completion there exists a lack of large, well-constrained sources to test the network and its capabilities. Also, significant uncertainties exist in long-range acoustic propagation due to a dynamic, difficult to characterize atmosphere, particularly the thermosphere. In 2009 and 2011 three large scale infrasound calibration experiments were performed in Europe, the Middle East, Africa, and Asia. The goal of the calibration experiments were to test the IMS infrasound network and validate atmospheric and propagation models with large, well-constrained infrasound sources. This presentation provides an overview of the calibration experiments, including deployment, atmospheric conditions during the experiments, explosion characterization, infrasonic signal detection and identification, and a discussion of the results and implications. Each calibration experiment consisted of singular surface detonation of explosives with nominal weights of 82, 10.24, and 102.08 tons on 26 August 2009, 24 January 2011, and 26 January 2011, respectively. These explosions were designed and conducted by the Geophysical Institute of Israel at Sayarim Military Range, Israel and produced significant infrasound detected by numerous permanent and temporary infrasound arrays in the region. The 2009 experiment was performed in the summer to take advantage of the westerly stratospheric winds. Infrasonic arrivals were detected by both IMS and temporary arrays deployed to the north and west of the source, including clear stratospheric arrivals and thermospheric arrivals with low celerities. The 2011 experiment was performed during the winter, when strong easterly stratospheric winds dominated in addition to a strong tropospheric jet (the jet stream). These wind jets allowed detection out to 6500 km, in addition to multiple tropospheric, stratospheric, and thermospheric arrivals at arrays deployed to the east. These experiments represented a considerable, successful collaboration between the CTBTO and numerous other groups and will provide a rich ground-truth dataset for detailed infrasound studies in the future.

  19. The GermOnline cross-species systems browser provides comprehensive information on genes and gene products relevant for sexual reproduction.

    PubMed

    Gattiker, Alexandre; Niederhauser-Wiederkehr, Christa; Moore, James; Hermida, Leandro; Primig, Michael

    2007-01-01

    We report a novel release of the GermOnline knowledgebase covering genes relevant for the cell cycle, gametogenesis and fertility. GermOnline was extended into a cross-species systems browser including information on DNA sequence annotation, gene expression and the function of gene products. The database covers eight model organisms and Homo sapiens, for which complete genome annotation data are available. The database is now built around a sophisticated genome browser (Ensembl), our own microarray information management and annotation system (MIMAS) used to extensively describe experimental data obtained with high-density oligonucleotide microarrays (GeneChips) and a comprehensive system for online editing of database entries (MediaWiki). The RNA data include results from classical microarrays as well as tiling arrays that yield information on RNA expression levels, transcript start sites and lengths as well as exon composition. Members of the research community are solicited to help GermOnline curators keep database entries on genes and gene products complete and accurate. The database is accessible at http://www.germonline.org/.

  20. Digital Mammography with a Mosaic of CCD-Arrays

    NASA Technical Reports Server (NTRS)

    Jalink, Antony, Jr. (Inventor); McAdoo, James A. (Inventor)

    1996-01-01

    The present invention relates generally to a mammography device and method and more particularly to a novel digital mammography device and method to detect microcalcifications of precancerous tissue. A digital mammography device uses a mosaic of electronic digital imaging arrays to scan an x-ray image. The mosaic of arrays is repositioned several times to expose different portions of the image, until the entire image is scanned. The data generated by the arrays during each exposure is stored in a computer. After the final exposure, the computer combines data of the several partial images to produce a composite of the original x-ray image. An aperture plate is used to reduce scatter and the overall exposure of the patient to x-rays. The novelty of this invention is that it provides a digital mammography device with large field coverage, high spatial resolution, scatter rejection, excellent contrast characteristics and lesion detectability under clinical conditions. This device also shields the patient from excessive radiation, can detect extremely small calcifications and allows manipulation and storage of the image.

  1. Striped tertiary storage arrays

    NASA Technical Reports Server (NTRS)

    Drapeau, Ann L.

    1993-01-01

    Data stripping is a technique for increasing the throughput and reducing the response time of large access to a storage system. In striped magnetic or optical disk arrays, a single file is striped or interleaved across several disks; in a striped tape system, files are interleaved across tape cartridges. Because a striped file can be accessed by several disk drives or tape recorders in parallel, the sustained bandwidth to the file is greater than in non-striped systems, where access to the file are restricted to a single device. It is argued that applying striping to tertiary storage systems will provide needed performance and reliability benefits. The performance benefits of striping for applications using large tertiary storage systems is discussed. It will introduce commonly available tape drives and libraries, and discuss their performance limitations, especially focusing on the long latency of tape accesses. This section will also describe an event-driven tertiary storage array simulator that is being used to understand the best ways of configuring these storage arrays. The reliability problems of magnetic tape devices are discussed, and plans for modeling the overall reliability of striped tertiary storage arrays to identify the amount of error correction required are described. Finally, work being done by other members of the Sequoia group to address latency of accesses, optimizing tertiary storage arrays that perform mostly writes, and compression is discussed.

  2. Pulsatile flow and mass transport over an array of cylinders: gas transfer in a cardiac-driven artificial lung.

    PubMed

    Chan, Kit Yan; Fujioka, Hideki; Bartlett, Robert H; Hirschl, Ronald B; Grotberg, James B

    2006-02-01

    The pulsatile flow and gas transport of a Newtonian passive fluid across an array of cylindrical microfibers are numerically investigated. It is related to an implantable, artificial lung where the blood flow is driven by the right heart. The fibers are modeled as either squared or staggered arrays. The pulsatile flow inputs considered in this study are a steady flow with a sinusoidal perturbation and a cardiac flow. The aims of this study are twofold: identifying favorable array geometry/spacing and system conditions that enhance gas transport; and providing pressure drop data that indicate the degree of flow resistance or the demand on the right heart in driving the flow through the fiber bundle. The results show that pulsatile flow improves the gas transfer to the fluid compared to steady flow. The degree of enhancement is found to be significant when the oscillation frequency is large, when the void fraction of the fiber bundle is decreased, and when the Reynolds number is increased; the use of a cardiac flow input can also improve gas transfer. In terms of array geometry, the staggered array gives both a better gas transfer per fiber (for relatively large void fraction) and a smaller pressure drop (for all cases). For most cases shown, an increase in gas transfer is accompanied by a higher pressure drop required to power the flow through the device.

  3. Standard Transistor Array (STAR). Volume 1: Placement technique

    NASA Technical Reports Server (NTRS)

    Cox, G. W.; Caroll, B. D.

    1979-01-01

    A large scale integration (LSI) technology, the standard transistor array uses a prefabricated understructure of transistors and a comprehensive library of digital logic cells to allow efficient fabrication of semicustom digital LSI circuits. The cell placement technique for this technology involves formation of a one dimensional cell layout and "folding" of the one dimensional placement onto the chip. It was found that, by use of various folding methods, high quality chip layouts can be achieved. Methods developed to measure of the "goodness" of the generated placements include efficient means for estimating channel usage requirements and for via counting. The placement and rating techniques were incorporated into a placement program (CAPSTAR). By means of repetitive use of the folding methods and simple placement improvement strategies, this program provides near optimum placements in a reasonable amount of time. The program was tested on several typical LSI circuits to provide performance comparisons both with respect to input parameters and with respect to the performance of other placement techniques. The results of this testing indicate that near optimum placements can be achieved by use of the procedures incurring severe time penalties.

  4. Automated array-based genomic profiling in chronic lymphocytic leukemia: Development of a clinical tool and discovery of recurrent genomic alterations

    PubMed Central

    Schwaenen, Carsten; Nessling, Michelle; Wessendorf, Swen; Salvi, Tatjana; Wrobel, Gunnar; Radlwimmer, Bernhard; Kestler, Hans A.; Haslinger, Christian; Stilgenbauer, Stephan; Döhner, Hartmut; Bentz, Martin; Lichter, Peter

    2004-01-01

    B cell chronic lymphocytic leukemia (B-CLL) is characterized by a highly variable clinical course. Recurrent chromosomal imbalances provide significant prognostic markers. Risk-adapted therapy based on genomic alterations has become an option that is currently being tested in clinical trials. To supply a robust tool for such large scale studies, we developed a comprehensive DNA microarray dedicated to the automated analysis of recurrent genomic imbalances in B-CLL by array-based comparative genomic hybridization (matrix–CGH). Validation of this chip in a series of 106 B-CLL cases revealed a high specificity and sensitivity that fulfils the criteria for application in clinical oncology. This chip is immediately applicable within clinical B-CLL treatment trials that evaluate whether B-CLL cases with distinct chromosomal abnormalities should be treated with chemotherapy of different intensities and/or stem cell transplantation. Through the control set of DNA fragments equally distributed over the genome, recurrent genomic imbalances were discovered: trisomy of chromosome 19 and gain of the MYCN oncogene correlating with an elevation of MYCN mRNA expression. PMID:14730057

  5. Xray: N-dimensional, labeled arrays for analyzing physical datasets in Python

    NASA Astrophysics Data System (ADS)

    Hoyer, S.

    2015-12-01

    Efficient analysis of geophysical datasets requires tools that both preserve and utilize metadata, and that transparently scale to process large datas. Xray is such a tool, in the form of an open source Python library for analyzing the labeled, multi-dimensional array (tensor) datasets that are ubiquitous in the Earth sciences. Xray's approach pairs Python data structures based on the data model of the netCDF file format with the proven design and user interface of pandas, the popular Python data analysis library for labeled tabular data. On top of the NumPy array, xray adds labeled dimensions (e.g., "time") and coordinate values (e.g., "2015-04-10"), which it uses to enable a host of operations powered by these labels: selection, aggregation, alignment, broadcasting, split-apply-combine, interoperability with pandas and serialization to netCDF/HDF5. Many of these operations are enabled by xray's tight integration with pandas. Finally, to allow for easy parallelism and to enable its labeled data operations to scale to datasets that does not fit into memory, xray integrates with the parallel processing library dask.

  6. Advanced technology optical telescopes IV; Proceedings of the Meeting, Tucson, AZ, Feb. 12-16, 1990. Parts 1 & 2

    NASA Technical Reports Server (NTRS)

    Barr, Lawrence D. (Editor)

    1990-01-01

    The present conference on the current status of large, advanced-technology optical telescope development and construction projects discusses topics on such factors as their novel optical system designs, the use of phased arrays, seeing and site performance factors, mirror fabrication and testing, pointing and tracking techniques, mirror thermal control, structural design strategies, mirror supports and coatings, and the control of segmented mirrors. Attention is given to the proposed implementation of the VLT Interferometer, the first diffraction-limited astronomical images with adaptive optics, a fiber-optic telescope using a large cross-section image-transmitting bundle, the design of wide-field arrays, Hartmann test data reductions, liquid mirrors, inertial drives for telescope pointing, temperature control of large honeycomb mirrors, evaporative coatings for very large telescope mirrors, and the W. M. Keck telescope's primary mirror active control system software.

  7. Building Educational Programs for the Australian Square Kilometre Array Pathfinder

    NASA Astrophysics Data System (ADS)

    Hollow, R.; Hobbs, G.

    2010-08-01

    The Australian Square Kilometre Array Pathfinder (ASKAP) will be an array of 36 antennas in Western Australia, each 12-m in diameter, and is due for operation in 2013. With a large instantaneous field-of-view ASKAP will survey the whole sky faster than existing radio telescopes, producing massive data sets. Government funding for ASKAP was contingent on it being available for education purposes, providing an exciting opportunity to develop innovative education projects for schools and citizen science. Building on the PULSE@Parkes program we plan to have a range of activities and resources, providing scope for student investigations. Challenges and educational opportunities are discussed.

  8. ASIC Readout Circuit Architecture for Large Geiger Photodiode Arrays

    NASA Technical Reports Server (NTRS)

    Vasile, Stefan; Lipson, Jerold

    2012-01-01

    The objective of this work was to develop a new class of readout integrated circuit (ROIC) arrays to be operated with Geiger avalanche photodiode (GPD) arrays, by integrating multiple functions at the pixel level (smart-pixel or active pixel technology) in 250-nm CMOS (complementary metal oxide semiconductor) processes. In order to pack a maximum of functions within a minimum pixel size, the ROIC array is a full, custom application-specific integrated circuit (ASIC) design using a mixed-signal CMOS process with compact primitive layout cells. The ROIC array was processed to allow assembly in bump-bonding technology with photon-counting infrared detector arrays into 3-D imaging cameras (LADAR). The ROIC architecture was designed to work with either common- anode Si GPD arrays or common-cathode InGaAs GPD arrays. The current ROIC pixel design is hardwired prior to processing one of the two GPD array configurations, and it has the provision to allow soft reconfiguration to either array (to be implemented into the next ROIC array generation). The ROIC pixel architecture implements the Geiger avalanche quenching, bias, reset, and time to digital conversion (TDC) functions in full-digital design, and uses time domain over-sampling (vernier) to allow high temporal resolution at low clock rates, increased data yield, and improved utilization of the laser beam.

  9. Feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Takamizawa, K.; Werntz, P.; Lapean, J.; Barts, R.; Shen, B.

    1992-01-01

    Virginia Tech has several articles which support the NASA Langley effort in the area of large aperture radiometric antenna systems. This semi-annual report reports on the following activities: a feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas and the design of array feeds for large reflector antennas.

  10. Development of self-compressing BLSOM for comprehensive analysis of big sequence data.

    PubMed

    Kikuchi, Akihito; Ikemura, Toshimichi; Abe, Takashi

    2015-01-01

    With the remarkable increase in genomic sequence data from various organisms, novel tools are needed for comprehensive analyses of available big sequence data. We previously developed a Batch-Learning Self-Organizing Map (BLSOM), which can cluster genomic fragment sequences according to phylotype solely dependent on oligonucleotide composition and applied to genome and metagenomic studies. BLSOM is suitable for high-performance parallel-computing and can analyze big data simultaneously, but a large-scale BLSOM needs a large computational resource. We have developed Self-Compressing BLSOM (SC-BLSOM) for reduction of computation time, which allows us to carry out comprehensive analysis of big sequence data without the use of high-performance supercomputers. The strategy of SC-BLSOM is to hierarchically construct BLSOMs according to data class, such as phylotype. The first-layer BLSOM was constructed with each of the divided input data pieces that represents the data subclass, such as phylotype division, resulting in compression of the number of data pieces. The second BLSOM was constructed with a total of weight vectors obtained in the first-layer BLSOMs. We compared SC-BLSOM with the conventional BLSOM by analyzing bacterial genome sequences. SC-BLSOM could be constructed faster than BLSOM and cluster the sequences according to phylotype with high accuracy, showing the method's suitability for efficient knowledge discovery from big sequence data.

  11. [Recent advances in metabonomics].

    PubMed

    Xu, Guo-Wang; Lu, Xin; Yang, Sheng-Li

    2007-12-01

    Metabonomics (or metabolomics) aims at the comprehensive and quantitative analysis of the wide arrays of metabolites in biological samples. Metabonomics has been labeled as one of the new" -omics" joining genomics, transcriptomics, and proteomics as a science employed toward the understanding of global systems biology. It has been widely applied in many research areas including drug toxicology, biomarker discovery, functional genomics, and molecular pathology etc. The comprehensive analysis of the metabonome is particularly challenging due to the diverse chemical natures of metabolites. Metabonomics investigations require special approaches for sample preparation, data-rich analytical chemical measurements, and information mining. The outputs from a metabonomics study allow sample classification, biomarker discovery, and interpretation of the reasons for classification information. This review focuses on the currently new advances in various technical platforms of metabonomics and its applications in drug discovery and development, disease biomarker identification, plant and microbe related fields.

  12. True time-delay photonic beamforming with fine steerability and frequency-agility for spaceborne phased-arrays: a proof-of-concept demonstration

    NASA Astrophysics Data System (ADS)

    Paul, Dilip K.; Razdan, Rajender; Goldman, Alfred M.

    1996-10-01

    Feasibility of photonics in beam forming and steering of large phased-array antennas onboard communications satellite/avionics systems is addressed in this paper. Specifically, a proof-of-concept demonstration of phased- array antenna feed network using fiber optic true time-delay (TTD) elements is reported for SATCOM phased-array antennas operating at C-band. Results of the photonic hardware design and performance analysis, including the measured radiation patterns of the antenna array fed by the photonic BFN, are presented. An excellent agreement between the analysis and measured data has been observed. In addition to being light- weight and compact, several unique characteristics such as rf carrier frequency agility and continuous steerability of the radiated beam achieved by the fiber optic TTD architecture are clear evidences of its superiority over other competing photonic architectures.

  13. Tracking and Navigation of Future NASA Spacecraft with the Square Kilometer Array

    NASA Astrophysics Data System (ADS)

    Resch, G. M.; Jones, D. L.; Connally, M. J.; Weinreb, S.; Preston, R. A.

    2001-12-01

    The international radio astronomy community is currently working on the design of an array of small radio antennas with a total collecting area of one square kilometer - more than a hundred times that of the largest existing (100-m) steerable antennas. An array of this size would provide obvious advantages for high data rate telemetry reception and for spacecraft navigation. Among these advantages are a two-orders-of-magnitude increase in sensitivity for telemetry downlink, flexible sub-arraying to track multiple spacecraft simultaneously, increased reliability through the use of large numbers of identical array elements, very accurate real-time angular spacecraft tracking, and a dramatic reduction in cost per unit area. NASA missions in many disciplines, including planetary science, would benefit from this increased ground-based tracking capability. The science return from planned missions could be increased, and opportunities for less expensive or completely new kinds of missions would be created.

  14. Eye movements during listening reveal spontaneous grammatical processing.

    PubMed

    Huette, Stephanie; Winter, Bodo; Matlock, Teenie; Ardell, David H; Spivey, Michael

    2014-01-01

    Recent research using eye-tracking typically relies on constrained visual contexts in particular goal-oriented contexts, viewing a small array of objects on a computer screen and performing some overt decision or identification. Eyetracking paradigms that use pictures as a measure of word or sentence comprehension are sometimes touted as ecologically invalid because pictures and explicit tasks are not always present during language comprehension. This study compared the comprehension of sentences with two different grammatical forms: the past progressive (e.g., was walking), which emphasizes the ongoing nature of actions, and the simple past (e.g., walked), which emphasizes the end-state of an action. The results showed that the distribution and timing of eye movements mirrors the underlying conceptual structure of this linguistic difference in the absence of any visual stimuli or task constraint: Fixations were shorter and saccades were more dispersed across the screen, as if thinking about more dynamic events when listening to the past progressive stories. Thus, eye movement data suggest that visual inputs or an explicit task are unnecessary to solicit analog representations of features such as movement, that could be a key perceptual component to grammatical comprehension.

  15. Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.

    PubMed

    Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P

    2010-01-15

    A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.

  16. The AlpArray Seismic Network: A Large-Scale European Experiment to Image the Alpine Orogen

    NASA Astrophysics Data System (ADS)

    Hetényi, György; Molinari, Irene; Clinton, John; Bokelmann, Götz; Bondár, István; Crawford, Wayne C.; Dessa, Jean-Xavier; Doubre, Cécile; Friederich, Wolfgang; Fuchs, Florian; Giardini, Domenico; Gráczer, Zoltán; Handy, Mark R.; Herak, Marijan; Jia, Yan; Kissling, Edi; Kopp, Heidrun; Korn, Michael; Margheriti, Lucia; Meier, Thomas; Mucciarelli, Marco; Paul, Anne; Pesaresi, Damiano; Piromallo, Claudia; Plenefisch, Thomas; Plomerová, Jaroslava; Ritter, Joachim; Rümpker, Georg; Šipka, Vesna; Spallarossa, Daniele; Thomas, Christine; Tilmann, Frederik; Wassermann, Joachim; Weber, Michael; Wéber, Zoltán; Wesztergom, Viktor; Živčić, Mladen

    2018-04-01

    The AlpArray programme is a multinational, European consortium to advance our understanding of orogenesis and its relationship to mantle dynamics, plate reorganizations, surface processes and seismic hazard in the Alps-Apennines-Carpathians-Dinarides orogenic system. The AlpArray Seismic Network has been deployed with contributions from 36 institutions from 11 countries to map physical properties of the lithosphere and asthenosphere in 3D and thus to obtain new, high-resolution geophysical images of structures from the surface down to the base of the mantle transition zone. With over 600 broadband stations operated for 2 years, this seismic experiment is one of the largest simultaneously operated seismological networks in the academic domain, employing hexagonal coverage with station spacing at less than 52 km. This dense and regularly spaced experiment is made possible by the coordinated coeval deployment of temporary stations from numerous national pools, including ocean-bottom seismometers, which were funded by different national agencies. They combine with permanent networks, which also required the cooperation of many different operators. Together these stations ultimately fill coverage gaps. Following a short overview of previous large-scale seismological experiments in the Alpine region, we here present the goals, construction, deployment, characteristics and data management of the AlpArray Seismic Network, which will provide data that is expected to be unprecedented in quality to image the complex Alpine mountains at depth.

  17. rfpipe: Radio interferometric transient search pipeline

    NASA Astrophysics Data System (ADS)

    Law, Casey J.

    2017-10-01

    rfpipe supports Python-based analysis of radio interferometric data (especially from the Very Large Array) and searches for fast radio transients. This extends on the rtpipe library (ascl:1706.002) with new approaches to parallelization, acceleration, and more portable data products. rfpipe can run in standalone mode or be in a cluster environment.

  18. VizieR Online Data Catalog: Molecular clouds in the dwarf galaxy NGC6822 (Schruba+, 2017)

    NASA Astrophysics Data System (ADS)

    Schruba, A.; Leroy, A. K.; Kruijssen, J. M. D.; Bigiel, F.; Bolatto, A. D.; de Blok, W. J. G.; Tacconi, L.; van Dishoeck, E. F.; Walter, F.

    2017-09-01

    We observed five fields in NGC 6822 with the Atacama Large Millimeter/submillimeter Array (ALMA) in Cycle 1 using the 1.3mm Band 6 receivers (project code: 2013.1.00351.S; PI. A. Schruba) in 2014 Mar 23-25. (1 data file).

  19. Test Review: Naglieri, J. A., Goldstein, S. (2013), "Comprehensive Executive Function Inventory." North Tonawanda, NY: Multi-Health Systems

    ERIC Educational Resources Information Center

    Fenwick, Melanie; McCrimmon, Adam W.

    2015-01-01

    This article provides a description and review of the "Comprehensive Executive Function Inventory" (CEFI; Naglieri & Goldstein, 2013), published by Multi-Health Systems Inc. (MHS). It is a rating scale developed to measure a wide array of Executive Function (EF) abilities in individuals aged 5 through 18 years. Completed by a parent,…

  20. A Dual-Layer Transducer Array for 3-D Rectilinear Imaging

    PubMed Central

    Yen, Jesse T.; Seo, Chi Hyung; Awad, Samer I.; Jeong, Jong S.

    2010-01-01

    2-D arrays for 3-D rectilinear imaging require very large element counts (16,000–65,000). The difficulties in fabricating and interconnecting 2-D arrays with a large number of elements (>5,000) have limited the development of suitable transducers for 3-D rectilinear imaging. In this paper, we propose an alternative solution to this problem by using a dual-layer transducer array design. This design consists of two perpendicular 1-D arrays for clinical 3-D imaging of targets near the transducer. These targets include the breast, carotid artery, and musculoskeletal system. This transducer design reduces the fabrication complexity and the channel count making 3-D rectilinear imaging more realizable. With this design, an effective N × N 2-D array can be developed using only N transmitters and N receivers. This benefit becomes very significant when N becomes greater than 128, for example. To demonstrate feasibility, we constructed a 4 × 4 cm prototype dual-layer array. The transmit array uses diced PZT-5H elements, and the receive array is a single sheet of undiced P[VDF-TrFE] copolymer. The receive elements are defined by the copper traces on the flexible interconnect circuit. The measured −6 dB fractional bandwidth was 80% with a center frequency of 4.8 MHz. At 5 MHz, the nearest neighbor crosstalk of the PZT array and PVDF array was −30.4 ± 3.1 dB and −28.8 ± 3.7 dB respectively. This dual-layer transducer was interfaced with an Ultrasonix Sonix RP system, and a synthetic aperture 3-D data set was acquired. We then performed off-line 3-D beamforming to obtain volumes of nylon wire targets. The theoretical lateral beamwidth was 0.52 mm compared to measured beamwidths of 0.65 mm and 0.67 mm in azimuth and elevation respectively. 3-D images of an 8 mm diameter anechoic cyst phantom were also acquired. PMID:19213647

  1. Listening to sounds from an exploding meteor and oceanic waves

    NASA Astrophysics Data System (ADS)

    Evers, L. G.; Haak, H. W.

    Low frequency sound (infrasound) measurements have been selected within the Comprehensive Nuclear-Test-Ban Treaty (CTBT) as a technique to detect and identify possible nuclear explosions. The Seismology Division of the Royal Netherlands Meteorological Institute (KNMI) operates since 1999 an experimental infrasound array of 16 micro-barometers. Here we show the rare detection and identification of an exploding meteor above Northern Germany on November 8th, 1999 with data from the Deelen Infrasound Array (DIA). At the same time, sound was radiated from the Atlantic Ocean, South of Iceland, due to the atmospheric coupling of standing ocean waves, called microbaroms. Occurring with only 0.04 Hz difference in dominant frequency, DIA proved to be able to discriminate between the physically different sources of infrasound through its unique lay-out and instruments. The explosive power of the meteor being 1.5 kT TNT is in the range of nuclear explosions and therefore relevant to the CTBT.

  2. Large Ka-Band Slot Array for Digital Beam-Forming Applications

    NASA Technical Reports Server (NTRS)

    Rengarajan, Sembiam; Zawadzki, Mark S.; Hodges, Richard E.

    2011-01-01

    This work describes the development of a large Ka Band Slot Array for the Glacier and Land Ice Surface Topography Interferometer (GLISTIN), a proposed spaceborne interferometric synthetic aperture radar for topographic mapping of ice sheets and glaciers. GLISTIN will collect ice topography measurement data over a wide swath with sub-seasonal repeat intervals using a Ka-band digitally beamformed antenna. For technology demonstration purpose a receive array of size 1x1 m, consisting of 160x160 radiating elements, was developed. The array is divided into 16 sticks, each stick consisting of 160x10 radiating elements, whose outputs are combined to produce 16 digital beams. A transmit array stick was also developed. The antenna arrays were designed using Elliott's design equations with the use of an infinite-array mutual-coupling model. A Floquet wave model was used to account for external coupling between radiating slots. Because of the use of uniform amplitude and phase distribution, the infinite array model yielded identical values for all radiating elements but for alternating offsets, and identical coupling elements but for alternating positive and negative tilts. Waveguide-fed slot arrays are finding many applications in radar, remote sensing, and communications applications because of their desirable properties such as low mass, low volume, and ease of design, manufacture, and deployability. Although waveguide-fed slot arrays have been designed, built, and tested in the past, this work represents several advances to the state of the art. The use of the infinite array model for the radiating slots yielded a simple design process for radiating and coupling slots. Method of moments solution to the integral equations for alternating offset radiating slots in an infinite array environment was developed and validated using the commercial finite element code HFSS. For the analysis purpose, a method of moments code was developed for an infinite array of subarrays. Overall the 1x1 m array was found to be successful in meeting the objectives of the GLISTIN demonstration antenna, especially with respect to the 0.042deg, 1/10th of the beamwidth of each stick, relative beam alignment between sticks.

  3. Signal Attenuation Curve for Different Surface Detector Arrays

    NASA Astrophysics Data System (ADS)

    Vicha, J.; Travnicek, P.; Nosek, D.; Ebr, J.

    2014-06-01

    Modern cosmic ray experiments consisting of large array of particle detectors measure the signals of electromagnetic or muon components or their combination. The correction for an amount of atmosphere passed is applied to the surface detector signal before its conversion to the shower energy. Either Monte Carlo based approach assuming certain composition of primaries or indirect estimation using real data and assuming isotropy of arrival directions can be used. Toy surface arrays of different sensitivities to electromagnetic and muon components are assumed in MC simulations to study effects imposed on attenuation curves for varying composition or possible high energy anisotropy. The possible sensitivity of the attenuation curve to the mass composition is also tested for different array types focusing on a future apparatus that can separate muon and electromagnetic component signals.

  4. FATES: a flexible analysis toolkit for the exploration of single-particle mass spectrometer data

    NASA Astrophysics Data System (ADS)

    Sultana, Camille M.; Cornwell, Gavin C.; Rodriguez, Paul; Prather, Kimberly A.

    2017-04-01

    Single-particle mass spectrometer (SPMS) analysis of aerosols has become increasingly popular since its invention in the 1990s. Today many iterations of commercial and lab-built SPMSs are in use worldwide. However, supporting analysis toolkits for these powerful instruments are outdated, have limited functionality, or are versions that are not available to the scientific community at large. In an effort to advance this field and allow better communication and collaboration between scientists, we have developed FATES (Flexible Analysis Toolkit for the Exploration of SPMS data), a MATLAB toolkit easily extensible to an array of SPMS designs and data formats. FATES was developed to minimize the computational demands of working with large data sets while still allowing easy maintenance, modification, and utilization by novice programmers. FATES permits scientists to explore, without constraint, complex SPMS data with simple scripts in a language popular for scientific numerical analysis. In addition FATES contains an array of data visualization graphic user interfaces (GUIs) which can aid both novice and expert users in calibration of raw data; exploration of the dependence of mass spectral characteristics on size, time, and peak intensity; and investigations of clustered data sets.

  5. Indium Hybridization of Large Format TES Bolometer Arrays to Readout Multiplexers for Far-Infrared Astronomy

    NASA Technical Reports Server (NTRS)

    Miller, Timothy M.; Costen, Nick; Allen, Christine

    2007-01-01

    This conference poster reviews the Indium hybridization of the large format TES bolometer arrays. We are developing a key technology to enable the next generation of detectors. That is the Hybridization of Large Format Arrays using Indium bonded detector arrays containing 32x40 elements which conforms to the NIST multiplexer readout architecture of 1135 micron pitch. We have fabricated and hybridized mechanical models with the detector chips bonded after being fully back-etched. The mechanical support consists of 30 micron walls between elements Demonstrated electrical continuity for each element. The goal is to hybridize fully functional array of TES detectors to NIST readout.

  6. The Ophidia framework: toward cloud-based data analytics for climate change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni

    2015-04-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.

  7. Multiple detector focal plane array ultraviolet spectrometer for the AMPS laboratory

    NASA Technical Reports Server (NTRS)

    Feldman, P. D.

    1975-01-01

    The possibility of meeting the requirements of the amps spectroscopic instrumentation by using a multi-element focal plane detector array in a conventional spectrograph mount was examined. The requirements of the detector array were determined from the optical design of the spectrometer which in turn depends on the desired level of resolution and sensitivity required. The choice of available detectors and their associated electronics and controls was surveyed, bearing in mind that the data collection rate from this system is so great that on-board processing and reduction of data are absolutely essential. Finally, parallel developments in instrumentation for imaging in astronomy were examined, both in the ultraviolet (for the Large Space Telescope as well as other rocket and satellite programs) and in the visible, to determine what progress in that area can have direct bearing on atmospheric spectroscopy.

  8. Costs of large truck- and bus-involved crashes.

    DOT National Transportation Integrated Search

    2000-12-01

    This study provides comprehensive, economically sophisticated estimates of the costs of highway crashes involving large trucks and buses by severity. Based on the latest data available, the estimated cost of police-reported crashes involving trucks w...

  9. Bridging the gap between strategic and management forest inventories

    Treesearch

    Ronald E. McRoberts

    2009-01-01

    Strategic forest inventory programs collect information for a large number of variables on a relatively sparse array of field plots. Data from these inventories are used to produce estimates for large areas such as states and provinces, regions, or countries. The purpose of management forest inventories is to guide management decisions for small areas such as stands....

  10. User Data on the Social Web: Authorship, Agency, and Appropriation

    ERIC Educational Resources Information Center

    Reyman, Jessica

    2013-01-01

    Social web services catalog users' activities across the Internet, aggregating, analyzing, and selling a vast array of user data to be used largely for consumer profiling and target marketing. This article interrogates the tacit agreements and terms-of-use policies that govern who owns user data, how it circulates, and how it can be used. Relying…

  11. A Decade of Ocean Acoustic Measurements from R/P FLIP

    NASA Astrophysics Data System (ADS)

    D'Spain, G. L.

    2002-12-01

    Studies of the properties of low frequency acoustic fields in the ocean continue to benefit from the use of manned, stable offshore platforms such as R/P FLIP. A major benefit is providing the at-sea stability required for deployment of extremely large aperture line arrays, line arrays composed of both acoustic motion and acoustic pressure sensors, and arrays that provide measurements in all 3 spatial dimensions. In addition, FLIP provides a high-profile (25 m) observation post with 360 deg coverage for simultaneous visual observations of marine mammals. A few examples of the scientific results that have been achieved over this past decade with ocean acoustic data collected on FLIP are presented. These results include the normal mode decomposition of earthquake T phases to study their generation and water/land coupling characteristics using a 3000 m vertical aperture hydrophone array, simultaneous vertical and horizontal directional information on the underwater sound field from line arrays of hydrophones and geophones, the strange nightime chorusing behavior of fish measured by 3D array aperture, the mirage effect caused by bathymetry changes in inversions for source location in shallow water, and the diving behavior of blue whales determined from 1D recordings of their vocalizations. Presently, FLIP serves as the central data recording platform in ocean acoustic studies using AUV's.

  12. Comprehensive Astronaut Immune Assessment Following a Short-Duration Space Flight

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Stowe, Raymond; Yetman, Deborah; Pierson, Duane; Sams, Clarence

    2006-01-01

    Immune system dysregulation has been demonstrated to occur during spaceflight and has the potential to cause serious health risks to crewmembers participating in exploration class missions. As a part of an ongoing NASA flight experiment assessing viral immunity (DSO-500), a generalized immune assessment was performed on 3 crewmembers who participated in the recent STS-114 Space Shuttle mission. The following assays were performed: (1) comprehensive immunophenotype analysis; (2) T cell function/intracellular cytokine profiles; (4) secreted Th1/Th2 cytokine profiles via cytometric bead array. Immunophenotype analysis included a leukocyte differential, lymphocyte subsets, T cell subsets, cytotoxic/effector CD8+ T cells, memory/naive T cell subsets and constitutively activated T cells. Study timepoints were L-180, L-65, L-10, R+0, R+3 and R+14. Detailed data are presented in the poster text. As expected from a limited number of human subjects, data tended to vary with respect to most parameters. Specific post-flight alterations were as follows (subject number in parentheses): Granulocytosis (2/3), reduced NK cells (3/3), elevated CD4/CD8 ratio (3/3), general CD8+ phenotype shift to a less differentiated phenotype (3/3), elevated levels of memory CD4+ T cells (3/3), loss of L-selectin on T cell subsets (3/3), increased levels of activated T cells (2/3), reduced IL-2 producing T cell subsets (3/3), levels of IFNg producing T cells were unchanged. CD8+ T cell expression of the CD69 activation markers following whole blood stimulation with SEA+SEB were dramatically reduced postflight (3/3), whereas other T cell function assessments were largely unchanged. Cytometric bead array assessment of secreted T cell cytokines was performed, following whole blood stimulation with either CD3/CD28 antibodies or PMA+ionomycin for 48 hours. Specific cytokines assessed were IFNg, TNFa, IL-2, IL-4, IL-5, IL-10. Following CD3/CD28 stimulation, all three crewmembers had a mission-associated reduction in the levels of secreted IFNg. One crewmember had a post-flight inversion in the IFNg/IL-10 ratio postflight, which trended back to baseline by R+14. Detailed cytokine data are presented in the poster text. This testing regimen was designed to correlate immunophenotype changes (thought to correspond to specific in-vivo immune responses or pathogenesis), against altered leukocyte function and cytokine profiles. In-flight studies are required to determine if post-flight alterations are reflective of the in-flight condition, or are a response to landing and readaptation.

  13. Autonomous Sensors for Large Scale Data Collection

    NASA Astrophysics Data System (ADS)

    Noto, J.; Kerr, R.; Riccobono, J.; Kapali, S.; Migliozzi, M. A.; Goenka, C.

    2017-12-01

    Presented here is a novel implementation of a "Doppler imager" which remotely measures winds and temperatures of the neutral background atmosphere at ionospheric altitudes of 87-300Km and possibly above. Incorporating both recent optical manufacturing developments, modern network awareness and the application of machine learning techniques for intelligent self-monitoring and data classification. This system achieves cost savings in manufacturing, deployment and lifetime operating costs. Deployed in both ground and space-based modalities, this cost-disruptive technology will allow computer models of, ionospheric variability and other space weather models to operate with higher precision. Other sensors can be folded into the data collection and analysis architecture easily creating autonomous virtual observatories. A prototype version of this sensor has recently been deployed in Trivandrum India for the Indian Government. This Doppler imager is capable of operation, even within the restricted CubeSat environment. The CubeSat bus offers a very challenging environment, even for small instruments. The lack of SWaP and the challenging thermal environment demand development of a new generation of instruments; the Doppler imager presented is well suited to this environment. Concurrent with this CubeSat development is the development and construction of ground based arrays of inexpensive sensors using the proposed technology. This instrument could be flown inexpensively on one or more CubeSats to provide valuable data to space weather forecasters and ionospheric scientists. Arrays of magnetometers have been deployed for the last 20 years [Alabi, 2005]. Other examples of ground based arrays include an array of white-light all sky imagers (THEMIS) deployed across Canada [Donovan et al., 2006], oceans sensors on buoys [McPhaden et al., 2010], and arrays of seismic sensors [Schweitzer et al., 2002]. A comparable array of Doppler imagers can be constructed and deployed on the ground, to compliment the CubeSat data.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fahim, Farah; Deptuch, Grzegorz; Shenai, Alpana

    The Vertically Integrated Photon Imaging Chip - Large, (VIPIC-L), is a large area, small pixel (65μm), 3D integrated, photon counting ASIC with zero-suppressed or full frame dead-time-less data readout. It features data throughput of 14.4 Gbps per chip with a full frame readout speed of 56kframes/s in the imaging mode. VIPIC-L contain 192 x 192 pixel array and the total size of the chip is 1.248cm x 1.248cm with only a 5μm periphery. It contains about 120M transistors. A 1.3M pixel camera module will be developed by arranging a 6 x 6 array of 3D VIPIC-L’s bonded to a largemore » area silicon sensor on the analog side and to a readout board on the digital side. The readout board hosts a bank of FPGA’s, one per VIPIC-L to allow processing of up to 0.7 Tbps of raw data produced by the camera.« less

  15. Fast and Accurate Simulation Technique for Large Irregular Arrays

    NASA Astrophysics Data System (ADS)

    Bui-Van, Ha; Abraham, Jens; Arts, Michel; Gueuning, Quentin; Raucy, Christopher; Gonzalez-Ovejero, David; de Lera Acedo, Eloy; Craeye, Christophe

    2018-04-01

    A fast full-wave simulation technique is presented for the analysis of large irregular planar arrays of identical 3-D metallic antennas. The solution method relies on the Macro Basis Functions (MBF) approach and an interpolatory technique to compute the interactions between MBFs. The Harmonic-polynomial (HARP) model is established for the near-field interactions in a modified system of coordinates. For extremely large arrays made of complex antennas, two approaches assuming a limited radius of influence for mutual coupling are considered: one is based on a sparse-matrix LU decomposition and the other one on a tessellation of the array in the form of overlapping sub-arrays. The computation of all embedded element patterns is sped up with the help of the non-uniform FFT algorithm. Extensive validations are shown for arrays of log-periodic antennas envisaged for the low-frequency SKA (Square Kilometer Array) radio-telescope. The analysis of SKA stations with such a large number of elements has not been treated yet in the literature. Validations include comparison with results obtained with commercial software and with experiments. The proposed method is particularly well suited to array synthesis, in which several orders of magnitude can be saved in terms of computation time.

  16. Social Withdrawal Among Individuals Receiving Psychiatric Care: Derivation of a Scale Using Routine Clinical Assessment Data to Support Screening and Outcome Measurement.

    PubMed

    Rios, Sebastian; Perlman, Christopher M

    2017-04-24

    Social withdrawal is a symptom experienced by individuals with an array of mental health conditions, particularly those with schizophrenia and mood disorders. Assessments of social withdrawal are often lengthy and may not be routinely integrated within the comprehensive clinical assessment of the individual. This study utilized item response and classical test theory methods to derive a Social Withdrawal Scale (SWS) using items embedded within a routine clinical assessment, the RAI-Mental Health (RAI-MH). Using data from 60,571 inpatients in Ontario, Canada, a common factor analysis identified seven items from the RAI-MH that measure social withdrawal. A graded response model found that six items had acceptable discrimination parameters: lack of motivation, reduced interaction, decreased energy, flat affect, anhedonia, and loss of interest. Summing these items, the SWS was found to have strong internal consistency (Cronbach's alpha = 0.82) and showed a medium to large effect size (d = 0.77) from admission to discharge. Fewer individuals with high SWS scores participated in social activity or reported having a confidant compared to those with lower scores. Since the RAI-MH is available across clinical subgroups in several jurisdictions, the SWS is a useful tool for screening, clinical decision support, and evaluation.

  17. The NSF Earthscope USArray Instrumentation Network

    NASA Astrophysics Data System (ADS)

    Davis, G. A.; Vernon, F.

    2012-12-01

    Since 2004, the Transportable Array component of the USArray Instrumentation Network has collected high resolution seismic data in near real-time from over 400 geographically distributed seismic stations. The deployed footprint of the array has steadily migrated across the continental United States, starting on the west coast and gradually moving eastward. As the network footprint shifts, stations from various regional seismic networks have been incorporated into the dataset. In 2009, an infrasound and barometric sensor component was added to existing core stations and to all new deployments. The ongoing success of the project can be attributed to a number of factors, including reliable communications to each site, on-site data buffering, largely homogenous data logging hardware, and a common phase-locked time reference between all stations. Continuous data quality is ensured by thorough human and automated review of data from the primary sensors and over 24 state-of-health parameters from each station. The staff at the Array Network Facility have developed a number of tools to visualize data and troubleshoot problematic stations remotely. In the event of an emergency or maintenance on the server hardware, data acquisition can be shifted to alternate data centers through the use of virtualization technologies.

  18. Detection and Mapping of the September 2017 Mexico Earthquakes Using DAS Fiber-Optic Infrastructure Arrays

    NASA Astrophysics Data System (ADS)

    Karrenbach, M. H.; Cole, S.; Williams, J. J.; Biondi, B. C.; McMurtry, T.; Martin, E. R.; Yuan, S.

    2017-12-01

    Fiber-optic distributed acoustic sensing (DAS) uses conventional telecom fibers for a wide variety of monitoring purposes. Fiber-optic arrays can be located along pipelines for leak detection; along borders and perimeters to detect and locate intruders, or along railways and roadways to monitor traffic and identify and manage incidents. DAS can also be used to monitor oil and gas reservoirs and to detect earthquakes. Because thousands of such arrays are deployed worldwide and acquiring data continuously, they can be a valuable source of data for earthquake detection and location, and could potentially provide important information to earthquake early-warning systems. In this presentation, we show that DAS arrays in Mexico and the United States detected the M8.1 and M7.2 Mexico earthquakes in September 2017. At Stanford University, we have deployed a 2.4 km fiber-optic DAS array in a figure-eight pattern, with 600 channels spaced 4 meters apart. Data have been recorded continuously since September 2016. Over 800 earthquakes from across California have been detected and catalogued. Distant teleseismic events have also been recorded, including the two Mexican earthquakes. In Mexico, fiber-optic arrays attached to pipelines also detected these two events. Because of the length of these arrays and their proximity to the event locations, we can not only detect the earthquakes but also make location estimates, potentially in near real time. In this presentation, we review the data recorded for these two events recorded at Stanford and in Mexico. We compare the waveforms recorded by the DAS arrays to those recorded by traditional earthquake sensor networks. Using the wide coverage provided by the pipeline arrays, we estimate the event locations. Such fiber-optic DAS networks can potentially play a role in earthquake early-warning systems, allowing actions to be taken to minimize the impact of an earthquake on critical infrastructure components. While many such fiber-optic networks are already in place, new arrays can be created on demand, using existing fiber-optic telecom cables, for specific monitoring situations such as recording aftershocks of a large earthquake or monitoring induced seismicity.

  19. Integrated residential photovoltaic array development

    NASA Technical Reports Server (NTRS)

    Royal, G. C., III

    1981-01-01

    Sixteen conceptual designs of residential photovoltaic arrays are described. Each design concept was evaluated by an industry advisory panel using a comprehensive set of technical, economic and institutional criteria. Key electrical and mechanical concerns that effect further array subsystem development are also discussed. Three integrated array design concepts were selected by the advisory panel for further optimization and development. From these concepts a single one will be selected for detailed analysis and prototype fabrication. The three concepts selected are: (1) An array of frameless panels/modules sealed in a T shaped zipper locking neoprene gasket grid pressure fitted into an extruded aluminum channel grid fastened across the rafters. (2) An array of frameless modules pressure fitted in a series of zipper locking EPDM rubber extrusions adhesively bonded to the roof. Series string voltage is developed using a set of integral tongue connectors and positioning blocks. (3) An array of frameless modules sealed by a silicone adhesive in a prefabricated grid of rigid tape and sheet metal attached to the roof.

  20. Lens and Camera Arrays for Sky Surveys and Space Surveillance

    NASA Astrophysics Data System (ADS)

    Ackermann, M.; Cox, D.; McGraw, J.; Zimmer, P.

    2016-09-01

    In recent years, a number of sky survey projects have chosen to use arrays of commercial cameras coupled with commercial photographic lenses to enable low-cost, wide-area observation. Projects such as SuperWASP, FAVOR, RAPTOR, Lotis, PANOPTES, and DragonFly rely on multiple cameras with commercial lenses to image wide areas of the sky each night. The sensors are usually commercial astronomical charge coupled devices (CCDs) or digital single reflex (DSLR) cameras, while the lenses are large-aperture, highend consumer items intended for general photography. While much of this equipment is very capable and relatively inexpensive, this approach comes with a number of significant limitations that reduce sensitivity and overall utility of the image data. The most frequently encountered limitations include lens vignetting, narrow spectral bandpass, and a relatively large point spread function. Understanding these limits helps to assess the utility of the data, and identify areas where advanced optical designs could significantly improve survey performance.

  1. Simultaneous Solar Maximum Mission and Very Large Array (VLA) observations of solar active regions

    NASA Technical Reports Server (NTRS)

    Lang, K. R.

    1985-01-01

    Simultaneous observations of solar active regions with the Solar Maximum Mission (SMM) Satellite and the Very Large Array (VLA) have been obtained and analyzed. Combined results enhance the scientific return for beyond that expeted from using either SMM or VLA alone. A total of two weeks of simultaneous SMM/VLA data were obtained. The multiple wavelength VLA observations were used to determine the temperature and magnetic structure at different heights within coronal loops. These data are compared with simultaneous SMM observations. Several papers on the subject are in progress. They include VLA observations of compact, transient sources in the transition region; simultaneous SMM/VLA observations of the coronal loops in one active region and the evolution of another one; and sampling of the coronal plasma using thermal cyclotron lines (magnetic field - VLA) and soft X ray spectral lines (electron density and electron temperaure-SMM).

  2. Environmentally-induced voltage limitations in large space power systems

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1984-01-01

    Large power systems proposed for future space missions imply higher operating voltage requirements which, in turn, will interact with the space plasma environment. The effects of these interactions can only be inferred because of the limited data base of ground simulations, small test samples, and two space flight experiments. This report evaluates floating potentials for a 100 kW power system operating at 300, 500, 750, and 1000 volts in relation to this data base. Of primary concern is the possibility of discharging to space. The implications of such discharges were studied at the 500 volt operational setting. It was found that discharging can shut down the power system if the discharge current exceeds the array short circuit current. Otherwise, a power oscillation can result that ranges from 2 to 20 percent, depending upon the solar array area involved in the discharge. Means of reducing the effect are discussed.

  3. Existing Instrumentation and Scientific Drivers for a Subduction Zone Observatory in Latin America

    NASA Astrophysics Data System (ADS)

    Frassetto, A.; Woodward, R.; Detrick, R. S.

    2015-12-01

    The subduction zones along the western shore of the Americas provide numerous societally relevant scientific questions that have yet to be fully explored and would make an excellent target for a comprehensive, integrated Subduction Zone Observatory (SZO). Further, recent discussions in Latin America indicate that there are a large number of existing stations that could serve as a backbone for an SZO. Such preexisting geophysical infrastructure commonly plays a vital role in new science initiatives, from small PI-led experiments to the establishment of the USArray Transportable Array, Reference Network, Cascadia Amphibious Array, and the redeployment of EarthScope Transportable Array stations to Alaska. Creating an SZO along the western coast of the Americas could strongly leverage the portfolio of existing seismic and geodetic stations across regions of interest. In this presentation, we will discuss the concept and experience of leveraging existing infrastructure in major new observational programs, outline the state of geophysical networks in the Americas (emphasizing current seismic networks but also looking back on historical temporary deployments), and provide an overview of potential scientific targets in the Americas that encompass a sampling of recently produced research results and datasets. Additionally, we will reflect on strategies for establishing meaningful collaborations across Latin America, an aspect that will be critical to the international partnerships, and associated capacity building, needed for a successful SZO initiative.

  4. Generalized assorted pixel camera: postcapture control of resolution, dynamic range, and spectrum.

    PubMed

    Yasuma, Fumihito; Mitsunaga, Tomoo; Iso, Daisuke; Nayar, Shree K

    2010-09-01

    We propose the concept of a generalized assorted pixel (GAP) camera, which enables the user to capture a single image of a scene and, after the fact, control the tradeoff between spatial resolution, dynamic range and spectral detail. The GAP camera uses a complex array (or mosaic) of color filters. A major problem with using such an array is that the captured image is severely under-sampled for at least some of the filter types. This leads to reconstructed images with strong aliasing. We make four contributions in this paper: 1) we present a comprehensive optimization method to arrive at the spatial and spectral layout of the color filter array of a GAP camera. 2) We develop a novel algorithm for reconstructing the under-sampled channels of the image while minimizing aliasing artifacts. 3) We demonstrate how the user can capture a single image and then control the tradeoff of spatial resolution to generate a variety of images, including monochrome, high dynamic range (HDR) monochrome, RGB, HDR RGB, and multispectral images. 4) Finally, the performance of our GAP camera has been verified using extensive simulations that use multispectral images of real world scenes. A large database of these multispectral images has been made available at http://www1.cs.columbia.edu/CAVE/projects/gap_camera/ for use by the research community.

  5. Exploring Seismic Noise with the USArray Transportable Array

    NASA Astrophysics Data System (ADS)

    Woodward, R.; Busby, R. W.; Simpson, D. W.

    2009-12-01

    The large number of seismic stations that comprise the EarthScope USArray Transportable Array (TA) seismic network provide an unparalleled opportunity for studying how seismic noise evolves with time over a large portion of the North American continent. Power spectra for every station in the TA data are computed automatically, for every hour of every station-day, by the Quality Analysis Control Kit (QUACK) system at the IRIS Data Management Center. The power spectra utilize hour-long data segments, with 50% overlap between segments, providing spectral values in the band between 20 Hz and 172 s. Thus, at any in-band frequency one can construct a continuous two-year time history of seismic noise for every TA station. When the time variation of the power spectra values across the array are rendered as individual movie frames one can examine the evolution of seismic noise across the full spatio-temporal extent of the TA. Overall, the background noise levels (especially at periods below 10 s) are remarkably uniform across the entire array. Numerous expected features are present, including diurnal and annual variations, enhanced noise levels at coastal stations, transients related to large storms, and episodes when the observations of background noise are dominated by earthquake energy. Upgrades to the TA station instrumentation will provide the capability to measure additional physical factors relevant to seismic noise. All TA stations deployed after August 2009 include MEMS barometers that can measure atmospheric pressure from DC to approximately 0.1 Hz. In additional, several stations have been temporarily equipped with infrasound sensors. Previous research has highlighted the direct effect of atmospheric pressure fluctuations on very long period vertical seismometers. The relationship to noise observed on horizontal seismometers is more complex. However, with a large number of uniform installations it may be possible to make further progress. We will present analyses of the spatio-temporal evolution of noise observed on the TA stations and present preliminary results from the barometers and infrasound sensors that have been deployed with TA stations so far. We will discuss opportunities for augmenting TA stations with additional sensors that may further elucidate seismic noise processes.

  6. Transferring genomics to the clinic: distinguishing Burkitt and diffuse large B cell lymphomas.

    PubMed

    Sha, Chulin; Barrans, Sharon; Care, Matthew A; Cunningham, David; Tooze, Reuben M; Jack, Andrew; Westhead, David R

    2015-01-01

    Classifiers based on molecular criteria such as gene expression signatures have been developed to distinguish Burkitt lymphoma and diffuse large B cell lymphoma, which help to explore the intermediate cases where traditional diagnosis is difficult. Transfer of these research classifiers into a clinical setting is challenging because there are competing classifiers in the literature based on different methodology and gene sets with no clear best choice; classifiers based on one expression measurement platform may not transfer effectively to another; and, classifiers developed using fresh frozen samples may not work effectively with the commonly used and more convenient formalin fixed paraffin-embedded samples used in routine diagnosis. Here we thoroughly compared two published high profile classifiers developed on data from different Affymetrix array platforms and fresh-frozen tissue, examining their transferability and concordance. Based on this analysis, a new Burkitt and diffuse large B cell lymphoma classifier (BDC) was developed and employed on Illumina DASL data from our own paraffin-embedded samples, allowing comparison with the diagnosis made in a central haematopathology laboratory and evaluation of clinical relevance. We show that both previous classifiers can be recapitulated using very much smaller gene sets than originally employed, and that the classification result is closely dependent on the Burkitt lymphoma criteria applied in the training set. The BDC classification on our data exhibits high agreement (~95 %) with the original diagnosis. A simple outcome comparison in the patients presenting intermediate features on conventional criteria suggests that the cases classified as Burkitt lymphoma by BDC have worse response to standard diffuse large B cell lymphoma treatment than those classified as diffuse large B cell lymphoma. In this study, we comprehensively investigate two previous Burkitt lymphoma molecular classifiers, and implement a new gene expression classifier, BDC, that works effectively on paraffin-embedded samples and provides useful information for treatment decisions. The classifier is available as a free software package under the GNU public licence within the R statistical software environment through the link http://www.bioinformatics.leeds.ac.uk/labpages/softwares/ or on github https://github.com/Sharlene/BDC.

  7. Reading and writing difficulties do not always occur as the researcher expects.

    PubMed

    Niemi, P; Poskiparta, E; Vauras, M; Mäki, H

    1998-09-01

    Making a prognosis about reading and learning difficulties is a tricky business, even if a large array of relevant variables is taken into account. The present article discusses such an endeavour, on the basis of a longitudinal four-year study which started with an orthodox intervention on linguistic awareness. However, after initial success, new groups of reading, writing and math disabled children were identified in the course of years. Membership of these groups could not always be predicted on the basis of extensive cognitive diagnostics performed during the preschool. Rather, the pupil's adaptive behaviour while coping with the demands of school work emerges as an important prognostic factor. This was particularly evident in an interaction combining math and reading comprehension in grade 3.

  8. Technology developments toward 30-year-life of photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1984-01-01

    As part of the United States National Photovoltaics Program, the Jet Propulsion Laboratory's Flat-Plate Solar Array Project (FSA) has maintained a comprehensive reliability and engineering sciences activity addressed toward understanding the reliability attributes of terrestrial flat-plate photovoltaic arrays and to deriving analysis and design tools necessary to achieve module designs with a 30-year useful life. The considerable progress to date stemming from the ongoing reliability research is discussed, and the major areas requiring continued research are highlighted. The result is an overview of the total array reliability problem and of available means of achieving high reliability at minimum cost.

  9. Controllability of the Coulomb charging energy in close-packed nanoparticle arrays.

    PubMed

    Duan, Chao; Wang, Ying; Sun, Jinling; Guan, Changrong; Grunder, Sergio; Mayor, Marcel; Peng, Lianmao; Liao, Jianhui

    2013-11-07

    We studied the electronic transport properties of metal nanoparticle arrays, particularly focused on the Coulomb charging energy. By comparison, we confirmed that it is more reasonable to estimate the Coulomb charging energy using the activation energy from the temperature-dependent zero-voltage conductance. Based on this, we systematically and comprehensively investigated the parameters that could be used to tune the Coulomb charging energy in nanoparticle arrays. We found that four parameters, including the particle core size, the inter-particle distance, the nearest neighboring number, and the dielectric constant of ligand molecules, could significantly tune the Coulomb charging energy.

  10. A Spatiotemporal Indexing Approach for Efficient Processing of Big Array-Based Climate Data with MapReduce

    NASA Technical Reports Server (NTRS)

    Li, Zhenlong; Hu, Fei; Schnase, John L.; Duffy, Daniel Q.; Lee, Tsengdar; Bowen, Michael K.; Yang, Chaowei

    2016-01-01

    Climate observations and model simulations are producing vast amounts of array-based spatiotemporal data. Efficient processing of these data is essential for assessing global challenges such as climate change, natural disasters, and diseases. This is challenging not only because of the large data volume, but also because of the intrinsic high-dimensional nature of geoscience data. To tackle this challenge, we propose a spatiotemporal indexing approach to efficiently manage and process big climate data with MapReduce in a highly scalable environment. Using this approach, big climate data are directly stored in a Hadoop Distributed File System in its original, native file format. A spatiotemporal index is built to bridge the logical array-based data model and the physical data layout, which enables fast data retrieval when performing spatiotemporal queries. Based on the index, a data-partitioning algorithm is applied to enable MapReduce to achieve high data locality, as well as balancing the workload. The proposed indexing approach is evaluated using the National Aeronautics and Space Administration (NASA) Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. The experimental results show that the index can significantly accelerate querying and processing (10 speedup compared to the baseline test using the same computing cluster), while keeping the index-to-data ratio small (0.0328). The applicability of the indexing approach is demonstrated by a climate anomaly detection deployed on a NASA Hadoop cluster. This approach is also able to support efficient processing of general array-based spatiotemporal data in various geoscience domains without special configuration on a Hadoop cluster.

  11. A Census of Southern Pulsars at 185 MHz

    NASA Astrophysics Data System (ADS)

    Xue, Mengyao; Bhat, N. D. R.; Tremblay, S. E.; Ord, S. M.; Sobey, C.; Swainston, N. A.; Kaplan, D. L.; Johnston, Simon; Meyers, B. W.; McSweeney, S. J.

    2017-12-01

    The Murchison Widefield Array, and its recently developed Voltage Capture System, facilitates extending the low-frequency range of pulsar observations at high-time and -frequency resolution in the Southern Hemisphere, providing further information about pulsars and the ISM. We present the results of an initial time-resolved census of known pulsars using the Murchison Widefield Array. To significantly reduce the processing load, we incoherently sum the detected powers from the 128 Murchison Widefield Array tiles, which yields 10% of the attainable sensitivity of the coherent sum. This preserves the large field-of-view ( 450 deg2 at 185 MHz), allowing multiple pulsars to be observed simultaneously. We developed a WIde-field Pulsar Pipeline that processes the data from each observation and automatically folds every known pulsar located within the beam. We have detected 50 pulsars to date, 6 of which are millisecond pulsars. This is consistent with our expectation, given the telescope sensitivity and the sky coverage of the processed data ( 17 000 deg2). For 10 pulsars, we present the lowest frequency detections published. For a subset of the pulsars, we present multi-frequency pulse profiles by combining our data with published profiles from other telescopes. Since the Murchison Widefield Array is a low-frequency precursor to the Square Kilometre Array, we use our census results to forecast that a survey using the low-frequency component of the Square Kilometre Array Phase 1 can potentially detect around 9 400 pulsars.

  12. Working with and Visualizing Big Data Efficiently with Python for the DARPA XDATA Program

    DTIC Science & Technology

    2017-08-01

    same function to be used with scalar inputs, input arrays of the same shape, or even input arrays of dimensionality in some cases. Most of the math ... math operations on values ● Split-apply-combine: similar to group-by operations in databases ● Join: combine two datasets using common columns 4.3.3...Numba - Continue to increase SIMD performance with support for fast math flags and improved support for AVX, Intel’s large vector

  13. Lorenzo Lotto's painting materials: an integrated diagnostic approach

    NASA Astrophysics Data System (ADS)

    Amadori, Maria Letizia; Poldi, Gianluca; Barcelli, Sara; Baraldi, Pietro; Berzioli, Michela; Casoli, Antonella; Marras, Susanna; Pojana, Giulio; Villa, Giovanni C. F.

    2016-07-01

    This paper presents the results of a comprehensive diagnostic investigation carried out on five paintings (three wood panels and two paintings on canvas) by Lorenzo Lotto, one of the most significant artists of the Italian Renaissance in the first half of 16th century. The paintings considered belong to 1508-1522 period, corresponding to the most significant years of Lotto's evolution. A wide array of non-invasive (reflectance spectrometry and X-ray fluorescence) and micro-invasive analytical techniques (optical microscopy, scanning electron microscopy with energy dispersive spectroscopy, micro-FTIR spectroscopy, micro-Raman spectroscopy, gas chromatography coupled with mass spectrometry and high performance liquid chromatography coupled with photodiode array detection and mass spectrometry) were applied in order to provide a large set of significant data, limiting as much as possible the sampling. This study has proved that Lotto's painting palette was typical of Venetian practice of that period, but some significant peculiarities emerged: the use of two kinds of red lakes, the addition of calcium carbonate and colourless powdered glass, the latter frequently found in pictorial and ground layers. Moreover, the integrated investigation showed that Lotto's technique was sometimes characterized by the use of coloured priming and multi-layer sequences with complex mixtures. Chromatographic analyses allowed to identify in all specimens: azelaic, palmitic and stearic acids, generally referring to the presence of drying oils. The extension of additional non-invasive examination to about 50 paintings by the same author, spanning from 1505 to around 1556, helped to verify the evolution in the use of some pigments, such as the yellow ones, where Pb-Sb yellow was used alongside Pb-Sn yellow.

  14. Development of advanced micromirror arrays by flip-chip assembly

    NASA Astrophysics Data System (ADS)

    Michalicek, M. Adrian; Bright, Victor M.

    2001-10-01

    This paper presents the design, commercial prefabrication, modeling and testing of advanced micromirror arrays fabricated using a novel, simple and inexpensive flip-chip assembly technique. Several polar piston arrays and rectangular cantilever arrays were fabricated using flip-chip assembly by which the upper layers of the array are fabricated on a separate chip and then transferred to a receiving module containing the lower layers. Typical polar piston arrays boast 98.3% active surface area, highly planarized surfaces, low address potentials compatible with CMOS electronics, highly standardized actuation between devices, and complex segmentation of mirror surfaces which allows for custom aberration configurations. Typical cantilever arrays boast large angles of rotation as well as an average surface planarity of only 1.779 nm of RMS roughness across 100 +m mirrors. Continuous torsion devices offer stable operation through as much as six degrees of rotation while binary operation devices offer stable activated positions with as much as 20 degrees of rotation. All arrays have desirable features of costly fabrication services like five structural layers and planarized mirror surfaces, but are prefabricated in the less costly MUMPs process. Models are developed for all devices and used to compare empirical data.

  15. A Generic and Efficient E-field Parallel Imaging Correlator for Next-Generation Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Thyagarajan, Nithyanandan; Beardsley, Adam P.; Bowman, Judd D.; Morales, Miguel F.

    2017-05-01

    Modern radio telescopes are favouring densely packed array layouts with large numbers of antennas (NA ≳ 1000). Since the complexity of traditional correlators scales as O(N_A^2), there will be a steep cost for realizing the full imaging potential of these powerful instruments. Through our generic and efficient E-field Parallel Imaging Correlator (epic), we present the first software demonstration of a generalized direct imaging algorithm, namely the Modular Optimal Frequency Fourier imager. Not only does it bring down the cost for dense layouts to O(N_A log _2N_A) but can also image from irregular layouts and heterogeneous arrays of antennas. epic is highly modular, parallelizable, implemented in object-oriented python, and publicly available. We have verified the images produced to be equivalent to those from traditional techniques to within a precision set by gridding coarseness. We have also validated our implementation on data observed with the Long Wavelength Array (LWA1). We provide a detailed framework for imaging with heterogeneous arrays and show that epic robustly estimates the input sky model for such arrays. Antenna layouts with dense filling factors consisting of a large number of antennas such as LWA, the Square Kilometre Array, Hydrogen Epoch of Reionization Array, and Canadian Hydrogen Intensity Mapping Experiment will gain significant computational advantage by deploying an optimized version of epic. The algorithm is a strong candidate for instruments targeting transient searches of fast radio bursts as well as planetary and exoplanetary phenomena due to the availability of high-speed calibrated time-domain images and low output bandwidth relative to visibility-based systems.

  16. Assembly, characterization, and operation of large-scale TES detector arrays for ACTPol

    NASA Astrophysics Data System (ADS)

    Pappas, Christine Goodwin

    2016-01-01

    The Polarization-sensitive Receiver for the Atacama Cosmology Telescope (ACTPol) is designed to measure the Cosmic Microwave Background (CMB) temperature and polarization anisotropies on small angular scales. Measurements of the CMB temperature and polarization anisotropies have produced arguably the most important cosmological data to date, establishing the LambdaCDM model and providing the best constraints on most of its parameters. To detect the very small fluctuations in the CMB signal across the sky, ACTPol uses feedhorn-coupled Transition-Edge Sensor (TES) detectors. A TES is a superconducting thin film operated in the transition region between the superconducting and normal states, where it functions as a highly sensitive resistive thermometer. In this thesis, aspects of the assembly, characterization, and in-field operation of the ACTPol TES detector arrays are discussed. First, a novel microfabrication process for producing high-density superconducting aluminum/polyimide flexible circuitry (flex) designed to connect large-scale detector arrays to the first stage of readout is presented. The flex is used in parts of the third ACTPol array and is currently being produced for use in the AdvACT detector arrays, which will begin to replace the ACTPol arrays in 2016. Next, we describe methods and results for the in-lab and on-telescope characterization of the detectors in the third ACTPol array. Finally, we describe the ACTPol TES R(T,I) transition shapes and how they affect the detector calibration and operation. Methods for measuring the exact detector calibration and re-biasing functions, taking into account the R(T,I) transition shape, are presented.

  17. NbN A/D Conversion of IR Focal Plane Sensor Signal at 10 K

    NASA Technical Reports Server (NTRS)

    Eaton, L.; Durand, D.; Sandell, R.; Spargo, J.; Krabach, T.

    1994-01-01

    We are implementing a 12 bit SFQ counting ADC with parallel-to-serial readout using our established 10 K NbN capability. This circuit provides a key element of the analog signal processor (ASP) used in large infrared focal plane arrays. The circuit processes the signal data stream from a Si:As BIB detector array. A 10 mega samples per second (MSPS) pixel data stream flows from the chip at a 120 megabit bit rate in a format that is compatible with other superconductive time dependent processor (TDP) circuits being developed. We will discuss our planned ASP demonstration, the circuit design, and test results.

  18. Electronic switching spherical array antenna

    NASA Technical Reports Server (NTRS)

    Stockton, R.

    1978-01-01

    This work was conducted to demonstrate the performance levels attainable with an ESSA (Electronic Switching Spherical Array) antenna by designing and testing an engineering model. The antenna was designed to satisfy general spacecraft environmental requirements and built to provide electronically commandable beam pointing capability throughout a hemisphere. Constant gain and beam shape throughout large volumetric coverage regions are the principle characteristics. The model is intended to be a prototype of a standard communications and data handling antenna for user scientific spacecraft with the Tracking and Data Relay Satellite System (TDRSS). Some additional testing was conducted to determine the feasibility of an integrated TDRSS and GPS (Global Positioning System) antenna system.

  19. Magnetic thin-film insulator with ultra-low spin wave damping for coherent nanomagnonics

    NASA Astrophysics Data System (ADS)

    Yu, Haiming; Kelly, O. D'allivy; Cros, V.; Bernard, R.; Bortolotti, P.; Anane, A.; Brandl, F.; Huber, R.; Stasinopoulos, I.; Grundler, D.

    2014-10-01

    Wave control in the solid state has opened new avenues in modern information technology. Surface-acoustic-wave-based devices are found as mass market products in 100 millions of cellular phones. Spin waves (magnons) would offer a boost in today's data handling and security implementations, i.e., image processing and speech recognition. However, nanomagnonic devices realized so far suffer from the relatively short damping length in the metallic ferromagnets amounting to a few 10 micrometers typically. Here we demonstrate that nm-thick YIG films overcome the damping chasm. Using a conventional coplanar waveguide we excite a large series of short-wavelength spin waves (SWs). From the data we estimate a macroscopic of damping length of about 600 micrometers. The intrinsic damping parameter suggests even a record value about 1 mm allowing for magnonics-based nanotechnology with ultra-low damping. In addition, SWs at large wave vector are found to exhibit the non-reciprocal properties relevant for new concepts in nanoscale SW-based logics. We expect our results to provide the basis for coherent data processing with SWs at GHz rates and in large arrays of cellular magnetic arrays, thereby boosting the envisioned image processing and speech recognition.

  20. Magnetic thin-film insulator with ultra-low spin wave damping for coherent nanomagnonics

    PubMed Central

    Yu, Haiming; Kelly, O. d'Allivy; Cros, V.; Bernard, R.; Bortolotti, P.; Anane, A.; Brandl, F.; Huber, R.; Stasinopoulos, I.; Grundler, D.

    2014-01-01

    Wave control in the solid state has opened new avenues in modern information technology. Surface-acoustic-wave-based devices are found as mass market products in 100 millions of cellular phones. Spin waves (magnons) would offer a boost in today's data handling and security implementations, i.e., image processing and speech recognition. However, nanomagnonic devices realized so far suffer from the relatively short damping length in the metallic ferromagnets amounting to a few 10 micrometers typically. Here we demonstrate that nm-thick YIG films overcome the damping chasm. Using a conventional coplanar waveguide we excite a large series of short-wavelength spin waves (SWs). From the data we estimate a macroscopic of damping length of about 600 micrometers. The intrinsic damping parameter suggests even a record value about 1 mm allowing for magnonics-based nanotechnology with ultra-low damping. In addition, SWs at large wave vector are found to exhibit the non-reciprocal properties relevant for new concepts in nanoscale SW-based logics. We expect our results to provide the basis for coherent data processing with SWs at GHz rates and in large arrays of cellular magnetic arrays, thereby boosting the envisioned image processing and speech recognition. PMID:25355200

  1. Unified Access Architecture for Large-Scale Scientific Datasets

    NASA Astrophysics Data System (ADS)

    Karna, Risav

    2014-05-01

    Data-intensive sciences have to deploy diverse large scale database technologies for data analytics as scientists have now been dealing with much larger volume than ever before. While array databases have bridged many gaps between the needs of data-intensive research fields and DBMS technologies (Zhang 2011), invocation of other big data tools accompanying these databases is still manual and separate the database management's interface. We identify this as an architectural challenge that will increasingly complicate the user's work flow owing to the growing number of useful but isolated and niche database tools. Such use of data analysis tools in effect leaves the burden on the user's end to synchronize the results from other data manipulation analysis tools with the database management system. To this end, we propose a unified access interface for using big data tools within large scale scientific array database using the database queries themselves to embed foreign routines belonging to the big data tools. Such an invocation of foreign data manipulation routines inside a query into a database can be made possible through a user-defined function (UDF). UDFs that allow such levels of freedom as to call modules from another language and interface back and forth between the query body and the side-loaded functions would be needed for this purpose. For the purpose of this research we attempt coupling of four widely used tools Hadoop (hadoop1), Matlab (matlab1), R (r1) and ScaLAPACK (scalapack1) with UDF feature of rasdaman (Baumann 98), an array-based data manager, for investigating this concept. The native array data model used by an array-based data manager provides compact data storage and high performance operations on ordered data such as spatial data, temporal data, and matrix-based data for linear algebra operations (scidbusr1). Performances issues arising due to coupling of tools with different paradigms, niche functionalities, separate processes and output data formats have been anticipated and considered during the design of the unified architecture. The research focuses on the feasibility of the designed coupling mechanism and the evaluation of the efficiency and benefits of our proposed unified access architecture. Zhang 2011: Zhang, Ying and Kersten, Martin and Ivanova, Milena and Nes, Niels, SciQL: Bridging the Gap Between Science and Relational DBMS, Proceedings of the 15th Symposium on International Database Engineering Applications, 2011. Baumann 98: Baumann, P., Dehmel, A., Furtado, P., Ritsch, R., Widmann, N., "The Multidimensional Database System RasDaMan", SIGMOD 1998, Proceedings ACM SIGMOD International Conference on Management of Data, June 2-4, 1998, Seattle, Washington, 1998. hadoop1: hadoop.apache.org, "Hadoop", http://hadoop.apache.org/, [Online; accessed 12-Jan-2014]. scalapack1: netlib.org/scalapack, "ScaLAPACK", http://www.netlib.org/scalapack,[Online; accessed 12-Jan-2014]. r1: r-project.org, "R", http://www.r-project.org/,[Online; accessed 12-Jan-2014]. matlab1: mathworks.com, "Matlab Documentation", http://www.mathworks.de/de/help/matlab/,[Online; accessed 12-Jan-2014]. scidbusr1: scidb.org, "SciDB User's Guide", http://scidb.org/HTMLmanual/13.6/scidb_ug,[Online; accessed 01-Dec-2013].

  2. ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.

    PubMed

    Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley

    2016-10-03

    Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23,677 profiles into a comprehensive quality atlas with fine classification for users.

  3. A new technique to transfer metallic nanoscale patterns to small and non-planar surfaces: Application to a fiber optic device for surface enhanced Raman scattering detection

    NASA Astrophysics Data System (ADS)

    Smythe, Elizabeth Jennings

    This thesis focuses on the development of a bidirectional fiber optic probe for the detection of surface enhanced Raman scattering (SERS). One facet of this fiber-based probe featured an array of coupled optical antennas, which we designed to enhance the Raman signal of nearby analytes. When this array interacted with an analyte, it generated SERS signals specific to the chemical composition of the sample; some of these SERS signals coupled back into the fiber. We used the other facet of the probe to input light into the fiber and collect the SERS signals that coupled into the probe. In this dissertation, the development of the probe is broken into three sections: (i) characterization of antenna arrays, (ii) fabrication of the probe, and (iii) device measurements. In the first section we present a comprehensive study of metallic antenna arrays. We carried out this study to determine the effects of antenna geometry, spacing, and composition on the surface plasmon resonance (SPR) of a coupled antenna array; the wavelength range and strength of the SPR are functions of the shape and interactions of the antennas. The SPR of the array ultimately amplified the Raman signal of analytes and produced a measurable SERS signal, thus determination of the optimal array geometries for SERS generation was an important first step in the development of the SERS fiber probe. We then introduce a new technique developed to fabricate the SERS fiber probes. This technique involves transferring antenna arrays (created by standard lithographic methods) from a large silicon substrate to a fiber facet. We developed this fabrication technique to bypass many of the limitations presented by previously developed methods for patterning unconventional substrates (i.e. small and/or non-planar substrates), such as focused ion-beam milling and soft lithography. In the third section of this thesis, we present SERS measurements taken with the fiber probe. We constructed a measurement system to couple light into the probe and filter out background noise; this allowed simultaneous detection of multiple chemicals. Antenna array enhancement factor (EF) calculations are shown; these allowed us to determine that the probe efficiently collected SERS signals.

  4. Tracking fin whale calls offshore the Galicia Margin, North East Atlantic Ocean.

    PubMed

    Gaspà Rebull, Oriol; Díaz Cusí, Jordi; Ruiz Fernández, Mario; Gallart Muset, Josep

    2006-10-01

    Data recorded during a temporary deployment of ocean bottom seismometers (OBSs) are used in this study to monitor the presence of fin whales around the array. In the summer of 2003, ten OBSs were placed 250 km from the NW coast of Iberia in the Galicia Margin, NE Atlantic Ocean for a period of one month. The recorded data set provided a large variety of signals, including fin whale vocalizations identified by their specific acoustic signature. The use of a dense array of seafloor receivers allowed investigation into the locations and tracks of the signal-generating whales using a seismological hypocentral location code. Individual pulses of different sequences have been chosen to study such tracks. Problems related to the correct identification of pulses, discrimination between direct and multiple arrivals, and the presence of more than one individual have been considered prior to location. Fin calls were concentrated in the last two weeks of the deployment and the locations were spread around the area covered by the array. These results illustrate that, besides its classical seismological aim, deployment of semipermanent seafloor seismic arrays can also provide valuable data for marine mammal behavior studies.

  5. Pulsar Timing Array Based Search for Supermassive Black Hole Binaries in the Square Kilometer Array Era

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Mohanty, Soumya D.

    2017-04-01

    The advent of next generation radio telescope facilities, such as the Square Kilometer Array (SKA), will usher in an era where a pulsar timing array (PTA) based search for gravitational waves (GWs) will be able to use hundreds of well timed millisecond pulsars rather than the few dozens in existing PTAs. A realistic assessment of the performance of such an extremely large PTA must take into account the data analysis challenge posed by an exponential increase in the parameter space volume due to the large number of so-called pulsar phase parameters. We address this problem and present such an assessment for isolated supermassive black hole binary (SMBHB) searches using a SKA era PTA containing 1 03 pulsars. We find that an all-sky search will be able to confidently detect nonevolving sources with a redshifted chirp mass of 1 010 M⊙ out to a redshift of about 28 (corresponding to a rest-frame chirp mass of 3.4 ×1 08 M⊙). We discuss the important implications that the large distance reach of a SKA era PTA has on GW observations from optically identified SMBHB candidates. If no SMBHB detections occur, a highly unlikely scenario in the light of our results, the sky-averaged upper limit on strain amplitude will be improved by about 3 orders of magnitude over existing limits.

  6. Pulsar Timing Array Based Search for Supermassive Black Hole Binaries in the Square Kilometer Array Era.

    PubMed

    Wang, Yan; Mohanty, Soumya D

    2017-04-14

    The advent of next generation radio telescope facilities, such as the Square Kilometer Array (SKA), will usher in an era where a pulsar timing array (PTA) based search for gravitational waves (GWs) will be able to use hundreds of well timed millisecond pulsars rather than the few dozens in existing PTAs. A realistic assessment of the performance of such an extremely large PTA must take into account the data analysis challenge posed by an exponential increase in the parameter space volume due to the large number of so-called pulsar phase parameters. We address this problem and present such an assessment for isolated supermassive black hole binary (SMBHB) searches using a SKA era PTA containing 10^{3} pulsars. We find that an all-sky search will be able to confidently detect nonevolving sources with a redshifted chirp mass of 10^{10}  M_{⊙} out to a redshift of about 28 (corresponding to a rest-frame chirp mass of 3.4×10^{8}  M_{⊙}). We discuss the important implications that the large distance reach of a SKA era PTA has on GW observations from optically identified SMBHB candidates. If no SMBHB detections occur, a highly unlikely scenario in the light of our results, the sky-averaged upper limit on strain amplitude will be improved by about 3 orders of magnitude over existing limits.

  7. Far infrared through millimeter backshort-under-grid arrays

    NASA Astrophysics Data System (ADS)

    Allen, Christine A.; Abrahams, John; Benford, Dominic J.; Chervenak, James A.; Chuss, David T.; Staguhn, Johannes G.; Miller, Timothy M.; Moseley, S. Harvey; Wollack, Edward J.

    2006-06-01

    We are developing a large-format, versatile, bolometer array for a wide range of infrared through millimeter astronomical applications. The array design consists of three key components - superconducting transition edge sensor bolometer arrays, quarter-wave reflective backshort grids, and Superconducting Quantum Interference Device (SQUID) multiplexer readouts. The detector array is a filled, square grid of bolometers with superconducting sensors. The backshort arrays are fabricated separately and are positioned in the etch cavities behind the detector grid. The grids have unique three-dimensional interlocking features micromachined into the walls for positioning and mechanical stability. The ultimate goal of the program is to produce large-format arrays with background-limited sensitivity, suitable for a wide range of wavelengths and applications. Large-format (kilopixel) arrays will be directly indium bump bonded to a SQUID multiplexer circuit. We have produced and tested 8×8 arrays of 1 mm detectors to demonstrate proof of concept. 8×16 arrays of 2 mm detectors are being produced for a new Goddard Space Flight Center instrument. We have also produced models of a kilopixel detector grid and dummy multiplexer chip for bump bonding development. We present detector design overview, several unique fabrication highlights, and assembly technologies.

  8. Evaluation of Microelectrode Array Data using Bayesian Modeling as an Approach to Screening and Prioritization for Neurotoxicity Testing*

    EPA Science Inventory

    The need to assess large numbers of chemicals for their potential toxicities has resulted in increased emphasis on medium- and high-throughput in vitro screening approaches. For such approaches to be useful, efficient and reliable data analysis and hit detection methods are also ...

  9. An Efficient Algorithm for TUCKALS3 on Data with Large Numbers of Observation Units.

    ERIC Educational Resources Information Center

    Kiers, Henk A. L.; And Others

    1992-01-01

    A modification of the TUCKALS3 algorithm is proposed that handles three-way arrays of order I x J x K for any I. The reduced work space needed for storing data and increased execution speed make the modified algorithm very suitable for use on personal computers. (SLD)

  10. Eosinophils from Physiology to Disease: A Comprehensive Review

    PubMed Central

    Yacoub, Mona-Rita; Ripa, Marco; Mannina, Daniele; Cariddi, Adriana; Saporiti, Nicoletta; Ciceri, Fabio; Castagna, Antonella; Dagna, Lorenzo

    2018-01-01

    Despite being the second least represented granulocyte subpopulation in the circulating blood, eosinophils are receiving a growing interest from the scientific community, due to their complex pathophysiological role in a broad range of local and systemic inflammatory diseases as well as in cancer and thrombosis. Eosinophils are crucial for the control of parasitic infections, but increasing evidence suggests that they are also involved in vital defensive tasks against bacterial and viral pathogens including HIV. On the other side of the coin, eosinophil potential to provide a strong defensive response against invading microbes through the release of a large array of compounds can prove toxic to the host tissues and dysregulate haemostasis. Increasing knowledge of eosinophil biological behaviour is leading to major changes in established paradigms for the classification and diagnosis of several allergic and autoimmune diseases and has paved the way to a “golden age” of eosinophil-targeted agents. In this review, we provide a comprehensive update on the pathophysiological role of eosinophils in host defence, inflammation, and cancer and discuss potential clinical implications in light of recent therapeutic advances. PMID:29619379

  11. Magnetic Fields in the Galaxy

    NASA Astrophysics Data System (ADS)

    Mayo, Elizabeth A.

    2009-01-01

    Interstellar magnetic fields are believed to play a crucial role in the star-formation process, therefore a comprehensive study of magnetic fields is necessary in understanding the origins of stars. These projects use observational data obtained from the Very Large Array (VLA) in Socorro, NM. The data reveal interstellar magnetic field strengths via the Zeeman effect in radio frequency spectral lines. This information provides an estimate of the magnetic energy in star-forming interstellar clouds in the Galaxy, and comparisons can be made with these energies and the energies of self-gravitation and internal motions. From these comparisons, a better understanding of the role of magnetic fields in the origins of stars will emerge. NGC 6334 A is a compact HII region at the center of what is believed to be a large, rotating molecular torus (Kramer et al. (1997)). This is a continuing study based on initial measurements of the HI and OH Zeeman effect (Sarma et al. (2000)). The current study includes OH observations performed by the VLA at a higher spatial resolution than previously published data, and allows for a better analysis of the spatial variations of the magnetic field. A new model of the region is also developed based on OH opacity studies, dust continuum maps, radio spectral lines, and infrared (IR) maps. The VLA has been used to study the Zeeman effect in the 21cm HI line seen in absorption against radio sources in the Cygnus-X region. These sources are mostly galactic nebulae or HII regions, and are bright and compact in this region of the spectrum. HI absorption lines are strong against these regions and the VLA is capable of detecting the weak Zeeman effect within them. Support for this work was provided by the NSF PAARE program to South Carolina State University under award AST-0750814.

  12. Solar cell array design handbook, volume 1

    NASA Technical Reports Server (NTRS)

    Rauschenbach, H. S.

    1976-01-01

    Twelve chapters discuss the following: historical developments, the environment and its effects, solar cells, solar cell filters and covers, solar cell and other electrical interconnections, blocking and shunt diodes, substrates and deployment mechanisms, material properties, design synthesis and optimization, design analysis, procurement, production and cost aspects, evaluation and test, orbital performance, and illustrative design examples. A comprehensive index permits rapid locating of desired topics. The handbook consists of two volumes: Volume 1 is of an expository nature while Volume 2 contains detailed design data in an appendix-like fashion. Volume 2 includes solar cell performance data, applicable unit conversion factors and physical constants, and mechanical, electrical, thermal optical, magnetic, and outgassing material properties. Extensive references are provided.

  13. Corrections for the geometric distortion of the tube detectors on SANS instruments at ORNL

    DOE PAGES

    He, Lilin; Do, Changwoo; Qian, Shuo; ...

    2014-11-25

    Small-angle neutron scattering instruments at the Oak Ridge National Laboratory's High Flux Isotope Reactor were upgraded in area detectors from the large, single volume crossed-wire detectors originally installed to staggered arrays of linear position-sensitive detectors (LPSDs). The specific geometry of the LPSD array requires that approaches to data reduction traditionally employed be modified. Here, two methods for correcting the geometric distortion produced by the LPSD array are presented and compared. The first method applies a correction derived from a detector sensitivity measurement performed using the same configuration as the samples are measured. In the second method, a solid angle correctionmore » is derived that can be applied to data collected in any instrument configuration during the data reduction process in conjunction with a detector sensitivity measurement collected at a sufficiently long camera length where the geometric distortions are negligible. Furthermore, both methods produce consistent results and yield a maximum deviation of corrected data from isotropic scattering samples of less than 5% for scattering angles up to a maximum of 35°. The results are broadly applicable to any SANS instrument employing LPSD array detectors, which will be increasingly common as instruments having higher incident flux are constructed at various neutron scattering facilities around the world.« less

  14. Parametrically Optimized Carbon Nanotube-Coated Cold Cathode Spindt Arrays

    PubMed Central

    Yuan, Xuesong; Cole, Matthew T.; Zhang, Yu; Wu, Jianqiang; Milne, William I.; Yan, Yang

    2017-01-01

    Here, we investigate, through parametrically optimized macroscale simulations, the field electron emission from arrays of carbon nanotube (CNT)-coated Spindts towards the development of an emerging class of novel vacuum electron devices. The present study builds on empirical data gleaned from our recent experimental findings on the room temperature electron emission from large area CNT electron sources. We determine the field emission current of the present microstructures directly using particle in cell (PIC) software and present a new CNT cold cathode array variant which has been geometrically optimized to provide maximal emission current density, with current densities of up to 11.5 A/cm2 at low operational electric fields of 5.0 V/μm. PMID:28336845

  15. Design automation techniques for custom LSI arrays

    NASA Technical Reports Server (NTRS)

    Feller, A.

    1975-01-01

    The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.

  16. Superconducting Bolometer Array Architectures

    NASA Technical Reports Server (NTRS)

    Benford, Dominic; Chervenak, Jay; Irwin, Kent; Moseley, S. Harvey; Shafer, Rick; Staguhn, Johannes; Wollack, Ed; Oegerle, William (Technical Monitor)

    2002-01-01

    The next generation of far-infrared and submillimeter instruments require large arrays of detectors containing thousands of elements. These arrays will necessarily be multiplexed, and superconducting bolometer arrays are the most promising present prospect for these detectors. We discuss our current research into superconducting bolometer array technologies, which has recently resulted in the first multiplexed detections of submillimeter light and the first multiplexed astronomical observations. Prototype arrays containing 512 pixels are in production using the Pop-Up Detector (PUD) architecture, which can be extended easily to 1000 pixel arrays. Planar arrays of close-packed bolometers are being developed for the GBT (Green Bank Telescope) and for future space missions. For certain applications, such as a slewed far-infrared sky survey, feedhorncoupling of a large sparsely-filled array of bolometers is desirable, and is being developed using photolithographic feedhorn arrays. Individual detectors have achieved a Noise Equivalent Power (NEP) of -10(exp 17) W/square root of Hz at 300mK, but several orders of magnitude improvement are required and can be reached with existing technology. The testing of such ultralow-background detectors will prove difficult, as this requires optical loading of below IfW. Antenna-coupled bolometer designs have advantages for large format array designs at low powers due to their mode selectivity.

  17. Detecting Spatial Patterns in Biological Array Experiments

    PubMed Central

    ROOT, DAVID E.; KELLEY, BRIAN P.; STOCKWELL, BRENT R.

    2005-01-01

    Chemical genetic screening and DNA and protein microarrays are among a number of increasingly important and widely used biological research tools that involve large numbers of parallel experiments arranged in a spatial array. It is often difficult to ensure that uniform experimental conditions are present throughout the entire array, and as a result, one often observes systematic spatially correlated errors, especially when array experiments are performed using robots. Here, the authors apply techniques based on the discrete Fourier transform to identify and quantify spatially correlated errors superimposed on a spatially random background. They demonstrate that these techniques are effective in identifying common spatially systematic errors in high-throughput 384-well microplate assay data. In addition, the authors employ a statistical test to allow for automatic detection of such errors. Software tools for using this approach are provided. PMID:14567791

  18. Measurement of Trailing Edge Noise Using Directional Array and Coherent Output Power Methods

    NASA Technical Reports Server (NTRS)

    Hutcheson, Florence V.; Brooks, Thomas F.

    2002-01-01

    The use of a directional (or phased) array of microphones for the measurement of trailing edge (TE) noise is described and tested. The capabilities of this method arc evaluated via measurements of TE noise from a NACA 63-215 airfoil model and from a cylindrical rod. This TE noise measurement approach is compared to one that is based on thc cross spectral analysis of output signals from a pair of microphones placed on opposite sides of an airframe model (COP method). Advantages and limitations of both methods arc examined. It is shown that the microphone array can accurately measures TE noise and captures its two-dimensional characteristic over a large frequency range for any TE configuration as long as noise contamination from extraneous sources is within bounds. The COP method is shown to also accurately measure TE noise but over a more limited frequency range that narrows for increased TE thickness. Finally, the applicability and generality of an airfoil self-noise prediction method was evaluated via comparison to the experimental data obtained using the COP and array measurement methods. The predicted and experimental results are shown to agree over large frequency ranges.

  19. A High-Speed Large-Range Tip-Tilt-Piston Micromirror Array

    DOE PAGES

    Hopkins, Jonathan B.; Panas, Robert M.; Song, Yuanping; ...

    2016-12-01

    This work introduces the design of a high fill-factor (>99%) micromirror array (MMA) that consists of 1mm2 hexagonal mirrors, which are expected to each independently achieve continuous, closed-loop control of three degrees of freedom (DOFs)—tip, tilt, and piston—over large ranges (>±10o rotation and >±30μm translation) at high speeds (~45kHz for a 1o amplitude of rotational oscillation). The flexure topology of this array is designed using the Freedom, Actuation, and Constraint Topologies (FACT) synthesis approach, which utilizes geometric shapes to help designers rapidly consider every flexure topology that best achieves a desired set of DOFs driven by decoupled actuators. The geometrymore » of this array’s comb-drive actuators are optimized in conjunction with the geometry of the system’s flexures using a novel approach. The analytical models underlying this approach are verified using finite element analysis (FEA) and validated using experimental data. The capabilities of this new mirror array will enable, or significantly improve, the performance of a variety of high-impact optical technologies such as advanced optical switches, spatial-light modulators, displays, and laser steering or scanning devices.« less

  20. A High-Speed Large-Range Tip-Tilt-Piston Micromirror Array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopkins, Jonathan B.; Panas, Robert M.; Song, Yuanping

    This work introduces the design of a high fill-factor (>99%) micromirror array (MMA) that consists of 1mm2 hexagonal mirrors, which are expected to each independently achieve continuous, closed-loop control of three degrees of freedom (DOFs)—tip, tilt, and piston—over large ranges (>±10o rotation and >±30μm translation) at high speeds (~45kHz for a 1o amplitude of rotational oscillation). The flexure topology of this array is designed using the Freedom, Actuation, and Constraint Topologies (FACT) synthesis approach, which utilizes geometric shapes to help designers rapidly consider every flexure topology that best achieves a desired set of DOFs driven by decoupled actuators. The geometrymore » of this array’s comb-drive actuators are optimized in conjunction with the geometry of the system’s flexures using a novel approach. The analytical models underlying this approach are verified using finite element analysis (FEA) and validated using experimental data. The capabilities of this new mirror array will enable, or significantly improve, the performance of a variety of high-impact optical technologies such as advanced optical switches, spatial-light modulators, displays, and laser steering or scanning devices.« less

  1. Acoustic measurements from a rotor blade-vortex interaction noise experiment in the German-Dutch Wind Tunnel (DNW)

    NASA Technical Reports Server (NTRS)

    Martin, Ruth M.; Splettstoesser, W. R.; Elliott, J. W.; Schultz, K.-J.

    1988-01-01

    Acoustic data are presented from a 40 percent scale model of the 4-bladed BO-105 helicopter main rotor, measured in the large European aeroacoustic wind tunnel, the DNW. Rotor blade-vortex interaction (BVI) noise data in the low speed flight range were acquired using a traversing in-flow microphone array. The experimental apparatus, testing procedures, calibration results, and experimental objectives are fully described. A large representative set of averaged acoustic signals is presented.

  2. Development and evaluation of the first high-throughput SNP array for common carp (Cyprinus carpio)

    PubMed Central

    2014-01-01

    Background A large number of single nucleotide polymorphisms (SNPs) have been identified in common carp (Cyprinus carpio) but, as yet, no high-throughput genotyping platform is available for this species. C. carpio is an important aquaculture species that accounts for nearly 14% of freshwater aquaculture production worldwide. We have developed an array for C. carpio with 250,000 SNPs and evaluated its performance using samples from various strains of C. carpio. Results The SNPs used on the array were selected from two resources: the transcribed sequences from RNA-seq data of four strains of C. carpio, and the genome re-sequencing data of five strains of C. carpio. The 250,000 SNPs on the resulting array are distributed evenly across the reference C.carpio genome with an average spacing of 6.6 kb. To evaluate the SNP array, 1,072 C. carpio samples were collected and tested. Of the 250,000 SNPs on the array, 185,150 (74.06%) were found to be polymorphic sites. Genotyping accuracy was checked using genotyping data from a group of full-siblings and their parents, and over 99.8% of the qualified SNPs were found to be reliable. Analysis of the linkage disequilibrium on all samples and on three domestic C.carpio strains revealed that the latter had the longer haplotype blocks. We also evaluated our SNP array on 80 samples from eight species related to C. carpio, with from 53,526 to 71,984 polymorphic SNPs. An identity by state analysis divided all the samples into three clusters; most of the C. carpio strains formed the largest cluster. Conclusions The Carp SNP array described here is the first high-throughput genotyping platform for C. carpio. Our evaluation of this array indicates that it will be valuable for farmed carp and for genetic and population biology studies in C. carpio and related species. PMID:24762296

  3. Development and evaluation of the first high-throughput SNP array for common carp (Cyprinus carpio).

    PubMed

    Xu, Jian; Zhao, Zixia; Zhang, Xiaofeng; Zheng, Xianhu; Li, Jiongtang; Jiang, Yanliang; Kuang, Youyi; Zhang, Yan; Feng, Jianxin; Li, Chuangju; Yu, Juhua; Li, Qiang; Zhu, Yuanyuan; Liu, Yuanyuan; Xu, Peng; Sun, Xiaowen

    2014-04-24

    A large number of single nucleotide polymorphisms (SNPs) have been identified in common carp (Cyprinus carpio) but, as yet, no high-throughput genotyping platform is available for this species. C. carpio is an important aquaculture species that accounts for nearly 14% of freshwater aquaculture production worldwide. We have developed an array for C. carpio with 250,000 SNPs and evaluated its performance using samples from various strains of C. carpio. The SNPs used on the array were selected from two resources: the transcribed sequences from RNA-seq data of four strains of C. carpio, and the genome re-sequencing data of five strains of C. carpio. The 250,000 SNPs on the resulting array are distributed evenly across the reference C.carpio genome with an average spacing of 6.6 kb. To evaluate the SNP array, 1,072 C. carpio samples were collected and tested. Of the 250,000 SNPs on the array, 185,150 (74.06%) were found to be polymorphic sites. Genotyping accuracy was checked using genotyping data from a group of full-siblings and their parents, and over 99.8% of the qualified SNPs were found to be reliable. Analysis of the linkage disequilibrium on all samples and on three domestic C.carpio strains revealed that the latter had the longer haplotype blocks. We also evaluated our SNP array on 80 samples from eight species related to C. carpio, with from 53,526 to 71,984 polymorphic SNPs. An identity by state analysis divided all the samples into three clusters; most of the C. carpio strains formed the largest cluster. The Carp SNP array described here is the first high-throughput genotyping platform for C. carpio. Our evaluation of this array indicates that it will be valuable for farmed carp and for genetic and population biology studies in C. carpio and related species.

  4. Solar array study for solar electric propulsion spacecraft for the Encke rendezvous mission

    NASA Technical Reports Server (NTRS)

    Sequeira, E. A.; Patterson, R. E.

    1974-01-01

    The work is described which was performed on the design, analysis and performance of a 20 kW rollup solar array capable of meeting the design requirements of a solar electric spacecraft for the 1980 Encke rendezvous mission. To meet the high power requirements of the proposed electric propulsion mission, solar arrays on the order of 186.6 sq m were defined. Because of the large weights involved with arrays of this size, consideration of array configurations is limited to lightweight, large area concepts with maximum power-to-weight ratios. Items covered include solar array requirements and constraints, array concept selection and rationale, structural and electrical design considerations, and reliability considerations.

  5. Generalized Phenomenological Cyclic Stress-Strain-Strength Characterization of Granular Media.

    DTIC Science & Technology

    1984-09-02

    could be fitted to a comprehensive data set. i ’../., Unfortunately, such equipment is not available at present, and most researchers still rely on the...notably, Lade and Duncan (1975), using a comprehensive series of test data obtained from a true triaxial device (Lade, 1973), have suggested that failure...0 VV 2. Shear Strain, low indeterminate (prior to failure) (at failure) 3. Deformation small large 4. Void Ratio (e) any e ecritical 5. Grain

  6. An array processing system for lunar geochemical and geophysical data

    NASA Technical Reports Server (NTRS)

    Eliason, E. M.; Soderblom, L. A.

    1977-01-01

    A computerized array processing system has been developed to reduce, analyze, display, and correlate a large number of orbital and earth-based geochemical, geophysical, and geological measurements of the moon on a global scale. The system supports the activities of a consortium of about 30 lunar scientists involved in data synthesis studies. The system was modeled after standard digital image-processing techniques but differs in that processing is performed with floating point precision rather than integer precision. Because of flexibility in floating-point image processing, a series of techniques that are impossible or cumbersome in conventional integer processing were developed to perform optimum interpolation and smoothing of data. Recently color maps of about 25 lunar geophysical and geochemical variables have been generated.

  7. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  8. Enhanced identification and biological validation of differential gene expression via Illumina whole-genome expression arrays through the use of the model-based background correction methodology

    PubMed Central

    Ding, Liang-Hao; Xie, Yang; Park, Seongmi; Xiao, Guanghua; Story, Michael D.

    2008-01-01

    Despite the tremendous growth of microarray usage in scientific studies, there is a lack of standards for background correction methodologies, especially in single-color microarray platforms. Traditional background subtraction methods often generate negative signals and thus cause large amounts of data loss. Hence, some researchers prefer to avoid background corrections, which typically result in the underestimation of differential expression. Here, by utilizing nonspecific negative control features integrated into Illumina whole genome expression arrays, we have developed a method of model-based background correction for BeadArrays (MBCB). We compared the MBCB with a method adapted from the Affymetrix robust multi-array analysis algorithm and with no background subtraction, using a mouse acute myeloid leukemia (AML) dataset. We demonstrated that differential expression ratios obtained by using the MBCB had the best correlation with quantitative RT–PCR. MBCB also achieved better sensitivity in detecting differentially expressed genes with biological significance. For example, we demonstrated that the differential regulation of Tnfr2, Ikk and NF-kappaB, the death receptor pathway, in the AML samples, could only be detected by using data after MBCB implementation. We conclude that MBCB is a robust background correction method that will lead to more precise determination of gene expression and better biological interpretation of Illumina BeadArray data. PMID:18450815

  9. Optical Communications With A Geiger Mode APD Array

    DTIC Science & Technology

    2016-02-09

    spurious fires from numerous sources, including crosstalk from other detectors in the same array . Additionally, after a 9 successful detection, the...be combined into arrays with large numbers of detectors , allowing for scaling of dynamic range with relatively little overhead on space and power...overall higher rate of dark counts than a single detector , this is more than compensated for by the extra detectors . A sufficiently large APD array could

  10. Deep 3 GHz number counts from a P(D) fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Vernstrom, T.; Scott, Douglas; Wall, J. V.; Condon, J. J.; Cotton, W. D.; Fomalont, E. B.; Kellermann, K. I.; Miller, N.; Perley, R. A.

    2014-05-01

    Radio source counts constrain galaxy populations and evolution, as well as the global star formation history. However, there is considerable disagreement among the published 1.4-GHz source counts below 100 μJy. Here, we present a statistical method for estimating the μJy and even sub-μJy source count using new deep wide-band 3-GHz data in the Lockman Hole from the Karl G. Jansky Very Large Array. We analysed the confusion amplitude distribution P(D), which provides a fresh approach in the form of a more robust model, with a comprehensive error analysis. We tested this method on a large-scale simulation, incorporating clustering and finite source sizes. We discuss in detail our statistical methods for fitting using Markov chain Monte Carlo, handling correlations, and systematic errors from the use of wide-band radio interferometric data. We demonstrated that the source count can be constrained down to 50 nJy, a factor of 20 below the rms confusion. We found the differential source count near 10 μJy to have a slope of -1.7, decreasing to about -1.4 at fainter flux densities. At 3 GHz, the rms confusion in an 8-arcsec full width at half-maximum beam is ˜ 1.2 μJy beam-1, and a radio background temperature ˜14 mK. Our counts are broadly consistent with published evolutionary models. With these results, we were also able to constrain the peak of the Euclidean normalized differential source count of any possible new radio populations that would contribute to the cosmic radio background down to 50 nJy.

  11. Importance of large-scale bathymetry features on 2011 Tohoku tsunami waveforms through comparison of simulations with the spatially dense ALBACORE OBS array data

    NASA Astrophysics Data System (ADS)

    Kohler, M. D.; Lynett, P. J.; Legg, M. R.; Weeraratne, D. S.

    2012-12-01

    In March 2011, a deployment of ocean bottom seismometers (OBSs) off the coast of Southern California recorded the tsunami resulting from the Mw=9.0 Tohoku, Japan earthquake with very high spatial resolution. The ALBACORE (Asthenosphere and Lithosphere Broadband Architecture from the California Offshore Region Experiment) OBS array spanned a region that was 150 km north-south by 400 km east-west, extending into deep open ocean west of the Patton escarpment. In that array, 22 stations with a spacing of 75 km had differential pressure gauges (DPGs) that recorded water pressure waveform data continuously at 50 samples/second. The DPG tsunami records across the entire array show multiple large-amplitude, coherent phases arriving one hour to more than 36 hours after the initial tsunami phase. To determine the source of the large-amplitude coherent phases, gravity ocean wave propagation calculations were carried out for the Pacific Ocean. Simulated pressure waveforms were compared with data for the ALBACORE stations, as well as for the NOAA DART buoys. The linear, non-dispersive shallow-water simulations include bottom frictional effects, and use the USGS NEIC Tohoku slip model and ETOPO2 (two-minute spatial resolution) bathymetry. The predicted travel times of the initial arrivals are found to be less than 1% different from the observed travel times in the southern California ALBACORE DPG data. In order to gauge the effects of large-scale features in Pacific Ocean bathymetry, several large-scale features were individually removed, and simulations were carried out for the modified bathymetry. The removed features include the Emperor Seamount chain, Hawaiian Islands, Oceania, French Polynesia, and the South American coastline. The results show that the removal of these features has an effect on the arrival time of the phases that depends on the feature proximity to the direct path, but their removal does not have a significant effect on the frequency content or phase amplitudes of the waves. The direct paths recorded in Southern California indicate that the tsunami wave did not interfere with distant above-water features such as the Aleutians, but was diffracted around Point Conception in the California coastline and around southern California islands. It is more likely that the scattered phases are the result of wave reflections off the western Japan coastline, or interactions with local structures such as the central-southern California coastline, plateaus beneath the Channel Islands, and the Patton Escarpment.

  12. Ten Steps to Conducting a Large, Multi-Site, Longitudinal Investigation of Language and Reading in Young Children

    PubMed Central

    Farquharson, Kelly; Murphy, Kimberly A.

    2016-01-01

    Purpose: This paper describes methodological procedures involving execution of a large-scale, multi-site longitudinal study of language and reading comprehension in young children. Researchers in the Language and Reading Research Consortium (LARRC) developed and implemented these procedures to ensure data integrity across multiple sites, schools, and grades. Specifically, major features of our approach, as well as lessons learned, are summarized in 10 steps essential for successful completion of a large-scale longitudinal investigation in early grades. Method: Over 5 years, children in preschool through third grade were administered a battery of 35 higher- and lower-level language, listening, and reading comprehension measures (RCM). Data were collected from children, their teachers, and their parents/guardians at four sites across the United States. Substantial and rigorous effort was aimed toward maintaining consistency in processes and data management across sites for children, assessors, and staff. Conclusion: With appropriate planning, flexibility, and communication strategies in place, LARRC developed and executed a successful multi-site longitudinal research study that will meet its goal of investigating the contribution and role of language skills in the development of children's listening and reading comprehension. Through dissemination of our design strategies and lessons learned, research teams embarking on similar endeavors can be better equipped to anticipate the challenges. PMID:27064308

  13. [Contrast of Z-Pinch X-Ray Yield Measure Technique].

    PubMed

    Li, Mo; Wang, Liang-ping; Sheng, Liang; Lu, Yi

    2015-03-01

    Resistive bolometer and scintillant detection system are two mainly Z-pinch X-ray yield measure techniques which are based on different diagnostic principles. Contrasting the results from two methods can help with increasing precision of X-ray yield measurement. Experiments with different load material and shape were carried out on the "QiangGuang-I" facility. For Al wire arrays, X-ray yields measured by the two techniques were largely consistent. However, for insulating coating W wire arrays, X-ray yields taken from bolometer changed with load parameters while data from scintillant detection system hardly changed. Simulation and analysis draw conclusions as follows: (1) Scintillant detection system is much more sensitive to X-ray photons with low energy and its spectral response is wider than the resistive bolometer. Thus, results from the former method are always larger than the latter. (2) The responses of the two systems are both flat to Al plasma radiation. Thus, their results are consistent for Al wire array loads. (3) Radiation form planar W wire arrays is mainly composed of sub-keV soft X-ray. X-ray yields measured by the bolometer is supposed to be accurate because of the nickel foil can absorb almost all the soft X-ray. (4) By contrast, using planar W wire arrays, data from scintillant detection system hardly change with load parameters. A possible explanation is that while the distance between wires increases, plasma temperature at stagnation reduces and spectra moves toward the soft X-ray region. Scintillator is much more sensitive to the soft X-ray below 200 eV. Thus, although the total X-ray yield reduces with large diameter load, signal from the scintillant detection system is almost the same. (5) Both Techniques affected by electron beams produced by the loads.

  14. MO-F-CAMPUS-J-03: Development of a Human Brain PET for On-Line Proton Beam-Range Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shao, Yiping

    Purpose: To develop a prototype PET for verifying proton beam-range before each fractionated therapy that will enable on-line re-planning proton therapy. Methods: Latest “edge-less” silicon photomultiplier arrays and customized ASIC readout electronics were used to develop PET detectors with depth-of-interaction (DOI) measurement capability. Each detector consists of one LYSO array with each end coupled to a SiPM array. Multiple detectors can be seamlessly tiled together to form a large detector panel. Detectors with 1.5×1.5 and 2.0×2.0 mm crystals at 20 or 30 mm lengths were studied. Readout of individual SiPM or signal multiplexing was used to transfer 3D interaction position-codedmore » analog signals through flexible-print-circuit cables or PCB board to dedicated ASIC front-end electronics to output digital timing pulses that encode interaction information. These digital pulses can be transferred to, through standard LVDS cables, and decoded by a FPGA-based data acquisition of coincidence events and data transfer. The modular detector and scalable electronics/data acquisition will enable flexible PET system configuration for different imaging geometry. Results: Initial detector performance measurement shows excellent crystal identification even with 30 mm long crystals, ∼18% and 2.8 ns energy and timing resolutions, and around 2–3 mm DOI resolution. A small prototype PET scanner with one detector ring has been built and evaluated, validating the technology and design. A large size detector panel has been fabricated by scaling up from modular detectors. Different designs of resistor and capacitor based signal multiplexing boards were tested and selected based on optimal crystal identification and timing performance. Stackable readout electronics boards and FPGA-based data acquisition boards were developed and tested. A brain PET is under construction. Conclusion: Technology of large-size DOI detector based on SiPM array and advanced readout has been developed. PET imaging performance and initial phantom studies of on-line proton beam-range measurement will be conducted and reported. NIH grant R21CA187717; Cancer Prevention and Research Institute of Texas grant RP120326.« less

  15. Arrays vs. single telescopes

    NASA Astrophysics Data System (ADS)

    Johnson, H. L.

    The question of the relative efficiencies of telescope arrays versus an equivalent mirror-area very large telescope is re-examined and summarized. Four separate investigations by Bowen, Johnson and Richards, Code, and Disney all came to the same conclusion: that an array of telescopes is superior, both scientifically and economically, to a single very large telescope. The costs of recently completed telescopes are compared. The costs of arrays of telescopes are shown to be significantly lower than that of a single, very large telescope, with the further advantage that because existing, proven, designs can be used, no engineering 'break-throughs' are needed.

  16. How to Get Data from NOAA Environmental Satellites: An Overview of Operations, Products, Access and Archive

    NASA Astrophysics Data System (ADS)

    Donoho, N.; Graumann, A.; McNamara, D. P.

    2015-12-01

    In this presentation we will highlight access and availability of NOAA satellite data for near real time (NRT) and retrospective product users. The presentation includes an overview of the current fleet of NOAA satellites and methods of data distribution and access to hundreds of imagery and products offered by the Environmental Satellite Processing Center (ESPC) and the Comprehensive Large Array-data Stewardship System (CLASS). In particular, emphasis on the various levels of services for current and past observations will be presented. The National Environmental Satellite, Data, and Information Service (NESDIS) is dedicated to providing timely access to global environmental data from satellites and other sources. In special cases, users are authorized direct access to NESDIS data distribution systems for environmental satellite data and products. Other means of access include publicly available distribution services such as the Global Telecommunication System (GTS), NOAA satellite direct broadcast services and various NOAA websites and ftp servers, including CLASS. CLASS is NOAA's information technology system designed to support long-term, secure preservation and standards-based access to environmental data collections and information. The National Centers for Environmental Information (NCEI) is responsible for the ingest, quality control, stewardship, archival and access to data and science information. This work will also show the latest technology improvements, enterprise approach and future plans for distribution of exponentially increasing data volumes from future NOAA missions. A primer on access to NOAA operational satellite products and services is available at http://www.ospo.noaa.gov/Organization/About/access.html. Access to post-operational satellite data and assorted products is available at http://www.class.noaa.gov

  17. Dual-color Proteomic Profiling of Complex Samples with a Microarray of 810 Cancer-related Antibodies*

    PubMed Central

    Schröder, Christoph; Jacob, Anette; Tonack, Sarah; Radon, Tomasz P.; Sill, Martin; Zucknick, Manuela; Rüffer, Sven; Costello, Eithne; Neoptolemos, John P.; Crnogorac-Jurcevic, Tatjana; Bauer, Andrea; Fellenberg, Kurt; Hoheisel, Jörg D.

    2010-01-01

    Antibody microarrays have the potential to enable comprehensive proteomic analysis of small amounts of sample material. Here, protocols are presented for the production, quality assessment, and reproducible application of antibody microarrays in a two-color mode with an array of 1,800 features, representing 810 antibodies that were directed at 741 cancer-related proteins. In addition to measures of array quality, we implemented indicators for the accuracy and significance of dual-color detection. Dual-color measurements outperform a single-color approach concerning assay reproducibility and discriminative power. In the analysis of serum samples, depletion of high-abundance proteins did not improve technical assay quality. On the contrary, depletion introduced a strong bias in protein representation. In an initial study, we demonstrated the applicability of the protocols to proteins derived from urine samples. We identified differences between urine samples from pancreatic cancer patients and healthy subjects and between sexes. This study demonstrates that biomedically relevant data can be produced. As demonstrated by the thorough quality analysis, the dual-color antibody array approach proved to be competitive with other proteomic techniques and comparable in performance to transcriptional microarray analyses. PMID:20164060

  18. High channel count microphone array accurately and precisely localizes ultrasonic signals from freely-moving mice.

    PubMed

    Warren, Megan R; Sangiamo, Daniel T; Neunuebel, Joshua P

    2018-03-01

    An integral component in the assessment of vocal behavior in groups of freely interacting animals is the ability to determine which animal is producing each vocal signal. This process is facilitated by using microphone arrays with multiple channels. Here, we made important refinements to a state-of-the-art microphone array based system used to localize vocal signals produced by freely interacting laboratory mice. Key changes to the system included increasing the number of microphones as well as refining the methodology for localizing and assigning vocal signals to individual mice. We systematically demonstrate that the improvements in the methodology for localizing mouse vocal signals led to an increase in the number of signals detected as well as the number of signals accurately assigned to an animal. These changes facilitated the acquisition of larger and more comprehensive data sets that better represent the vocal activity within an experiment. Furthermore, this system will allow more thorough analyses of the role that vocal signals play in social communication. We expect that such advances will broaden our understanding of social communication deficits in mouse models of neurological disorders. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. The data acquisition system for the ANTARES neutrino telescope

    NASA Astrophysics Data System (ADS)

    Aguilar, J. A.; Albert, A.; Ameli, F.; Anghinolfi, M.; Anton, G.; Anvar, S.; Aslanides, E.; Aubert, J.-J.; Barbarito, E.; Basa, S.; Battaglieri, M.; Becherini, Y.; Bellotti, R.; Beltramelli, J.; Bertin, V.; Bigi, A.; Billault, M.; Blaes, R.; de Botton, N.; Bouwhuis, M. C.; Bradbury, S. M.; Bruijn, R.; Brunner, J.; Burgio, G. F.; Busto, J.; Cafagna, F.; Caillat, L.; Calzas, A.; Capone, A.; Caponetto, L.; Carmona, E.; Carr, J.; Cartwright, S. L.; Castel, D.; Castorina, E.; Cavasinni, V.; Cecchini, S.; Ceres, A.; Charvis, P.; Chauchot, P.; Chiarusi, T.; Circella, M.; Colnard, C.; Compère, C.; Coniglione, R.; Cottini, N.; Coyle, P.; Cuneo, S.; Cussatlegras, A.-S.; Damy, G.; van Dantzig, R.; de Marzo, C.; Dekeyser, I.; Delagnes, E.; Denans, D.; Deschamps, A.; Dessages-Ardellier, F.; Destelle, J.-J.; Dinkespieler, B.; Distefano, C.; Donzaud, C.; Drogou, J.-F.; Druillole, F.; Durand, D.; Ernenwein, J.-P.; Escoffier, S.; Falchini, E.; Favard, S.; Feinstein, F.; Ferry, S.; Festy, D.; Fiorello, C.; Flaminio, V.; Galeotti, S.; Gallone, J.-M.; Giacomelli, G.; Girard, N.; Gojak, C.; Goret, Ph.; Graf, K.; Hallewell, G.; Harakeh, M. N.; Hartmann, B.; Heijboer, A.; Heine, E.; Hello, Y.; Hernández-Rey, J. J.; Hößl, J.; Hoffman, C.; Hogenbirk, J.; Hubbard, J. R.; Jaquet, M.; Jaspers, M.; de Jong, M.; Jouvenot, F.; Kalantar-Nayestanaki, N.; Kappes, A.; Karg, T.; Karkar, S.; Katz, U.; Keller, P.; Kok, H.; Kooijman, P.; Kopper, C.; Korolkova, E. V.; Kouchner, A.; Kretschmer, W.; Kruijer, A.; Kuch, S.; Kudryavstev, V. A.; Lachartre, D.; Lafoux, H.; Lagier, P.; Lahmann, R.; Lamanna, G.; Lamare, P.; Languillat, J. C.; Laschinsky, H.; Le Guen, Y.; Le Provost, H.; Le van Suu, A.; Legou, T.; Lim, G.; Lo Nigro, L.; Lo Presti, D.; Loehner, H.; Loucatos, S.; Louis, F.; Lucarelli, F.; Lyashuk, V.; Marcelin, M.; Margiotta, A.; Masullo, R.; Mazéas, F.; Mazure, A.; McMillan, J. E.; Megna, R.; Melissas, M.; Migneco, E.; Milovanovic, A.; Mongelli, M.; Montaruli, T.; Morganti, M.; Moscoso, L.; Musumeci, M.; Naumann, C.; Naumann-Godo, M.; Niess, V.; Olivetto, C.; Ostasch, R.; Palanque-Delabrouille, N.; Payre, P.; Peek, H.; Petta, C.; Piattelli, P.; Pineau, J.-P.; Poinsignon, J.; Popa, V.; Pradier, T.; Racca, C.; Randazzo, N.; van Randwijk, J.; Real, D.; van Rens, B.; Réthoré, F.; Rewiersma, P.; Riccobene, G.; Rigaud, V.; Ripani, M.; Roca, V.; Roda, C.; Rolin, J. F.; Romita, M.; Rose, H. J.; Rostovtsev, A.; Roux, J.; Ruppi, M.; Russo, G. V.; Salesa, F.; Salomon, K.; Sapienza, P.; Schmitt, F.; Schuller, J.-P.; Shanidze, R.; Sokalski, I.; Spona, T.; Spurio, M.; van der Steenhoven, G.; Stolarczyk, T.; Streeb, K.; Stubert, D.; Sulak, L.; Taiuti, M.; Tamburini, C.; Tao, C.; Terreni, G.; Thompson, L. F.; Valdy, P.; Valente, V.; Vallage, B.; Venekamp, G.; Verlaat, B.; Vernin, P.; de Vita, R.; de Vries, G.; van Wijk, R.; de Witt Huberts, P.; Wobbe, G.; de Wolf, E.; Yao, A.-F.; Zaborov, D.; Zaccone, H.; Zornoza, J. D.; Zúñiga, J.

    2007-01-01

    The ANTARES neutrino telescope is being constructed in the Mediterranean Sea. It consists of a large three-dimensional array of photo-multiplier tubes. The data acquisition system of the detector takes care of the digitisation of the photo-multiplier tube signals, data transport, data filtering, and data storage. The detector is operated using a control program interfaced with all elements. The design and the implementation of the data acquisition system are described.

  20. Study of Ultra-High Energy Cosmic Ray composition using Telescope Array's Middle Drum detector and surface array in hybrid mode

    NASA Astrophysics Data System (ADS)

    Abbasi, R. U.; Abe, M.; Abu-Zayyad, T.; Allen, M.; Anderson, R.; Azuma, R.; Barcikowski, E.; Belz, J. W.; Bergman, D. R.; Blake, S. A.; Cady, R.; Chae, M. J.; Cheon, B. G.; Chiba, J.; Chikawa, M.; Cho, W. R.; Fujii, T.; Fukushima, M.; Goto, T.; Hanlon, W.; Hayashi, Y.; Hayashida, N.; Hibino, K.; Honda, K.; Ikeda, D.; Inoue, N.; Ishii, T.; Ishimori, R.; Ito, H.; Ivanov, D.; Jui, C. C. H.; Kadota, K.; Kakimoto, F.; Kalashev, O.; Kasahara, K.; Kawai, H.; Kawakami, S.; Kawana, S.; Kawata, K.; Kido, E.; Kim, H. B.; Kim, J. H.; Kim, J. H.; Kitamura, S.; Kitamura, Y.; Kuzmin, V.; Kwon, Y. J.; Lan, J.; Lim, S. I.; Lundquist, J. P.; Machida, K.; Martens, K.; Matsuda, T.; Matsuyama, T.; Matthews, J. N.; Minamino, M.; Mukai, Y.; Myers, I.; Nagasawa, K.; Nagataki, S.; Nakamura, T.; Nonaka, T.; Nozato, A.; Ogio, S.; Ogura, J.; Ohnishi, M.; Ohoka, H.; Oki, K.; Okuda, T.; Ono, M.; Oshima, A.; Ozawa, S.; Park, I. H.; Pshirkov, M. S.; Rodriguez, D. C.; Rubtsov, G.; Ryu, D.; Sagawa, H.; Sakurai, N.; Sampson, A. L.; Scott, L. M.; Shah, P. D.; Shibata, F.; Shibata, T.; Shimodaira, H.; Shin, B. K.; Shin, H. S.; Smith, J. D.; Sokolsky, P.; Springer, R. W.; Stokes, B. T.; Stratton, S. R.; Stroman, T.; Suzawa, T.; Takamura, M.; Takeda, M.; Takeishi, R.; Taketa, A.; Takita, M.; Tameda, Y.; Tanaka, H.; Tanaka, K.; Tanaka, M.; Thomas, S. B.; Thomson, G. B.; Tinyakov, P.; Tkachev, I.; Tokuno, H.; Tomida, T.; Troitsky, S.; Tsunesada, Y.; Tsutsumi, K.; Uchihori, Y.; Udo, S.; Urban, F.; Vasiloff, G.; Wong, T.; Yamane, R.; Yamaoka, H.; Yamazaki, K.; Yang, J.; Yashiro, K.; Yoneda, Y.; Yoshida, S.; Yoshii, H.; Zollinger, R.; Zundel, Z.

    2015-04-01

    Previous measurements of the composition of Ultra-High Energy Cosmic Rays (UHECRs) made by the High Resolution Fly's Eye (HiRes) and Pierre Auger Observatory (PAO) are seemingly contradictory, but utilize different detection methods, as HiRes was a stereo detector and PAO is a hybrid detector. The five year Telescope Array (TA) Middle Drum hybrid composition measurement is similar in some, but not all, respects in methodology to PAO, and good agreement is evident between data and a light, largely protonic, composition when comparing the measurements to predictions obtained with the QGSJetII-03 and QGSJet-01c models. These models are also in agreement with previous HiRes stereo measurements, confirming the equivalence of the stereo and hybrid methods. The data is incompatible with a pure iron composition, for all models examined, over the available range of energies. The elongation rate and mean values of Xmax are in good agreement with Pierre Auger Observatory data. This analysis is presented using two methods: data cuts using simple geometrical variables and a new pattern recognition technique.

  1. Whole-exome sequencing for RH genotyping and alloimmunization risk in children with sickle cell anemia

    PubMed Central

    Flanagan, Jonathan M.; Vege, Sunitha; Luban, Naomi L. C.; Brown, R. Clark; Ware, Russell E.; Westhoff, Connie M.

    2017-01-01

    RH genes are highly polymorphic and encode the most complex of the 35 human blood group systems. This genetic diversity contributes to Rh alloimmunization in patients with sickle cell anemia (SCA) and is not avoided by serologic Rh-matched red cell transfusions. Standard serologic testing does not distinguish variant Rh antigens. Single nucleotide polymorphism (SNP)–based DNA arrays detect many RHD and RHCE variants, but the number of alleles tested is limited. We explored a next-generation sequencing (NGS) approach using whole-exome sequencing (WES) in 27 Rh alloimmunized and 27 matched non-alloimmunized patients with SCA who received chronic red cell transfusions and were enrolled in a multicenter study. We demonstrate that WES provides a comprehensive RH genotype, identifies SNPs not interrogated by DNA array, and accurately determines RHD zygosity. Among this multicenter cohort, we demonstrate an association between an altered RH genotype and Rh alloimmunization: 52% of Rh immunized vs 19% of non-immunized patients expressed variant Rh without co-expression of the conventional protein. Our findings suggest that RH allele variation in patients with SCA is clinically relevant, and NGS technology can offer a comprehensive alternative to targeted SNP-based testing. This is particularly relevant as NGS data becomes more widely available and could provide the means for reducing Rh alloimmunization in children with SCA. PMID:29296782

  2. Profiling of components of rhizoma et radix polygoni cuspidati by high-performance liquid chromatography with ultraviolet diode-array detector and ion trap/time-of-flight mass spectrometric detection.

    PubMed

    Fu, Jinfeng; Wang, Min; Guo, Huimin; Tian, Yuan; Zhang, Zunjian; Song, Rui

    2015-01-01

    Rhizoma et Radix Polygoni Cuspidati (Huzhang in Chinese, HZ) is a traditional medicinal plant in China. Many of the components of HZ have been proved to be bioactive while it is difficult to conduct a comprehensive chemical profiling of HZ as a consequence of the absence of efficient separation system and sensitive detective means. We developed a simple and effective method for comprehensive characterization of constituents in HZ. To develop a simple and effective method to characterize the components in HZ and provide useful information for subsequent metabolic studies of HZ. The components in HZ aqueous extract were characterized by using high performance liquid chromatography with UV diode-array detector (HPLC-DAD) and ion trap/time-of-flight mass spectrometric detection (HPLC-IT/TOF). Stilbenes, anthraquinones, gallates and tannins, naphthalenes and some other compounds were identified and confirmed by diagnostic fragment ions with accurate mass measurements, characteristic fragmentation pathways and relevant published literatures. Among the 238 constituents detected in HZ, a total number of 74 constituents were identified unambiguously or tentatively, including 29 compounds reported for the first time in HZ. The identification and structure elucidation of these chemicals provided essential data for quality control and further in vivo metabolic studies of HZ. Key words: Polygonum cuspidatum, HPLC-DAD, HPLC-IT/TOF, qualitative analysis.

  3. Performance comparison of SNP detection tools with illumina exome sequencing data—an assessment using both family pedigree information and sample-matched SNP array data

    PubMed Central

    Yi, Ming; Zhao, Yongmei; Jia, Li; He, Mei; Kebebew, Electron; Stephens, Robert M.

    2014-01-01

    To apply exome-seq-derived variants in the clinical setting, there is an urgent need to identify the best variant caller(s) from a large collection of available options. We have used an Illumina exome-seq dataset as a benchmark, with two validation scenarios—family pedigree information and SNP array data for the same samples, permitting global high-throughput cross-validation, to evaluate the quality of SNP calls derived from several popular variant discovery tools from both the open-source and commercial communities using a set of designated quality metrics. To the best of our knowledge, this is the first large-scale performance comparison of exome-seq variant discovery tools using high-throughput validation with both Mendelian inheritance checking and SNP array data, which allows us to gain insights into the accuracy of SNP calling through such high-throughput validation in an unprecedented way, whereas the previously reported comparison studies have only assessed concordance of these tools without directly assessing the quality of the derived SNPs. More importantly, the main purpose of our study was to establish a reusable procedure that applies high-throughput validation to compare the quality of SNP discovery tools with a focus on exome-seq, which can be used to compare any forthcoming tool(s) of interest. PMID:24831545

  4. On DESTINY Science Instrument Electrical and Electronics Subsystem Framework

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Benford, Dominic J.; Lauer, Tod R.

    2009-01-01

    Future space missions are going to require large focal planes with many sensing arrays and hundreds of millions of pixels all read out at high data rates'' . This will place unique demands on the electrical and electronics (EE) subsystem design and it will be critically important to have high technology readiness level (TRL) EE concepts ready to support such missions. One such omission is the Joint Dark Energy Mission (JDEM) charged with making precise measurements of the expansion rate of the universe to reveal vital clues about the nature of dark energy - a hypothetical form of energy that permeates all of space and tends to increase the rate of the expansion. One of three JDEM concept studies - the Dark Energy Space Telescope (DESTINY) was conducted in 2008 at the NASA's Goddard Space Flight Center (GSFC) in Greenbelt, Maryland. This paper presents the EE subsystem framework, which evolved from the DESTINY science instrument study. It describes the main challenges and implementation concepts related to the design of an EE subsystem featuring multiple focal planes populated with dozens of large arrays and millions of pixels. The focal planes are passively cooled to cryogenic temperatures (below 140 K). The sensor mosaic is controlled by a large number of Readout Integrated Circuits and Application Specific Integrated Circuits - the ROICs/ASICs in near proximity to their sensor focal planes. The ASICs, in turn, are serviced by a set of "warm" EE subsystem boxes performing Field Programmable Gate Array (FPGA) based digital signal processing (DSP) computations of complex algorithms, such as sampling-up-the-ramp algorithm (SUTR), over large volumes of fast data streams. The SUTR boxes are supported by the Instrument Control/Command and Data Handling box (ICDH Primary and Backup boxes) for lossless data compression, command and low volume telemetry handling, power conversion and for communications with the spacecraft. The paper outlines how the JDEM DESTINY concept instrument EE subsystem can be built now, a design; which is generally U.S. Government work not protected by U.S. copyright IEEEAC paper # 1429. Version 4. Updated October 19, 2009 applicable to a wide variety of missions using large focal planes with lar ge mosaics of sensors.

  5. Techniques for extracting single-trial activity patterns from large-scale neural recordings

    PubMed Central

    Churchland, Mark M; Yu, Byron M; Sahani, Maneesh; Shenoy, Krishna V

    2008-01-01

    Summary Large, chronically-implanted arrays of microelectrodes are an increasingly common tool for recording from primate cortex, and can provide extracellular recordings from many (order of 100) neurons. While the desire for cortically-based motor prostheses has helped drive their development, such arrays also offer great potential to advance basic neuroscience research. Here we discuss the utility of array recording for the study of neural dynamics. Neural activity often has dynamics beyond that driven directly by the stimulus. While governed by those dynamics, neural responses may nevertheless unfold differently for nominally identical trials, rendering many traditional analysis methods ineffective. We review recent studies – some employing simultaneous recording, some not – indicating that such variability is indeed present both during movement generation, and during the preceding premotor computations. In such cases, large-scale simultaneous recordings have the potential to provide an unprecedented view of neural dynamics at the level of single trials. However, this enterprise will depend not only on techniques for simultaneous recording, but also on the use and further development of analysis techniques that can appropriately reduce the dimensionality of the data, and allow visualization of single-trial neural behavior. PMID:18093826

  6. Estimation of Solvation Quantities from Experimental Thermodynamic Data: Development of the Comprehensive CompSol Databank for Pure and Mixed Solutes

    NASA Astrophysics Data System (ADS)

    Moine, Edouard; Privat, Romain; Sirjean, Baptiste; Jaubert, Jean-Noël

    2017-09-01

    The Gibbs energy of solvation measures the affinity of a solute for its solvent and is thus a key property for the selection of an appropriate solvent for a chemical synthesis or a separation process. More fundamentally, Gibbs energies of solvation are choice data for developing and benchmarking molecular models predicting solvation effects. The Comprehensive Solvation—CompSol—database was developed with the ambition to propose very large sets of new experimental solvation chemical-potential, solvation entropy, and solvation enthalpy data of pure and mixed components, covering extended temperature ranges. For mixed compounds, the solvation quantities were generated in infinite-dilution conditions by combining experimental values of pure-component and binary-mixture thermodynamic properties. Three types of binary-mixture properties were considered: partition coefficients, activity coefficients at infinite dilution, and Henry's-law constants. A rigorous methodology was implemented with the aim to select data at appropriate conditions of temperature, pressure, and concentration for the estimation of solvation data. Finally, our comprehensive CompSol database contains 21 671 data associated with 1969 pure species and 70 062 data associated with 14 102 binary mixtures (including 760 solvation data related to the ionic-liquid class of solvents). On the basis of the very large amount of experimental data contained in the CompSol database, it is finally discussed how solvation energies are influenced by hydrogen-bonding association effects.

  7. The HI/OH/Recombination line survey of the inner Milky Way (THOR). Survey overview and data release 1

    NASA Astrophysics Data System (ADS)

    Beuther, H.; Bihr, S.; Rugel, M.; Johnston, K.; Wang, Y.; Walter, F.; Brunthaler, A.; Walsh, A. J.; Ott, J.; Stil, J.; Henning, Th.; Schierhuber, T.; Kainulainen, J.; Heyer, M.; Goldsmith, P. F.; Anderson, L. D.; Longmore, S. N.; Klessen, R. S.; Glover, S. C. O.; Urquhart, J. S.; Plume, R.; Ragan, S. E.; Schneider, N.; McClure-Griffiths, N. M.; Menten, K. M.; Smith, R.; Roy, N.; Shanahan, R.; Nguyen-Luong, Q.; Bigiel, F.

    2016-10-01

    Context. The past decade has witnessed a large number of Galactic plane surveys at angular resolutions below 20''. However, no comparable high-resolution survey exists at long radio wavelengths around 21 cm in line and continuum emission. Aims: We remedy this situation by studying the northern Galactic plane at 20'' resolution in emission of atomic, molecular, and ionized gas. Methods: Employing the Karl G. Jansky Very Large Array (VLA) in the C-array configuration and a large program, we observe the HI 21 cm line, four OH lines, nineteen Hnα radio recombination lines as well as the continuum emission from 1 to 2 GHz in full polarization over a large part of the first Galactic quadrant. Results: Covering Galactic longitudes from 14.5 to 67.4 deg and latitudes between ± 1.25 deg, we image all of these lines and the continuum at 20'' resolution. These data allow us to study the various components of the interstellar medium (ISM): from the atomic phase, traced by the HI line, to the molecular phase, observed by the OH transitions, to the ionized medium, revealed by the cm continuum and the Hnα radio recombination lines. Furthermore, the polarized continuum emission enables magnetic field studies. In this overview paper, we discuss the survey outline and present the first data release as well as early results from the different datasets. We now release the first half of the survey; the second half will follow later after the ongoing data processing has been completed. The data in fits format (continuum images and line data cubes) can be accessed through the project web-page. Conclusions: The HI/OH/Recombination line survey of the Milky Way (THOR) opens a new window to the different parts of the ISM. It enables detailed studies of molecular cloud formation, conversion of atomic to molecular gas, and feedback from Hii regions as well as the magnetic field in the Milky Way. It is highly complementary to other surveys of our Galaxy, and comparing the different datasets will allow us to address many open questions. Based on observations carried out with the Karl Jansky Very Large Array (VLA). http://www.mpia.de/thor

  8. Next-Generation Microshutter Arrays for Large-Format Imaging and Spectroscopy

    NASA Technical Reports Server (NTRS)

    Moseley, Samuel; Kutyrev, Alexander; Brown, Ari; Li, Mary

    2012-01-01

    A next-generation microshutter array, LArge Microshutter Array (LAMA), was developed as a multi-object field selector. LAMA consists of small-scaled microshutter arrays that can be combined to form large-scale microshutter array mosaics. Microshutter actuation is accomplished via electrostatic attraction between the shutter and a counter electrode, and 2D addressing can be accomplished by applying an electrostatic potential between a row of shutters and a column, orthogonal to the row, of counter electrodes. Microelectromechanical system (MEMS) technology is used to fabricate the microshutter arrays. The main feature of the microshutter device is to use a set of standard surface micromachining processes for device fabrication. Electrostatic actuation is used to eliminate the need for macromechanical magnet actuating components. A simplified electrostatic actuation with no macro components (e.g. moving magnets) required for actuation and latching of the shutters will make the microshutter arrays robust and less prone to mechanical failure. Smaller-size individual arrays will help to increase the yield and thus reduce the cost and improve robustness of the fabrication process. Reducing the size of the individual shutter array to about one square inch and building the large-scale mosaics by tiling these smaller-size arrays would further help to reduce the cost of the device due to the higher yield of smaller devices. The LAMA development is based on prior experience acquired while developing microshutter arrays for the James Webb Space Telescope (JWST), but it will have different features. The LAMA modular design permits large-format mosaicking to cover a field of view at least 50 times larger than JWST MSA. The LAMA electrostatic, instead of magnetic, actuation enables operation cycles at least 100 times faster and a mass significantly smaller compared to JWST MSA. Also, standard surface micromachining technology will simplify the fabrication process, increasing yield and reducing cost.

  9. Cognitive correlates of pragmatic language comprehension in adult traumatic brain injury: A systematic review and meta-analyses.

    PubMed

    Rowley, Dane A; Rogish, Miles; Alexander, Timothy; Riggs, Kevin J

    2017-01-01

    Effective pragmatic comprehension of language is critical for successful communication and interaction, but this ability is routinely impaired following Traumatic Brain Injury (TBI) (1,2). Individual studies have investigated the cognitive domains associated with impaired pragmatic comprehension, but there remains little understanding of the relative importance of these domains in contributing to pragmatic comprehension impairment following TBI. This paper presents a systematic meta-analytic review of the observed correlations between pragmatic comprehension and cognitive processes following TBI. Five meta-analyses were computed, which quantified the relationship between pragmatic comprehension and five key cognitive constructs (declarative memory; working memory; attention; executive functions; social cognition). Significant moderate-to-strong correlations were found between all cognitive measures and pragmatic comprehension, where declarative memory was the strongest correlate. Thus, our findings indicate that pragmatic comprehension in TBI is associated with an array of domain general cognitive processes, and as such deficits in these cognitive domains may underlie pragmatic comprehension difficulties following TBI. The clinical implications of these findings are discussed.

  10. Performance measurements of the first RAID prototype

    NASA Technical Reports Server (NTRS)

    Chervenak, Ann L.

    1990-01-01

    The performance is examined of Redundant Arrays of Inexpensive Disks (RAID) the First, a prototype disk array. A hierarchy of bottlenecks was discovered in the system that limit overall performance. The most serious is the memory system contention on the Sun 4/280 host CPU, which limits array bandwidth to 2.3 MBytes/sec. The array performs more successfully on small random operations, achieving nearly 300 I/Os per second before the Sun 4/280 becomes CPU limited. Other bottlenecks in the system are the VME backplane, bandwidth on the disk controller, and overheads associated with the SCSI protocol. All are examined in detail. The main conclusion is that to achieve the potential bandwidth of arrays, more powerful CPU's alone will not suffice. Just as important are adequate host memory bandwidth and support for high bandwidth on disk controllers. Current disk controllers are more often designed to achieve large numbers of small random operations, rather than high bandwidth. Operating systems also need to change to support high bandwidth from disk arrays. In particular, they should transfer data in larger blocks, and should support asynchronous I/O to improve sequential write performance.

  11. Deployment dynamics and control of large-scale flexible solar array system with deployable mast

    NASA Astrophysics Data System (ADS)

    Li, Hai-Quan; Liu, Xiao-Feng; Guo, Shao-Jing; Cai, Guo-Ping

    2016-10-01

    In this paper, deployment dynamics and control of large-scale flexible solar array system with deployable mast are investigated. The adopted solar array system is introduced firstly, including system configuration, deployable mast and solar arrays with several mechanisms. Then dynamic equation of the solar array system is established by the Jourdain velocity variation principle and a method for dynamics with topology changes is introduced. In addition, a PD controller with disturbance estimation is designed to eliminate the drift of spacecraft mainbody. Finally the validity of the dynamic model is verified through a comparison with ADAMS software and the deployment process and dynamic behavior of the system are studied in detail. Simulation results indicate that the proposed model is effective to describe the deployment dynamics of the large-scale flexible solar arrays and the proposed controller is practical to eliminate the drift of spacecraft mainbody.

  12. Porous microwells for geometry-selective, large-scale microparticle arrays

    NASA Astrophysics Data System (ADS)

    Kim, Jae Jung; Bong, Ki Wan; Reátegui, Eduardo; Irimia, Daniel; Doyle, Patrick S.

    2017-01-01

    Large-scale microparticle arrays (LSMAs) are key for material science and bioengineering applications. However, previous approaches suffer from trade-offs between scalability, precision, specificity and versatility. Here, we present a porous microwell-based approach to create large-scale microparticle arrays with complex motifs. Microparticles are guided to and pushed into microwells by fluid flow through small open pores at the bottom of the porous well arrays. A scaling theory allows for the rational design of LSMAs to sort and array particles on the basis of their size, shape, or modulus. Sequential particle assembly allows for proximal and nested particle arrangements, as well as particle recollection and pattern transfer. We demonstrate the capabilities of the approach by means of three applications: high-throughput single-cell arrays; microenvironment fabrication for neutrophil chemotaxis; and complex, covert tags by the transfer of an upconversion nanocrystal-laden LSMA.

  13. A solar house in Provence: Impressions and reflections after five years

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Randall, D.G.

    1989-01-01

    This house was fully described in SunWorld, Volume 7, Number 3 (Fall 1983). It was not a research project so comprehensive readings of conditions, analyses and heat balances have not been attempted. However, the author has accumulated much data from which some approximations can be made. Being essentially a home for him and his wife, some of this report is subjective. Developments which are described are: domestic hot water system, main collector array, solar porch fans, control systems, pool heating and space heating make-up. The author also discusses active versus passive, humidity, frost, economics, and maintenance.

  14. Transparent Fingerprint Sensor System for Large Flat Panel Display.

    PubMed

    Seo, Wonkuk; Pi, Jae-Eun; Cho, Sung Haeung; Kang, Seung-Youl; Ahn, Seong-Deok; Hwang, Chi-Sun; Jeon, Ho-Sik; Kim, Jong-Uk; Lee, Myunghee

    2018-01-19

    In this paper, we introduce a transparent fingerprint sensing system using a thin film transistor (TFT) sensor panel, based on a self-capacitive sensing scheme. An armorphousindium gallium zinc oxide (a-IGZO) TFT sensor array and associated custom Read-Out IC (ROIC) are implemented for the system. The sensor panel has a 200 × 200 pixel array and each pixel size is as small as 50 μm × 50 μm. The ROIC uses only eight analog front-end (AFE) amplifier stages along with a successive approximation analog-to-digital converter (SAR ADC). To get the fingerprint image data from the sensor array, the ROIC senses a capacitance, which is formed by a cover glass material between a human finger and an electrode of each pixel of the sensor array. Three methods are reviewed for estimating the self-capacitance. The measurement result demonstrates that the transparent fingerprint sensor system has an ability to differentiate a human finger's ridges and valleys through the fingerprint sensor array.

  15. Primary gamma ray selection in a hybrid timing/imaging Cherenkov array

    NASA Astrophysics Data System (ADS)

    Postnikov, E. B.; Grinyuk, A. A.; Kuzmichev, L. A.; Sveshnikova, L. G.

    2017-06-01

    This work is a methodical study on hybrid reconstruction techniques for hybrid imaging/timing Cherenkov observations. This type of hybrid array is to be realized at the gamma-observatory TAIGA intended for very high energy gamma-ray astronomy (> 30 TeV). It aims at combining the cost-effective timing-array technique with imaging telescopes. Hybrid operation of both of these techniques can lead to a relatively cheap way of development of a large area array. The joint approach of gamma event selection was investigated on both types of simulated data: the image parameters from the telescopes, and the shower parameters reconstructed from the timing array. The optimal set of imaging parameters and shower parameters to be combined is revealed. The cosmic ray background suppression factor depending on distance and energy is calculated. The optimal selection technique leads to cosmic ray background suppression of about 2 orders of magnitude on distances up to 450 m for energies greater than 50 TeV.

  16. Design of an Improved Heater Array to Measure Microscale Wall Heat Transfer

    NASA Technical Reports Server (NTRS)

    Kim, Jungho; Chng, Choon Ping; Kalkur, T. S.

    1996-01-01

    An improved array of microscale heaters is being developed to measure the heat transfer coefficient at many points underneath individual bubbles during boiling as a function of space and time. This heater array enables the local heat transfer from a surface during the bubble growth and departure process to be measured with very high temporal and spatial resolution, and should allow better understanding of the boiling heat transfer mechanisms by pin-pointing when and where in the bubble departure cycle large amounts of wall heat transfer occur. Such information can provide much needed data regarding the important heat transfer mechanisms during the bubble departure cycle, and can serve as benchmarks to validate many of the analytical and numerical models used to simulate boiling. The improvements to the heater array include using a silicon-on-quartz substrate to reduce thermal cross-talk between the heaters, decreased space between the heaters, increased pad sizes on the heaters, and progressive heater sizes. Some results using the present heater array are discussed.

  17. Transparent Fingerprint Sensor System for Large Flat Panel Display

    PubMed Central

    Seo, Wonkuk; Pi, Jae-Eun; Cho, Sung Haeung; Kang, Seung-Youl; Ahn, Seong-Deok; Hwang, Chi-Sun; Jeon, Ho-Sik; Kim, Jong-Uk

    2018-01-01

    In this paper, we introduce a transparent fingerprint sensing system using a thin film transistor (TFT) sensor panel, based on a self-capacitive sensing scheme. An armorphousindium gallium zinc oxide (a-IGZO) TFT sensor array and associated custom Read-Out IC (ROIC) are implemented for the system. The sensor panel has a 200 × 200 pixel array and each pixel size is as small as 50 μm × 50 μm. The ROIC uses only eight analog front-end (AFE) amplifier stages along with a successive approximation analog-to-digital converter (SAR ADC). To get the fingerprint image data from the sensor array, the ROIC senses a capacitance, which is formed by a cover glass material between a human finger and an electrode of each pixel of the sensor array. Three methods are reviewed for estimating the self-capacitance. The measurement result demonstrates that the transparent fingerprint sensor system has an ability to differentiate a human finger’s ridges and valleys through the fingerprint sensor array. PMID:29351218

  18. Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar.

    PubMed

    Long, Teng; Zhang, Honggang; Zeng, Tao; Chen, Xinliang; Liu, Quanhua; Zheng, Le

    2016-09-09

    Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF)-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar's estimation is employed to the extended Kalman filter (EKF) to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method.

  19. Next Generation Astronomical Data Processing using Big Data Technologies from the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Mattmann, Chris

    2014-04-01

    In this era of exascale instruments for astronomy we must naturally develop next generation capabilities for the unprecedented data volume and velocity that will arrive due to the veracity of these ground-based sensor and observatories. Integrating scientific algorithms stewarded by scientific groups unobtrusively and rapidly; intelligently selecting data movement technologies; making use of cloud computing for storage and processing; and automatically extracting text and metadata and science from any type of file are all needed capabilities in this exciting time. Our group at NASA JPL has promoted the use of open source data management technologies available from the Apache Software Foundation (ASF) in pursuit of constructing next generation data management and processing systems for astronomical instruments including the Expanded Very Large Array (EVLA) in Socorro, NM and the Atacama Large Milimetre/Sub Milimetre Array (ALMA); as well as for the KAT-7 project led by SKA South Africa as a precursor to the full MeerKAT telescope. In addition we are funded currently by the National Science Foundation in the US to work with MIT Haystack Observatory and the University of Cambridge in the UK to construct a Radio Array of Portable Interferometric Devices (RAPID) that will undoubtedly draw from the rich technology advances underway. NASA JPL is investing in a strategic initiative for Big Data that is pulling in these capabilities and technologies for astronomical instruments and also for Earth science remote sensing. In this talk I will describe the above collaborative efforts underway and point to solutions in open source from the Apache Software Foundation that can be deployed and used today and that are already bringing our teams and projects benefits. I will describe how others can take advantage of our experience and point towards future application and contribution of these tools.

  20. New Hybridized Surface Wave Approach for Geotechnical Modeling of Shear Wave Velocity at Strong Motion Recording Stations

    NASA Astrophysics Data System (ADS)

    Kayen, R.; Carkin, B.; Minasian, D.

    2006-12-01

    Strong motion recording (SMR) networks often have little or no shear wave velocity measurements at stations where characterization of site amplification and site period effects is needed. Using the active Spectral Analysis of Surface Waves (SASW) method, and passive H/V microtremor method we have investigated nearly two hundred SMR sites in California, Alaska, Japan, Australia, China and Taiwan. We are conducting these studies, in part, to develop a new hybridized method of site characterization that utilizes a parallel array of harmonic-wave sources for active-source SASW, and a single long period seismometer for passive-source microtremor measurement. Surface wave methods excel in their ability to non-invasively and rapidly characterize the variation of ground stiffness properties with depth below the surface. These methods are lightweight, inexpensive to deploy, and time-efficient. They have been shown to produce accurate and deep soil stiffness profiles. By placing and wiring shakers in a large parallel circuit, either side-by-side on the ground or in a trailer-mounted array, a strong in-phase harmonic wave can be produced. The effect of arraying many sources in parallel is to increase the amplitude of waves received at far-away spaced seismometers at low frequencies so as to extend the longest wavelengths of the captured dispersion curve. The USGS system for profiling uses this concept by arraying between two and eight electro-mechanical harmonic-wave shakers. With large parallel arrays of vibrators, a dynamic force in excess of 1000 lb can be produced to vibrate the ground and produce surface waves. We adjust the harmonic wave through a swept-sine procedure to profile surface wave dispersion down to a frequency of 1 Hz and out to surface wave-wavelengths of 200-1000 meters, depending on the site stiffness. The parallel-array SASW procedure is augmented using H/V microtremor data collected with the active source turned off. Passive array microtremor data reveal the natural and resonance characteristics of the ground by capturing persistent natural vibrations. These microtremors are the result of the interaction of surface waves arriving from distant sources and the stiffness structure of the site under investigation. As such, these resonance effects are effective in constraining the layer thicknesses of the SASW shear wave velocity structure and aid in determining the depth of the deepest layer. Together, the hybridized SASW and H/V procedure provides a complete data set for modeling the geotechnical aspects of ground amplification of earthquake motions. Data from these investigations are available at http://walrus.wr.usgs.gov/geotech.

  1. Smart Energy Cryo-refrigerator Technology for the next generation Very Large Array

    NASA Astrophysics Data System (ADS)

    Spagna, Stefano

    2018-01-01

    We describe a “smart energy” cryocooler technology architecture for the next generation Very Large Array that makes use of multiple variable frequency cold heads driven from a single variable speed air cooled compressor. Preliminary experiments indicate that the compressor variable flow control, advanced diagnostics, and the cryo-refrigerator low vibration, provide a unique energy efficient capability for the very large number of antennas that will be employed in this array.

  2. Investigation and Characterization of Acoustic Emissions of Tornadoes Using Arrays of Infrasound Sensors

    NASA Astrophysics Data System (ADS)

    Frazier, W. G.; Talmadge, C. L.; Waxler, R.; Knupp, K. R.; Goudeau, B.; Hetzer, C. H.

    2017-12-01

    Working in co-ordination with the NOAA Vortex Southeast (Vortex SE) research program, 9 infrasound sensor arrays were deployed at fixed sites across North Alabama, South-central Tennessee, and Northwest Georgia during March and April of 2017, to investigate the emission and characterization of infrasonic acoustic energy from tornadoes and related phenomena. Each array consisted of seven broadband acoustic sensors with calibrated frequency response from 0.02 Hz to 200 Hz. The arrays were configured in a pattern such that accurate bearings to acoustic sources could be obtained over a broad range of frequencies (nominally from 1 Hz to 100 Hz). Data were collected synchronously at a rate of 1000 samples per second. On 22 April 2017 a line of strong storms passed directly through the area being monitored producing at least three verified tornadoes. Two of these were rated at EF0 and the other an EF1. Subsequent processing of the data from several of the arrays revealed acoustic emissions from the tornadic storms ranging in frequencies below 1 Hz to frequencies greater than 10 Hz. Accurate bearings to the storms have been calculated from distances greater than 60 km. Preliminary analysis has revealed that continuous emissions occurred prior to the estimated touchdown times, while the storms were on the ground, and for short periods after the tornadoes lifted; however, the strongest emissions appeared to occur while the storms were on the ground. One of the storms passed near two arrays simultaneously, and therefore accurate an accurate track of the storm as it moved has been obtained only using the infrasound measurements. Initial results from the analysis of the infrasound data will be presented. Under Vortex SE meteorological data was collected on a large suite of sensors. Correlations between the infrasound data and the meteorological data will be investigated and discussed.

  3. Innovative research in the design and operation of large telescopes for space: Aspects of giant telescopes in space

    NASA Technical Reports Server (NTRS)

    Angel, J. R. P.

    1985-01-01

    The capability and understanding of how to finish the reflector surfaces needed for large space telescopes is discussed. The technology for making very light glass substrates for mirrors is described. Other areas of development are in wide field imaging design for very fast primaries, in data analysis and retrieval methods for astronomical images, and in methods for making large area closely packed mosaics of solid state array detectors.

  4. Study of atmospheric gravity waves and infrasonic sources using the USArray Transportable Array pressure data

    NASA Astrophysics Data System (ADS)

    Hedlin, Michael; de Groot-Hedlin, Catherine; Hoffmann, Lars; Alexander, M. Joan; Stephan, Claudia

    2016-04-01

    The upgrade of the USArray Transportable Array (TA) with microbarometers and infrasound microphones has created an opportunity for a broad range of new studies of atmospheric sources and the large- and small-scale atmospheric structure through which signals from these events propagate. These studies are akin to early studies of seismic events and the Earth's interior structure that were made possible by the first seismic networks. In one early study with the new dataset we use the method of de Groot-Hedlin and Hedlin (2015) to recast the TA as a massive collection of 3-element arrays to detect and locate large infrasonic events. Over 2,000 events have been detected in 2013. The events cluster in highly active regions on land and offshore. Stratospherically ducted signals from some of these events have been recorded more than 2,000 km from the source and clearly show dispersion due to propagation through atmospheric gravity waves. Modeling of these signals has been used to test statistical models of atmospheric gravity waves. The network is also useful for making direct observations of gravity waves. We are currently studying TA and satellite observations of gravity waves from singular events to better understand how the waves near ground level relate to those observed aloft. We are also studying the long-term statistics of these waves from the beginning of 2010 through 2014. Early work using data bandpass filtered from 1-6 hr shows that both the TA and satellite data reveal highly active source regions, such as near the Great Lakes. de Groot-Hedlin and Hedlin, 2015, A method for detecting and locating geophysical events using clusters of arrays, Geophysical Journal International, v203, p960-971, doi: 10.1093/gji/ggv345.

  5. Effects of Wave Energy Converter (WEC) Arrays on Wave, Current, and Sediment Circulation

    NASA Astrophysics Data System (ADS)

    Ruehl, K.; Roberts, J. D.; Jones, C.; Magalen, J.; James, S. C.

    2012-12-01

    The characterization of the physical environment and commensurate alteration of that environment due to Wave Energy Conversion (WEC) devices, or arrays of devices, must be understood to make informed device-performance predictions, specifications of hydrodynamic loads, and environmental evaluations of eco-system responses (e.g., changes to circulation patterns, sediment dynamics, and water quality). Hydrodynamic and sediment issues associated with performance of wave-energy devices will primarily be nearshore where WEC infrastructure (e.g., anchors, piles) are exposed to large forces from the surface-wave action and currents. Wave-energy devices will be subject to additional corrosion, fouling, and wear of moving parts caused by suspended sediments in the water column. The alteration of the circulation and sediment transport patterns may also alter local ecosystems through changes in benthic habitat, circulation patterns, or other environmental parameters. Sandia National Laboratories is developing tools and performing studies to quantitatively characterize the environments where WEC devices may be installed and to assess potential affects to hydrodynamics and local sediment transport. The primary tools are wave, hydrodynamic, and sediment transport models. To ensure confidence in the resulting evaluation of system-wide effects, the models are appropriately constrained and validated with measured data where available. An extension of the US EPA's EFDC code, SNL-EFDC, provides a suitable platform for modeling the necessary hydrodynamics;it has been modified to directly incorporate output from a SWAN wave model of the region. Model development and results are presented. In this work, a model is exercised for Monterey Bay, near Santa Cruz where a WEC array could be deployed. Santa Cruz is located on the northern coast of Monterey Bay, in Central California, USA. This site was selected for preliminary research due to the readily available historical hydrodynamic data (currents and wave heights, periods, and directions), sediment characterization data, and near-shore bathymetric data. In addition, the region has been under evaluation for future ocean energy projects. The modeling framework of SWAN and SNL-EFDC combined with field validation datasets allows for a robust quantitative description of the nearshore environment within which the MHK devices will be evaluated. This quantitative description can be directly incorporated into environmental impact assessments to eliminate guesswork related to the effects of the presence of large-scale arrays. These results can be used to design more efficient arrays while minimizing impacts on the nearshore environments. Further investigations into fine-scale scour near the structures will help determine if these large-scale results show that, in fact, there is deposition adjacent to the arrays, which could have design implications on anchorage and cabling systems.

  6. TROPIX Power System Architecture

    NASA Technical Reports Server (NTRS)

    Manner, David B.; Hickman, J. Mark

    1995-01-01

    This document contains results obtained in the process of performing a power system definition study of the TROPIX power management and distribution system (PMAD). Requirements derived from the PMADs interaction with other spacecraft systems are discussed first. Since the design is dependent on the performance of the photovoltaics, there is a comprehensive discussion of the appropriate models for cells and arrays. A trade study of the array operating voltage and its effect on array bus mass is also presented. A system architecture is developed which makes use of a combination of high efficiency switching power convertors and analog regulators. Mass and volume estimates are presented for all subsystems.

  7. KUTE-BASE: storing, downloading and exporting MIAME-compliant microarray experiments in minutes rather than hours.

    PubMed

    Draghici, Sorin; Tarca, Adi L; Yu, Longfei; Ethier, Stephen; Romero, Roberto

    2008-03-01

    The BioArray Software Environment (BASE) is a very popular MIAME-compliant, web-based microarray data repository. However in BASE, like in most other microarray data repositories, the experiment annotation and raw data uploading can be very timeconsuming, especially for large microarray experiments. We developed KUTE (Karmanos Universal daTabase for microarray Experiments), as a plug-in for BASE 2.0 that addresses these issues. KUTE provides an automatic experiment annotation feature and a completely redesigned data work-flow that dramatically reduce the human-computer interaction time. For instance, in BASE 2.0 a typical Affymetrix experiment involving 100 arrays required 4 h 30 min of user interaction time forexperiment annotation, and 45 min for data upload/download. In contrast, for the same experiment, KUTE required only 28 min of user interaction time for experiment annotation, and 3.3 min for data upload/download. http://vortex.cs.wayne.edu/kute/index.html.

  8. Divide and Recombine for Large Complex Data

    DTIC Science & Technology

    2017-12-01

    Empirical Methods in Natural Language Processing , October 2014 Keywords Enter keywords for the publication. URL Enter the URL...low-latency data processing systems. Declarative Languages for Interactive Visualization: The Reactive Vega Stack Another thread of XDATA research...for array processing operations embedded in the R programming language . Vector virtual machines work well for long vectors. One of the most

  9. Final analysis of cost, value, and risk.

    DOT National Transportation Integrated Search

    2009-03-05

    USDOT understands that access to emergency services provided by 9-1-1 in todays world of evolving : technology will ultimately occur within a broader array of interconnected networks comprehensively : supporting emergency servicesfrom public ac...

  10. Quick Fabrication of Large-area Organic Semiconductor Single Crystal Arrays with a Rapid Annealing Self-Solution-Shearing Method

    PubMed Central

    Li, Yunze; Ji, Deyang; Liu, Jie; Yao, Yifan; Fu, Xiaolong; Zhu, Weigang; Xu, Chunhui; Dong, Huanli; Li, Jingze; Hu, Wenping

    2015-01-01

    In this paper, we developed a new method to produce large-area single crystal arrays by using the organic semiconductor 9, 10-bis (phenylethynyl) anthracene (BPEA). This method involves an easy operation, is efficient, meets the demands of being low-cost and is independent of the substrate for large-area arrays fabrication. Based on these single crystal arrays, the organic field effect transistors exhibit the superior performance with the average mobility extracting from the saturation region of 0.2 cm2 V−1s−1 (the highest 0.47 cm2 V−1s−1) and on/off ratio exceeding 105. In addition, our single crystal arrays also show a very high photoswitch performance with an on/off current ratio up to 4.1 × 105, which is one of the highest values reported for organic materials. It is believed that this method provides a new way to fabricate single crystal arrays and has the potential for application to large area organic electronics. PMID:26282460

  11. Fabrication of plasmonic cavity arrays for SERS analysis

    NASA Astrophysics Data System (ADS)

    Li, Ning; Feng, Lei; Teng, Fei; Lu, Nan

    2017-05-01

    The plasmonic cavity arrays are ideal substrates for surface enhanced Raman scattering analysis because they can provide hot spots with large volume for analyte molecules. The large area increases the probability to make more analyte molecules on hot spots and leads to a high reproducibility. Therefore, to develop a simple method for creating cavity arrays is important. Herein, we demonstrate how to fabricate a V and W shape cavity arrays by a simple method based on self-assembly. Briefly, the V and W shape cavity arrays are respectively fabricated by taking KOH etching on a nanohole and a nanoring array patterned silicon (Si) slides. The nanohole array is generated by taking a reactive ion etching on a Si slide assembled with monolayer of polystyrene (PS) spheres. The nanoring array is generated by taking a reactive ion etching on a Si slide covered with a monolayer of octadecyltrichlorosilane before self-assembling PS spheres. Both plasmonic V and W cavity arrays can provide large hot area, which increases the probability for analyte molecules to deposit on the hot spots. Taking 4-Mercaptopyridine as analyte probe, the enhancement factor can reach 2.99 × 105 and 9.97 × 105 for plasmonic V cavity and W cavity array, respectively. The relative standard deviations of the plasmonic V and W cavity arrays are 6.5% and 10.2% respectively according to the spectra collected on 20 random spots.

  12. Fabrication of plasmonic cavity arrays for SERS analysis.

    PubMed

    Li, Ning; Feng, Lei; Teng, Fei; Lu, Nan

    2017-05-05

    The plasmonic cavity arrays are ideal substrates for surface enhanced Raman scattering analysis because they can provide hot spots with large volume for analyte molecules. The large area increases the probability to make more analyte molecules on hot spots and leads to a high reproducibility. Therefore, to develop a simple method for creating cavity arrays is important. Herein, we demonstrate how to fabricate a V and W shape cavity arrays by a simple method based on self-assembly. Briefly, the V and W shape cavity arrays are respectively fabricated by taking KOH etching on a nanohole and a nanoring array patterned silicon (Si) slides. The nanohole array is generated by taking a reactive ion etching on a Si slide assembled with monolayer of polystyrene (PS) spheres. The nanoring array is generated by taking a reactive ion etching on a Si slide covered with a monolayer of octadecyltrichlorosilane before self-assembling PS spheres. Both plasmonic V and W cavity arrays can provide large hot area, which increases the probability for analyte molecules to deposit on the hot spots. Taking 4-Mercaptopyridine as analyte probe, the enhancement factor can reach 2.99 × 10 5 and 9.97 × 10 5 for plasmonic V cavity and W cavity array, respectively. The relative standard deviations of the plasmonic V and W cavity arrays are 6.5% and 10.2% respectively according to the spectra collected on 20 random spots.

  13. AgBase: supporting functional modeling in agricultural organisms

    PubMed Central

    McCarthy, Fiona M.; Gresham, Cathy R.; Buza, Teresia J.; Chouvarine, Philippe; Pillai, Lakshmi R.; Kumar, Ranjit; Ozkan, Seval; Wang, Hui; Manda, Prashanti; Arick, Tony; Bridges, Susan M.; Burgess, Shane C.

    2011-01-01

    AgBase (http://www.agbase.msstate.edu/) provides resources to facilitate modeling of functional genomics data and structural and functional annotation of agriculturally important animal, plant, microbe and parasite genomes. The website is redesigned to improve accessibility and ease of use, including improved search capabilities. Expanded capabilities include new dedicated pages for horse, cat, dog, cotton, rice and soybean. We currently provide 590 240 Gene Ontology (GO) annotations to 105 454 gene products in 64 different species, including GO annotations linked to transcripts represented on agricultural microarrays. For many of these arrays, this provides the only functional annotation available. GO annotations are available for download and we provide comprehensive, species-specific GO annotation files for 18 different organisms. The tools available at AgBase have been expanded and several existing tools improved based upon user feedback. One of seven new tools available at AgBase, GOModeler, supports hypothesis testing from functional genomics data. We host several associated databases and provide genome browsers for three agricultural pathogens. Moreover, we provide comprehensive training resources (including worked examples and tutorials) via links to Educational Resources at the AgBase website. PMID:21075795

  14. Polar Experiment Network for Geospace Upper-atmosphere Investigations (PENGUIn): A Vision for Global Polar Studies and Education

    NASA Astrophysics Data System (ADS)

    Weatherwax, A. T.; Lanzerotti, L. J.; Rosenberg, T. J.; Detrick, D. L.; Clauer, C. R.; Ridley, A.; Mende, S. B.; Frey, H. U.; Ostgaard, N.; Sterling, R. W.; Inan, U. S.; Engebretson, M. J.; Petit, N.; Labelle, J.; Lynch, K.; Lessard, M.; Maclennan, C. G.; Doolittle, J. H.; Fukunishi, H.

    2003-12-01

    The several decades since the advent of space flight have witnessed the ever growing importance and relevance of the Earth's space environment for understanding the functioning of Earth within the solar system and for understanding the effects of the Sun's influence on technological systems deployed on Earth and in space. Achieving a comprehensive understanding of Earth's geospace environment requires knowledge of the ionosphere and magnetosphere in both polar regions. Outlined in this talk is a broad, multi-national plan to investigate in depth, from Antarctica and nominally conjugate regions in the Arctic, the electrodynamic system that comprises the space environment of Planet Earth. Specifics include (a) the phased development of a new and comprehensive upper atmosphere geophysical measurement program based upon distributed instruments operating in an extreme polar environments; (b) real time data collection via satellites; (c) a methodology to build synergistic data sets from a global distribution of southern and northern hemisphere instrument arrays; and (d) an integration with all levels of education including high school, undergraduate, graduate, and post-doctoral.

  15. Large Format Arrays for Far Infrared and Millimeter Astronomy

    NASA Technical Reports Server (NTRS)

    Moseley, Harvey

    2004-01-01

    Some of the most compelling questions in modem astronomy are best addressed with submillimeter and millimeter observations. The question of the role of inflation in the early evolution of the universe is best addressed with large sensitive arrays of millimeter polarimeters. The study of the first generations of galaxies requires sensitive submillimeter imaging, which can help us to understand the history of energy release and nucleosynthesis in the universe. Our ability to address these questions is dramatically increasing, driven by dramatic steps in the sensitivity and size of available detector arrays. While the MIPS instrument on the SIRTF mission will revolutionize far infrared astronomy with its 1024 element array of photoconductors, thermal detectors remain the dominant technology for submillimeter and millimeter imaging and polarimetry. The last decade has seen the deployment of increasingly large arrays of bolometers, ranging from the 48 element arrays deployed on the KAO in the late 198Os, to the SHARC and SCUBA arrays in the 1990s. The past years have seen the deployment of a new generation of larger detector arrays in SHARC II (384 channels) and Bolocam (144 channels). These detectors are in operation and are beginning to make significant impacts on the field. Arrays of sensitive submillimeter bolometers on the SPIRE instrument on Herschel will allow the first large areas surveys of the sky, providing important insight into the evolution of galaxies. The next generation of detectors, led by SCUBA II, will increase the focal scale of these instruments by an order of magnitude. Two major missions are being planned by NASA for which further development of long wavelength detectors is essential, The SAFlR mission, a 10-m class telescope with large arrays of background limited detectors, will extend our reach into the epoch of initial galaxy formation. A major goal of modem cosmology is to test the inflationary paradigm in the early evolution of the universe. To this end, a mission is planned to detect the imprint of inflation on the CMB by precision measurement of its polarization. This work requires very large arrays of sensitive detectors which can provide unprecedented control of a wide range of systematic errors, given the small amplitude of the signal of interest. We will describe the current state of large format detector arrays, the performance requirements set by the new missions, and the different approaches being developed in the community to meet these requirements. We are confident that within a decade, these developments will lead to dramatic advances in our understanding of the evolution of the universe.

  16. Inverse current source density method in two dimensions: inferring neural activation from multielectrode recordings.

    PubMed

    Łęski, Szymon; Pettersen, Klas H; Tunstall, Beth; Einevoll, Gaute T; Gigg, John; Wójcik, Daniel K

    2011-12-01

    The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a two-dimensional grid using multi-electrode rectangular arrays. This new method, which we call two-dimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one- and three-dimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include system-specific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUI-based MATLAB toolbox to analyze and visualize our test data as well as user datasets.

  17. Deformations and Rotational Ground Motions Inferred from Downhole Vertical Array Observations

    NASA Astrophysics Data System (ADS)

    Graizer, V.

    2017-12-01

    Only few direct reliable measurements of rotational component of strong earthquake ground motions are obtained so far. In the meantime, high quality data recorded at downhole vertical arrays during a number of earthquakes provide an opportunity to calculate deformations based on the differences in ground motions recorded simultaneously at different depths. More than twenty high resolution strong motion downhole vertical arrays were installed in California with primary goal to study site response of different geologic structures to strong motion. Deformation or simple shear strain with the rate γ is the combination of pure shear strain with the rate γ/2 and rotation with the rate of α=γ/2. Deformations and rotations were inferred from downhole array records of the Mw 6.0 Parkfield 2004, the Mw 7.2 Sierra El Mayor (Mexico) 2010, the Mw 6.5 Ferndale area in N. California 2010 and the two smaller earthquakes in California. Highest amplitude of rotation of 0.60E-03 rad was observed at the Eureka array corresponding to ground velocity of 35 cm/s, and highest rotation rate of 0.55E-02 rad/s associated with the S-wave was observed at a close epicentral distance of 4.3 km from the ML 4.2 event in Southern California at the La Cienega array. Large magnitude Sierra El Mayor earthquake produced long duration rotational motions of up to 1.5E-04 rad and 2.05E-03 rad/s associated with shear and surface waves at the El Centro array at closest fault distance of 33.4km. Rotational motions of such levels, especially tilting can have significant effect on structures. High dynamic range well synchronized and properly oriented instrumentation is necessary for reliable calculation of rotations from vertical array data. Data from the dense Treasure Island array near San Francisco demonstrate consistent change of shape of rotational motion with depth and material. In the frequency range of 1-15 Hz Fourier amplitude spectrum of vertical ground velocity is similar to the scaled tilt spectrum. Amplitudes of rotations at the site depend upon the size of the base and usually decrease with depth. They are also amplified by soft material. Earthquake data used in this study were downloaded from the Center for Engineering Strong Motion Data at http://www.strongmotioncenter.org/.

  18. Earthquake recording at the Stanford DAS Array with fibers in existing telecomm conduits

    NASA Astrophysics Data System (ADS)

    Biondi, B. C.; Martin, E. R.; Yuan, S.; Cole, S.; Karrenbach, M. H.

    2017-12-01

    The Stanford Distributed Acoustic Sensing Array (SDASA-1) has been continuously recording seismic data since September 2016 on 2.5 km of single mode fiber optics in existing telecommunications conduits under Stanford's campus. The array is figure-eight shaped and roughly 600 m along its widest side with a channel spacing of roughly 8 m. This array is easy to maintain and is nonintrusive, making it well suited to urban environments, but it sacrifices some cable-to-ground coupling compared to more traditional seismometers. We have been testing its utility for earthquake recording, active seismic, and ambient noise interferometry. This talk will focus on earthquake observations. We will show comparisons between the strain rates measured throughout the DAS array and the particle velocities measured at the nearby Jasper Ridge Seismic Station (JRSC). In some of these events, we will point out directionality features specific to DAS that can require slight modifications in data processing. We also compare repeatability of DAS and JRSC recordings of blasts from a nearby quarry. Using existing earthquake databases, we have created a small catalog of DAS earthquake observations by pulling records of over 700 Northern California events spanning Sep. 2016 to Jul. 2017 from both the DAS data and JRSC. On these events we have tested common array methods for earthquake detection and location including beamforming and STA/LTA analysis in time and frequency. We have analyzed these events to approximate thresholds on what distances and magnitudes are clearly detectible by the DAS array. Further analysis should be done on detectability with methods tailored to small events (for example, template matching). In creating this catalog, we have developed open source software available for free download that can manage large sets of continuous seismic data files (both existing files, and files as they stream in). This software can both interface with existing earthquake networks, and efficiently extract earthquake recordings from many continuous recordings saved on the users machines.

  19. Optical phased array configuration for an extremely large telescope.

    PubMed

    Meinel, Aden Baker; Meinel, Marjorie Pettit

    2004-01-20

    Extremely large telescopes are currently under consideration by several groups in several countries. Extrapolation of current technology up to 30 m indicates a cost of over dollars 1 billion. Innovative concepts are being explored to find significant cost reductions. We explore the concept of an Optical Phased Array (OPA) telescope. Each element of the OPA is a separate Cassegrain telescope. Collimated beams from the array are sent via an associated set of delay lines to a central beam combiner. This array of small telescope elements offers the possibility of starting with a low-cost array of a few rings of elements, adding structure and additional Cass elements until the desired diameter telescope is attained. We address the salient features of such an extremely large telescope and cost elements relative to more conventional options.

  20. The Very Large Array Data Processing Pipeline

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako

    2018-01-01

    We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an international consortium of scientists and software developers based at the National Radio Astronomical Observatory (NRAO), the European Southern Observatory (ESO), and the National Astronomical Observatory of Japan (NAOJ).

Top