Sample records for large scale integrated

  1. SSI/MSI/LSI/VLSI/ULSI.

    ERIC Educational Resources Information Center

    Alexander, George

    1984-01-01

    Discusses small-scale integrated (SSI), medium-scale integrated (MSI), large-scale integrated (LSI), very large-scale integrated (VLSI), and ultra large-scale integrated (ULSI) chips. The development and properties of these chips, uses of gallium arsenide, Josephson devices (two superconducting strips sandwiching a thin insulator), and future…

  2. Mems: Platform for Large-Scale Integrated Vacuum Electronic Circuits

    DTIC Science & Technology

    2017-03-20

    SECURITY CLASSIFICATION OF: The objective of the LIVEC advanced study project was to develop a platform for large-scale integrated vacuum electronic ...Distribution Unlimited UU UU UU UU 20-03-2017 1-Jul-2014 30-Jun-2015 Final Report: MEMS Platform for Large-Scale Integrated Vacuum Electronic ... Electronic Circuits (LIVEC) Contract No: W911NF-14-C-0093 COR Dr. James Harvey U.S. ARO RTP, NC 27709-2211 Phone: 702-696-2533 e-mail

  3. MAINTAINING DATA QUALITY IN THE PERFORMANCE OF A LARGE SCALE INTEGRATED MONITORING EFFORT

    EPA Science Inventory

    Macauley, John M. and Linda C. Harwell. In press. Maintaining Data Quality in the Performance of a Large Scale Integrated Monitoring Effort (Abstract). To be presented at EMAP Symposium 2004: Integrated Monitoring and Assessment for Effective Water Quality Management, 3-7 May 200...

  4. 76 FR 14688 - In the Matter of Certain Large Scale Integrated Circuit Semiconductor Chips and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-17

    ... Integrated Circuit Semiconductor Chips and Products Containing the Same; Notice of a Commission Determination... certain large scale integrated circuit semiconductor chips and products containing same by reason of... existence of a domestic industry. The Commission's notice of investigation named several respondents...

  5. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  6. 75 FR 24742 - In the Matter of Certain Large Scale Integrated Circuit Semiconductor Chips and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ... Integrated Circuit Semiconductor Chips and Products Containing Same; Notice of Investigation AGENCY: U.S... of certain large scale integrated circuit semiconductor chips and products containing same by reason... alleges that an industry in the United States exists as required by subsection (a)(2) of section 337. The...

  7. Topological Properties of Some Integrated Circuits for Very Large Scale Integration Chip Designs

    NASA Astrophysics Data System (ADS)

    Swanson, S.; Lanzerotti, M.; Vernizzi, G.; Kujawski, J.; Weatherwax, A.

    2015-03-01

    This talk presents topological properties of integrated circuits for Very Large Scale Integration chip designs. These circuits can be implemented in very large scale integrated circuits, such as those in high performance microprocessors. Prior work considered basic combinational logic functions and produced a mathematical framework based on algebraic topology for integrated circuits composed of logic gates. Prior work also produced an historically-equivalent interpretation of Mr. E. F. Rent's work for today's complex circuitry in modern high performance microprocessors, where a heuristic linear relationship was observed between the number of connections and number of logic gates. This talk will examine topological properties and connectivity of more complex functionally-equivalent integrated circuits. The views expressed in this article are those of the author and do not reflect the official policy or position of the United States Air Force, Department of Defense or the U.S. Government.

  8. Optical correlator using very-large-scale integrated circuit/ferroelectric-liquid-crystal electrically addressed spatial light modulators

    NASA Technical Reports Server (NTRS)

    Turner, Richard M.; Jared, David A.; Sharp, Gary D.; Johnson, Kristina M.

    1993-01-01

    The use of 2-kHz 64 x 64 very-large-scale integrated circuit/ferroelectric-liquid-crystal electrically addressed spatial light modulators as the input and filter planes of a VanderLugt-type optical correlator is discussed. Liquid-crystal layer thickness variations that are present in the devices are analyzed, and the effects on correlator performance are investigated through computer simulations. Experimental results from the very-large-scale-integrated / ferroelectric-liquid-crystal optical-correlator system are presented and are consistent with the level of performance predicted by the simulations.

  9. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    NASA Astrophysics Data System (ADS)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  10. Real-time adaptive ramp metering : phase I, MILOS proof of concept (multi-objective, integrated, large-scale, optimized system).

    DOT National Transportation Integrated Search

    2006-12-01

    Over the last several years, researchers at the University of Arizonas ATLAS Center have developed an adaptive ramp : metering system referred to as MILOS (Multi-Objective, Integrated, Large-Scale, Optimized System). The goal of this project : is ...

  11. Development and Applications of a Modular Parallel Process for Large Scale Fluid/Structures Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    A modular process that can efficiently solve large scale multidisciplinary problems using massively parallel supercomputers is presented. The process integrates disciplines with diverse physical characteristics by retaining the efficiency of individual disciplines. Computational domain independence of individual disciplines is maintained using a meta programming approach. The process integrates disciplines without affecting the combined performance. Results are demonstrated for large scale aerospace problems on several supercomputers. The super scalability and portability of the approach is demonstrated on several parallel computers.

  12. Development and Applications of a Modular Parallel Process for Large Scale Fluid/Structures Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Byun, Chansup; Kwak, Dochan (Technical Monitor)

    2001-01-01

    A modular process that can efficiently solve large scale multidisciplinary problems using massively parallel super computers is presented. The process integrates disciplines with diverse physical characteristics by retaining the efficiency of individual disciplines. Computational domain independence of individual disciplines is maintained using a meta programming approach. The process integrates disciplines without affecting the combined performance. Results are demonstrated for large scale aerospace problems on several supercomputers. The super scalability and portability of the approach is demonstrated on several parallel computers.

  13. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  14. Data integration in the era of omics: current and future challenges

    PubMed Central

    2014-01-01

    To integrate heterogeneous and large omics data constitutes not only a conceptual challenge but a practical hurdle in the daily analysis of omics data. With the rise of novel omics technologies and through large-scale consortia projects, biological systems are being further investigated at an unprecedented scale generating heterogeneous and often large data sets. These data-sets encourage researchers to develop novel data integration methodologies. In this introduction we review the definition and characterize current efforts on data integration in the life sciences. We have used a web-survey to assess current research projects on data-integration to tap into the views, needs and challenges as currently perceived by parts of the research community. PMID:25032990

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trujillo, Angelina Michelle

    Strategy, Planning, Acquiring- very large scale computing platforms come and go and planning for immensely scalable machines often precedes actual procurement by 3 years. Procurement can be another year or more. Integration- After Acquisition, machines must be integrated into the computing environments at LANL. Connection to scalable storage via large scale storage networking, assuring correct and secure operations. Management and Utilization – Ongoing operations, maintenance, and trouble shooting of the hardware and systems software at massive scale is required.

  16. The Emergence of Dominant Design(s) in Large Scale Cyber-Infrastructure Systems

    ERIC Educational Resources Information Center

    Diamanti, Eirini Ilana

    2012-01-01

    Cyber-infrastructure systems are integrated large-scale IT systems designed with the goal of transforming scientific practice by enabling multi-disciplinary, cross-institutional collaboration. Their large scale and socio-technical complexity make design decisions for their underlying architecture practically irreversible. Drawing on three…

  17. Dissociable effects of local inhibitory and excitatory theta-burst stimulation on large-scale brain dynamics

    PubMed Central

    Sale, Martin V.; Lord, Anton; Zalesky, Andrew; Breakspear, Michael; Mattingley, Jason B.

    2015-01-01

    Normal brain function depends on a dynamic balance between local specialization and large-scale integration. It remains unclear, however, how local changes in functionally specialized areas can influence integrated activity across larger brain networks. By combining transcranial magnetic stimulation with resting-state functional magnetic resonance imaging, we tested for changes in large-scale integration following the application of excitatory or inhibitory stimulation on the human motor cortex. After local inhibitory stimulation, regions encompassing the sensorimotor module concurrently increased their internal integration and decreased their communication with other modules of the brain. There were no such changes in modular dynamics following excitatory stimulation of the same area of motor cortex nor were there changes in the configuration and interactions between core brain hubs after excitatory or inhibitory stimulation of the same area. These results suggest the existence of selective mechanisms that integrate local changes in neural activity, while preserving ongoing communication between brain hubs. PMID:25717162

  18. Effect of Integrating Hydrologic Scaling Concepts on Students Learning and Decision Making Experiences

    ERIC Educational Resources Information Center

    Najm, Majdi R. Abou; Mohtar, Rabi H.; Cherkauer, Keith A.; French, Brian F.

    2010-01-01

    Proper understanding of scaling and large-scale hydrologic processes is often not explicitly incorporated in the teaching curriculum. This makes it difficult for students to connect the effect of small scale processes and properties (like soil texture and structure, aggregation, shrinkage, and cracking) on large scale hydrologic responses (like…

  19. Plans for Embedding ICTs into Teaching and Learning through a Large-Scale Secondary Education Reform in the Country of Georgia

    ERIC Educational Resources Information Center

    Richardson, Jayson W.; Sales, Gregory; Sentocnik, Sonja

    2015-01-01

    Integrating ICTs into international development projects is common. However, focusing on how ICTs support leading, teaching, and learning is often overlooked. This article describes a team's approach to technology integration into the design of a large-scale, five year, teacher and leader professional development project in the country of Georgia.…

  20. Microfluidic large-scale integration: the evolution of design rules for biological automation.

    PubMed

    Melin, Jessica; Quake, Stephen R

    2007-01-01

    Microfluidic large-scale integration (mLSI) refers to the development of microfluidic chips with thousands of integrated micromechanical valves and control components. This technology is utilized in many areas of biology and chemistry and is a candidate to replace today's conventional automation paradigm, which consists of fluid-handling robots. We review the basic development of mLSI and then discuss design principles of mLSI to assess the capabilities and limitations of the current state of the art and to facilitate the application of mLSI to areas of biology. Many design and practical issues, including economies of scale, parallelization strategies, multiplexing, and multistep biochemical processing, are discussed. Several microfluidic components used as building blocks to create effective, complex, and highly integrated microfluidic networks are also highlighted.

  1. Using Hybrid Techniques for Generating Watershed-scale Flood Models in an Integrated Modeling Framework

    NASA Astrophysics Data System (ADS)

    Saksena, S.; Merwade, V.; Singhofen, P.

    2017-12-01

    There is an increasing global trend towards developing large scale flood models that account for spatial heterogeneity at watershed scales to drive the future flood risk planning. Integrated surface water-groundwater modeling procedures can elucidate all the hydrologic processes taking part during a flood event to provide accurate flood outputs. Even though the advantages of using integrated modeling are widely acknowledged, the complexity of integrated process representation, computation time and number of input parameters required have deterred its application to flood inundation mapping, especially for large watersheds. This study presents a faster approach for creating watershed scale flood models using a hybrid design that breaks down the watershed into multiple regions of variable spatial resolution by prioritizing higher order streams. The methodology involves creating a hybrid model for the Upper Wabash River Basin in Indiana using Interconnected Channel and Pond Routing (ICPR) and comparing the performance with a fully-integrated 2D hydrodynamic model. The hybrid approach involves simplification procedures such as 1D channel-2D floodplain coupling; hydrologic basin (HUC-12) integration with 2D groundwater for rainfall-runoff routing; and varying spatial resolution of 2D overland flow based on stream order. The results for a 50-year return period storm event show that hybrid model (NSE=0.87) performance is similar to the 2D integrated model (NSE=0.88) but the computational time is reduced to half. The results suggest that significant computational efficiency can be obtained while maintaining model accuracy for large-scale flood models by using hybrid approaches for model creation.

  2. Report on phase 1 of the Microprocessor Seminar. [and associated large scale integration

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Proceedings of a seminar on microprocessors and associated large scale integrated (LSI) circuits are presented. The potential for commonality of device requirements, candidate processes and mechanisms for qualifying candidate LSI technologies for high reliability applications, and specifications for testing and testability were among the topics discussed. Various programs and tentative plans of the participating organizations in the development of high reliability LSI circuits are given.

  3. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  4. Networks and landscapes: a framework for setting goals and evaluating performance at the large landscape scale

    Treesearch

    R Patrick Bixler; Shawn Johnson; Kirk Emerson; Tina Nabatchi; Melly Reuling; Charles Curtin; Michele Romolini; Morgan Grove

    2016-01-01

    The objective of large landscape conser vation is to mitigate complex ecological problems through interventions at multiple and overlapping scales. Implementation requires coordination among a diverse network of individuals and organizations to integrate local-scale conservation activities with broad-scale goals. This requires an understanding of the governance options...

  5. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    NASA Astrophysics Data System (ADS)

    Dednam, W.; Botha, A. E.

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution function method.

  6. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  7. In-situ device integration of large-area patterned organic nanowire arrays for high-performance optical sensors

    PubMed Central

    Wu, Yiming; Zhang, Xiujuan; Pan, Huanhuan; Deng, Wei; Zhang, Xiaohong; Zhang, Xiwei; Jie, Jiansheng

    2013-01-01

    Single-crystalline organic nanowires (NWs) are important building blocks for future low-cost and efficient nano-optoelectronic devices due to their extraordinary properties. However, it remains a critical challenge to achieve large-scale organic NW array assembly and device integration. Herein, we demonstrate a feasible one-step method for large-area patterned growth of cross-aligned single-crystalline organic NW arrays and their in-situ device integration for optical image sensors. The integrated image sensor circuitry contained a 10 × 10 pixel array in an area of 1.3 × 1.3 mm2, showing high spatial resolution, excellent stability and reproducibility. More importantly, 100% of the pixels successfully operated at a high response speed and relatively small pixel-to-pixel variation. The high yield and high spatial resolution of the operational pixels, along with the high integration level of the device, clearly demonstrate the great potential of the one-step organic NW array growth and device construction approach for large-scale optoelectronic device integration. PMID:24287887

  8. Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Resseguie, David R

    There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less

  9. St. Louis Initiative for Integrated Care Excellence (SLI(2)CE): integrated-collaborative care on a large scale model.

    PubMed

    Brawer, Peter A; Martielli, Richard; Pye, Patrice L; Manwaring, Jamie; Tierney, Anna

    2010-06-01

    The primary care health setting is in crisis. Increasing demand for services, with dwindling numbers of providers, has resulted in decreased access and decreased satisfaction for both patients and providers. Moreover, the overwhelming majority of primary care visits are for behavioral and mental health concerns rather than issues of a purely medical etiology. Integrated-collaborative models of health care delivery offer possible solutions to this crisis. The purpose of this article is to review the existing data available after 2 years of the St. Louis Initiative for Integrated Care Excellence; an example of integrated-collaborative care on a large scale model within a regional Veterans Affairs Health Care System. There is clear evidence that the SLI(2)CE initiative rather dramatically increased access to health care, and modified primary care practitioners' willingness to address mental health issues within the primary care setting. In addition, data suggests strong fidelity to a model of integrated-collaborative care which has been successful in the past. Integrated-collaborative care offers unique advantages to the traditional view and practice of medical care. Through careful implementation and practice, success is possible on a large scale model. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  10. Adaptive Fault-Tolerant Control of Uncertain Nonlinear Large-Scale Systems With Unknown Dead Zone.

    PubMed

    Chen, Mou; Tao, Gang

    2016-08-01

    In this paper, an adaptive neural fault-tolerant control scheme is proposed and analyzed for a class of uncertain nonlinear large-scale systems with unknown dead zone and external disturbances. To tackle the unknown nonlinear interaction functions in the large-scale system, the radial basis function neural network (RBFNN) is employed to approximate them. To further handle the unknown approximation errors and the effects of the unknown dead zone and external disturbances, integrated as the compounded disturbances, the corresponding disturbance observers are developed for their estimations. Based on the outputs of the RBFNN and the disturbance observer, the adaptive neural fault-tolerant control scheme is designed for uncertain nonlinear large-scale systems by using a decentralized backstepping technique. The closed-loop stability of the adaptive control system is rigorously proved via Lyapunov analysis and the satisfactory tracking performance is achieved under the integrated effects of unknown dead zone, actuator fault, and unknown external disturbances. Simulation results of a mass-spring-damper system are given to illustrate the effectiveness of the proposed adaptive neural fault-tolerant control scheme for uncertain nonlinear large-scale systems.

  11. Musical expertise is related to altered functional connectivity during audiovisual integration

    PubMed Central

    Paraskevopoulos, Evangelos; Kraneburg, Anja; Herholz, Sibylle Cornelia; Bamidis, Panagiotis D.; Pantev, Christo

    2015-01-01

    The present study investigated the cortical large-scale functional network underpinning audiovisual integration via magnetoencephalographic recordings. The reorganization of this network related to long-term musical training was investigated by comparing musicians to nonmusicians. Connectivity was calculated on the basis of the estimated mutual information of the sources’ activity, and the corresponding networks were statistically compared. Nonmusicians’ results indicated that the cortical network associated with audiovisual integration supports visuospatial processing and attentional shifting, whereas a sparser network, related to spatial awareness supports the identification of audiovisual incongruences. In contrast, musicians’ results showed enhanced connectivity in regions related to the identification of auditory pattern violations. Hence, nonmusicians rely on the processing of visual clues for the integration of audiovisual information, whereas musicians rely mostly on the corresponding auditory information. The large-scale cortical network underpinning multisensory integration is reorganized due to expertise in a cognitive domain that largely involves audiovisual integration, indicating long-term training-related neuroplasticity. PMID:26371305

  12. Reducing the two-loop large-scale structure power spectrum to low-dimensional, radial integrals

    DOE PAGES

    Schmittfull, Marcel; Vlah, Zvonimir

    2016-11-28

    Modeling the large-scale structure of the universe on nonlinear scales has the potential to substantially increase the science return of upcoming surveys by increasing the number of modes available for model comparisons. One way to achieve this is to model nonlinear scales perturbatively. Unfortunately, this involves high-dimensional loop integrals that are cumbersome to evaluate. Here, trying to simplify this, we show how two-loop (next-to-next-to-leading order) corrections to the density power spectrum can be reduced to low-dimensional, radial integrals. Many of those can be evaluated with a one-dimensional fast Fourier transform, which is significantly faster than the five-dimensional Monte-Carlo integrals thatmore » are needed otherwise. The general idea of this fast fourier transform perturbation theory method is to switch between Fourier and position space to avoid convolutions and integrate over orientations, leaving only radial integrals. This reformulation is independent of the underlying shape of the initial linear density power spectrum and should easily accommodate features such as those from baryonic acoustic oscillations. We also discuss how to account for halo bias and redshift space distortions.« less

  13. An innovative large scale integration of silicon nanowire-based field effect transistors

    NASA Astrophysics Data System (ADS)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  14. Evolving from bioinformatics in-the-small to bioinformatics in-the-large.

    PubMed

    Parker, D Stott; Gorlick, Michael M; Lee, Christopher J

    2003-01-01

    We argue the significance of a fundamental shift in bioinformatics, from in-the-small to in-the-large. Adopting a large-scale perspective is a way to manage the problems endemic to the world of the small-constellations of incompatible tools for which the effort required to assemble an integrated system exceeds the perceived benefit of the integration. Where bioinformatics in-the-small is about data and tools, bioinformatics in-the-large is about metadata and dependencies. Dependencies represent the complexities of large-scale integration, including the requirements and assumptions governing the composition of tools. The popular make utility is a very effective system for defining and maintaining simple dependencies, and it offers a number of insights about the essence of bioinformatics in-the-large. Keeping an in-the-large perspective has been very useful to us in large bioinformatics projects. We give two fairly different examples, and extract lessons from them showing how it has helped. These examples both suggest the benefit of explicitly defining and managing knowledge flows and knowledge maps (which represent metadata regarding types, flows, and dependencies), and also suggest approaches for developing bioinformatics database systems. Generally, we argue that large-scale engineering principles can be successfully adapted from disciplines such as software engineering and data management, and that having an in-the-large perspective will be a key advantage in the next phase of bioinformatics development.

  15. Miniaturized integration of a fluorescence microscope

    PubMed Central

    Ghosh, Kunal K.; Burns, Laurie D.; Cocker, Eric D.; Nimmerjahn, Axel; Ziv, Yaniv; Gamal, Abbas El; Schnitzer, Mark J.

    2013-01-01

    The light microscope is traditionally an instrument of substantial size and expense. Its miniaturized integration would enable many new applications based on mass-producible, tiny microscopes. Key prospective usages include brain imaging in behaving animals towards relating cellular dynamics to animal behavior. Here we introduce a miniature (1.9 g) integrated fluorescence microscope made from mass-producible parts, including semiconductor light source and sensor. This device enables high-speed cellular-level imaging across ∼0.5 mm2 areas in active mice. This capability allowed concurrent tracking of Ca2+ spiking in >200 Purkinje neurons across nine cerebellar microzones. During mouse locomotion, individual microzones exhibited large-scale, synchronized Ca2+ spiking. This is a mesoscopic neural dynamic missed by prior techniques for studying the brain at other length scales. Overall, the integrated microscope is a potentially transformative technology that permits distribution to many animals and enables diverse usages, such as portable diagnostics or microscope arrays for large-scale screens. PMID:21909102

  16. Miniaturized integration of a fluorescence microscope.

    PubMed

    Ghosh, Kunal K; Burns, Laurie D; Cocker, Eric D; Nimmerjahn, Axel; Ziv, Yaniv; Gamal, Abbas El; Schnitzer, Mark J

    2011-09-11

    The light microscope is traditionally an instrument of substantial size and expense. Its miniaturized integration would enable many new applications based on mass-producible, tiny microscopes. Key prospective usages include brain imaging in behaving animals for relating cellular dynamics to animal behavior. Here we introduce a miniature (1.9 g) integrated fluorescence microscope made from mass-producible parts, including a semiconductor light source and sensor. This device enables high-speed cellular imaging across ∼0.5 mm2 areas in active mice. This capability allowed concurrent tracking of Ca2+ spiking in >200 Purkinje neurons across nine cerebellar microzones. During mouse locomotion, individual microzones exhibited large-scale, synchronized Ca2+ spiking. This is a mesoscopic neural dynamic missed by prior techniques for studying the brain at other length scales. Overall, the integrated microscope is a potentially transformative technology that permits distribution to many animals and enables diverse usages, such as portable diagnostics or microscope arrays for large-scale screens.

  17. On the performance of exponential integrators for problems in magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Einkemmer, Lukas; Tokman, Mayya; Loffeld, John

    2017-02-01

    Exponential integrators have been introduced as an efficient alternative to explicit and implicit methods for integrating large stiff systems of differential equations. Over the past decades these methods have been studied theoretically and their performance was evaluated using a range of test problems. While the results of these investigations showed that exponential integrators can provide significant computational savings, the research on validating this hypothesis for large scale systems and understanding what classes of problems can particularly benefit from the use of the new techniques is in its initial stages. Resistive magnetohydrodynamic (MHD) modeling is widely used in studying large scale behavior of laboratory and astrophysical plasmas. In many problems numerical solution of MHD equations is a challenging task due to the temporal stiffness of this system in the parameter regimes of interest. In this paper we evaluate the performance of exponential integrators on large MHD problems and compare them to a state-of-the-art implicit time integrator. Both the variable and constant time step exponential methods of EPIRK-type are used to simulate magnetic reconnection and the Kevin-Helmholtz instability in plasma. Performance of these methods, which are part of the EPIC software package, is compared to the variable time step variable order BDF scheme included in the CVODE (part of SUNDIALS) library. We study performance of the methods on parallel architectures and with respect to magnitudes of important parameters such as Reynolds, Lundquist, and Prandtl numbers. We find that the exponential integrators provide superior or equal performance in most circumstances and conclude that further development of exponential methods for MHD problems is warranted and can lead to significant computational advantages for large scale stiff systems of differential equations such as MHD.

  18. An integrated network of Arabidopsis growth regulators and its use for gene prioritization.

    PubMed

    Sabaghian, Ehsan; Drebert, Zuzanna; Inzé, Dirk; Saeys, Yvan

    2015-12-01

    Elucidating the molecular mechanisms that govern plant growth has been an important topic in plant research, and current advances in large-scale data generation call for computational tools that efficiently combine these different data sources to generate novel hypotheses. In this work, we present a novel, integrated network that combines multiple large-scale data sources to characterize growth regulatory genes in Arabidopsis, one of the main plant model organisms. The contributions of this work are twofold: first, we characterized a set of carefully selected growth regulators with respect to their connectivity patterns in the integrated network, and, subsequently, we explored to which extent these connectivity patterns can be used to suggest new growth regulators. Using a large-scale comparative study, we designed new supervised machine learning methods to prioritize growth regulators. Our results show that these methods significantly improve current state-of-the-art prioritization techniques, and are able to suggest meaningful new growth regulators. In addition, the integrated network is made available to the scientific community, providing a rich data source that will be useful for many biological processes, not necessarily restricted to plant growth.

  19. Ten-channel InP-based large-scale photonic integrated transmitter fabricated by SAG technology

    NASA Astrophysics Data System (ADS)

    Zhang, Can; Zhu, Hongliang; Liang, Song; Cui, Xiao; Wang, Huitao; Zhao, Lingjuan; Wang, Wei

    2014-12-01

    A 10-channel InP-based large-scale photonic integrated transmitter was fabricated by selective area growth (SAG) technology combined with butt-joint regrowth (BJR) technology. The SAG technology was utilized to fabricate the electroabsorption modulated distributed feedback (DFB) laser (EML) arrays at the same time. The design of coplanar electrodes for electroabsorption modulator (EAM) was used for the flip-chip bonding package. The lasing wavelength of DFB laser could be tuned by the integrated micro-heater to match the ITU grids, which only needs one electrode pad. The average output power of each channel is 250 μW with an injection current of 200 mA. The static extinction ratios of the EAMs for 10 channels tested are ranged from 15 to 27 dB with a reverse bias of 6 V. The frequencies of 3 dB bandwidth of the chip for each channel are around 14 GHz. The novel design and simple fabrication process show its enormous potential in reducing the cost of large-scale photonic integrated circuit (LS-PIC) transmitter with high chip yields.

  20. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    PubMed

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales.

  1. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions

    PubMed Central

    Hahn, Beth A.; Dreitz, Victoria J.; George, T. Luke

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer’s sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer’s sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical design and analyses ensures reliable knowledge about bird populations that is relevant and integral to bird conservation at multiple scales. PMID:29065128

  2. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.

  3. Integration and segregation of large-scale brain networks during short-term task automatization

    PubMed Central

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F.; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-01-01

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes. PMID:27808095

  4. Operating Reserves and Wind Power Integration: An International Comparison; Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milligan, M.; Donohoo, P.; Lew, D.

    2010-10-01

    This paper provides a high-level international comparison of methods and key results from both operating practice and integration analysis, based on an informal International Energy Agency Task 25: Large-scale Wind Integration.

  5. Technology and testing.

    PubMed

    Quellmalz, Edys S; Pellegrino, James W

    2009-01-02

    Large-scale testing of educational outcomes benefits already from technological applications that address logistics such as development, administration, and scoring of tests, as well as reporting of results. Innovative applications of technology also provide rich, authentic tasks that challenge the sorts of integrated knowledge, critical thinking, and problem solving seldom well addressed in paper-based tests. Such tasks can be used on both large-scale and classroom-based assessments. Balanced assessment systems can be developed that integrate curriculum-embedded, benchmark, and summative assessments across classroom, district, state, national, and international levels. We discuss here the potential of technology to launch a new era of integrated, learning-centered assessment systems.

  6. Concepts for on-board satellite image registration. Volume 3: Impact of VLSI/VHSIC on satellite on-board signal processing

    NASA Technical Reports Server (NTRS)

    Aanstoos, J. V.; Snyder, W. E.

    1981-01-01

    Anticipated major advances in integrated circuit technology in the near future are described as well as their impact on satellite onboard signal processing systems. Dramatic improvements in chip density, speed, power consumption, and system reliability are expected from very large scale integration. Improvements are expected from very large scale integration enable more intelligence to be placed on remote sensing platforms in space, meeting the goals of NASA's information adaptive system concept, a major component of the NASA End-to-End Data System program. A forecast of VLSI technological advances is presented, including a description of the Defense Department's very high speed integrated circuit program, a seven-year research and development effort.

  7. Integrated fringe projection 3D scanning system for large-scale metrology based on laser tracker

    NASA Astrophysics Data System (ADS)

    Du, Hui; Chen, Xiaobo; Zhou, Dan; Guo, Gen; Xi, Juntong

    2017-10-01

    Large scale components exist widely in advance manufacturing industry,3D profilometry plays a pivotal role for the quality control. This paper proposes a flexible, robust large-scale 3D scanning system by integrating a robot with a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. And a mathematical model is established for the global data fusion. Subsequently, a flexible and robust method and mechanism is introduced for the establishment of the end coordination system. Based on this method, a virtual robot noumenon is constructed for hand-eye calibration. And then the transformation matrix between end coordination system and world coordination system is solved. Validation experiment is implemented for verifying the proposed algorithms. Firstly, hand-eye transformation matrix is solved. Then a car body rear is measured for 16 times for the global data fusion algorithm verification. And the 3D shape of the rear is reconstructed successfully.

  8. XLinkDB 2.0: integrated, large-scale structural analysis of protein crosslinking data

    PubMed Central

    Schweppe, Devin K.; Zheng, Chunxiang; Chavez, Juan D.; Navare, Arti T.; Wu, Xia; Eng, Jimmy K.; Bruce, James E.

    2016-01-01

    Motivation: Large-scale chemical cross-linking with mass spectrometry (XL-MS) analyses are quickly becoming a powerful means for high-throughput determination of protein structural information and protein–protein interactions. Recent studies have garnered thousands of cross-linked interactions, yet the field lacks an effective tool to compile experimental data or access the network and structural knowledge for these large scale analyses. We present XLinkDB 2.0 which integrates tools for network analysis, Protein Databank queries, modeling of predicted protein structures and modeling of docked protein structures. The novel, integrated approach of XLinkDB 2.0 enables the holistic analysis of XL-MS protein interaction data without limitation to the cross-linker or analytical system used for the analysis. Availability and Implementation: XLinkDB 2.0 can be found here, including documentation and help: http://xlinkdb.gs.washington.edu/. Contact: jimbruce@uw.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153666

  9. Small-scale monitoring - can it be integrated with large-scale programs?

    Treesearch

    C. M. Downes; J. Bart; B. T. Collins; B. Craig; B. Dale; E. H. Dunn; C. M. Francis; S. Woodley; P. Zorn

    2005-01-01

    There are dozens of programs and methodologies for monitoring and inventory of bird populations, differing in geographic scope, species focus, field methods and purpose. However, most of the emphasis has been placed on large-scale monitoring programs. People interested in assessing bird numbers and long-term trends in small geographic areas such as a local birding area...

  10. Mapping the integrated Sachs-Wolfe effect

    NASA Astrophysics Data System (ADS)

    Manzotti, A.; Dodelson, S.

    2014-12-01

    On large scales, the anisotropies in the cosmic microwave background (CMB) reflect not only the primordial density field but also the energy gain when photons traverse decaying gravitational potentials of large scale structure, what is called the integrated Sachs-Wolfe (ISW) effect. Decomposing the anisotropy signal into a primordial piece and an ISW component, the main secondary effect on large scales, is more urgent than ever as cosmologists strive to understand the Universe on those scales. We present a likelihood technique for extracting the ISW signal combining measurements of the CMB, the distribution of galaxies, and maps of gravitational lensing. We test this technique with simulated data showing that we can successfully reconstruct the ISW map using all the data sets together. Then we present the ISW map obtained from a combination of real data: the NRAO VLA sky survey (NVSS) galaxy survey, temperature anisotropies, and lensing maps made by the Planck satellite. This map shows that, with the data sets used and assuming linear physics, there is no evidence, from the reconstructed ISW signal in the Cold Spot region, for an entirely ISW origin of this large scale anomaly in the CMB. However a large scale structure origin from low redshift voids outside the NVSS redshift range is still possible. Finally we show that future surveys, thanks to a better large scale lensing reconstruction will be able to improve the reconstruction signal to noise which is now mainly coming from galaxy surveys.

  11. Assuring Quality in Large-Scale Online Course Development

    ERIC Educational Resources Information Center

    Parscal, Tina; Riemer, Deborah

    2010-01-01

    Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…

  12. Taking Stock: Existing Resources for Assessing a New Vision of Science Learning

    ERIC Educational Resources Information Center

    Alonzo, Alicia C.; Ke, Li

    2016-01-01

    A new vision of science learning described in the "Next Generation Science Standards"--particularly the science and engineering practices and their integration with content--pose significant challenges for large-scale assessment. This article explores what might be learned from advances in large-scale science assessment and…

  13. Regional climate model sensitivity to domain size

    NASA Astrophysics Data System (ADS)

    Leduc, Martin; Laprise, René

    2009-05-01

    Regional climate models are increasingly used to add small-scale features that are not present in their lateral boundary conditions (LBC). It is well known that the limited area over which a model is integrated must be large enough to allow the full development of small-scale features. On the other hand, integrations on very large domains have shown important departures from the driving data, unless large scale nudging is applied. The issue of domain size is studied here by using the “perfect model” approach. This method consists first of generating a high-resolution climatic simulation, nicknamed big brother (BB), over a large domain of integration. The next step is to degrade this dataset with a low-pass filter emulating the usual coarse-resolution LBC. The filtered nesting data (FBB) are hence used to drive a set of four simulations (LBs for Little Brothers), with the same model, but on progressively smaller domain sizes. The LB statistics for a climate sample of four winter months are compared with BB over a common region. The time average (stationary) and transient-eddy standard deviation patterns of the LB atmospheric fields generally improve in terms of spatial correlation with the reference (BB) when domain gets smaller. The extraction of the small-scale features by using a spectral filter allows detecting important underestimations of the transient-eddy variability in the vicinity of the inflow boundary, which can penalize the use of small domains (less than 100 × 100 grid points). The permanent “spatial spin-up” corresponds to the characteristic distance that the large-scale flow needs to travel before developing small-scale features. The spin-up distance tends to grow in size at higher levels in the atmosphere.

  14. Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector

    NASA Astrophysics Data System (ADS)

    Kumar, P.; Mishra, T.; Banerjee, R.

    2017-12-01

    India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.

  15. Oak Ridge Bio-surveillance Toolkit (ORBiT): Integrating Big-Data Analytics with Visual Analysis for Public Health Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramanathan, Arvind; Pullum, Laura L; Steed, Chad A

    In this position paper, we describe the design and implementation of the Oak Ridge Bio-surveillance Toolkit (ORBiT): a collection of novel statistical and machine learning tools implemented for (1) integrating heterogeneous traditional (e.g. emergency room visits, prescription sales data, etc.) and non-traditional (social media such as Twitter and Instagram) data sources, (2) analyzing large-scale datasets and (3) presenting the results from the analytics as a visual interface for the end-user to interact and provide feedback. We present examples of how ORBiT can be used to summarize ex- tremely large-scale datasets effectively and how user interactions can translate into the datamore » analytics process for bio-surveillance. We also present a strategy to estimate parameters relevant to dis- ease spread models from near real time data feeds and show how these estimates can be integrated with disease spread models for large-scale populations. We conclude with a perspective on how integrating data and visual analytics could lead to better forecasting and prediction of disease spread as well as improved awareness of disease susceptible regions.« less

  16. A Methodology for Integrated, Multiregional Life Cycle Assessment Scenarios under Large-Scale Technological Change.

    PubMed

    Gibon, Thomas; Wood, Richard; Arvesen, Anders; Bergesen, Joseph D; Suh, Sangwon; Hertwich, Edgar G

    2015-09-15

    Climate change mitigation demands large-scale technological change on a global level and, if successfully implemented, will significantly affect how products and services are produced and consumed. In order to anticipate the life cycle environmental impacts of products under climate mitigation scenarios, we present the modeling framework of an integrated hybrid life cycle assessment model covering nine world regions. Life cycle assessment databases and multiregional input-output tables are adapted using forecasted changes in technology and resources up to 2050 under a 2 °C scenario. We call the result of this modeling "technology hybridized environmental-economic model with integrated scenarios" (THEMIS). As a case study, we apply THEMIS in an integrated environmental assessment of concentrating solar power. Life-cycle greenhouse gas emissions for this plant range from 33 to 95 g CO2 eq./kWh across different world regions in 2010, falling to 30-87 g CO2 eq./kWh in 2050. Using regional life cycle data yields insightful results. More generally, these results also highlight the need for systematic life cycle frameworks that capture the actual consequences and feedback effects of large-scale policies in the long term.

  17. Colloidal core-seeded semiconductor nanorods as fluorescent labels for in-vitro diagnostics (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Chan, YinThai

    2016-03-01

    Colloidal semiconductor nanocrystals are ideal fluorophores for clinical diagnostics, therapeutics, and highly sensitive biochip applications due to their high photostability, size-tunable color of emission and flexible surface chemistry. The relatively recent development of core-seeded semiconductor nanorods showed that the presence of a rod-like shell can confer even more advantageous physicochemical properties than their spherical counterparts, such as large multi-photon absorption cross-sections and facet-specific chemistry that can be exploited to deposit secondary nanoparticles. It may be envisaged that these highly fluorescent nanorods can be integrated with large scale integrated (LSI) microfluidic systems that allow miniaturization and integration of multiple biochemical processes in a single device at the nanoliter scale, resulting in a highly sensitive and automated detection platform. In this talk, I will describe a LSI microfluidic device that integrates RNA extraction, reverse transcription to cDNA, amplification and target pull-down to detect histidine decarboxylase (HDC) gene directly from human white blood cells samples. When anisotropic colloidal semiconductor nanorods (NRs) were used as the fluorescent readout, the detection limit was found to be 0.4 ng of total RNA, which was much lower than that obtained using spherical quantum dots (QDs) or organic dyes. This was attributed to the large action cross-section of NRs and their high probability of target capture in a pull-down detection scheme. The combination of large scale integrated microfluidics with highly fluorescent semiconductor NRs may find widespread utility in point-of-care devices and multi-target diagnostics.

  18. Two-machine flow shop scheduling integrated with preventive maintenance planning

    NASA Astrophysics Data System (ADS)

    Wang, Shijin; Liu, Ming

    2016-02-01

    This paper investigates an integrated optimisation problem of production scheduling and preventive maintenance (PM) in a two-machine flow shop with time to failure of each machine subject to a Weibull probability distribution. The objective is to find the optimal job sequence and the optimal PM decisions before each job such that the expected makespan is minimised. To investigate the value of integrated scheduling solution, computational experiments on small-scale problems with different configurations are conducted with total enumeration method, and the results are compared with those of scheduling without maintenance but with machine degradation, and individual job scheduling combined with independent PM planning. Then, for large-scale problems, four genetic algorithm (GA) based heuristics are proposed. The numerical results with several large problem sizes and different configurations indicate the potential benefits of integrated scheduling solution and the results also show that proposed GA-based heuristics are efficient for the integrated problem.

  19. Space transportation booster engine thrust chamber technology, large scale injector

    NASA Technical Reports Server (NTRS)

    Schneider, J. A.

    1993-01-01

    The objective of the Large Scale Injector (LSI) program was to deliver a 21 inch diameter, 600,000 lbf thrust class injector to NASA/MSFC for hot fire testing. The hot fire test program would demonstrate the feasibility and integrity of the full scale injector, including combustion stability, chamber wall compatibility (thermal management), and injector performance. The 21 inch diameter injector was delivered in September of 1991.

  20. Carbon nanotube circuit integration up to sub-20 nm channel lengths.

    PubMed

    Shulaker, Max Marcel; Van Rethy, Jelle; Wu, Tony F; Liyanage, Luckshitha Suriyasena; Wei, Hai; Li, Zuanyi; Pop, Eric; Gielen, Georges; Wong, H-S Philip; Mitra, Subhasish

    2014-04-22

    Carbon nanotube (CNT) field-effect transistors (CNFETs) are a promising emerging technology projected to achieve over an order of magnitude improvement in energy-delay product, a metric of performance and energy efficiency, compared to silicon-based circuits. However, due to substantial imperfections inherent with CNTs, the promise of CNFETs has yet to be fully realized. Techniques to overcome these imperfections have yielded promising results, but thus far only at large technology nodes (1 μm device size). Here we demonstrate the first very large scale integration (VLSI)-compatible approach to realizing CNFET digital circuits at highly scaled technology nodes, with devices ranging from 90 nm to sub-20 nm channel lengths. We demonstrate inverters functioning at 1 MHz and a fully integrated CNFET infrared light sensor and interface circuit at 32 nm channel length. This demonstrates the feasibility of realizing more complex CNFET circuits at highly scaled technology nodes.

  1. Large-scale data integration framework provides a comprehensive view on glioblastoma multiforme.

    PubMed

    Ovaska, Kristian; Laakso, Marko; Haapa-Paananen, Saija; Louhimo, Riku; Chen, Ping; Aittomäki, Viljami; Valo, Erkka; Núñez-Fontarnau, Javier; Rantanen, Ville; Karinen, Sirkku; Nousiainen, Kari; Lahesmaa-Korpinen, Anna-Maria; Miettinen, Minna; Saarinen, Lilli; Kohonen, Pekka; Wu, Jianmin; Westermarck, Jukka; Hautaniemi, Sampsa

    2010-09-07

    Coordinated efforts to collect large-scale data sets provide a basis for systems level understanding of complex diseases. In order to translate these fragmented and heterogeneous data sets into knowledge and medical benefits, advanced computational methods for data analysis, integration and visualization are needed. We introduce a novel data integration framework, Anduril, for translating fragmented large-scale data into testable predictions. The Anduril framework allows rapid integration of heterogeneous data with state-of-the-art computational methods and existing knowledge in bio-databases. Anduril automatically generates thorough summary reports and a website that shows the most relevant features of each gene at a glance, allows sorting of data based on different parameters, and provides direct links to more detailed data on genes, transcripts or genomic regions. Anduril is open-source; all methods and documentation are freely available. We have integrated multidimensional molecular and clinical data from 338 subjects having glioblastoma multiforme, one of the deadliest and most poorly understood cancers, using Anduril. The central objective of our approach is to identify genetic loci and genes that have significant survival effect. Our results suggest several novel genetic alterations linked to glioblastoma multiforme progression and, more specifically, reveal Moesin as a novel glioblastoma multiforme-associated gene that has a strong survival effect and whose depletion in vitro significantly inhibited cell proliferation. All analysis results are available as a comprehensive website. Our results demonstrate that integrated analysis and visualization of multidimensional and heterogeneous data by Anduril enables drawing conclusions on functional consequences of large-scale molecular data. Many of the identified genetic loci and genes having significant survival effect have not been reported earlier in the context of glioblastoma multiforme. Thus, in addition to generally applicable novel methodology, our results provide several glioblastoma multiforme candidate genes for further studies.Anduril is available at http://csbi.ltdk.helsinki.fi/anduril/The glioblastoma multiforme analysis results are available at http://csbi.ltdk.helsinki.fi/anduril/tcga-gbm/

  2. Van der Waals epitaxial growth and optoelectronics of large-scale WSe2/SnS2 vertical bilayer p-n junctions.

    PubMed

    Yang, Tiefeng; Zheng, Biyuan; Wang, Zhen; Xu, Tao; Pan, Chen; Zou, Juan; Zhang, Xuehong; Qi, Zhaoyang; Liu, Hongjun; Feng, Yexin; Hu, Weida; Miao, Feng; Sun, Litao; Duan, Xiangfeng; Pan, Anlian

    2017-12-04

    High-quality two-dimensional atomic layered p-n heterostructures are essential for high-performance integrated optoelectronics. The studies to date have been largely limited to exfoliated and restacked flakes, and the controlled growth of such heterostructures remains a significant challenge. Here we report the direct van der Waals epitaxial growth of large-scale WSe 2 /SnS 2 vertical bilayer p-n junctions on SiO 2 /Si substrates, with the lateral sizes reaching up to millimeter scale. Multi-electrode field-effect transistors have been integrated on a single heterostructure bilayer. Electrical transport measurements indicate that the field-effect transistors of the junction show an ultra-low off-state leakage current of 10 -14 A and a highest on-off ratio of up to 10 7 . Optoelectronic characterizations show prominent photoresponse, with a fast response time of 500 μs, faster than all the directly grown vertical 2D heterostructures. The direct growth of high-quality van der Waals junctions marks an important step toward high-performance integrated optoelectronic devices and systems.

  3. Multi-format all-optical processing based on a large-scale, hybridly integrated photonic circuit.

    PubMed

    Bougioukos, M; Kouloumentas, Ch; Spyropoulou, M; Giannoulis, G; Kalavrouziotis, D; Maziotis, A; Bakopoulos, P; Harmon, R; Rogers, D; Harrison, J; Poustie, A; Maxwell, G; Avramopoulos, H

    2011-06-06

    We investigate through numerical studies and experiments the performance of a large scale, silica-on-silicon photonic integrated circuit for multi-format regeneration and wavelength-conversion. The circuit encompasses a monolithically integrated array of four SOAs inside two parallel Mach-Zehnder structures, four delay interferometers and a large number of silica waveguides and couplers. Exploiting phase-incoherent techniques, the circuit is capable of processing OOK signals at variable bit rates, DPSK signals at 22 or 44 Gb/s and DQPSK signals at 44 Gbaud. Simulation studies reveal the wavelength-conversion potential of the circuit with enhanced regenerative capabilities for OOK and DPSK modulation formats and acceptable quality degradation for DQPSK format. Regeneration of 22 Gb/s OOK signals with amplified spontaneous emission (ASE) noise and DPSK data signals degraded with amplitude, phase and ASE noise is experimentally validated demonstrating a power penalty improvement up to 1.5 dB.

  4. The Plant Phenology Ontology: A New Informatics Resource for Large-Scale Integration of Plant Phenology Data.

    PubMed

    Stucky, Brian J; Guralnick, Rob; Deck, John; Denny, Ellen G; Bolmgren, Kjell; Walls, Ramona

    2018-01-01

    Plant phenology - the timing of plant life-cycle events, such as flowering or leafing out - plays a fundamental role in the functioning of terrestrial ecosystems, including human agricultural systems. Because plant phenology is often linked with climatic variables, there is widespread interest in developing a deeper understanding of global plant phenology patterns and trends. Although phenology data from around the world are currently available, truly global analyses of plant phenology have so far been difficult because the organizations producing large-scale phenology data are using non-standardized terminologies and metrics during data collection and data processing. To address this problem, we have developed the Plant Phenology Ontology (PPO). The PPO provides the standardized vocabulary and semantic framework that is needed for large-scale integration of heterogeneous plant phenology data. Here, we describe the PPO, and we also report preliminary results of using the PPO and a new data processing pipeline to build a large dataset of phenology information from North America and Europe.

  5. Robust decentralized hybrid adaptive output feedback fuzzy control for a class of large-scale MIMO nonlinear systems and its application to AHS.

    PubMed

    Huang, Yi-Shao; Liu, Wel-Ping; Wu, Min; Wang, Zheng-Wu

    2014-09-01

    This paper presents a novel observer-based decentralized hybrid adaptive fuzzy control scheme for a class of large-scale continuous-time multiple-input multiple-output (MIMO) uncertain nonlinear systems whose state variables are unmeasurable. The scheme integrates fuzzy logic systems, state observers, and strictly positive real conditions to deal with three issues in the control of a large-scale MIMO uncertain nonlinear system: algorithm design, controller singularity, and transient response. Then, the design of the hybrid adaptive fuzzy controller is extended to address a general large-scale uncertain nonlinear system. It is shown that the resultant closed-loop large-scale system keeps asymptotically stable and the tracking error converges to zero. The better characteristics of our scheme are demonstrated by simulations. Copyright © 2014. Published by Elsevier Ltd.

  6. Integral criteria for large-scale multiple fingerprint solutions

    NASA Astrophysics Data System (ADS)

    Ushmaev, Oleg S.; Novikov, Sergey O.

    2004-08-01

    We propose the definition and analysis of the optimal integral similarity score criterion for large scale multmodal civil ID systems. Firstly, the general properties of score distributions for genuine and impostor matches for different systems and input devices are investigated. The empirical statistics was taken from the real biometric tests. Then we carry out the analysis of simultaneous score distributions for a number of combined biometric tests and primary for ultiple fingerprint solutions. The explicit and approximate relations for optimal integral score, which provides the least value of the FRR while the FAR is predefined, have been obtained. The results of real multiple fingerprint test show good correspondence with the theoretical results in the wide range of the False Acceptance and the False Rejection Rates.

  7. Integrating land and resource management plans and applied large-scale research on two national forests

    Treesearch

    Callie Jo Schweitzer; Stacy Clark; Glen Gaines; Paul Finke; Kurt Gottschalk; David Loftis

    2008-01-01

    Researchers working out of the Southern and Northern Research Stations have partnered with two National Forests to conduct two large-scale studies designed to assess the effectiveness of silvicultural techniques used to restore and maintain upland oak (Quercus spp.)-dominated ecosystems in the Cumberland Plateau Region of the southeastern United...

  8. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    NASA Technical Reports Server (NTRS)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  9. Development of analog watch with minute repeater

    NASA Astrophysics Data System (ADS)

    Okigami, Tomio; Aoyama, Shigeru; Osa, Takashi; Igarashi, Kiyotaka; Ikegami, Tomomi

    A complementary metal oxide semiconductor with large scale integration was developed for an electronic minute repeater. It is equipped with the synthetic struck sound circuit to generate natural struck sound necessary for the minute repeater. This circuit consists of an envelope curve drawing circuit, frequency mixer, polyphonic mixer, and booster circuit made by using analog circuit technology. This large scale integration is a single chip microcomputer with motor drivers and input ports in addition to the synthetic struck sound circuit, and it is possible to make an electronic system of minute repeater at a very low cost in comparison with the conventional type.

  10. The prospect of modern thermomechanics in structural integrity calculations of large-scale pressure vessels

    NASA Astrophysics Data System (ADS)

    Fekete, Tamás

    2018-05-01

    Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well-grounded theoretical foundation for a new modeling framework of structural integrity. This paper presents the first findings of the research project.

  11. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    NASA Astrophysics Data System (ADS)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of climate change on crop productivity in a watershed. The first was carried out by the large-scale crop model alone. The second was carried out by the integrated model of the large-scale crop model and the H08 model. The former projected that changes in temperature and precipitation due to future climate change would give rise to increasing the water stress in crops. Nevertheless, the latter projected that the increasing amount of agricultural water resources in the watershed would supply sufficient amount of water for irrigation, consequently reduce the water stress. The integrated model demonstrated the importance of taking into account the water circulation in watershed when predicting the regional crop production.

  12. Renewable Fuels-to-Grid Integration | Energy Systems Integration Facility |

    Science.gov Websites

    hydrogen, other than electrolysis. Read more about this research. Partnerships Photo of a polymer electrolyte membrane stack in a laboratory Giner NREL helped evaluate a large-scale polymer electrolyte

  13. Skin Friction Reduction Through Large-Scale Forcing

    NASA Astrophysics Data System (ADS)

    Bhatt, Shibani; Artham, Sravan; Gnanamanickam, Ebenezer

    2017-11-01

    Flow structures in a turbulent boundary layer larger than an integral length scale (δ), referred to as large-scales, interact with the finer scales in a non-linear manner. By targeting these large-scales and exploiting this non-linear interaction wall shear stress (WSS) reduction of over 10% has been achieved. The plane wall jet (PWJ), a boundary layer which has highly energetic large-scales that become turbulent independent of the near-wall finer scales, is the chosen model flow field. It's unique configuration allows for the independent control of the large-scales through acoustic forcing. Perturbation wavelengths from about 1 δ to 14 δ were considered with a reduction in WSS for all wavelengths considered. This reduction, over a large subset of the wavelengths, scales with both inner and outer variables indicating a mixed scaling to the underlying physics, while also showing dependence on the PWJ global properties. A triple decomposition of the velocity fields shows an increase in coherence due to forcing with a clear organization of the small scale turbulence with respect to the introduced large-scale. The maximum reduction in WSS occurs when the introduced large-scale acts in a manner so as to reduce the turbulent activity in the very near wall region. This material is based upon work supported by the Air Force Office of Scientific Research under Award Number FA9550-16-1-0194 monitored by Dr. Douglas Smith.

  14. The Role of Free Stream Turbulence on the Aerodynamic Performance of a Wind Turbine Blade

    NASA Astrophysics Data System (ADS)

    Maldonado, Victor; Thormann, Adrien; Meneveau, Charles; Castillo, Luciano

    2014-11-01

    Effects of free stream turbulence with large integral scale on the aerodynamic performance of an S809 airfoil-based wind turbine blade at low Reynolds number are studied using wind tunnel experiments. A constant chord (2-D) S809 airfoil wind turbine blade model with an operating Reynolds number of 208,000 based on chord length was tested for a range of angles of attack representative of fully attached and stalled flow as encountered in typical wind turbine operation. The smooth-surface blade was subjected to a quasi-laminar free stream with very low free-stream turbulence as well as to elevated free-stream turbulence generated by an active grid. This turbulence contained large-scale eddies with levels of free-stream turbulence intensity of up to 6.14% and an integral length scale of about 60% of chord-length. The pressure distribution was acquired using static pressure taps and the lift was subsequently computed by numerical integration. The wake velocity deficit was measured utilizing hot-wire anemometry to compute the drag coefficient also via integration. In addition, the mean flow was quantified using 2-D particle image velocimetry (PIV) over the suction surface of the blade. Results indicate that turbulence, even with very large-scale eddies comparable in size to the chord-length, significantly improves the aerodynamic performance of the blade by increasing the lift coefficient and overall lift-to-drag ratio, L/D for all angles tested except zero degrees.

  15. Globus | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Globus software services provide secure cancer research data transfer, synchronization, and sharing in distributed environments at large scale. These services can be integrated into applications and research data gateways, leveraging Globus identity management, single sign-on, search, and authorization capabilities. Globus Genomics integrates Globus with the Galaxy genomics workflow engine and Amazon Web Services to enable cancer genomics analysis that can elastically scale compute resources with demand.

  16. Laser Scanning Holographic Lithography for Flexible 3D Fabrication of Multi-Scale Integrated Nano-structures and Optical Biosensors

    PubMed Central

    Yuan, Liang (Leon); Herman, Peter R.

    2016-01-01

    Three-dimensional (3D) periodic nanostructures underpin a promising research direction on the frontiers of nanoscience and technology to generate advanced materials for exploiting novel photonic crystal (PC) and nanofluidic functionalities. However, formation of uniform and defect-free 3D periodic structures over large areas that can further integrate into multifunctional devices has remained a major challenge. Here, we introduce a laser scanning holographic method for 3D exposure in thick photoresist that combines the unique advantages of large area 3D holographic interference lithography (HIL) with the flexible patterning of laser direct writing to form both micro- and nano-structures in a single exposure step. Phase mask interference patterns accumulated over multiple overlapping scans are shown to stitch seamlessly and form uniform 3D nanostructure with beam size scaled to small 200 μm diameter. In this way, laser scanning is presented as a facile means to embed 3D PC structure within microfluidic channels for integration into an optofluidic lab-on-chip, demonstrating a new laser HIL writing approach for creating multi-scale integrated microsystems. PMID:26922872

  17. Solutions of large-scale electromagnetics problems involving dielectric objects with the parallel multilevel fast multipole algorithm.

    PubMed

    Ergül, Özgür

    2011-11-01

    Fast and accurate solutions of large-scale electromagnetics problems involving homogeneous dielectric objects are considered. Problems are formulated with the electric and magnetic current combined-field integral equation and discretized with the Rao-Wilton-Glisson functions. Solutions are performed iteratively by using the multilevel fast multipole algorithm (MLFMA). For the solution of large-scale problems discretized with millions of unknowns, MLFMA is parallelized on distributed-memory architectures using a rigorous technique, namely, the hierarchical partitioning strategy. Efficiency and accuracy of the developed implementation are demonstrated on very large problems involving as many as 100 million unknowns.

  18. OpenMP parallelization of a gridded SWAT (SWATG)

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin

    2017-12-01

    Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.

  19. Lagrangian velocity and acceleration correlations of large inertial particles in a closed turbulent flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Machicoane, Nathanaël; Volk, Romain

    We investigate the response of large inertial particle to turbulent fluctuations in an inhomogeneous and anisotropic flow. We conduct a Lagrangian study using particles both heavier and lighter than the surrounding fluid, and whose diameters are comparable to the flow integral scale. Both velocity and acceleration correlation functions are analyzed to compute the Lagrangian integral time and the acceleration time scale of such particles. The knowledge of how size and density affect these time scales is crucial in understanding particle dynamics and may permit stochastic process modelization using two-time models (for instance, Sawford’s). As particles are tracked over long timesmore » in the quasi-totality of a closed flow, the mean flow influences their behaviour and also biases the velocity time statistics, in particular the velocity correlation functions. By using a method that allows for the computation of turbulent velocity trajectories, we can obtain unbiased Lagrangian integral time. This is particularly useful in accessing the scale separation for such particles and to comparing it to the case of fluid particles in a similar configuration.« less

  20. Energy Systems Integration Facility Overview

    ScienceCinema

    Arvizu, Dan; Chistensen, Dana; Hannegan, Bryan; Garret, Bobi; Kroposki, Ben; Symko-Davies, Martha; Post, David; Hammond, Steve; Kutscher, Chuck; Wipke, Keith

    2018-01-16

    The U.S. Department of Energy's Energy Systems Integration Facility (ESIF) is located at the National Renewable Energy Laboratory is the right tool, at the right time... a first-of-its-kind facility that addresses the challenges of large-scale integration of clean energy technologies into the energy systems that power the nation.

  1. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    NASA Astrophysics Data System (ADS)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we contend that creating believable soil carbon predictions requires a robust, transparent, and community-available benchmarking framework. I will present an ILAMB evaluation of several of the above-mentioned approaches in ACME, and attempt to motivate community adoption of this evaluation approach.

  2. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    NASA Astrophysics Data System (ADS)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  3. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA.

    PubMed

    Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P

    2014-01-01

    Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  4. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA

    PubMed Central

    Nandi, Anirban; Danquah, Michael K.

    2014-01-01

    Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA. PMID:25057428

  5. Multiscale solvers and systematic upscaling in computational physics

    NASA Astrophysics Data System (ADS)

    Brandt, A.

    2005-07-01

    Multiscale algorithms can overcome the scale-born bottlenecks that plague most computations in physics. These algorithms employ separate processing at each scale of the physical space, combined with interscale iterative interactions, in ways which use finer scales very sparingly. Having been developed first and well known as multigrid solvers for partial differential equations, highly efficient multiscale techniques have more recently been developed for many other types of computational tasks, including: inverse PDE problems; highly indefinite (e.g., standing wave) equations; Dirac equations in disordered gauge fields; fast computation and updating of large determinants (as needed in QCD); fast integral transforms; integral equations; astrophysics; molecular dynamics of macromolecules and fluids; many-atom electronic structures; global and discrete-state optimization; practical graph problems; image segmentation and recognition; tomography (medical imaging); fast Monte-Carlo sampling in statistical physics; and general, systematic methods of upscaling (accurate numerical derivation of large-scale equations from microscopic laws).

  6. Moon-based Earth Observation for Large Scale Geoscience Phenomena

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Liu, Guang; Ding, Yixing

    2016-07-01

    The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.

  7. Data assimilation in optimizing and integrating soil and water quality water model predictions at different scales

    USDA-ARS?s Scientific Manuscript database

    Relevant data about subsurface water flow and solute transport at relatively large scales that are of interest to the public are inherently laborious and in most cases simply impossible to obtain. Upscaling in which fine-scale models and data are used to predict changes at the coarser scales is the...

  8. Integrating Cloud-Computing-Specific Model into Aircraft Design

    NASA Astrophysics Data System (ADS)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  9. Atomic-order thermal nitridation of group IV semiconductors for ultra-large-scale integration

    NASA Astrophysics Data System (ADS)

    Murota, Junichi; Le Thanh, Vinh

    2015-03-01

    One of the main requirements for ultra-large-scale integration (ULSI) is atomic-order control of process technology. Our concept of atomically controlled processing for group IV semiconductors is based on atomic-order surface reaction control in Si-based CVD epitaxial growth. On the atomic-order surface nitridation of a few nm-thick Ge/about 4 nm-thick Si0.5Ge0.5/Si(100) by NH3, it is found that N atoms diffuse through nm-order thick Ge layer into Si0.5Ge0.5/Si(100) substrate and form Si nitride, even at 500 °C. By subsequent H2 heat treatment, although N atomic amount in Ge layer is reduced drastically, the reduction of the Si nitride is slight. It is suggested that N diffusion in Ge layer is suppressed by the formation of Si nitride and that Ge/atomic-order N layer/Si1-xGex/Si (100) heterostructure is formed. These results demonstrate the capability of CVD technology for atomically controlled nitridation of group IV semiconductors for ultra-large-scale integration. Invited talk at the 7th International Workshop on Advanced Materials Science and Nanotechnology IWAMSN2014, 2-6 November, 2014, Ha Long, Vietnam.

  10. Materials Integration and Doping of Carbon Nanotube-based Logic Circuits

    NASA Astrophysics Data System (ADS)

    Geier, Michael

    Over the last 20 years, extensive research into the structure and properties of single- walled carbon nanotube (SWCNT) has elucidated many of the exceptional qualities possessed by SWCNTs, including record-setting tensile strength, excellent chemical stability, distinctive optoelectronic features, and outstanding electronic transport characteristics. In order to exploit these remarkable qualities, many application-specific hurdles must be overcome before the material can be implemented in commercial products. For electronic applications, recent advances in sorting SWCNTs by electronic type have enabled significant progress towards SWCNT-based integrated circuits. Despite these advances, demonstrations of SWCNT-based devices with suitable characteristics for large-scale integrated circuits have been limited. The processing methodologies, materials integration, and mechanistic understanding of electronic properties developed in this dissertation have enabled unprecedented scales of SWCNT-based transistor fabrication and integrated circuit demonstrations. Innovative materials selection and processing methods are at the core of this work and these advances have led to transistors with the necessary transport properties required for modern circuit integration. First, extensive collaborations with other research groups allowed for the exploration of SWCNT thin-film transistors (TFTs) using a wide variety of materials and processing methods such as new dielectric materials, hybrid semiconductor materials systems, and solution-based printing of SWCNT TFTs. These materials were integrated into circuit demonstrations such as NOR and NAND logic gates, voltage-controlled ring oscillators, and D-flip-flops using both rigid and flexible substrates. This dissertation explores strategies for implementing complementary SWCNT-based circuits, which were developed by using local metal gate structures that achieve enhancement-mode p-type and n-type SWCNT TFTs with widely separated and symmetric threshold voltages. Additionally, a novel n-type doping procedure for SWCNT TFTs was also developed utilizing a solution-processed organometallic small molecule to demonstrate the first network top-gated n-type SWCNT TFTs. Lastly, new doping and encapsulation layers were incorporated to stabilize both p-type and n-type SWCNT TFT electronic properties, which enabled the fabrication of large-scale memory circuits. Employing these materials and processing advances has addressed many application specific barriers to commercialization. For instance, the first thin-film SWCNT complementary metal-oxide-semi-conductor (CMOS) logic devices are demonstrated with sub-nanowatt static power consumption and full rail-to-rail voltage transfer characteristics. With the introduction of a new n-type Rh-based molecular dopant, the first SWCNT TFTs are fabricated in top-gate geometries over large areas with high yield. Then by utilizing robust encapsulation methods, stable and uniform electronic performance of both p-type and n-type SWCNT TFTs has been achieved. Based on these complementary SWCNT TFTs, it is possible to simulate, design, and fabricate arrays of low-power static random access memory (SRAM) circuits, achieving large-scale integration for the first time based on solution-processed semiconductors. Together, this work provides a direct pathway for solution processable, large scale, power-efficient advanced integrated logic circuits and systems.

  11. Bringing Abstract Academic Integrity and Ethical Concepts into Real-Life Situations

    ERIC Educational Resources Information Center

    Kwong, Theresa; Wong, Eva; Yue, Kevin

    2017-01-01

    This paper reports the learning analytics on the initial stages of a large-scale, government-funded project which inducts university students in Hong Kong into consideration of academic integrity and ethics through mobile Augmented Reality (AR) learning trails--Trails of Integrity and Ethics (TIEs)--accessed on smart devices. The trails immerse…

  12. Microfabrication techniques for integrated sensors and microsystems.

    PubMed

    Wise, K D; Najafi, K

    1991-11-29

    Integrated sensors and actuators are rapidly evolving to provide an important link between very large scale integrated circuits and nonelectronic monitoring and control applications ranging from biomedicine to automated manufacturing. As they continue to expand, entire microsystems merging electrical, mechanical, thermal, optical, magnetic, and perhaps chemical components should be possible on a common substrate.

  13. Chip-integrated optical power limiter based on an all-passive micro-ring resonator

    NASA Astrophysics Data System (ADS)

    Yan, Siqi; Dong, Jianji; Zheng, Aoling; Zhang, Xinliang

    2014-10-01

    Recent progress in silicon nanophotonics has dramatically advanced the possible realization of large-scale on-chip optical interconnects integration. Adopting photons as information carriers can break the performance bottleneck of electronic integrated circuit such as serious thermal losses and poor process rates. However, in integrated photonics circuits, few reported work can impose an upper limit of optical power therefore prevent the optical device from harm caused by high power. In this study, we experimentally demonstrate a feasible integrated scheme based on a single all-passive micro-ring resonator to realize the optical power limitation which has a similar function of current limiting circuit in electronics. Besides, we analyze the performance of optical power limiter at various signal bit rates. The results show that the proposed device can limit the signal power effectively at a bit rate up to 20 Gbit/s without deteriorating the signal. Meanwhile, this ultra-compact silicon device can be completely compatible with the electronic technology (typically complementary metal-oxide semiconductor technology), which may pave the way of very large scale integrated photonic circuits for all-optical information processors and artificial intelligence systems.

  14. Improving the integration of recreation management with management of other natural resources by applying concepts of scale from ecology.

    PubMed

    Morse, Wayde C; Hall, Troy E; Kruger, Linda E

    2009-03-01

    In this article, we examine how issues of scale affect the integration of recreation management with the management of other natural resources on public lands. We present two theories used to address scale issues in ecology and explore how they can improve the two most widely applied recreation-planning frameworks. The theory of patch dynamics and hierarchy theory are applied to the recreation opportunity spectrum (ROS) and the limits of acceptable change (LAC) recreation-planning frameworks. These frameworks have been widely adopted internationally, and improving their ability to integrate with other aspects of natural resource management has significant social and conservation implications. We propose that incorporating ecologic criteria and scale concepts into these recreation-planning frameworks will improve the foundation for integrated land management by resolving issues of incongruent boundaries, mismatched scales, and multiple-scale analysis. Specifically, we argue that whereas the spatially explicit process of the ROS facilitates integrated decision making, its lack of ecologic criteria, broad extent, and large patch size decrease its usefulness for integration at finer scales. The LAC provides explicit considerations for weighing competing values, but measurement of recreation disturbances within an LAC analysis is often done at too fine a grain and at too narrow an extent for integration with other recreation and resource concerns. We suggest that planners should perform analysis at multiple scales when making management decisions that involve trade-offs among competing values. The United States Forest Service is used as an example to discuss how resource-management agencies can improve this integration.

  15. The Use of CASES-97 Observations to Assess and Parameterize the Impact of Land-Surface Heterogeneity on Area-Average Surface Heat Fluxes for Large-Scale Coupled Atmosphere-Hydrology Models

    NASA Technical Reports Server (NTRS)

    Chen, Fei; Yates, David; LeMone, Margaret

    2001-01-01

    To understand the effects of land-surface heterogeneity and the interactions between the land-surface and the planetary boundary layer at different scales, we develop a multiscale data set. This data set, based on the Cooperative Atmosphere-Surface Exchange Study (CASES97) observations, includes atmospheric, surface, and sub-surface observations obtained from a dense observation network covering a large region on the order of 100 km. We use this data set to drive three land-surface models (LSMs) to generate multi-scale (with three resolutions of 1, 5, and 10 kilometers) gridded surface heat flux maps for the CASES area. Upon validating these flux maps with measurements from surface station and aircraft, we utilize them to investigate several approaches for estimating the area-integrated surface heat flux for the CASES97 domain of 71x74 square kilometers, which is crucial for land surface model development/validation and area water and energy budget studies. This research is aimed at understanding the relative contribution of random turbulence versus organized mesoscale circulations to the area-integrated surface flux at the scale of 100 kilometers, and identifying the most important effective parameters for characterizing the subgrid-scale variability for large-scale atmosphere-hydrology models.

  16. A Review of Large-Scale Fracture Experiments Relevant to Pressure Vessel Integrity Under Pressurized Thermal Shock Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pugh, C.E.

    2001-01-29

    Numerous large-scale fracture experiments have been performed over the past thirty years to advance fracture mechanics methodologies applicable to thick-wall pressure vessels. This report first identifies major factors important to nuclear reactor pressure vessel (RPV) integrity under pressurized thermal shock (PTS) conditions. It then covers 20 key experiments that have contributed to identifying fracture behavior of RPVs and to validating applicable assessment methodologies. The experiments are categorized according to four types of specimens: (1) cylindrical specimens, (2) pressurized vessels, (3) large plate specimens, and (4) thick beam specimens. These experiments were performed in laboratories in six different countries. This reportmore » serves as a summary of those experiments, and provides a guide to references for detailed information.« less

  17. Construction of large scale switch matrix by interconnecting integrated optical switch chips with EDFAs

    NASA Astrophysics Data System (ADS)

    Liao, Mingle; Wu, Baojian; Hou, Jianhong; Qiu, Kun

    2018-03-01

    Large scale optical switches are essential components in optical communication network. We aim to build up a large scale optical switch matrix by the interconnection of silicon-based optical switch chips using 3-stage CLOS structure, where EDFAs are needed to compensate for the insertion loss of the chips. The optical signal-to-noise ratio (OSNR) performance of the resulting large scale optical switch matrix is investigated for TE-mode light and the experimental results are in agreement with the theoretical analysis. We build up a 64 ×64 switch matrix by use of 16 ×16 optical switch chips and the OSNR and receiver sensibility can respectively be improved by 0.6 dB and 0.2 dB by optimizing the gain configuration of the EDFAs.

  18. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    NASA Astrophysics Data System (ADS)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  19. Analysis of Three-Dimensional, Nonlinear Development of Wave-Like Structure in a Compressible Round Jet

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Mankbadi, Reda R.

    2002-01-01

    An analysis of the nonlinear development of the large-scale structures or instability waves in compressible round jets was conducted using the integral energy method. The equations of motion were decomposed into two sets of equations; one set governing the mean flow motion and the other set governing the large-scale structure motion. The equations in each set were then combined to derive kinetic energy equations that were integrated in the radial direction across the jet after the boundary-layer approximations were applied. Following the application of further assumptions regarding the radial shape of the mean flow and the large structures, equations were derived that govern the nonlinear, streamwise development of the large structures. Using numerically generated mean flows, calculations show the energy exchanges and the effects of the initial amplitude on the coherent structure development in the jet.

  20. Vertical and lateral heterogeneous integration

    NASA Astrophysics Data System (ADS)

    Geske, Jon; Okuno, Yae L.; Bowers, John E.; Jayaraman, Vijay

    2001-09-01

    A technique for achieving large-scale monolithic integration of lattice-mismatched materials in the vertical direction and the lateral integration of dissimilar lattice-matched structures has been developed. The technique uses a single nonplanar direct-wafer-bond step to transform vertically integrated epitaxial structures into lateral epitaxial variation across the surface of a wafer. Nonplanar wafer bonding is demonstrated by integrating four different unstrained multi-quantum-well active regions lattice matched to InP on a GaAs wafer surface. Microscopy is used to verify the quality of the bonded interface, and photoluminescence is used to verify that the bonding process does not degrade the optical quality of the laterally integrated wells. The authors propose this technique as a means to achieve greater levels of wafer-scale integration in optical, electrical, and micromechanical devices.

  1. The Parallel System for Integrating Impact Models and Sectors (pSIMS)

    NASA Technical Reports Server (NTRS)

    Elliott, Joshua; Kelly, David; Chryssanthacopoulos, James; Glotter, Michael; Jhunjhnuwala, Kanika; Best, Neil; Wilde, Michael; Foster, Ian

    2014-01-01

    We present a framework for massively parallel climate impact simulations: the parallel System for Integrating Impact Models and Sectors (pSIMS). This framework comprises a) tools for ingesting and converting large amounts of data to a versatile datatype based on a common geospatial grid; b) tools for translating this datatype into custom formats for site-based models; c) a scalable parallel framework for performing large ensemble simulations, using any one of a number of different impacts models, on clusters, supercomputers, distributed grids, or clouds; d) tools and data standards for reformatting outputs to common datatypes for analysis and visualization; and e) methodologies for aggregating these datatypes to arbitrary spatial scales such as administrative and environmental demarcations. By automating many time-consuming and error-prone aspects of large-scale climate impacts studies, pSIMS accelerates computational research, encourages model intercomparison, and enhances reproducibility of simulation results. We present the pSIMS design and use example assessments to demonstrate its multi-model, multi-scale, and multi-sector versatility.

  2. Leaky Integrate and Fire Neuron by Charge-Discharge Dynamics in Floating-Body MOSFET.

    PubMed

    Dutta, Sangya; Kumar, Vinay; Shukla, Aditya; Mohapatra, Nihar R; Ganguly, Udayan

    2017-08-15

    Neuro-biology inspired Spiking Neural Network (SNN) enables efficient learning and recognition tasks. To achieve a large scale network akin to biology, a power and area efficient electronic neuron is essential. Earlier, we had demonstrated an LIF neuron by a novel 4-terminal impact ionization based n+/p/n+ with an extended gate (gated-INPN) device by physics simulation. Excellent improvement in area and power compared to conventional analog circuit implementations was observed. In this paper, we propose and experimentally demonstrate a compact conventional 3-terminal partially depleted (PD) SOI- MOSFET (100 nm gate length) to replace the 4-terminal gated-INPN device. Impact ionization (II) induced floating body effect in SOI-MOSFET is used to capture LIF neuron behavior to demonstrate spiking frequency dependence on input. MHz operation enables attractive hardware acceleration compared to biology. Overall, conventional PD-SOI-CMOS technology enables very-large-scale-integration (VLSI) which is essential for biology scale (~10 11 neuron based) large neural networks.

  3. Botswana water and surface energy balance research program. Part 2: Large scale moisture and passive microwaves

    NASA Technical Reports Server (NTRS)

    Vandegriend, A. A.; Owe, M.; Chang, A. T. C.

    1992-01-01

    The Botswana water and surface energy balance research program was developed to study and evaluate the integrated use of multispectral satellite remote sensing for monitoring the hydrological status of the Earth's surface. The research program consisted of two major, mutually related components: a surface energy balance modeling component, built around an extensive field campaign; and a passive microwave research component which consisted of a retrospective study of large scale moisture conditions and Nimbus scanning multichannel microwave radiometer microwave signatures. The integrated approach of both components are explained in general and activities performed within the passive microwave research component are summarized. The microwave theory is discussed taking into account: soil dielectric constant, emissivity, soil roughness effects, vegetation effects, optical depth, single scattering albedo, and wavelength effects. The study site is described. The soil moisture data and its processing are considered. The relation between observed large scale soil moisture and normalized brightness temperatures is discussed. Vegetation characteristics and inverse modeling of soil emissivity is considered.

  4. Forest Ecosystem Analysis Using a GIS

    Treesearch

    S.G. McNulty; W.T. Swank

    1996-01-01

    Forest ecosystem studies have expanded spatially in recent years to address large scale environmental issues. We are using a geographic information system (GIS) to understand and integrate forest processes at landscape to regional spatial scales. This paper presents three diverse research studies using a GIS. First, we used a GIS to develop a landscape scale model to...

  5. GIGGLE: a search engine for large-scale integrated genome analysis.

    PubMed

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-02-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation.

  6. Impact of Utility-Scale Distributed Wind on Transmission-Level System Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brancucci Martinez-Anido, C.; Hodge, B. M.

    2014-09-01

    This report presents a new renewable integration study that aims to assess the potential for adding distributed wind to the current power system with minimal or no upgrades to the distribution or transmission electricity systems. It investigates the impacts of integrating large amounts of utility-scale distributed wind power on bulk system operations by performing a case study on the power system of the Independent System Operator-New England (ISO-NE).

  7. GIGGLE: a search engine for large-scale integrated genome analysis

    PubMed Central

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-01-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation. PMID:29309061

  8. Recent developments in microfluidic large scale integration.

    PubMed

    Araci, Ismail Emre; Brisk, Philip

    2014-02-01

    In 2002, Thorsen et al. integrated thousands of micromechanical valves on a single microfluidic chip and demonstrated that the control of the fluidic networks can be simplified through multiplexors [1]. This enabled realization of highly parallel and automated fluidic processes with substantial sample economy advantage. Moreover, the fabrication of these devices by multilayer soft lithography was easy and reliable hence contributed to the power of the technology; microfluidic large scale integration (mLSI). Since then, mLSI has found use in wide variety of applications in biology and chemistry. In the meantime, efforts to improve the technology have been ongoing. These efforts mostly focus on; novel materials, components, micromechanical valve actuation methods, and chip architectures for mLSI. In this review, these technological advances are discussed and, recent examples of the mLSI applications are summarized. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Integrating market processes into utility resource planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahn, E.P.

    1992-11-01

    Integrated resource planning has resulted in an abundance of alternatives for meeting existing and new demand for electricity services: (1) utility demand-side management (DSM) programs, (2) DSM bidding, (3) competitive bidding for private power supplies, (4) utility re-powering, and (5) new utility construction. Each alternative relies on a different degree of planning for implementation and, therefore, each alternative relies on markets to a greater or lesser degree. This paper shows how the interaction of planning processes and market forces results in resource allocations among the alternatives. The discussion focuses on three phenomena that are driving forces behind the unanticipated consequences'more » of contemporary integrated resource planning efforts. These forces are: (1) large-scale DSM efforts, (2) customer bypass, and (3) large-scale independent power projects. 22 refs., 3 figs., 2 tabs.« less

  10. Centimeter-Scale 2D van der Waals Vertical Heterostructures Integrated on Deformable Substrates Enabled by Gold Sacrificial Layer-Assisted Growth.

    PubMed

    Islam, Md Ashraful; Kim, Jung Han; Schropp, Anthony; Kalita, Hirokjyoti; Choudhary, Nitin; Weitzman, Dylan; Khondaker, Saiful I; Oh, Kyu Hwan; Roy, Tania; Chung, Hee-Suk; Jung, Yeonwoong

    2017-10-11

    Two-dimensional (2D) transition metal dichalcogenides (TMDs) such as molybdenum or tungsten disulfides (MoS 2 or WS 2 ) exhibit extremely large in-plane strain limits and unusual optical/electrical properties, offering unprecedented opportunities for flexible electronics/optoelectronics in new form factors. In order for them to be technologically viable building-blocks for such emerging technologies, it is critically demanded to grow/integrate them onto flexible or arbitrary-shaped substrates on a large wafer-scale compatible with the prevailing microelectronics processes. However, conventional approaches to assemble them on such unconventional substrates via mechanical exfoliations or coevaporation chemical growths have been limited to small-area transfers of 2D TMD layers with uncontrolled spatial homogeneity. Moreover, additional processes involving a prolonged exposure to strong chemical etchants have been required for the separation of as-grown 2D layers, which is detrimental to their material properties. Herein, we report a viable strategy to universally combine the centimeter-scale growth of various 2D TMD layers and their direct assemblies on mechanically deformable substrates. By exploring the water-assisted debonding of gold (Au) interfaced with silicon dioxide (SiO 2 ), we demonstrate the direct growth, transfer, and integration of 2D TMD layers and heterostructures such as 2D MoS 2 and 2D MoS 2 /WS 2 vertical stacks on centimeter-scale plastic and metal foil substrates. We identify the dual function of the Au layer as a growth substrate as well as a sacrificial layer which facilitates 2D layer transfer. Furthermore, we demonstrate the versatility of this integration approach by fabricating centimeter-scale 2D MoS 2 /single walled carbon nanotube (SWNT) vertical heterojunctions which exhibit current rectification and photoresponse. This study opens a pathway to explore large-scale 2D TMD van der Waals layers as device building blocks for emerging mechanically deformable electronics/optoelectronics.

  11. Pioneering University/Industry Venture Explores VLSI Frontiers.

    ERIC Educational Resources Information Center

    Davis, Dwight B.

    1983-01-01

    Discusses industry-sponsored programs in semiconductor research, focusing on Stanford University's Center for Integrated Systems (CIS). CIS, while pursuing research in semiconductor very-large-scale integration, is merging the fields of computer science, information science, and physical science. Issues related to these university/industry…

  12. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    NASA Technical Reports Server (NTRS)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  13. Cost of Community Integrated Prevention Campaign for Malaria, HIV, and Diarrhea in Rural Kenya

    PubMed Central

    2011-01-01

    Background Delivery of community-based prevention services for HIV, malaria, and diarrhea is a major priority and challenge in rural Africa. Integrated delivery campaigns may offer a mechanism to achieve high coverage and efficiency. Methods We quantified the resources and costs to implement a large-scale integrated prevention campaign in Lurambi Division, Western Province, Kenya that reached 47,133 individuals (and 83% of eligible adults) in 7 days. The campaign provided HIV testing, condoms, and prevention education materials; a long-lasting insecticide-treated bed net; and a water filter. Data were obtained primarily from logistical and expenditure data maintained by implementing partners. We estimated the projected cost of a Scaled-Up Replication (SUR), assuming reliance on local managers, potential efficiencies of scale, and other adjustments. Results The cost per person served was $41.66 for the initial campaign and was projected at $31.98 for the SUR. The SUR cost included 67% for commodities (mainly water filters and bed nets) and 20% for personnel. The SUR projected unit cost per person served, by disease, was $6.27 for malaria (nets and training), $15.80 for diarrhea (filters and training), and $9.91 for HIV (test kits, counseling, condoms, and CD4 testing at each site). Conclusions A large-scale, rapidly implemented, integrated health campaign provided services to 80% of a rural Kenyan population with relatively low cost. Scaling up this design may provide similar services to larger populations at lower cost per person. PMID:22189090

  14. Techniques for Large-Scale Bacterial Genome Manipulation and Characterization of the Mutants with Respect to In Silico Metabolic Reconstructions.

    PubMed

    diCenzo, George C; Finan, Turlough M

    2018-01-01

    The rate at which all genes within a bacterial genome can be identified far exceeds the ability to characterize these genes. To assist in associating genes with cellular functions, a large-scale bacterial genome deletion approach can be employed to rapidly screen tens to thousands of genes for desired phenotypes. Here, we provide a detailed protocol for the generation of deletions of large segments of bacterial genomes that relies on the activity of a site-specific recombinase. In this procedure, two recombinase recognition target sequences are introduced into known positions of a bacterial genome through single cross-over plasmid integration. Subsequent expression of the site-specific recombinase mediates recombination between the two target sequences, resulting in the excision of the intervening region and its loss from the genome. We further illustrate how this deletion system can be readily adapted to function as a large-scale in vivo cloning procedure, in which the region excised from the genome is captured as a replicative plasmid. We next provide a procedure for the metabolic analysis of bacterial large-scale genome deletion mutants using the Biolog Phenotype MicroArray™ system. Finally, a pipeline is described, and a sample Matlab script is provided, for the integration of the obtained data with a draft metabolic reconstruction for the refinement of the reactions and gene-protein-reaction relationships in a metabolic reconstruction.

  15. A Needs Analysis for Technology Integration Plan: Challenges and Needs of Teachers

    ERIC Educational Resources Information Center

    Vatanartiran, Sinem; Karadeniz, Sirin

    2015-01-01

    Lack of technology leadership and technology integration plans are important obstacles for using technology effectively in schools. We carried out a large-scale study to be able to design a technology integration plan for one of the pilot provinces that Fatih Project was initiated. The purpose of this research is to examine the perceived…

  16. Data Integration: Charting a Path Forward to 2035

    DTIC Science & Technology

    2011-02-14

    New York, NY: Gotham Books, 2004. Seligman , Len. Mitre Corporation, e-mail interview, 6 Dec 2010. Singer, P.W. Wired for War: The Robotics...articles.aspx (accessed 4 Dec 2010). Ultra-Large-Scale Systems: The Software Challenge of the Future. Study lead Linda Northrup. Pittsburgh, PA: Carnegie...Virtualization?‖ 1. 41 Ultra-Large-Scale Systems: The Software Challenge of the Future. Study lead Linda Northrup. Pittsburgh, PA: Carnegie Mellon Software

  17. Breaking barriers through collaboration: the example of the Cell Migration Consortium.

    PubMed

    Horwitz, Alan Rick; Watson, Nikki; Parsons, J Thomas

    2002-10-15

    Understanding complex integrated biological processes, such as cell migration, requires interdisciplinary approaches. The Cell Migration Consortium, funded by a Large-Scale Collaborative Project Award from the National Institute of General Medical Science, develops and disseminates new technologies, data, reagents, and shared information to a wide audience. The development and operation of this Consortium may provide useful insights for those who plan similarly large-scale, interdisciplinary approaches.

  18. A Programmable and Configurable Mixed-Mode FPAA SoC

    DTIC Science & Technology

    2016-03-17

    A Programmable and Configurable Mixed-Mode FPAA SoC Sahil Shah, Sihwan Kim, Farhan Adil, Jennifer Hasler, Suma George, Michelle Collins, Richard...Abstract: The authors present a Floating-Gate based, System-On-Chip large-scale Field- Programmable Analog Array IC that integrates divergent concepts...Floating-Gate, SoC, Command Word Classification This paper presents a Floating-Gate (FG) based, System- On-Chip (SoC) large-scale Field- Programmable

  19. Hierarchical Engine for Large-scale Infrastructure Co-Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-04-24

    HELICS is designed to support very-large-scale (100,000+ federates) cosimulations with off-the-shelf power-system, communication, market, and end-use tools. Other key features include cross platform operating system support, the integration of both event driven (e.g., packetized communication) and time-series (e.g., power flow) simulations, and the ability to co-iterate among federates to ensure physical model convergence at each time step.

  20. Sensitivity of the two-dimensional shearless mixing layer to the initial turbulent kinetic energy and integral length scale

    NASA Astrophysics Data System (ADS)

    Fathali, M.; Deshiri, M. Khoshnami

    2016-04-01

    The shearless mixing layer is generated from the interaction of two homogeneous isotropic turbulence (HIT) fields with different integral scales ℓ1 and ℓ2 and different turbulent kinetic energies E1 and E2. In this study, the sensitivity of temporal evolutions of two-dimensional, incompressible shearless mixing layers to the parametric variations of ℓ1/ℓ2 and E1/E2 is investigated. The sensitivity methodology is based on the nonintrusive approach; using direct numerical simulation and generalized polynomial chaos expansion. The analysis is carried out at Reℓ 1=90 for the high-energy HIT region and different integral length scale ratios 1 /4 ≤ℓ1/ℓ2≤4 and turbulent kinetic energy ratios 1 ≤E1/E2≤30 . It is found that the most influential parameter on the variability of the mixing layer evolution is the turbulent kinetic energy while variations of the integral length scale show a negligible influence on the flow field variability. A significant level of anisotropy and intermittency is observed in both large and small scales. In particular, it is found that large scales have higher levels of intermittency and sensitivity to the variations of ℓ1/ℓ2 and E1/E2 compared to the small scales. Reconstructed response surfaces of the flow field intermittency and the turbulent penetration depth show monotonic dependence on ℓ1/ℓ2 and E1/E2 . The mixing layer growth rate and the mixing efficiency both show sensitive dependence on the initial condition parameters. However, the probability density function of these quantities shows relatively small solution variations in response to the variations of the initial condition parameters.

  1. Evaluation of advanced microelectronics for inclusion in MIL-STD-975

    NASA Technical Reports Server (NTRS)

    Scott, W. Richard

    1991-01-01

    The approach taken by NASA and JPL (Jet Propulsion Laboratory) in the development of a MIL-STD-975 section which contains advanced technology such as Large Scale Integration and Very Large Scale Integration (LSI/VLSI) microelectronic devices is described. The parts listed in this section are recommended as satisfactory for NASA flight applications, in the absence of alternate qualified devices, based on satisfactory results of a vendor capability audit, the availability of sufficient characterization and reliability data from the manufacturers and users and negotiated detail procurement specifications. The criteria used in the selection and evaluation of the vendors and candidate parts, the preparation of procurement specifications, and the status of this activity are discussed.

  2. Large-Scale Document Automation: The Systems Integration Issue.

    ERIC Educational Resources Information Center

    Kalthoff, Robert J.

    1985-01-01

    Reviews current technologies for electronic imaging and its recording and transmission, including digital recording, optical data disks, automated image-delivery micrographics, high-density-magnetic recording, and new developments in telecommunications and computers. The role of the document automation systems integrator, who will bring these…

  3. Integrating Carbon Nanotubes For Atomic Force Microscopy Imaging Applications

    NASA Technical Reports Server (NTRS)

    Ye, Qi; Cassell, Alan M.; Liu, Hongbing; Han, Jie; Meyyappan, Meyya

    2004-01-01

    Carbon nanotube (CNT) related nanostructures possess remarkable electrical, mechanical, and thermal properties. To produce these nanostructures for real world applications, a large-scale controlled growth of carbon nanotubes is crucial for the integration and fabrication of nanodevices and nanosensors. We have taken the approach of integrating nanopatterning and nanomaterials synthesis with traditional silicon micro fabrication techniques. This integration requires a catalyst or nanomaterial protection scheme. In this paper, we report our recent work on fabricating wafer-scale carbon nanotube AFM cantilever probe tips. We will address the design and fabrication considerations in detail, and present the preliminary scanning probe test results. This work may serve as an example of rational design, fabrication, and integration of nanomaterials for advanced nanodevice and nanosensor applications.

  4. The revolution in data gathering systems

    NASA Technical Reports Server (NTRS)

    Cambra, J. M.; Trover, W. F.

    1975-01-01

    Data acquisition systems used in NASA's wind tunnels from the 1950's through the present time are summarized as a baseline for assessing the impact of minicomputers and microcomputers on data acquisition and data processing. Emphasis is placed on the cyclic evolution in computer technology which transformed the central computer system, and finally the distributed computer system. Other developments discussed include: medium scale integration, large scale integration, combining the functions of data acquisition and control, and micro and minicomputers.

  5. The Plant Genome Integrative Explorer Resource: PlantGenIE.org.

    PubMed

    Sundell, David; Mannapperuma, Chanaka; Netotea, Sergiu; Delhomme, Nicolas; Lin, Yao-Cheng; Sjödin, Andreas; Van de Peer, Yves; Jansson, Stefan; Hvidsten, Torgeir R; Street, Nathaniel R

    2015-12-01

    Accessing and exploring large-scale genomics data sets remains a significant challenge to researchers without specialist bioinformatics training. We present the integrated PlantGenIE.org platform for exploration of Populus, conifer and Arabidopsis genomics data, which includes expression networks and associated visualization tools. Standard features of a model organism database are provided, including genome browsers, gene list annotation, Blast homology searches and gene information pages. Community annotation updating is supported via integration of WebApollo. We have produced an RNA-sequencing (RNA-Seq) expression atlas for Populus tremula and have integrated these data within the expression tools. An updated version of the ComPlEx resource for performing comparative plant expression analyses of gene coexpression network conservation between species has also been integrated. The PlantGenIE.org platform provides intuitive access to large-scale and genome-wide genomics data from model forest tree species, facilitating both community contributions to annotation improvement and tools supporting use of the included data resources to inform biological insight. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  6. The OLI Radiometric Scale Realization Round Robin Measurement Campaign

    NASA Technical Reports Server (NTRS)

    Cutlip, Hansford; Cole,Jerold; Johnson, B. Carol; Maxwell, Stephen; Markham, Brian; Ong, Lawrence; Hom, Milton; Biggar, Stuart

    2011-01-01

    A round robin radiometric scale realization was performed at the Ball Aerospace Radiometric Calibration Laboratory in January/February 2011 in support of the Operational Land Imager (OLI) Program. Participants included Ball Aerospace, NIST, NASA Goddard Space Flight Center, and the University of Arizona. The eight day campaign included multiple observations of three integrating sphere sources by nine radiometers. The objective of the campaign was to validate the radiance calibration uncertainty ascribed to the integrating sphere used to calibrate the OLI instrument. The instrument level calibration source uncertainty was validated by quatnifying: (1) the long term stability of the NIST calibrated radiance artifact, (2) the responsivity scale of the Ball Aerospace transfer radiometer and (3) the operational characteristics of the large integrating sphere.

  7. Large-scale quantum photonic circuits in silicon

    NASA Astrophysics Data System (ADS)

    Harris, Nicholas C.; Bunandar, Darius; Pant, Mihir; Steinbrecher, Greg R.; Mower, Jacob; Prabhu, Mihika; Baehr-Jones, Tom; Hochberg, Michael; Englund, Dirk

    2016-08-01

    Quantum information science offers inherently more powerful methods for communication, computation, and precision measurement that take advantage of quantum superposition and entanglement. In recent years, theoretical and experimental advances in quantum computing and simulation with photons have spurred great interest in developing large photonic entangled states that challenge today's classical computers. As experiments have increased in complexity, there has been an increasing need to transition bulk optics experiments to integrated photonics platforms to control more spatial modes with higher fidelity and phase stability. The silicon-on-insulator (SOI) nanophotonics platform offers new possibilities for quantum optics, including the integration of bright, nonclassical light sources, based on the large third-order nonlinearity (χ(3)) of silicon, alongside quantum state manipulation circuits with thousands of optical elements, all on a single phase-stable chip. How large do these photonic systems need to be? Recent theoretical work on Boson Sampling suggests that even the problem of sampling from e30 identical photons, having passed through an interferometer of hundreds of modes, becomes challenging for classical computers. While experiments of this size are still challenging, the SOI platform has the required component density to enable low-loss and programmable interferometers for manipulating hundreds of spatial modes. Here, we discuss the SOI nanophotonics platform for quantum photonic circuits with hundreds-to-thousands of optical elements and the associated challenges. We compare SOI to competing technologies in terms of requirements for quantum optical systems. We review recent results on large-scale quantum state evolution circuits and strategies for realizing high-fidelity heralded gates with imperfect, practical systems. Next, we review recent results on silicon photonics-based photon-pair sources and device architectures, and we discuss a path towards large-scale source integration. Finally, we review monolithic integration strategies for single-photon detectors and their essential role in on-chip feed forward operations.

  8. Workflow management in large distributed systems

    NASA Astrophysics Data System (ADS)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  9. Large Scale Integrated Circuits for Military Applications.

    DTIC Science & Technology

    1977-05-01

    economic incentive for riarrowing this gap is examined, y (U)^wo"categories of cost are analyzed: the direct life cycle cost of the integrated circuit...dependence of these costs on the physical charac- teristics of the integrated circuits is discussed. (U) The economic and physical characteristics of... economic incentive for narrowing this gap is examined. Two categories of cost are analyzed: the direct life cycle cost of the integrated circuit

  10. Book Review: Large-Scale Ecosystem Restoration: Five Case Studies from the United States

    EPA Science Inventory

    Broad-scale ecosystem restoration efforts involve a very complex set of ecological and societal components, and the success of any ecosystem restoration project rests on an integrated approach to implementation. Editors Mary Doyle and Cynthia Drew have successfully synthesized ma...

  11. Large scale integration of graphene transistors for potential applications in the back end of the line

    NASA Astrophysics Data System (ADS)

    Smith, A. D.; Vaziri, S.; Rodriguez, S.; Östling, M.; Lemme, M. C.

    2015-06-01

    A chip to wafer scale, CMOS compatible method of graphene device fabrication has been established, which can be integrated into the back end of the line (BEOL) of conventional semiconductor process flows. In this paper, we present experimental results of graphene field effect transistors (GFETs) which were fabricated using this wafer scalable method. The carrier mobilities in these transistors reach up to several hundred cm2 V-1 s-1. Further, these devices exhibit current saturation regions similar to graphene devices fabricated using mechanical exfoliation. The overall performance of the GFETs can not yet compete with record values reported for devices based on mechanically exfoliated material. Nevertheless, this large scale approach is an important step towards reliability and variability studies as well as optimization of device aspects such as electrical contacts and dielectric interfaces with statistically relevant numbers of devices. It is also an important milestone towards introducing graphene into wafer scale process lines.

  12. Scale interaction and arrangement in a turbulent boundary layer perturbed by a wall-mounted cylindrical element

    NASA Astrophysics Data System (ADS)

    Tang, Zhanqi; Jiang, Nan

    2018-05-01

    This study reports the modifications of scale interaction and arrangement in a turbulent boundary layer perturbed by a wall-mounted circular cylinder. Hot-wire measurements were executed at multiple streamwise and wall-normal wise locations downstream of the cylindrical element. The streamwise fluctuating signals were decomposed into large-, small-, and dissipative-scale signatures by corresponding cutoff filters. The scale interaction under the cylindrical perturbation was elaborated by comparing the small- and dissipative-scale amplitude/frequency modulation effects downstream of the cylinder element with the results observed in the unperturbed case. It was obtained that the large-scale fluctuations perform a stronger amplitude modulation on both the small and dissipative scales in the near-wall region. At the wall-normal positions of the cylinder height, the small-scale amplitude modulation coefficients are redistributed by the cylinder wake. The similar observation was noted in small-scale frequency modulation; however, the dissipative-scale frequency modulation seems to be independent of the cylindrical perturbation. The phase-relationship observation indicated that the cylindrical perturbation shortens the time shifts between both the small- and dissipative-scale variations (amplitude and frequency) and large-scale fluctuations. Then, the integral time scale dependence of the phase-relationship between the small/dissipative scales and large scales was also discussed. Furthermore, the discrepancy of small- and dissipative-scale time shifts relative to the large-scale motions was examined, which indicates that the small-scale amplitude/frequency leads the dissipative scales.

  13. Tethys – A Python Package for Spatial and Temporal Downscaling of Global Water Withdrawals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.

    Downscaling of water withdrawals from regional/national to local scale is a fundamental step and also a common problem when integrating large scale economic and integrated assessment models with high-resolution detailed sectoral models. Tethys, an open-access software written in Python, is developed with statistical downscaling algorithms, to spatially and temporally downscale water withdrawal data to a finer scale. The spatial resolution will be downscaled from region/basin scale to grid (0.5 geographic degree) scale and the temporal resolution will be downscaled from year to month. Tethys is used to produce monthly global gridded water withdrawal products based on estimates from the Globalmore » Change Assessment Model (GCAM).« less

  14. Tethys – A Python Package for Spatial and Temporal Downscaling of Global Water Withdrawals

    DOE PAGES

    Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.; ...

    2018-02-09

    Downscaling of water withdrawals from regional/national to local scale is a fundamental step and also a common problem when integrating large scale economic and integrated assessment models with high-resolution detailed sectoral models. Tethys, an open-access software written in Python, is developed with statistical downscaling algorithms, to spatially and temporally downscale water withdrawal data to a finer scale. The spatial resolution will be downscaled from region/basin scale to grid (0.5 geographic degree) scale and the temporal resolution will be downscaled from year to month. Tethys is used to produce monthly global gridded water withdrawal products based on estimates from the Globalmore » Change Assessment Model (GCAM).« less

  15. The application of liquid air energy storage for large scale long duration solutions to grid balancing

    NASA Astrophysics Data System (ADS)

    Brett, Gareth; Barnett, Matthew

    2014-12-01

    Liquid Air Energy Storage (LAES) provides large scale, long duration energy storage at the point of demand in the 5 MW/20 MWh to 100 MW/1,000 MWh range. LAES combines mature components from the industrial gas and electricity industries assembled in a novel process and is one of the few storage technologies that can be delivered at large scale, with no geographical constraints. The system uses no exotic materials or scarce resources and all major components have a proven lifetime of 25+ years. The system can also integrate low grade waste heat to increase power output. Founded in 2005, Highview Power Storage, is a UK based developer of LAES. The company has taken the concept from academic analysis, through laboratory testing, and in 2011 commissioned the world's first fully integrated system at pilot plant scale (300 kW/2.5 MWh) hosted at SSE's (Scottish & Southern Energy) 80 MW Biomass Plant in Greater London which was partly funded by a Department of Energy and Climate Change (DECC) grant. Highview is now working with commercial customers to deploy multi MW commercial reference plants in the UK and abroad.

  16. Synopsis of emergent approaches

    Treesearch

    Malcolm North; Brandon Collins; John Keane; Jonathan W. Long; Carl Skinner; Bill Zielinski

    2014-01-01

    This synopsis presents three integrated themes that emerged from synthesizing information about biological resources. These themes become particularly important when managing forests to promote resilience at large landscape scales and long timeframes. This synopsis summarizes ideas in the longer chapter 1.2, “Integrative Approaches: Promoting Socioecological Resilience...

  17. Integration of MnO2 thin film and carbon nanotubes to three-dimensional carbon microelectrodes for electrochemical microcapacitors

    NASA Astrophysics Data System (ADS)

    Jiang, Shulan; Shi, Tielin; Liu, Dan; Long, Hu; Xi, Shuang; Wu, Fengshun; Li, Xiaoping; Xia, Qi; Tang, Zirong

    2014-09-01

    Large-scale three-dimensional (3D) hybrid microelectrodes have been fabricated through modified carbon microelectromechanical systems (Carbon-MEMS) process and electrochemical deposition method. Greatly improved electrochemical performance has been shown for the 3D photoresist-derived carbon microelectrodes with the integration of carbon nanotubes (CNTs) and manganese dioxide (MnO2). The electrochemical measurements of the microelectrodes indicate that the specific geometric capacitance can reach up to 238 mF cm-2 at the current density of 0.5 mA cm-2. The capacitance loss is less than 18.2% of the original value after 6000 charge-discharge cycles. This study shows that stacking of MnO2 film and integrating of CNTs to the 3D glassy carbon microelectrodes have great potential for on-chip microcapacitors as energy storage devices, and the presented approach is promising for large-scale and low-cost manufacturing.

  18. Multirate Particle-in-Cell Time Integration Techniques of Vlasov-Maxwell Equations for Collisionless Kinetic Plasma Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Guangye; Chacon, Luis; Knoll, Dana Alan

    2015-07-31

    A multi-rate PIC formulation was developed that employs large timesteps for slow field evolution, and small (adaptive) timesteps for particle orbit integrations. Implementation is based on a JFNK solver with nonlinear elimination and moment preconditioning. The approach is free of numerical instabilities (ω peΔt >>1, and Δx >> λ D), and requires many fewer dofs (vs. explicit PIC) for comparable accuracy in challenging problems. Significant gains (vs. conventional explicit PIC) may be possible for large scale simulations. The paper is organized as follows: Vlasov-Maxwell Particle-in-cell (PIC) methods for plasmas; Explicit, semi-implicit, and implicit time integrations; Implicit PIC formulation (Jacobian-Free Newton-Krylovmore » (JFNK) with nonlinear elimination allows different treatments of disparate scales, discrete conservation properties (energy, charge, canonical momentum, etc.)); Some numerical examples; and Summary.« less

  19. Multidimensional quantum entanglement with large-scale integrated optics.

    PubMed

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong; Santagati, Raffaele; Skrzypczyk, Paul; Salavrakos, Alexia; Tura, Jordi; Augusiak, Remigiusz; Mančinska, Laura; Bacco, Davide; Bonneau, Damien; Silverstone, Joshua W; Gong, Qihuang; Acín, Antonio; Rottwitt, Karsten; Oxenløwe, Leif K; O'Brien, Jeremy L; Laing, Anthony; Thompson, Mark G

    2018-04-20

    The ability to control multidimensional quantum systems is central to the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control, and analyze high-dimensional entanglement. A programmable bipartite entangled system is realized with dimensions up to 15 × 15 on a large-scale silicon photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality, and controllability of our multidimensional technology, and further exploit these abilities to demonstrate previously unexplored quantum applications, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development of multidimensional quantum technologies. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  20. Efficient data management in a large-scale epidemiology research project.

    PubMed

    Meyer, Jens; Ostrzinski, Stefan; Fredrich, Daniel; Havemann, Christoph; Krafczyk, Janina; Hoffmann, Wolfgang

    2012-09-01

    This article describes the concept of a "Central Data Management" (CDM) and its implementation within the large-scale population-based medical research project "Personalized Medicine". The CDM can be summarized as a conjunction of data capturing, data integration, data storage, data refinement, and data transfer. A wide spectrum of reliable "Extract Transform Load" (ETL) software for automatic integration of data as well as "electronic Case Report Forms" (eCRFs) was developed, in order to integrate decentralized and heterogeneously captured data. Due to the high sensitivity of the captured data, high system resource availability, data privacy, data security and quality assurance are of utmost importance. A complex data model was developed and implemented using an Oracle database in high availability cluster mode in order to integrate different types of participant-related data. Intelligent data capturing and storage mechanisms are improving the quality of data. Data privacy is ensured by a multi-layered role/right system for access control and de-identification of identifying data. A well defined backup process prevents data loss. Over the period of one and a half year, the CDM has captured a wide variety of data in the magnitude of approximately 5terabytes without experiencing any critical incidents of system breakdown or loss of data. The aim of this article is to demonstrate one possible way of establishing a Central Data Management in large-scale medical and epidemiological studies. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  1. Distributed Storage Inverter and Legacy Generator Integration Plus Renewables Solution for Microgrids

    DTIC Science & Technology

    2015-07-01

    Reactive kVAR Kilo Watts kW Lithium Ion Li Ion Lithium-Titanate Oxide nLTO Natural gas NG Performance Objectives PO Photovoltaic PV Power ...cloud covered) periods. The demonstration features a large (relative to the overall system power requirements) photovoltaic solar array, whose inverter...microgrid with less expensive power storage instead of large scale energy storage and that the renewable energy with small-scale power storage can

  2. The Cognitive and Attitudinal Effects of Arts Integrated Instruction on 4th and 5th Grade Elementary School Students

    ERIC Educational Resources Information Center

    Chand O'Neal, Ivonne

    2017-01-01

    Integration of art into school curricula has the potential to increase multiple abilities in student populations. However, the process through which cognition and behavior are affected is not well understood. The current study is among the first large-scale, longitudinal examinations of arts integration on improving student creativity and…

  3. China’s new-age small farms and their vertical integration: agribusiness or co-ops?

    PubMed

    Huang, Philip C C

    2011-01-01

    The future of Chinese agriculture lies not with large mechanized farms but with small capital-labor dual intensifying family farms for livestock-poultry-fish raising and vegetable-fruit cultivation. Chinese food consumption patterns have been changing from the old 8:1:1 pattern of 8 parts grain, 1 part meat, and 1 part vegetables to a 4:3:3 pattern, with a corresponding transformation in agricultural structure. Small family-farming is better suited for the new-age agriculture, including organic farming, than large-scale mechanized farming, because of the intensive, incremental, and variegated hand labor involved, not readily open to economies of scale, though compatible with economies of scope. It is also better suited to the realities of severe population pressure on land. But it requires vertical integration from cultivation to processing to marketing, albeit without horizontal integration for farming. It is against such a background that co-ops have arisen spontaneously for integrating small farms with processing and marketing. The Chinese government, however, has been supporting aggressively capitalistic agribusinesses as the preferred mode of vertical integration. At present, Chinese agriculture is poised at a crossroads, with the future organizational mode for vertical integration as yet uncertain.

  4. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  5. One-step assembly and targeted integration of multigene constructs assisted by the I-SceI meganuclease in Saccharomyces cerevisiae

    PubMed Central

    Kuijpers, Niels GA; Chroumpi, Soultana; Vos, Tim; Solis-Escalante, Daniel; Bosman, Lizanne; Pronk, Jack T; Daran, Jean-Marc; Daran-Lapujade, Pascale

    2013-01-01

    In vivo assembly of overlapping fragments by homologous recombination in Saccharomyces cerevisiae is a powerful method to engineer large DNA constructs. Whereas most in vivo assembly methods reported to date result in circular vectors, stable integrated constructs are often preferred for metabolic engineering as they are required for large-scale industrial application. The present study explores the potential of combining in vivo assembly of large, multigene expression constructs with their targeted chromosomal integration in S. cerevisiae. Combined assembly and targeted integration of a ten-fragment 22-kb construct to a single chromosomal locus was successfully achieved in a single transformation process, but with low efficiency (5% of the analyzed transformants contained the correctly assembled construct). The meganuclease I-SceI was therefore used to introduce a double-strand break at the targeted chromosomal locus, thus to facilitate integration of the assembled construct. I-SceI-assisted integration dramatically increased the efficiency of assembly and integration of the same construct to 95%. This study paves the way for the fast, efficient, and stable integration of large DNA constructs in S. cerevisiae chromosomes. PMID:24028550

  6. LARGE SCALE DISASTER ANALYSIS AND MANAGEMENT: SYSTEM LEVEL STUDY ON AN INTEGRATED MODEL

    EPA Science Inventory

    The increasing intensity and scale of human activity across the globe leading to severe depletion and deterioration of the Earth's natural resources has meant that sustainability has emerged as a new paradigm of analysis and management. Sustainability, conceptually defined by the...

  7. Flexible, High-Speed CdSe Nanocrystal Integrated Circuits.

    PubMed

    Stinner, F Scott; Lai, Yuming; Straus, Daniel B; Diroll, Benjamin T; Kim, David K; Murray, Christopher B; Kagan, Cherie R

    2015-10-14

    We report large-area, flexible, high-speed analog and digital colloidal CdSe nanocrystal integrated circuits operating at low voltages. Using photolithography and a newly developed process to fabricate vertical interconnect access holes, we scale down device dimensions, reducing parasitic capacitances and increasing the frequency of circuit operation, and scale up device fabrication over 4 in. flexible substrates. We demonstrate amplifiers with ∼7 kHz bandwidth, ring oscillators with <10 μs stage delays, and NAND and NOR logic gates.

  8. An integrated approach to reconstructing genome-scale transcriptional regulatory networks

    DOE PAGES

    Imam, Saheed; Noguera, Daniel R.; Donohue, Timothy J.; ...

    2015-02-27

    Transcriptional regulatory networks (TRNs) program cells to dynamically alter their gene expression in response to changing internal or environmental conditions. In this study, we develop a novel workflow for generating large-scale TRN models that integrates comparative genomics data, global gene expression analyses, and intrinsic properties of transcription factors (TFs). An assessment of this workflow using benchmark datasets for the well-studied γ-proteobacterium Escherichia coli showed that it outperforms expression-based inference approaches, having a significantly larger area under the precision-recall curve. Further analysis indicated that this integrated workflow captures different aspects of the E. coli TRN than expression-based approaches, potentially making themmore » highly complementary. We leveraged this new workflow and observations to build a large-scale TRN model for the α-Proteobacterium Rhodobacter sphaeroides that comprises 120 gene clusters, 1211 genes (including 93 TFs), 1858 predicted protein-DNA interactions and 76 DNA binding motifs. We found that ~67% of the predicted gene clusters in this TRN are enriched for functions ranging from photosynthesis or central carbon metabolism to environmental stress responses. We also found that members of many of the predicted gene clusters were consistent with prior knowledge in R. sphaeroides and/or other bacteria. Experimental validation of predictions from this R. sphaeroides TRN model showed that high precision and recall was also obtained for TFs involved in photosynthesis (PpsR), carbon metabolism (RSP_0489) and iron homeostasis (RSP_3341). In addition, this integrative approach enabled generation of TRNs with increased information content relative to R. sphaeroides TRN models built via other approaches. We also show how this approach can be used to simultaneously produce TRN models for each related organism used in the comparative genomics analysis. Our results highlight the advantages of integrating comparative genomics of closely related organisms with gene expression data to assemble large-scale TRN models with high-quality predictions.« less

  9. Architectural Visualization of C/C++ Source Code for Program Comprehension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panas, T; Epperly, T W; Quinlan, D

    2006-09-01

    Structural and behavioral visualization of large-scale legacy systems to aid program comprehension is still a major challenge. The challenge is even greater when applications are implemented in flexible and expressive languages such as C and C++. In this paper, we consider visualization of static and dynamic aspects of large-scale scientific C/C++ applications. For our investigation, we reuse and integrate specialized analysis and visualization tools. Furthermore, we present a novel layout algorithm that permits a compressive architectural view of a large-scale software system. Our layout is unique in that it allows traditional program visualizations, i.e., graph structures, to be seen inmore » relation to the application's file structure.« less

  10. Scalable ion-photon quantum interface based on integrated diffractive mirrors

    NASA Astrophysics Data System (ADS)

    Ghadimi, Moji; Blūms, Valdis; Norton, Benjamin G.; Fisher, Paul M.; Connell, Steven C.; Amini, Jason M.; Volin, Curtis; Hayden, Harley; Pai, Chien-Shing; Kielpinski, David; Lobino, Mirko; Streed, Erik W.

    2017-12-01

    Quantum networking links quantum processors through remote entanglement for distributed quantum information processing and secure long-range communication. Trapped ions are a leading quantum information processing platform, having demonstrated universal small-scale processors and roadmaps for large-scale implementation. Overall rates of ion-photon entanglement generation, essential for remote trapped ion entanglement, are limited by coupling efficiency into single mode fibers and scaling to many ions. Here, we show a microfabricated trap with integrated diffractive mirrors that couples 4.1(6)% of the fluorescence from a 174Yb+ ion into a single mode fiber, nearly triple the demonstrated bulk optics efficiency. The integrated optic collects 5.8(8)% of the π transition fluorescence, images the ion with sub-wavelength resolution, and couples 71(5)% of the collected light into the fiber. Our technology is suitable for entangling multiple ions in parallel and overcomes mode quality limitations of existing integrated optical interconnects.

  11. Airframe-Jet Engine Integration Noise

    NASA Technical Reports Server (NTRS)

    Tam, Christopher; Antcliff, Richard R. (Technical Monitor)

    2003-01-01

    It has been found experimentally that the noise radiated by a jet mounted under the wing of an aircraft exceeds that of the same jet in a stand-alone environment. The increase in noise is referred to as jet engine airframe integration noise. The objectives of the present investigation are, (1) To obtain a better understanding of the physical mechanisms responsible for jet engine airframe integration noise or installation noise. (2) To develop a prediction model for jet engine airframe integration noise. It is known that jet mixing noise consists of two principal components. They are the noise from the large turbulence structures of the jet flow and the noise from the fine scale turbulence. In this investigation, only the effect of jet engine airframe interaction on the fine scale turbulence noise of a jet is studied. The fine scale turbulence noise is the dominant noise component in the sideline direction. Thus we limit out consideration primarily to the sideline.

  12. Strategies for Large Scale Implementation of a Multiscale, Multiprocess Integrated Hydrologic Model

    NASA Astrophysics Data System (ADS)

    Kumar, M.; Duffy, C.

    2006-05-01

    Distributed models simulate hydrologic state variables in space and time while taking into account the heterogeneities in terrain, surface, subsurface properties and meteorological forcings. Computational cost and complexity associated with these model increases with its tendency to accurately simulate the large number of interacting physical processes at fine spatio-temporal resolution in a large basin. A hydrologic model run on a coarse spatial discretization of the watershed with limited number of physical processes needs lesser computational load. But this negatively affects the accuracy of model results and restricts physical realization of the problem. So it is imperative to have an integrated modeling strategy (a) which can be universally applied at various scales in order to study the tradeoffs between computational complexity (determined by spatio- temporal resolution), accuracy and predictive uncertainty in relation to various approximations of physical processes (b) which can be applied at adaptively different spatial scales in the same domain by taking into account the local heterogeneity of topography and hydrogeologic variables c) which is flexible enough to incorporate different number and approximation of process equations depending on model purpose and computational constraint. An efficient implementation of this strategy becomes all the more important for Great Salt Lake river basin which is relatively large (~89000 sq. km) and complex in terms of hydrologic and geomorphic conditions. Also the types and the time scales of hydrologic processes which are dominant in different parts of basin are different. Part of snow melt runoff generated in the Uinta Mountains infiltrates and contributes as base flow to the Great Salt Lake over a time scale of decades to centuries. The adaptive strategy helps capture the steep topographic and climatic gradient along the Wasatch front. Here we present the aforesaid modeling strategy along with an associated hydrologic modeling framework which facilitates a seamless, computationally efficient and accurate integration of the process model with the data model. The flexibility of this framework leads to implementation of multiscale, multiresolution, adaptive refinement/de-refinement and nested modeling simulations with least computational burden. However, performing these simulations and related calibration of these models over a large basin at higher spatio- temporal resolutions is computationally intensive and requires use of increasing computing power. With the advent of parallel processing architectures, high computing performance can be achieved by parallelization of existing serial integrated-hydrologic-model code. This translates to running the same model simulation on a network of large number of processors thereby reducing the time needed to obtain solution. The paper also discusses the implementation of the integrated model on parallel processors. Also will be discussed the mapping of the problem on multi-processor environment, method to incorporate coupling between hydrologic processes using interprocessor communication models, model data structure and parallel numerical algorithms to obtain high performance.

  13. Computer Technology, Large-Scale Social Integration, and the Local Community.

    ERIC Educational Resources Information Center

    Calhoun, Craig

    1986-01-01

    A conceptual framework is proposed for studying variations in kind and extent of social integration and relatedness, such as those new communication technology may foster. Emphasis is on the contrast between direct and indirect social relationships. The framework is illustrated by consideration of potential social impacts of widespread…

  14. Fast numerical methods for simulating large-scale integrate-and-fire neuronal networks.

    PubMed

    Rangan, Aaditya V; Cai, David

    2007-02-01

    We discuss numerical methods for simulating large-scale, integrate-and-fire (I&F) neuronal networks. Important elements in our numerical methods are (i) a neurophysiologically inspired integrating factor which casts the solution as a numerically tractable integral equation, and allows us to obtain stable and accurate individual neuronal trajectories (i.e., voltage and conductance time-courses) even when the I&F neuronal equations are stiff, such as in strongly fluctuating, high-conductance states; (ii) an iterated process of spike-spike corrections within groups of strongly coupled neurons to account for spike-spike interactions within a single large numerical time-step; and (iii) a clustering procedure of firing events in the network to take advantage of localized architectures, such as spatial scales of strong local interactions, which are often present in large-scale computational models-for example, those of the primary visual cortex. (We note that the spike-spike corrections in our methods are more involved than the correction of single neuron spike-time via a polynomial interpolation as in the modified Runge-Kutta methods commonly used in simulations of I&F neuronal networks.) Our methods can evolve networks with relatively strong local interactions in an asymptotically optimal way such that each neuron fires approximately once in [Formula: see text] operations, where N is the number of neurons in the system. We note that quantifications used in computational modeling are often statistical, since measurements in a real experiment to characterize physiological systems are typically statistical, such as firing rate, interspike interval distributions, and spike-triggered voltage distributions. We emphasize that it takes much less computational effort to resolve statistical properties of certain I&F neuronal networks than to fully resolve trajectories of each and every neuron within the system. For networks operating in realistic dynamical regimes, such as strongly fluctuating, high-conductance states, our methods are designed to achieve statistical accuracy when very large time-steps are used. Moreover, our methods can also achieve trajectory-wise accuracy when small time-steps are used.

  15. Wave models for turbulent free shear flows

    NASA Technical Reports Server (NTRS)

    Liou, W. W.; Morris, P. J.

    1991-01-01

    New predictive closure models for turbulent free shear flows are presented. They are based on an instability wave description of the dominant large scale structures in these flows using a quasi-linear theory. Three model were developed to study the structural dynamics of turbulent motions of different scales in free shear flows. The local characteristics of the large scale motions are described using linear theory. Their amplitude is determined from an energy integral analysis. The models were applied to the study of an incompressible free mixing layer. In all cases, predictions are made for the development of the mean flow field. In the last model, predictions of the time dependent motion of the large scale structure of the mixing region are made. The predictions show good agreement with experimental observations.

  16. Large-Scale Overlays and Trends: Visually Mining, Panning and Zooming the Observable Universe.

    PubMed

    Luciani, Timothy Basil; Cherinka, Brian; Oliphant, Daniel; Myers, Sean; Wood-Vasey, W Michael; Labrinidis, Alexandros; Marai, G Elisabeta

    2014-07-01

    We introduce a web-based computing infrastructure to assist the visual integration, mining and interactive navigation of large-scale astronomy observations. Following an analysis of the application domain, we design a client-server architecture to fetch distributed image data and to partition local data into a spatial index structure that allows prefix-matching of spatial objects. In conjunction with hardware-accelerated pixel-based overlays and an online cross-registration pipeline, this approach allows the fetching, displaying, panning and zooming of gigabit panoramas of the sky in real time. To further facilitate the integration and mining of spatial and non-spatial data, we introduce interactive trend images-compact visual representations for identifying outlier objects and for studying trends within large collections of spatial objects of a given class. In a demonstration, images from three sky surveys (SDSS, FIRST and simulated LSST results) are cross-registered and integrated as overlays, allowing cross-spectrum analysis of astronomy observations. Trend images are interactively generated from catalog data and used to visually mine astronomy observations of similar type. The front-end of the infrastructure uses the web technologies WebGL and HTML5 to enable cross-platform, web-based functionality. Our approach attains interactive rendering framerates; its power and flexibility enables it to serve the needs of the astronomy community. Evaluation on three case studies, as well as feedback from domain experts emphasize the benefits of this visual approach to the observational astronomy field; and its potential benefits to large scale geospatial visualization in general.

  17. Integration and validation testing for PhEDEx, DBS and DAS with the PhEDEx LifeCycle agent

    NASA Astrophysics Data System (ADS)

    Boeser, C.; Chwalek, T.; Giffels, M.; Kuznetsov, V.; Wildish, T.

    2014-06-01

    The ever-increasing amount of data handled by the CMS dataflow and workflow management tools poses new challenges for cross-validation among different systems within CMS experiment at LHC. To approach this problem we developed an integration test suite based on the LifeCycle agent, a tool originally conceived for stress-testing new releases of PhEDEx, the CMS data-placement tool. The LifeCycle agent provides a framework for customising the test workflow in arbitrary ways, and can scale to levels of activity well beyond those seen in normal running. This means we can run realistic performance tests at scales not likely to be seen by the experiment for some years, or with custom topologies to examine particular situations that may cause concern some time in the future. The LifeCycle agent has recently been enhanced to become a general purpose integration and validation testing tool for major CMS services. It allows cross-system integration tests of all three components to be performed in controlled environments, without interfering with production services. In this paper we discuss the design and implementation of the LifeCycle agent. We describe how it is used for small-scale debugging and validation tests, and how we extend that to large-scale tests of whole groups of sub-systems. We show how the LifeCycle agent can emulate the action of operators, physicists, or software agents external to the system under test, and how it can be scaled to large and complex systems.

  18. Single-chip microprocessor that communicates directly using light

    NASA Astrophysics Data System (ADS)

    Sun, Chen; Wade, Mark T.; Lee, Yunsup; Orcutt, Jason S.; Alloatti, Luca; Georgas, Michael S.; Waterman, Andrew S.; Shainline, Jeffrey M.; Avizienis, Rimas R.; Lin, Sen; Moss, Benjamin R.; Kumar, Rajesh; Pavanello, Fabio; Atabaki, Amir H.; Cook, Henry M.; Ou, Albert J.; Leu, Jonathan C.; Chen, Yu-Hsin; Asanović, Krste; Ram, Rajeev J.; Popović, Miloš A.; Stojanović, Vladimir M.

    2015-12-01

    Data transport across short electrical wires is limited by both bandwidth and power density, which creates a performance bottleneck for semiconductor microchips in modern computer systems—from mobile phones to large-scale data centres. These limitations can be overcome by using optical communications based on chip-scale electronic-photonic systems enabled by silicon-based nanophotonic devices8. However, combining electronics and photonics on the same chip has proved challenging, owing to microchip manufacturing conflicts between electronics and photonics. Consequently, current electronic-photonic chips are limited to niche manufacturing processes and include only a few optical devices alongside simple circuits. Here we report an electronic-photonic system on a single chip integrating over 70 million transistors and 850 photonic components that work together to provide logic, memory, and interconnect functions. This system is a realization of a microprocessor that uses on-chip photonic devices to directly communicate with other chips using light. To integrate electronics and photonics at the scale of a microprocessor chip, we adopt a ‘zero-change’ approach to the integration of photonics. Instead of developing a custom process to enable the fabrication of photonics, which would complicate or eliminate the possibility of integration with state-of-the-art transistors at large scale and at high yield, we design optical devices using a standard microelectronics foundry process that is used for modern microprocessors. This demonstration could represent the beginning of an era of chip-scale electronic-photonic systems with the potential to transform computing system architectures, enabling more powerful computers, from network infrastructure to data centres and supercomputers.

  19. Single-chip microprocessor that communicates directly using light.

    PubMed

    Sun, Chen; Wade, Mark T; Lee, Yunsup; Orcutt, Jason S; Alloatti, Luca; Georgas, Michael S; Waterman, Andrew S; Shainline, Jeffrey M; Avizienis, Rimas R; Lin, Sen; Moss, Benjamin R; Kumar, Rajesh; Pavanello, Fabio; Atabaki, Amir H; Cook, Henry M; Ou, Albert J; Leu, Jonathan C; Chen, Yu-Hsin; Asanović, Krste; Ram, Rajeev J; Popović, Miloš A; Stojanović, Vladimir M

    2015-12-24

    Data transport across short electrical wires is limited by both bandwidth and power density, which creates a performance bottleneck for semiconductor microchips in modern computer systems--from mobile phones to large-scale data centres. These limitations can be overcome by using optical communications based on chip-scale electronic-photonic systems enabled by silicon-based nanophotonic devices. However, combining electronics and photonics on the same chip has proved challenging, owing to microchip manufacturing conflicts between electronics and photonics. Consequently, current electronic-photonic chips are limited to niche manufacturing processes and include only a few optical devices alongside simple circuits. Here we report an electronic-photonic system on a single chip integrating over 70 million transistors and 850 photonic components that work together to provide logic, memory, and interconnect functions. This system is a realization of a microprocessor that uses on-chip photonic devices to directly communicate with other chips using light. To integrate electronics and photonics at the scale of a microprocessor chip, we adopt a 'zero-change' approach to the integration of photonics. Instead of developing a custom process to enable the fabrication of photonics, which would complicate or eliminate the possibility of integration with state-of-the-art transistors at large scale and at high yield, we design optical devices using a standard microelectronics foundry process that is used for modern microprocessors. This demonstration could represent the beginning of an era of chip-scale electronic-photonic systems with the potential to transform computing system architectures, enabling more powerful computers, from network infrastructure to data centres and supercomputers.

  20. A refined regional modeling approach for the Corn Belt - Experiences and recommendations for large-scale integrated modeling

    NASA Astrophysics Data System (ADS)

    Panagopoulos, Yiannis; Gassman, Philip W.; Jha, Manoj K.; Kling, Catherine L.; Campbell, Todd; Srinivasan, Raghavan; White, Michael; Arnold, Jeffrey G.

    2015-05-01

    Nonpoint source pollution from agriculture is the main source of nitrogen and phosphorus in the stream systems of the Corn Belt region in the Midwestern US. This region is comprised of two large river basins, the intensely row-cropped Upper Mississippi River Basin (UMRB) and Ohio-Tennessee River Basin (OTRB), which are considered the key contributing areas for the Northern Gulf of Mexico hypoxic zone according to the US Environmental Protection Agency. Thus, in this area it is of utmost importance to ensure that intensive agriculture for food, feed and biofuel production can coexist with a healthy water environment. To address these objectives within a river basin management context, an integrated modeling system has been constructed with the hydrologic Soil and Water Assessment Tool (SWAT) model, capable of estimating river basin responses to alternative cropping and/or management strategies. To improve modeling performance compared to previous studies and provide a spatially detailed basis for scenario development, this SWAT Corn Belt application incorporates a greatly refined subwatershed structure based on 12-digit hydrologic units or 'subwatersheds' as defined by the US Geological Service. The model setup, calibration and validation are time-demanding and challenging tasks for these large systems, given the scale intensive data requirements, and the need to ensure the reliability of flow and pollutant load predictions at multiple locations. Thus, the objectives of this study are both to comprehensively describe this large-scale modeling approach, providing estimates of pollution and crop production in the region as well as to present strengths and weaknesses of integrated modeling at such a large scale along with how it can be improved on the basis of the current modeling structure and results. The predictions were based on a semi-automatic hydrologic calibration approach for large-scale and spatially detailed modeling studies, with the use of the Sequential Uncertainty Fitting algorithm (SUFI-2) and the SWAT-CUP interface, followed by a manual water quality calibration on a monthly basis. The refined modeling approach developed in this study led to successful predictions across most parts of the Corn Belt region and can be used for testing pollution mitigation measures and agricultural economic scenarios, providing useful information to policy makers and recommendations on similar efforts at the regional scale.

  1. Selection and Manufacturing of Membrane Materials for Solar Sails

    NASA Technical Reports Server (NTRS)

    Bryant, Robert G.; Seaman, Shane T.; Wilkie, W. Keats; Miyaucchi, Masahiko; Working, Dennis C.

    2013-01-01

    Commercial metallized polyimide or polyester films and hand-assembly techniques are acceptable for small solar sail technology demonstrations, although scaling this approach to large sail areas is impractical. Opportunities now exist to use new polymeric materials specifically designed for solar sailing applications, and take advantage of integrated sail manufacturing to enable large-scale solar sail construction. This approach has, in part, been demonstrated on the JAXA IKAROS solar sail demonstrator, and NASA Langley Research Center is now developing capabilities to produce ultrathin membranes for solar sails by integrating resin synthesis with film forming and sail manufacturing processes. This paper will discuss the selection and development of polymer material systems for space, and these new processes for producing ultrathin high-performance solar sail membrane films.

  2. Impurity engineering of Czochralski silicon used for ultra large-scaled-integrated circuits

    NASA Astrophysics Data System (ADS)

    Yang, Deren; Chen, Jiahe; Ma, Xiangyang; Que, Duanlin

    2009-01-01

    Impurities in Czochralski silicon (Cz-Si) used for ultra large-scaled-integrated (ULSI) circuits have been believed to deteriorate the performance of devices. In this paper, a review of the recent processes from our investigation on internal gettering in Cz-Si wafers which were doped with nitrogen, germanium and/or high content of carbon is presented. It has been suggested that those impurities enhance oxygen precipitation, and create both denser bulk microdefects and enough denuded zone with the desirable width, which is benefit of the internal gettering of metal contamination. Based on the experimental facts, a potential mechanism of impurity doping on the internal gettering structure is interpreted and, a new concept of 'impurity engineering' for Cz-Si used for ULSI is proposed.

  3. TESTING METHODS FOR DETECTION OF CRYPTOSPORIDIUM SPP. IN WATER SAMPLES

    EPA Science Inventory

    A large waterborne outbreak of cryptosporidiosis in Milwaukee, Wisconsin, U.S.A. in 1993 prompted a search for ways to prevent large-scale waterborne outbreaks of protozoan parasitoses. Methods for detecting Cryptosporidium parvum play an integral role in strategies that lead to...

  4. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    NASA Astrophysics Data System (ADS)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  5. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.

  6. ESRI applications of GIS technology: Mineral resource development

    NASA Technical Reports Server (NTRS)

    Derrenbacher, W.

    1981-01-01

    The application of geographic information systems technology to large scale regional assessment related to mineral resource development, identifying candidate sites for related industry, and evaluating sites for waste disposal is discussed. Efforts to develop data bases were conducted at scales ranging from 1:3,000,000 to 1:25,000. In several instances, broad screening was conducted for large areas at a very general scale with more detailed studies subsequently undertaken in promising areas windowed out of the generalized data base. Increasingly, the systems which are developed are structured as the spatial framework for the long-term collection, storage, referencing, and retrieval of vast amounts of data about large regions. Typically, the reconnaissance data base for a large region is structured at 1:250,000 scale, data bases for smaller areas being structured at 1:25,000, 1:50,000 or 1:63,360. An integrated data base for the coterminous US was implemented at a scale of 1:3,000,000 for two separate efforts.

  7. Precise excision and self-integration of a composite transposon as a model for spontaneous large-scale chromosome inversion/deletion of the Staphylococcus haemolyticus clinical strain JCSC1435.

    PubMed

    Watanabe, Shinya; Ito, Teruyo; Morimoto, Yuh; Takeuchi, Fumihiko; Hiramatsu, Keiichi

    2007-04-01

    Large-scale chromosomal inversions (455 to 535 kbp) or deletions (266 to 320 kbp) were found to accompany spontaneous loss of beta-lactam resistance during drug-free passage of the multiresistant Staphylococcus haemolyticus clinical strain JCSC1435. Identification and sequencing of the rearranged chromosomal loci revealed that ISSha1 of S. haemolyticus is responsible for the chromosome rearrangements.

  8. Coordination in Large Scale Software Development

    DTIC Science & Technology

    1990-01-01

    toward achieving common and explicitly recognized goals" (Blau and Scott, 1962) and "the integration or linking together of different parts of an...require a strong degree of integration of its components. Much software is built of thousands of modules that must mesh with each other perfectly for the...coordination between subgroups producing software modules could lead to failure in integrating the modules themselves. Informal communication. Both

  9. MHD Modeling of the Solar Wind with Turbulence Transport and Heating

    NASA Technical Reports Server (NTRS)

    Goldstein, M. L.; Usmanov, A. V.; Matthaeus, W. H.; Breech, B.

    2009-01-01

    We have developed a magnetohydrodynamic model that describes the global axisymmetric steady-state structure of the solar wind near solar minimum with account for transport of small-scale turbulence associated heating. The Reynolds-averaged mass, momentum, induction, and energy equations for the large-scale solar wind flow are solved simultaneously with the turbulence transport equations in the region from 0.3 to 100 AU. The large-scale equations include subgrid-scale terms due to turbulence and the turbulence (small-scale) equations describe the effects of transport and (phenomenologically) dissipation of the MHD turbulence based on a few statistical parameters (turbulence energy, normalized cross-helicity, and correlation scale). The coupled set of equations is integrated numerically for a source dipole field on the Sun by a time-relaxation method in the corotating frame of reference. We present results on the plasma, magnetic field, and turbulence distributions throughout the heliosphere and on the role of the turbulence in the large-scale structure and temperature distribution in the solar wind.

  10. Going the distance: spatial scale of athletic experience affects the accuracy of path integration.

    PubMed

    Smith, Alastair D; Howard, Christina J; Alcock, Niall; Cater, Kirsten

    2010-09-01

    Evidence suggests that athletically trained individuals are more accurate than untrained individuals in updating their spatial position through idiothetic cues. We assessed whether training at different spatial scales affects the accuracy of path integration. Groups of rugby players (large-scale training) and martial artists (small-scale training) participated in a triangle-completion task: they were led (blindfolded) along two sides of a right-angled triangle and were required to complete the hypotenuse by returning to the origin. The groups did not differ in their assessment of the distance to the origin, but rugby players were more accurate than martial artists in assessing the correct angle to turn (heading), and landed significantly closer to the origin. These data support evidence that distance and heading components can be dissociated. Furthermore, they suggest that the spatial scale at which an individual is trained may affect the accuracy of one component of path integration but not the other.

  11. Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems

    NASA Astrophysics Data System (ADS)

    Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard

    Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.

  12. Probing the largest cosmological scales with the correlation between the cosmic microwave background and peculiar velocities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fosalba, Pablo; Dore, Olivier

    2007-11-15

    Cross correlation between the cosmic microwave background (CMB) and large-scale structure is a powerful probe of dark energy and gravity on the largest physical scales. We introduce a novel estimator, the CMB-velocity correlation, that has most of its power on large scales and that, at low redshift, delivers up to a factor of 2 higher signal-to-noise ratio than the recently detected CMB-dark matter density correlation expected from the integrated Sachs-Wolfe effect. We propose to use a combination of peculiar velocities measured from supernovae type Ia and kinetic Sunyaev-Zeldovich cluster surveys to reveal this signal and forecast dark energy constraints thatmore » can be achieved with future surveys. We stress that low redshift peculiar velocity measurements should be exploited with complementary deeper large-scale structure surveys for precision cosmology.« less

  13. The large-scale organization of metabolic networks

    NASA Astrophysics Data System (ADS)

    Jeong, H.; Tombor, B.; Albert, R.; Oltvai, Z. N.; Barabási, A.-L.

    2000-10-01

    In a cell or microorganism, the processes that generate mass, energy, information transfer and cell-fate specification are seamlessly integrated through a complex network of cellular constituents and reactions. However, despite the key role of these networks in sustaining cellular functions, their large-scale structure is essentially unknown. Here we present a systematic comparative mathematical analysis of the metabolic networks of 43 organisms representing all three domains of life. We show that, despite significant variation in their individual constituents and pathways, these metabolic networks have the same topological scaling properties and show striking similarities to the inherent organization of complex non-biological systems. This may indicate that metabolic organization is not only identical for all living organisms, but also complies with the design principles of robust and error-tolerant scale-free networks, and may represent a common blueprint for the large-scale organization of interactions among all cellular constituents.

  14. A fully reconfigurable photonic integrated signal processor

    NASA Astrophysics Data System (ADS)

    Liu, Weilin; Li, Ming; Guzzon, Robert S.; Norberg, Erik J.; Parker, John S.; Lu, Mingzhi; Coldren, Larry A.; Yao, Jianping

    2016-03-01

    Photonic signal processing has been considered a solution to overcome the inherent electronic speed limitations. Over the past few years, an impressive range of photonic integrated signal processors have been proposed, but they usually offer limited reconfigurability, a feature highly needed for the implementation of large-scale general-purpose photonic signal processors. Here, we report and experimentally demonstrate a fully reconfigurable photonic integrated signal processor based on an InP-InGaAsP material system. The proposed photonic signal processor is capable of performing reconfigurable signal processing functions including temporal integration, temporal differentiation and Hilbert transformation. The reconfigurability is achieved by controlling the injection currents to the active components of the signal processor. Our demonstration suggests great potential for chip-scale fully programmable all-optical signal processing.

  15. Detecting and monitoring large-scale drought effects on forests: toward an integrated approach

    Treesearch

    Steve Norman; Frank H. Koch; William W. Hargrove

    2016-01-01

    Although drought is recognized as an important and overarching driver of ecosystem change, its occurrence and effects have been difficult to describe over large geographic areas (Hogg and others 2008, Panu and Sharma 2002).

  16. Establishment of a National Wind Energy Center at University of Houston

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Su Su

    The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less

  17. Perspectives on integrated modeling of transport processes in semiconductor crystal growth

    NASA Technical Reports Server (NTRS)

    Brown, Robert A.

    1992-01-01

    The wide range of length and time scales involved in industrial scale solidification processes is demonstrated here by considering the Czochralski process for the growth of large diameter silicon crystals that become the substrate material for modern microelectronic devices. The scales range in time from microseconds to thousands of seconds and in space from microns to meters. The physics and chemistry needed to model processes on these different length scales are reviewed.

  18. Linking crop yield anomalies to large-scale atmospheric circulation in Europe.

    PubMed

    Ceglar, Andrej; Turco, Marco; Toreti, Andrea; Doblas-Reyes, Francisco J

    2017-06-15

    Understanding the effects of climate variability and extremes on crop growth and development represents a necessary step to assess the resilience of agricultural systems to changing climate conditions. This study investigates the links between the large-scale atmospheric circulation and crop yields in Europe, providing the basis to develop seasonal crop yield forecasting and thus enabling a more effective and dynamic adaptation to climate variability and change. Four dominant modes of large-scale atmospheric variability have been used: North Atlantic Oscillation, Eastern Atlantic, Scandinavian and Eastern Atlantic-Western Russia patterns. Large-scale atmospheric circulation explains on average 43% of inter-annual winter wheat yield variability, ranging between 20% and 70% across countries. As for grain maize, the average explained variability is 38%, ranging between 20% and 58%. Spatially, the skill of the developed statistical models strongly depends on the large-scale atmospheric variability impact on weather at the regional level, especially during the most sensitive growth stages of flowering and grain filling. Our results also suggest that preceding atmospheric conditions might provide an important source of predictability especially for maize yields in south-eastern Europe. Since the seasonal predictability of large-scale atmospheric patterns is generally higher than the one of surface weather variables (e.g. precipitation) in Europe, seasonal crop yield prediction could benefit from the integration of derived statistical models exploiting the dynamical seasonal forecast of large-scale atmospheric circulation.

  19. Neural networks supporting audiovisual integration for speech: A large-scale lesion study.

    PubMed

    Hickok, Gregory; Rogalsky, Corianne; Matchin, William; Basilakos, Alexandra; Cai, Julia; Pillay, Sara; Ferrill, Michelle; Mickelsen, Soren; Anderson, Steven W; Love, Tracy; Binder, Jeffrey; Fridriksson, Julius

    2018-06-01

    Auditory and visual speech information are often strongly integrated resulting in perceptual enhancements for audiovisual (AV) speech over audio alone and sometimes yielding compelling illusory fusion percepts when AV cues are mismatched, the McGurk-MacDonald effect. Previous research has identified three candidate regions thought to be critical for AV speech integration: the posterior superior temporal sulcus (STS), early auditory cortex, and the posterior inferior frontal gyrus. We assess the causal involvement of these regions (and others) in the first large-scale (N = 100) lesion-based study of AV speech integration. Two primary findings emerged. First, behavioral performance and lesion maps for AV enhancement and illusory fusion measures indicate that classic metrics of AV speech integration are not necessarily measuring the same process. Second, lesions involving superior temporal auditory, lateral occipital visual, and multisensory zones in the STS are the most disruptive to AV speech integration. Further, when AV speech integration fails, the nature of the failure-auditory vs visual capture-can be predicted from the location of the lesions. These findings show that AV speech processing is supported by unimodal auditory and visual cortices as well as multimodal regions such as the STS at their boundary. Motor related frontal regions do not appear to play a role in AV speech integration. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Integrating History of Mathematics into the Classroom: Was Aristotle Wrong?

    ERIC Educational Resources Information Center

    Panasuk, Regina M.; Horton, Leslie Bolinger

    2013-01-01

    This article describes a part of a large scale study which helped to gain understanding of the high school mathematics teachers' perceptions related to the integration of history of mathematics into instruction. There is obvious lack of correspondence between general perception about possible benefits of students learning the history of…

  1. Integrating social, economic, and ecological values across large landscapes

    Treesearch

    Jessica E. Halofsky; Megan K. Creutzburg; Miles A. Hemstrom

    2014-01-01

    The Integrated Landscape Assessment Project (ILAP) was a multiyear effort to produce information, maps, and models to help land managers, policymakers, and others conduct mid- to broad-scale (e.g., watersheds to states and larger areas) prioritization of land management actions, perform landscape assessments, and estimate cumulative effects of management actions for...

  2. The Water Turbine: An Integrative STEM Education Context

    ERIC Educational Resources Information Center

    Grubbs, Michael E.; Deck, Anita

    2015-01-01

    Water turbines have long been used to make work easier for humans while minimizing energy consumption. They are not only used in small- and large-scale operations, but also provide a great context for Integrative STEM education. Students can begin to understand the technological processes available by designing, building, and testing different…

  3. Transforming Power Systems Through Global Collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-06-01

    Ambitious and integrated policy and regulatory frameworks are crucial to achieve power system transformation. The 21st Century Power Partnership -- a multilateral initiative of the Clean Energy Ministerial -- serves as a platform for public-private collaboration to advance integrated solutions for the large-scale deployment of renewable energy in combination with energy efficiency and grid modernization.

  4. Integration of Technology, Curriculum, and Professional Development for Advancing Middle School Mathematics: Three Large-Scale Studies

    ERIC Educational Resources Information Center

    Roschelle, Jeremy; Shechtman, Nicole; Tatar, Deborah; Hegedus, Stephen; Hopkins, Bill; Empson, Susan; Knudsen, Jennifer; Gallagher, Lawrence P.

    2010-01-01

    The authors present three studies (two randomized controlled experiments and one embedded quasi-experiment) designed to evaluate the impact of replacement units targeting student learning of advanced middle school mathematics. The studies evaluated the SimCalc approach, which integrates an interactive representational technology, paper curriculum,…

  5. Large-scale block adjustment without use of ground control points based on the compensation of geometric calibration for ZY-3 images

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Wang, Mi; Xu, Wen; Li, Deren; Gong, Jianya; Pi, Yingdong

    2017-12-01

    The potential of large-scale block adjustment (BA) without ground control points (GCPs) has long been a concern among photogrammetric researchers, which is of effective guiding significance for global mapping. However, significant problems with the accuracy and efficiency of this method remain to be solved. In this study, we analyzed the effects of geometric errors on BA, and then developed a step-wise BA method to conduct integrated processing of large-scale ZY-3 satellite images without GCPs. We first pre-processed the BA data, by adopting a geometric calibration (GC) method based on the viewing-angle model to compensate for systematic errors, such that the BA input images were of good initial geometric quality. The second step was integrated BA without GCPs, in which a series of technical methods were used to solve bottleneck problems and ensure accuracy and efficiency. The BA model, based on virtual control points (VCPs), was constructed to address the rank deficiency problem caused by lack of absolute constraints. We then developed a parallel matching strategy to improve the efficiency of tie points (TPs) matching, and adopted a three-array data structure based on sparsity to relieve the storage and calculation burden of the high-order modified equation. Finally, we used the conjugate gradient method to improve the speed of solving the high-order equations. To evaluate the feasibility of the presented large-scale BA method, we conducted three experiments on real data collected by the ZY-3 satellite. The experimental results indicate that the presented method can effectively improve the geometric accuracies of ZY-3 satellite images. This study demonstrates the feasibility of large-scale mapping without GCPs.

  6. A unified large/small-scale dynamo in helical turbulence

    NASA Astrophysics Data System (ADS)

    Bhat, Pallavi; Subramanian, Kandaswamy; Brandenburg, Axel

    2016-09-01

    We use high resolution direct numerical simulations (DNS) to show that helical turbulence can generate significant large-scale fields even in the presence of strong small-scale dynamo action. During the kinematic stage, the unified large/small-scale dynamo grows fields with a shape-invariant eigenfunction, with most power peaked at small scales or large k, as in Subramanian & Brandenburg. Nevertheless, the large-scale field can be clearly detected as an excess power at small k in the negatively polarized component of the energy spectrum for a forcing with positively polarized waves. Its strength overline{B}, relative to the total rms field Brms, decreases with increasing magnetic Reynolds number, ReM. However, as the Lorentz force becomes important, the field generated by the unified dynamo orders itself by saturating on successively larger scales. The magnetic integral scale for the positively polarized waves, characterizing the small-scale field, increases significantly from the kinematic stage to saturation. This implies that the small-scale field becomes as coherent as possible for a given forcing scale, which averts the ReM-dependent quenching of overline{B}/B_rms. These results are obtained for 10243 DNS with magnetic Prandtl numbers of PrM = 0.1 and 10. For PrM = 0.1, overline{B}/B_rms grows from about 0.04 to about 0.4 at saturation, aided in the final stages by helicity dissipation. For PrM = 10, overline{B}/B_rms grows from much less than 0.01 to values of the order the 0.2. Our results confirm that there is a unified large/small-scale dynamo in helical turbulence.

  7. Gene Expression Browser: Large-Scale and Cross-Experiment Microarray Data Management, Search & Visualization

    USDA-ARS?s Scientific Manuscript database

    The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...

  8. META II: Formal Co-Verification of Correctness of Large-Scale Cyber-Physical Systems during Design. Volume 1

    DTIC Science & Technology

    2011-08-01

    design space is large. His research contributions are to the field of Decision-based Design, specifically in linking consumer preferences and...Integrating Consumer Preferences into Engineering Design, to be published in 2012. He received his PhD from Northwestern University in Mechanical

  9. RAID-2: Design and implementation of a large scale disk array controller

    NASA Technical Reports Server (NTRS)

    Katz, R. H.; Chen, P. M.; Drapeau, A. L.; Lee, E. K.; Lutz, K.; Miller, E. L.; Seshan, S.; Patterson, D. A.

    1992-01-01

    We describe the implementation of a large scale disk array controller and subsystem incorporating over 100 high performance 3.5 inch disk drives. It is designed to provide 40 MB/s sustained performance and 40 GB capacity in three 19 inch racks. The array controller forms an integral part of a file server that attaches to a Gb/s local area network. The controller implements a high bandwidth interconnect between an interleaved memory, an XOR calculation engine, the network interface (HIPPI), and the disk interfaces (SCSI). The system is now functionally operational, and we are tuning its performance. We review the design decisions, history, and lessons learned from this three year university implementation effort to construct a truly large scale system assembly.

  10. Computation of Large-Scale Structure Jet Noise Sources With Weak Nonlinear Effects Using Linear Euler

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Hixon, Ray; Mankbadi, Reda R.

    2003-01-01

    An approximate technique is presented for the prediction of the large-scale turbulent structure sound source in a supersonic jet. A linearized Euler equations code is used to solve for the flow disturbances within and near a jet with a given mean flow. Assuming a normal mode composition for the wave-like disturbances, the linear radial profiles are used in an integration of the Navier-Stokes equations. This results in a set of ordinary differential equations representing the weakly nonlinear self-interactions of the modes along with their interaction with the mean flow. Solutions are then used to correct the amplitude of the disturbances that represent the source of large-scale turbulent structure sound in the jet.

  11. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    PubMed

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.

  12. An Overview of NASA Efforts on Zero Boiloff Storage of Cryogenic Propellants

    NASA Technical Reports Server (NTRS)

    Hastings, Leon J.; Plachta, D. W.; Salerno, L.; Kittel, P.; Haynes, Davy (Technical Monitor)

    2001-01-01

    Future mission planning within NASA has increasingly motivated consideration of cryogenic propellant storage durations on the order of years as opposed to a few weeks or months. Furthermore, the advancement of cryocooler and passive insulation technologies in recent years has substantially improved the prospects for zero boiloff storage of cryogenics. Accordingly, a cooperative effort by NASA's Ames Research Center (ARC), Glenn Research Center (GRC), and Marshall Space Flight Center (MSFC) has been implemented to develop and demonstrate "zero boiloff" concepts for in-space storage of cryogenic propellants, particularly liquid hydrogen and oxygen. ARC is leading the development of flight-type cryocoolers, GRC the subsystem development and small scale testing, and MSFC the large scale and integrated system level testing. Thermal and fluid modeling involves a combined effort by the three Centers. Recent accomplishments include: 1) development of "zero boiloff" analytical modeling techniques for sizing the storage tankage, passive insulation, cryocooler, power source mass, and radiators; 2) an early subscale demonstration with liquid hydrogen 3) procurement of a flight-type 10 watt, 95 K pulse tube cryocooler for liquid oxygen storage and 4) assembly of a large-scale test article for an early demonstration of the integrated operation of passive insulation, destratification/pressure control, and cryocooler (commercial unit) subsystems to achieve zero boiloff storage of liquid hydrogen. Near term plans include the large-scale integrated system demonstration testing this summer, subsystem testing of the flight-type pulse-tube cryocooler with liquid nitrogen (oxygen simulant), and continued development of a flight-type liquid hydrogen pulse tube cryocooler.

  13. Generating and controlling homogeneous air turbulence using random jet arrays

    NASA Astrophysics Data System (ADS)

    Carter, Douglas; Petersen, Alec; Amili, Omid; Coletti, Filippo

    2016-12-01

    The use of random jet arrays, already employed in water tank facilities to generate zero-mean-flow homogeneous turbulence, is extended to air as a working fluid. A novel facility is introduced that uses two facing arrays of individually controlled jets (256 in total) to force steady homogeneous turbulence with negligible mean flow, shear, and strain. Quasi-synthetic jet pumps are created by expanding pressurized air through small straight nozzles and are actuated by fast-response low-voltage solenoid valves. Velocity fields, two-point correlations, energy spectra, and second-order structure functions are obtained from 2D PIV and are used to characterize the turbulence from the integral-to-the Kolmogorov scales. Several metrics are defined to quantify how well zero-mean-flow homogeneous turbulence is approximated for a wide range of forcing and geometric parameters. With increasing jet firing time duration, both the velocity fluctuations and the integral length scales are augmented and therefore the Reynolds number is increased. We reach a Taylor-microscale Reynolds number of 470, a large-scale Reynolds number of 74,000, and an integral-to-Kolmogorov length scale ratio of 680. The volume of the present homogeneous turbulence, the largest reported to date in a zero-mean-flow facility, is much larger than the integral length scale, allowing for the natural development of the energy cascade. The turbulence is found to be anisotropic irrespective of the distance between the jet arrays. Fine grids placed in front of the jets are effective at modulating the turbulence, reducing both velocity fluctuations and integral scales. Varying the jet-to-jet spacing within each array has no effect on the integral length scale, suggesting that this is dictated by the length scale of the jets.

  14. Development of Nanomechanical Sensors for Breast Cancer Biomarkers

    DTIC Science & Technology

    2008-06-01

    semiconductor industry in developing large scale integrated circuits at very lost cost can lead to similar breakthroughs in array sensors for biomolecules of...insulated from the serum or buffer. The entire device is mounted onto a semiconductor chip carrier, for easy integration with electronics. Figure 3...Keithley 2400 source meter. The ac modulation and the dc bias are added by a noninverting summing circuit, which is integrated with the preamplifier

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cutler, Dylan; Frank, Stephen; Slovensky, Michelle

    Rich, well-organized building performance and energy consumption data enable a host of analytic capabilities for building owners and operators, from basic energy benchmarking to detailed fault detection and system optimization. Unfortunately, data integration for building control systems is challenging and costly in any setting. Large portfolios of buildings--campuses, cities, and corporate portfolios--experience these integration challenges most acutely. These large portfolios often have a wide array of control systems, including multiple vendors and nonstandard communication protocols. They typically have complex information technology (IT) networks and cybersecurity requirements and may integrate distributed energy resources into their infrastructure. Although the challenges are significant,more » the integration of control system data has the potential to provide proportionally greater value for these organizations through portfolio-scale analytics, comprehensive demand management, and asset performance visibility. As a large research campus, the National Renewable Energy Laboratory (NREL) experiences significant data integration challenges. To meet them, NREL has developed an architecture for effective data collection, integration, and analysis, providing a comprehensive view of data integration based on functional layers. The architecture is being evaluated on the NREL campus through deployment of three pilot implementations.« less

  16. PATIKA: an integrated visual environment for collaborative construction and analysis of cellular pathways.

    PubMed

    Demir, E; Babur, O; Dogrusoz, U; Gursoy, A; Nisanci, G; Cetin-Atalay, R; Ozturk, M

    2002-07-01

    Availability of the sequences of entire genomes shifts the scientific curiosity towards the identification of function of the genomes in large scale as in genome studies. In the near future, data produced about cellular processes at molecular level will accumulate with an accelerating rate as a result of proteomics studies. In this regard, it is essential to develop tools for storing, integrating, accessing, and analyzing this data effectively. We define an ontology for a comprehensive representation of cellular events. The ontology presented here enables integration of fragmented or incomplete pathway information and supports manipulation and incorporation of the stored data, as well as multiple levels of abstraction. Based on this ontology, we present the architecture of an integrated environment named Patika (Pathway Analysis Tool for Integration and Knowledge Acquisition). Patika is composed of a server-side, scalable, object-oriented database and client-side editors to provide an integrated, multi-user environment for visualizing and manipulating network of cellular events. This tool features automated pathway layout, functional computation support, advanced querying and a user-friendly graphical interface. We expect that Patika will be a valuable tool for rapid knowledge acquisition, microarray generated large-scale data interpretation, disease gene identification, and drug development. A prototype of Patika is available upon request from the authors.

  17. Abnormalities in Structural Covariance of Cortical Gyrification in Parkinson's Disease.

    PubMed

    Xu, Jinping; Zhang, Jiuquan; Zhang, Jinlei; Wang, Yue; Zhang, Yanling; Wang, Jian; Li, Guanglin; Hu, Qingmao; Zhang, Yuanchao

    2017-01-01

    Although abnormal cortical morphology and connectivity between brain regions (structural covariance) have been reported in Parkinson's disease (PD), the topological organizations of large-scale structural brain networks are still poorly understood. In this study, we investigated large-scale structural brain networks in a sample of 37 PD patients and 34 healthy controls (HC) by assessing the structural covariance of cortical gyrification with local gyrification index (lGI). We demonstrated prominent small-world properties of the structural brain networks for both groups. Compared with the HC group, PD patients showed significantly increased integrated characteristic path length and integrated clustering coefficient, as well as decreased integrated global efficiency in structural brain networks. Distinct distributions of hub regions were identified between the two groups, showing more hub regions in the frontal cortex in PD patients. Moreover, the modular analyses revealed significantly decreased integrated regional efficiency in lateral Fronto-Insula-Temporal module, and increased integrated regional efficiency in Parieto-Temporal module in the PD group as compared to the HC group. In summary, our study demonstrated altered topological properties of structural networks at a global, regional and modular level in PD patients. These findings suggests that the structural networks of PD patients have a suboptimal topological organization, resulting in less effective integration of information between brain regions.

  18. Diagnosing scaling behavior of groundwater with a fully-integrated, high resolution hydrologic model simulated over the continental US (Invited)

    NASA Astrophysics Data System (ADS)

    Maxwell, R. M.; Condon, L. E.; Kollet, S. J.

    2013-12-01

    Groundwater is an important component of the hydrologic cycle yet its importance is often overlooked. Aquifers are a critical water resource, particularly in irrigation, but also participates in moderating the land-energy balance over the so-called critical zone of 2-10m in water table depth. Yet,the scaling behavior of groundwater is not well known. Here, we present the results of a fully-integrated hydrologic model run over a 6.3M km2 domain that covers much of North America focused on the continental United States. This model encompasses both the Mississippi and Colorado River watersheds in their entirety at 1km resolution and is constructed using the fully-integrated groundwater-vadose zone-surface water-land surface model, ParFlow. Results from this work are compared to observations (both of surface water flow and groundwater depths) and approaches are presented for observing of these integrated systems. Furthermore, results are used to understand the scaling behavior of groundwater over the continent at high resolution. Implications for understanding dominant hydrological processes at large scales will be discussed.

  19. Evaluation of an index of biotic integrity approach to assess fish assemblage condition in Western USA streams and rivers at varying spatial scales

    EPA Science Inventory

    Consistent assessments of biological condition are needed across multiple ecoregions to provide a greater understanding of the spatial extent of environmental degradation. However, consistent assessments at large geographic scales are often hampered by lack of uniformity in data ...

  20. The watershed-scale optimized and rearranged landscape design (WORLD) model and local biomass processing depots for sustainable biofuel production: Integrated life cycle assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eranki, Pragnya L.; Manowitz, David H.; Bals, Bryan D.

    An array of feedstock is being evaluated as potential raw material for cellulosic biofuel production. Thorough assessments are required in regional landscape settings before these feedstocks can be cultivated and sustainable management practices can be implemented. On the processing side, a potential solution to the logistical challenges of large biorefi neries is provided by a network of distributed processing facilities called local biomass processing depots. A large-scale cellulosic ethanol industry is likely to emerge soon in the United States. We have the opportunity to influence the sustainability of this emerging industry. The watershed-scale optimized and rearranged landscape design (WORLD) modelmore » estimates land allocations for different cellulosic feedstocks at biorefinery scale without displacing current animal nutrition requirements. This model also incorporates a network of the aforementioned depots. An integrated life cycle assessment is then conducted over the unified system of optimized feedstock production, processing, and associated transport operations to evaluate net energy yields (NEYs) and environmental impacts.« less

  1. Wafer level reliability for high-performance VLSI design

    NASA Technical Reports Server (NTRS)

    Root, Bryan J.; Seefeldt, James D.

    1987-01-01

    As very large scale integration architecture requires higher package density, reliability of these devices has approached a critical level. Previous processing techniques allowed a large window for varying reliability. However, as scaling and higher current densities push reliability to its limit, tighter control and instant feedback becomes critical. Several test structures developed to monitor reliability at the wafer level are described. For example, a test structure was developed to monitor metal integrity in seconds as opposed to weeks or months for conventional testing. Another structure monitors mobile ion contamination at critical steps in the process. Thus the reliability jeopardy can be assessed during fabrication preventing defective devices from ever being placed in the field. Most importantly, the reliability can be assessed on each wafer as opposed to an occasional sample.

  2. Organic printed photonics: From microring lasers to integrated circuits

    PubMed Central

    Zhang, Chuang; Zou, Chang-Ling; Zhao, Yan; Dong, Chun-Hua; Wei, Cong; Wang, Hanlin; Liu, Yunqi; Guo, Guang-Can; Yao, Jiannian; Zhao, Yong Sheng

    2015-01-01

    A photonic integrated circuit (PIC) is the optical analogy of an electronic loop in which photons are signal carriers with high transport speed and parallel processing capability. Besides the most frequently demonstrated silicon-based circuits, PICs require a variety of materials for light generation, processing, modulation, and detection. With their diversity and flexibility, organic molecular materials provide an alternative platform for photonics; however, the versatile fabrication of organic integrated circuits with the desired photonic performance remains a big challenge. The rapid development of flexible electronics has shown that a solution printing technique has considerable potential for the large-scale fabrication and integration of microsized/nanosized devices. We propose the idea of soft photonics and demonstrate the function-directed fabrication of high-quality organic photonic devices and circuits. We prepared size-tunable and reproducible polymer microring resonators on a wafer-scale transparent and flexible chip using a solution printing technique. The printed optical resonator showed a quality (Q) factor higher than 4 × 105, which is comparable to that of silicon-based resonators. The high material compatibility of this printed photonic chip enabled us to realize low-threshold microlasers by doping organic functional molecules into a typical photonic device. On an identical chip, this construction strategy allowed us to design a complex assembly of one-dimensional waveguide and resonator components for light signal filtering and optical storage toward the large-scale on-chip integration of microscopic photonic units. Thus, we have developed a scheme for soft photonic integration that may motivate further studies on organic photonic materials and devices. PMID:26601256

  3. Organic printed photonics: From microring lasers to integrated circuits.

    PubMed

    Zhang, Chuang; Zou, Chang-Ling; Zhao, Yan; Dong, Chun-Hua; Wei, Cong; Wang, Hanlin; Liu, Yunqi; Guo, Guang-Can; Yao, Jiannian; Zhao, Yong Sheng

    2015-09-01

    A photonic integrated circuit (PIC) is the optical analogy of an electronic loop in which photons are signal carriers with high transport speed and parallel processing capability. Besides the most frequently demonstrated silicon-based circuits, PICs require a variety of materials for light generation, processing, modulation, and detection. With their diversity and flexibility, organic molecular materials provide an alternative platform for photonics; however, the versatile fabrication of organic integrated circuits with the desired photonic performance remains a big challenge. The rapid development of flexible electronics has shown that a solution printing technique has considerable potential for the large-scale fabrication and integration of microsized/nanosized devices. We propose the idea of soft photonics and demonstrate the function-directed fabrication of high-quality organic photonic devices and circuits. We prepared size-tunable and reproducible polymer microring resonators on a wafer-scale transparent and flexible chip using a solution printing technique. The printed optical resonator showed a quality (Q) factor higher than 4 × 10(5), which is comparable to that of silicon-based resonators. The high material compatibility of this printed photonic chip enabled us to realize low-threshold microlasers by doping organic functional molecules into a typical photonic device. On an identical chip, this construction strategy allowed us to design a complex assembly of one-dimensional waveguide and resonator components for light signal filtering and optical storage toward the large-scale on-chip integration of microscopic photonic units. Thus, we have developed a scheme for soft photonic integration that may motivate further studies on organic photonic materials and devices.

  4. An integrated approach for automated cover-type mapping of large inaccessible areas in Alaska

    USGS Publications Warehouse

    Fleming, Michael D.

    1988-01-01

    The lack of any detailed cover type maps in the state necessitated that a rapid and accurate approach to be employed to develop maps for 329 million acres of Alaska within a seven-year period. This goal has been addressed by using an integrated approach to computer-aided analysis which combines efficient use of field data with the only consistent statewide spatial data sets available: Landsat multispectral scanner data, digital elevation data derived from 1:250 000-scale maps, and 1:60 000-scale color-infrared aerial photographs.

  5. Chromatin Landscapes of Retroviral and Transposon Integration Profiles

    PubMed Central

    Badhai, Jitendra; Rust, Alistair G.; Rad, Roland; Hilkens, John; Berns, Anton; van Lohuizen, Maarten; Wessels, Lodewyk F. A.; de Ridder, Jeroen

    2014-01-01

    The ability of retroviruses and transposons to insert their genetic material into host DNA makes them widely used tools in molecular biology, cancer research and gene therapy. However, these systems have biases that may strongly affect research outcomes. To address this issue, we generated very large datasets consisting of to unselected integrations in the mouse genome for the Sleeping Beauty (SB) and piggyBac (PB) transposons, and the Mouse Mammary Tumor Virus (MMTV). We analyzed (epi)genomic features to generate bias maps at both local and genome-wide scales. MMTV showed a remarkably uniform distribution of integrations across the genome. More distinct preferences were observed for the two transposons, with PB showing remarkable resemblance to bias profiles of the Murine Leukemia Virus. Furthermore, we present a model where target site selection is directed at multiple scales. At a large scale, target site selection is similar across systems, and defined by domain-oriented features, namely expression of proximal genes, proximity to CpG islands and to genic features, chromatin compaction and replication timing. Notable differences between the systems are mainly observed at smaller scales, and are directed by a diverse range of features. To study the effect of these biases on integration sites occupied under selective pressure, we turned to insertional mutagenesis (IM) screens. In IM screens, putative cancer genes are identified by finding frequently targeted genomic regions, or Common Integration Sites (CISs). Within three recently completed IM screens, we identified 7%–33% putative false positive CISs, which are likely not the result of the oncogenic selection process. Moreover, results indicate that PB, compared to SB, is more suited to tag oncogenes. PMID:24721906

  6. Evaluation of an index of biotic integrity approach used to assess biological condition in western U.S. streams and rivers at varying spatial scales

    USGS Publications Warehouse

    Meador, M.R.; Whittier, T.R.; Goldstein, R.M.; Hughes, R.M.; Peck, D.V.

    2008-01-01

    Consistent assessments of biological condition are needed across multiple ecoregions to provide a greater understanding of the spatial extent of environmental degradation. However, consistent assessments at large geographic scales are often hampered by lack of uniformity in data collection, analyses, and interpretation. The index of biotic integrity (IBI) has been widely used in eastern and central North America, where fish assemblages are complex and largely composed of native species, but IBI development has been hindered in the western United States because of relatively low fish species richness and greater relative abundance of alien fishes. Approaches to developing IBIs rarely provide a consistent means of assessing biological condition across multiple ecoregions. We conducted an evaluation of IBIs recently proposed for three ecoregions of the western United States using an independent data set covering a large geographic scale. We standardized the regional IBIs and developed biological condition criteria, assessed the responsiveness of IBIs to basin-level land uses, and assessed their precision and concordance with basin-scale IBIs. Standardized IBI scores from 318 sites in the western United States comprising mountain, plains, and xeric ecoregions were significantly related to combined urban and agricultural land uses. Standard deviations and coefficients of variation revealed relatively low variation in IBI scores based on multiple sampling reaches at sites. A relatively high degree of corroboration with independent, locally developed IBIs indicates that the regional IBIs are robust across large geographic scales, providing precise and accurate assessments of biological condition for western U.S. streams. ?? Copyright by the American Fisheries Society 2008.

  7. Wafer Scale Integration of CMOS Chips for Biomedical Applications via Self-Aligned Masking.

    PubMed

    Uddin, Ashfaque; Milaninia, Kaveh; Chen, Chin-Hsuan; Theogarajan, Luke

    2011-12-01

    This paper presents a novel technique for the integration of small CMOS chips into a large area substrate. A key component of the technique is the CMOS chip based self-aligned masking. This allows for the fabrication of sockets in wafers that are at most 5 µm larger than the chip on each side. The chip and the large area substrate are bonded onto a carrier such that the top surfaces of the two components are flush. The unique features of this technique enable the integration of macroscale components, such as leads and microfluidics. Furthermore, the integration process allows for MEMS micromachining after CMOS die-wafer integration. To demonstrate the capabilities of the proposed technology, a low-power integrated potentiostat chip for biosensing implemented in the AMI 0.5 µm CMOS technology is integrated in a silicon substrate. The horizontal gap and the vertical displacement between the chip and the large area substrate measured after the integration were 4 µm and 0.5 µm, respectively. A number of 104 interconnects are patterned with high-precision alignment. Electrical measurements have shown that the functionality of the chip is not affected by the integration process.

  8. Soil organic carbon across scales.

    PubMed

    O'Rourke, Sharon M; Angers, Denis A; Holden, Nicholas M; McBratney, Alex B

    2015-10-01

    Mechanistic understanding of scale effects is important for interpreting the processes that control the global carbon cycle. Greater attention should be given to scale in soil organic carbon (SOC) science so that we can devise better policy to protect/enhance existing SOC stocks and ensure sustainable use of soils. Global issues such as climate change require consideration of SOC stock changes at the global and biosphere scale, but human interaction occurs at the landscape scale, with consequences at the pedon, aggregate and particle scales. This review evaluates our understanding of SOC across all these scales in the context of the processes involved in SOC cycling at each scale and with emphasis on stabilizing SOC. Current synergy between science and policy is explored at each scale to determine how well each is represented in the management of SOC. An outline of how SOC might be integrated into a framework of soil security is examined. We conclude that SOC processes at the biosphere to biome scales are not well understood. Instead, SOC has come to be viewed as a large-scale pool subjects to carbon flux. Better understanding exists for SOC processes operating at the scales of the pedon, aggregate and particle. At the landscape scale, the influence of large- and small-scale processes has the greatest interaction and is exposed to the greatest modification through agricultural management. Policy implemented at regional or national scale tends to focus at the landscape scale without due consideration of the larger scale factors controlling SOC or the impacts of policy for SOC at the smaller SOC scales. What is required is a framework that can be integrated across a continuum of scales to optimize SOC management. © 2015 John Wiley & Sons Ltd.

  9. Spin diffusion from an inhomogeneous quench in an integrable system.

    PubMed

    Ljubotina, Marko; Žnidarič, Marko; Prosen, Tomaž

    2017-07-13

    Generalized hydrodynamics predicts universal ballistic transport in integrable lattice systems when prepared in generic inhomogeneous initial states. However, the ballistic contribution to transport can vanish in systems with additional discrete symmetries. Here we perform large scale numerical simulations of spin dynamics in the anisotropic Heisenberg XXZ spin 1/2 chain starting from an inhomogeneous mixed initial state which is symmetric with respect to a combination of spin reversal and spatial reflection. In the isotropic and easy-axis regimes we find non-ballistic spin transport which we analyse in detail in terms of scaling exponents of the transported magnetization and scaling profiles of the spin density. While in the easy-axis regime we find accurate evidence of normal diffusion, the spin transport in the isotropic case is clearly super-diffusive, with the scaling exponent very close to 2/3, but with universal scaling dynamics which obeys the diffusion equation in nonlinearly scaled time.

  10. Hypersonic research engine/aerothermodynamic integration model, experimental results. Volume 1: Mach 6 component integration

    NASA Technical Reports Server (NTRS)

    Andrews, E. H., Jr.; Mackley, E. A.

    1976-01-01

    The NASA Hypersonic Research Engine (HRE) Project was initiated for the purpose of advancing the technology of airbreathing propulsion for hypersonic flight. A large component (inlet, combustor, and nozzle) and structures development program was encompassed by the project. The tests of a full-scale (18 in. diameter cowl and 87 in. long) HRE concept, designated the Aerothermodynamic Integration Model (AIM), at Mach numbers of 5, 6, and 7. Computer program results for Mach 6 component integration tests are presented.

  11. Tau Ranging Revisited

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1989-01-01

    Report reviews history of tau ranging and advocates use of advanced electronic circuitry to revive this composite-code-uplink spacecraft-ranging technique. Very-large-scale integration gives new life to abandoned distance-measuring technique.

  12. A 14 × 14 μm2 footprint polarization-encoded quantum controlled-NOT gate based on hybrid waveguide

    PubMed Central

    Wang, S. M.; Cheng, Q. Q.; Gong, Y. X.; Xu, P.; Sun, C.; Li, L.; Li, T.; Zhu, S. N.

    2016-01-01

    Photonic quantum information processing system has been widely used in communication, metrology and lithography. The recent emphasis on the miniaturized photonic platform is thus motivated by the urgent need for realizing large-scale information processing and computing. Although the integrated quantum logic gates and quantum algorithms based on path encoding have been successfully demonstrated, the technology for handling another commonly used polarization-encoded qubits has yet to be fully developed. Here, we show the implementation of a polarization-dependent beam-splitter in the hybrid waveguide system. With precisely design, the polarization-encoded controlled-NOT gate can be implemented using only single such polarization-dependent beam-splitter with the significant size reduction of the overall device footprint to 14 × 14 μm2. The experimental demonstration of the highly integrated controlled-NOT gate sets the stage to develop large-scale quantum information processing system. Our hybrid design also establishes the new capabilities in controlling the polarization modes in integrated photonic circuits. PMID:27142992

  13. Large-scale network integration in the human brain tracks temporal fluctuations in memory encoding performance.

    PubMed

    Keerativittayayut, Ruedeerat; Aoki, Ryuta; Sarabi, Mitra Taghizadeh; Jimura, Koji; Nakahara, Kiyoshi

    2018-06-18

    Although activation/deactivation of specific brain regions have been shown to be predictive of successful memory encoding, the relationship between time-varying large-scale brain networks and fluctuations of memory encoding performance remains unclear. Here we investigated time-varying functional connectivity patterns across the human brain in periods of 30-40 s, which have recently been implicated in various cognitive functions. During functional magnetic resonance imaging, participants performed a memory encoding task, and their performance was assessed with a subsequent surprise memory test. A graph analysis of functional connectivity patterns revealed that increased integration of the subcortical, default-mode, salience, and visual subnetworks with other subnetworks is a hallmark of successful memory encoding. Moreover, multivariate analysis using the graph metrics of integration reliably classified the brain network states into the period of high (vs. low) memory encoding performance. Our findings suggest that a diverse set of brain systems dynamically interact to support successful memory encoding. © 2018, Keerativittayayut et al.

  14. Integrative approaches for large-scale transcriptome-wide association studies

    PubMed Central

    Gusev, Alexander; Ko, Arthur; Shi, Huwenbo; Bhatia, Gaurav; Chung, Wonil; Penninx, Brenda W J H; Jansen, Rick; de Geus, Eco JC; Boomsma, Dorret I; Wright, Fred A; Sullivan, Patrick F; Nikkola, Elina; Alvarez, Marcus; Civelek, Mete; Lusis, Aldons J.; Lehtimäki, Terho; Raitoharju, Emma; Kähönen, Mika; Seppälä, Ilkka; Raitakari, Olli T.; Kuusisto, Johanna; Laakso, Markku; Price, Alkes L.; Pajukanta, Päivi; Pasaniuc, Bogdan

    2016-01-01

    Many genetic variants influence complex traits by modulating gene expression, thus altering the abundance levels of one or multiple proteins. Here, we introduce a powerful strategy that integrates gene expression measurements with summary association statistics from large-scale genome-wide association studies (GWAS) to identify genes whose cis-regulated expression is associated to complex traits. We leverage expression imputation to perform a transcriptome wide association scan (TWAS) to identify significant expression-trait associations. We applied our approaches to expression data from blood and adipose tissue measured in ~3,000 individuals overall. We imputed gene expression into GWAS data from over 900,000 phenotype measurements to identify 69 novel genes significantly associated to obesity-related traits (BMI, lipids, and height). Many of the novel genes are associated with relevant phenotypes in the Hybrid Mouse Diversity Panel. Our results showcase the power of integrating genotype, gene expression and phenotype to gain insights into the genetic basis of complex traits. PMID:26854917

  15. Vertical resonant tunneling transistors with molecular quantum dots for large-scale integration.

    PubMed

    Hayakawa, Ryoma; Chikyow, Toyohiro; Wakayama, Yutaka

    2017-08-10

    Quantum molecular devices have a potential for the construction of new data processing architectures that cannot be achieved using current complementary metal-oxide-semiconductor (CMOS) technology. The relevant basic quantum transport properties have been examined by specific methods such as scanning probe and break-junction techniques. However, these methodologies are not compatible with current CMOS applications, and the development of practical molecular devices remains a persistent challenge. Here, we demonstrate a new vertical resonant tunneling transistor for large-scale integration. The transistor channel is comprised of a MOS structure with C 60 molecules as quantum dots, and the structure behaves like a double tunnel junction. Notably, the transistors enabled the observation of stepwise drain currents, which originated from resonant tunneling via the discrete molecular orbitals. Applying side-gate voltages produced depletion layers in Si substrates, to achieve effective modulation of the drain currents and obvious peak shifts in the differential conductance curves. Our device configuration thus provides a promising means of integrating molecular functions into future CMOS applications.

  16. Large-scale complementary macroelectronics using hybrid integration of carbon nanotubes and IGZO thin-film transistors.

    PubMed

    Chen, Haitian; Cao, Yu; Zhang, Jialu; Zhou, Chongwu

    2014-06-13

    Carbon nanotubes and metal oxide semiconductors have emerged as important materials for p-type and n-type thin-film transistors, respectively; however, realizing sophisticated macroelectronics operating in complementary mode has been challenging due to the difficulty in making n-type carbon nanotube transistors and p-type metal oxide transistors. Here we report a hybrid integration of p-type carbon nanotube and n-type indium-gallium-zinc-oxide thin-film transistors to achieve large-scale (>1,000 transistors for 501-stage ring oscillators) complementary macroelectronic circuits on both rigid and flexible substrates. This approach of hybrid integration allows us to combine the strength of p-type carbon nanotube and n-type indium-gallium-zinc-oxide thin-film transistors, and offers high device yield and low device variation. Based on this approach, we report the successful demonstration of various logic gates (inverter, NAND and NOR gates), ring oscillators (from 51 stages to 501 stages) and dynamic logic circuits (dynamic inverter, NAND and NOR gates).

  17. A 14 × 14 μm(2) footprint polarization-encoded quantum controlled-NOT gate based on hybrid waveguide.

    PubMed

    Wang, S M; Cheng, Q Q; Gong, Y X; Xu, P; Sun, C; Li, L; Li, T; Zhu, S N

    2016-05-04

    Photonic quantum information processing system has been widely used in communication, metrology and lithography. The recent emphasis on the miniaturized photonic platform is thus motivated by the urgent need for realizing large-scale information processing and computing. Although the integrated quantum logic gates and quantum algorithms based on path encoding have been successfully demonstrated, the technology for handling another commonly used polarization-encoded qubits has yet to be fully developed. Here, we show the implementation of a polarization-dependent beam-splitter in the hybrid waveguide system. With precisely design, the polarization-encoded controlled-NOT gate can be implemented using only single such polarization-dependent beam-splitter with the significant size reduction of the overall device footprint to 14 × 14 μm(2). The experimental demonstration of the highly integrated controlled-NOT gate sets the stage to develop large-scale quantum information processing system. Our hybrid design also establishes the new capabilities in controlling the polarization modes in integrated photonic circuits.

  18. ELT-scale Adaptive Optics real-time control with thes Intel Xeon Phi Many Integrated Core Architecture

    NASA Astrophysics Data System (ADS)

    Jenkins, David R.; Basden, Alastair; Myers, Richard M.

    2018-05-01

    We propose a solution to the increased computational demands of Extremely Large Telescope (ELT) scale adaptive optics (AO) real-time control with the Intel Xeon Phi Knights Landing (KNL) Many Integrated Core (MIC) Architecture. The computational demands of an AO real-time controller (RTC) scale with the fourth power of telescope diameter and so the next generation ELTs require orders of magnitude more processing power for the RTC pipeline than existing systems. The Xeon Phi contains a large number (≥64) of low power x86 CPU cores and high bandwidth memory integrated into a single socketed server CPU package. The increased parallelism and memory bandwidth are crucial to providing the performance for reconstructing wavefronts with the required precision for ELT scale AO. Here, we demonstrate that the Xeon Phi KNL is capable of performing ELT scale single conjugate AO real-time control computation at over 1.0kHz with less than 20μs RMS jitter. We have also shown that with a wavefront sensor camera attached the KNL can process the real-time control loop at up to 966Hz, the maximum frame-rate of the camera, with jitter remaining below 20μs RMS. Future studies will involve exploring the use of a cluster of Xeon Phis for the real-time control of the MCAO and MOAO regimes of AO. We find that the Xeon Phi is highly suitable for ELT AO real time control.

  19. Calculating Soil Wetness, Evapotranspiration and Carbon Cycle Processes Over Large Grid Areas Using a New Scaling Technique

    NASA Technical Reports Server (NTRS)

    Sellers, Piers

    2012-01-01

    Soil wetness typically shows great spatial variability over the length scales of general circulation model (GCM) grid areas (approx 100 km ), and the functions relating evapotranspiration and photosynthetic rate to local-scale (approx 1 m) soil wetness are highly non-linear. Soil respiration is also highly dependent on very small-scale variations in soil wetness. We therefore expect significant inaccuracies whenever we insert a single grid area-average soil wetness value into a function to calculate any of these rates for the grid area. For the particular case of evapotranspiration., this method - use of a grid-averaged soil wetness value - can also provoke severe oscillations in the evapotranspiration rate and soil wetness under some conditions. A method is presented whereby the probability distribution timction(pdf) for soil wetness within a grid area is represented by binning. and numerical integration of the binned pdf is performed to provide a spatially-integrated wetness stress term for the whole grid area, which then permits calculation of grid area fluxes in a single operation. The method is very accurate when 10 or more bins are used, can deal realistically with spatially variable precipitation, conserves moisture exactly and allows for precise modification of the soil wetness pdf after every time step. The method could also be applied to other ecological problems where small-scale processes must be area-integrated, or upscaled, to estimate fluxes over large areas, for example in treatments of the terrestrial carbon budget or trace gas generation.

  20. The study of integration about measurable image and 4D production

    NASA Astrophysics Data System (ADS)

    Zhang, Chunsen; Hu, Pingbo; Niu, Weiyun

    2008-12-01

    In this paper, we create the geospatial data of three-dimensional (3D) modeling by the combination of digital photogrammetry and digital close-range photogrammetry. For large-scale geographical background, we make the establishment of DEM and DOM combination of three-dimensional landscape model based on the digital photogrammetry which uses aerial image data to make "4D" (DOM: Digital Orthophoto Map, DEM: Digital Elevation Model, DLG: Digital Line Graphic and DRG: Digital Raster Graphic) production. For the range of building and other artificial features which the users are interested in, we realize that the real features of the three-dimensional reconstruction adopting the method of the digital close-range photogrammetry can come true on the basis of following steps : non-metric cameras for data collection, the camera calibration, feature extraction, image matching, and other steps. At last, we combine three-dimensional background and local measurements real images of these large geographic data and realize the integration of measurable real image and the 4D production.The article discussed the way of the whole flow and technology, achieved the three-dimensional reconstruction and the integration of the large-scale threedimensional landscape and the metric building.

  1. Dynamic subfilter-scale stress model for large-eddy simulations

    NASA Astrophysics Data System (ADS)

    Rouhi, A.; Piomelli, U.; Geurts, B. J.

    2016-08-01

    We present a modification of the integral length-scale approximation (ILSA) model originally proposed by Piomelli et al. [Piomelli et al., J. Fluid Mech. 766, 499 (2015), 10.1017/jfm.2015.29] and apply it to plane channel flow and a backward-facing step. In the ILSA models the length scale is expressed in terms of the integral length scale of turbulence and is determined by the flow characteristics, decoupled from the simulation grid. In the original formulation the model coefficient was constant, determined by requiring a desired global contribution of the unresolved subfilter scales (SFSs) to the dissipation rate, known as SFS activity; its value was found by a set of coarse-grid calculations. Here we develop two modifications. We de-fine a measure of SFS activity (based on turbulent stresses), which adds to the robustness of the model, particularly at high Reynolds numbers, and removes the need for the prior coarse-grid calculations: The model coefficient can be computed dynamically and adapt to large-scale unsteadiness. Furthermore, the desired level of SFS activity is now enforced locally (and not integrated over the entire volume, as in the original model), providing better control over model activity and also improving the near-wall behavior of the model. Application of the local ILSA to channel flow and a backward-facing step and comparison with the original ILSA and with the dynamic model of Germano et al. [Germano et al., Phys. Fluids A 3, 1760 (1991), 10.1063/1.857955] show better control over the model contribution in the local ILSA, while the positive properties of the original formulation (including its higher accuracy compared to the dynamic model on coarse grids) are maintained. The backward-facing step also highlights the advantage of the decoupling of the model length scale from the mesh.

  2. A Landscape Model (LEEMATH) to Evaluate Effects of Management Impacts on Timber and Wildlife Habitat

    Treesearch

    Harbin Li; David L. Gartner; Pu Mou; Carl C. Trettin

    2000-01-01

    Managing forest resources for sustainability requires the successful integration of economic and ecological goals. To attain such integration, land managers need decision support tools that incorporate science, land-use strategies, and policy options to assess resources sustainability at large scales. Landscape Evaluation of Effects of Management Activities on Timber...

  3. A Smart Partnership: Integrating Educational Technology for Underserved Children in India

    ERIC Educational Resources Information Center

    Charania, Amina; Davis, Niki

    2016-01-01

    This paper explores the evolution of a large multi-stakeholder partnership that has grown since 2011 to scale deep engagement with learning through technology and decrease the digital divide for thousands of underserved school children in India. Using as its basis a case study of an initiative called integrated approach to technology in education…

  4. 75 FR 51843 - In the Matter of Certain Large Scale Integrated Circuit Semiconductor Chips and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-23

    ... Integrated Circuit Semiconductor Chips and Products Containing the Same; Notice of Commission Decision Not To... semiconductor chips and products containing same by reason of infringement of certain claims of U.S. Patent Nos. 5,933,364 and 6,834,336. The complaint further alleges the existence of a domestic industry. The...

  5. Non-linear coherent mode interactions and the control of shear layers

    NASA Technical Reports Server (NTRS)

    Nikitopoulos, D. E.; Liu, J. T. C.

    1990-01-01

    A nonlinear integral formulation, based on local linear stability considerations, is used to study the collective interactions between discrete wave-modes associated with large-scale structures and the mean flow in a developing shear layer. Aspects of shear layer control are examined in light of the sensitivity of these interactions to the initial frequency parameter, modal energy contents and modal phases. Manipulation of the large-scale structure is argued to be an effective means of controlling the flow, including the small-scale turbulence dominated region far downstream. Cases of fundamental, 1st and 2nd subharmonic forcing are discussed in conjunction with relevant experiments.

  6. Using relational databases for improved sequence similarity searching and large-scale genomic analyses.

    PubMed

    Mackey, Aaron J; Pearson, William R

    2004-10-01

    Relational databases are designed to integrate diverse types of information and manage large sets of search results, greatly simplifying genome-scale analyses. Relational databases are essential for management and analysis of large-scale sequence analyses, and can also be used to improve the statistical significance of similarity searches by focusing on subsets of sequence libraries most likely to contain homologs. This unit describes using relational databases to improve the efficiency of sequence similarity searching and to demonstrate various large-scale genomic analyses of homology-related data. This unit describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. These include basic use of the database to generate a novel sequence library subset, how to extend and use seqdb_demo for the storage of sequence similarity search results and making use of various kinds of stored search results to address aspects of comparative genomic analysis.

  7. Survey of organizational research climates in three research intensive, doctoral granting universities.

    PubMed

    Wells, James A; Thrush, Carol R; Martinson, Brian C; May, Terry A; Stickler, Michelle; Callahan, Eileen C; Klomparens, Karen L

    2014-12-01

    The Survey of Organizational Research Climate (SOuRCe) is a new instrument that assesses dimensions of research integrity climate, including ethical leadership, socialization and communication processes, and policies, procedures, structures, and processes to address risks to research integrity. We present a descriptive analysis to characterize differences on the SOuRCe scales across departments, fields of study, and status categories (faculty, postdoctoral scholars, and graduate students) for 11,455 respondents from three research-intensive universities. Among the seven SOuRCe scales, variance explained by status and fields of study ranged from 7.6% (Advisor-Advisee Relations) to 16.2% (Integrity Norms). Department accounted for greater than 50% of the variance explained for each of the SOuRCe scales, ranging from 52.6% (Regulatory Quality) to 80.3% (Integrity Inhibitors). It is feasible to implement this instrument in large university settings across a broad range of fields, department types, and individual roles within academic units. Published baseline results provide initial data for institutions using the SOuRCe who wish to compare their own research integrity climates. © The Author(s) 2014.

  8. Guided Growth of Horizontal ZnSe Nanowires and their Integration into High-Performance Blue-UV Photodetectors.

    PubMed

    Oksenberg, Eitan; Popovitz-Biro, Ronit; Rechav, Katya; Joselevich, Ernesto

    2015-07-15

    Perfectly aligned horizontal ZnSe nano-wires are obtained by guided growth, and easily integrated into high-performance blue-UV photodetectors. Their crystal phase and crystallographic orientation are controlled by the epitaxial relations with six different sapphire planes. Guided growth paves the way for the large-scale integration of nanowires into optoelectronic devices. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. On the statistical mechanics of the 2D stochastic Euler equation

    NASA Astrophysics Data System (ADS)

    Bouchet, Freddy; Laurie, Jason; Zaboronski, Oleg

    2011-12-01

    The dynamics of vortices and large scale structures is qualitatively very different in two dimensional flows compared to its three dimensional counterparts, due to the presence of multiple integrals of motion. These are believed to be responsible for a variety of phenomena observed in Euler flow such as the formation of large scale coherent structures, the existence of meta-stable states and random abrupt changes in the topology of the flow. In this paper we study stochastic dynamics of the finite dimensional approximation of the 2D Euler flow based on Lie algebra su(N) which preserves all integrals of motion. In particular, we exploit rich algebraic structure responsible for the existence of Euler's conservation laws to calculate the invariant measures and explore their properties and also study the approach to equilibrium. Unexpectedly, we find deep connections between equilibrium measures of finite dimensional su(N) truncations of the stochastic Euler equations and random matrix models. Our work can be regarded as a preparation for addressing the questions of large scale structures, meta-stability and the dynamics of random transitions between different flow topologies in stochastic 2D Euler flows.

  10. Aerodynamic Design of a Dual-Flow Mach 7 Hypersonic Inlet System for a Turbine-Based Combined-Cycle Hypersonic Propulsion System

    NASA Technical Reports Server (NTRS)

    Sanders, Bobby W.; Weir, Lois J.

    2008-01-01

    A new hypersonic inlet for a turbine-based combined-cycle (TBCC) engine has been designed. This split-flow inlet is designed to provide flow to an over-under propulsion system with turbofan and dual-mode scramjet engines for flight from takeoff to Mach 7. It utilizes a variable-geometry ramp, high-speed cowl lip rotation, and a rotating low-speed cowl that serves as a splitter to divide the flow between the low-speed turbofan and the high-speed scramjet and to isolate the turbofan at high Mach numbers. The low-speed inlet was designed for Mach 4, the maximum mode transition Mach number. Integration of the Mach 4 inlet into the Mach 7 inlet imposed significant constraints on the low-speed inlet design, including a large amount of internal compression. The inlet design was used to develop mechanical designs for two inlet mode transition test models: small-scale (IMX) and large-scale (LIMX) research models. The large-scale model is designed to facilitate multi-phase testing including inlet mode transition and inlet performance assessment, controls development, and integrated systems testing with turbofan and scramjet engines.

  11. Large-scale behaviour of local and entanglement entropy of the free Fermi gas at any temperature

    NASA Astrophysics Data System (ADS)

    Leschke, Hajo; Sobolev, Alexander V.; Spitzer, Wolfgang

    2016-07-01

    The leading asymptotic large-scale behaviour of the spatially bipartite entanglement entropy (EE) of the free Fermi gas infinitely extended in multidimensional Euclidean space at zero absolute temperature, T = 0, is by now well understood. Here, we present and discuss the first rigorous results for the corresponding EE of thermal equilibrium states at T> 0. The leading large-scale term of this thermal EE turns out to be twice the first-order finite-size correction to the infinite-volume thermal entropy (density). Not surprisingly, this correction is just the thermal entropy on the interface of the bipartition. However, it is given by a rather complicated integral derived from a semiclassical trace formula for a certain operator on the underlying one-particle Hilbert space. But in the zero-temperature limit T\\downarrow 0, the leading large-scale term of the thermal EE considerably simplifies and displays a {ln}(1/T)-singularity which one may identify with the known logarithmic enhancement at T = 0 of the so-called area-law scaling. birthday of the ideal Fermi gas.

  12. FIELD DEMONSTRATION OF EMERGING PIPE WALL INTEGRITY ASSESSMENT TECHNOLOGIES FOR LARGE CAST IRON WATER MAINS

    EPA Science Inventory

    The U.S. Environmental Protection Agency sponsored a large-scale field demonstration of innovative leak detection/location and condition assessment technologies on a 76-year old, 2,500-ft long, cement-lined, 24-in. cast iron water main in Louisville, KY from July through Septembe...

  13. Field Demonstration of Emerging Pipe Wall Integrity Assessment Technologies for Large Cast Iron Water Mains - Paper

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) sponsored a large-scale field demonstration of innovative leak detection/location and condition assessment technologies on a 76-year old, 2,000-ft long, cement-lined, 24-in. cast-iron water main in Louisville, KY from July through Se...

  14. A Navy Shore Activity Manpower Planning System for Civilians. Technical Report No. 24.

    ERIC Educational Resources Information Center

    Niehaus, R. J.; Sholtz, D.

    This report describes the U.S. Navy Shore Activity Manpower Planning System (SAMPS) advanced development research project. This effort is aimed at large-scale feasibility tests of manpower models for large Naval installations. These local planning systems are integrated with Navy-wide information systems on a data-communications network accessible…

  15. Integrating complexity into data-driven multi-hazard supply chain network strategies

    USGS Publications Warehouse

    Long, Suzanna K.; Shoberg, Thomas G.; Ramachandran, Varun; Corns, Steven M.; Carlo, Hector J.

    2013-01-01

    Major strategies in the wake of a large-scale disaster have focused on short-term emergency response solutions. Few consider medium-to-long-term restoration strategies that reconnect urban areas to the national supply chain networks (SCN) and their supporting infrastructure. To re-establish this connectivity, the relationships within the SCN must be defined and formulated as a model of a complex adaptive system (CAS). A CAS model is a representation of a system that consists of large numbers of inter-connections, demonstrates non-linear behaviors and emergent properties, and responds to stimulus from its environment. CAS modeling is an effective method of managing complexities associated with SCN restoration after large-scale disasters. In order to populate the data space large data sets are required. Currently access to these data is hampered by proprietary restrictions. The aim of this paper is to identify the data required to build a SCN restoration model, look at the inherent problems associated with these data, and understand the complexity that arises due to integration of these data.

  16. Development of the US3D Code for Advanced Compressible and Reacting Flow Simulations

    NASA Technical Reports Server (NTRS)

    Candler, Graham V.; Johnson, Heath B.; Nompelis, Ioannis; Subbareddy, Pramod K.; Drayna, Travis W.; Gidzak, Vladimyr; Barnhardt, Michael D.

    2015-01-01

    Aerothermodynamics and hypersonic flows involve complex multi-disciplinary physics, including finite-rate gas-phase kinetics, finite-rate internal energy relaxation, gas-surface interactions with finite-rate oxidation and sublimation, transition to turbulence, large-scale unsteadiness, shock-boundary layer interactions, fluid-structure interactions, and thermal protection system ablation and thermal response. Many of the flows have a large range of length and time scales, requiring large computational grids, implicit time integration, and large solution run times. The University of Minnesota NASA US3D code was designed for the simulation of these complex, highly-coupled flows. It has many of the features of the well-established DPLR code, but uses unstructured grids and has many advanced numerical capabilities and physical models for multi-physics problems. The main capabilities of the code are described, the physical modeling approaches are discussed, the different types of numerical flux functions and time integration approaches are outlined, and the parallelization strategy is overviewed. Comparisons between US3D and the NASA DPLR code are presented, and several advanced simulations are presented to illustrate some of novel features of the code.

  17. A large-scale clinical validation of an integrated monitoring system in the emergency department.

    PubMed

    Clifton, David A; Wong, David; Clifton, Lei; Wilson, Sarah; Way, Rob; Pullinger, Richard; Tarassenko, Lionel

    2013-07-01

    We consider an integrated patient monitoring system, combining electronic patient records with high-rate acquisition of patient physiological data. There remain many challenges in increasing the robustness of "e-health" applications to a level at which they are clinically useful, particularly in the use of automated algorithms used to detect and cope with artifact in data contained within the electronic patient record, and in analyzing and communicating the resultant data for reporting to clinicians. There is a consequential "plague of pilots," in which engineering prototype systems do not enter into clinical use. This paper describes an approach in which, for the first time, the Emergency Department (ED) of a major research hospital has adopted such systems for use during a large clinical trial. We describe the disadvantages of existing evaluation metrics when applied to such large trials, and propose a solution suitable for large-scale validation. We demonstrate that machine learning technologies embedded within healthcare information systems can provide clinical benefit, with the potential to improve patient outcomes in the busy environment of a major ED and other high-dependence areas of patient care.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques Hugo

    Traditional engineering methods do not make provision for the integration of human considerations, while traditional human factors methods do not scale well to the complexity of large-scale nuclear power plant projects. Although the need for up-to-date human factors engineering processes and tools is recognised widely in industry, so far no formal guidance has been developed. This article proposes such a framework.

  19. Examination of Cross-Scale Coupling During Auroral Events using RENU2 and ISINGLASS Sounding Rocket Data.

    NASA Astrophysics Data System (ADS)

    Kenward, D. R.; Lessard, M.; Lynch, K. A.; Hysell, D. L.; Hampton, D. L.; Michell, R.; Samara, M.; Varney, R. H.; Oksavik, K.; Clausen, L. B. N.; Hecht, J. H.; Clemmons, J. H.; Fritz, B.

    2017-12-01

    The RENU2 sounding rocket (launched from Andoya rocket range on December 13th, 2015) observed Poleward Moving Auroral Forms within the dayside cusp. The ISINGLASS rockets (launched from Poker Flat rocket range on February 22, 2017 and March 2, 2017) both observed aurora during a substorm event. Despite observing very different events, both campaigns witnessed a high degree of small scale structuring within the larger auroral boundary, including Alfvenic signatures. These observations suggest a method of coupling large-scale energy input to fine scale structures within aurorae. During RENU2, small (sub-km) scale drivers persist for long (10s of minutes) time scales and result in large scale ionospheric (thermal electron) and thermospheric response (neutral upwelling). ISINGLASS observations show small scale drivers, but with short (minute) time scales, with ionospheric response characterized by the flight's thermal electron instrument (ERPA). The comparison of the two flights provides an excellent opportunity to examine ionospheric and thermospheric response to small scale drivers over different integration times.

  20. Building a multi-scaled geospatial temporal ecology database from disparate data sources: fostering open science and data reuse.

    PubMed

    Soranno, Patricia A; Bissell, Edward G; Cheruvelil, Kendra S; Christel, Samuel T; Collins, Sarah M; Fergus, C Emi; Filstrup, Christopher T; Lapierre, Jean-Francois; Lottig, Noah R; Oliver, Samantha K; Scott, Caren E; Smith, Nicole J; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A; Gries, Corinna; Henry, Emily N; Skaff, Nick K; Stanley, Emily H; Stow, Craig A; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km(2)). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated database reproducible and extensible, allowing users to ask new research questions with the existing database or through the addition of new data. The largest challenge of this task was the heterogeneity of the data, formats, and metadata. Many steps of data integration need manual input from experts in diverse fields, requiring close collaboration.

  1. Building a multi-scaled geospatial temporal ecology database from disparate data sources: Fostering open science through data reuse

    USGS Publications Warehouse

    Soranno, Patricia A.; Bissell, E.G.; Cheruvelil, Kendra S.; Christel, Samuel T.; Collins, Sarah M.; Fergus, C. Emi; Filstrup, Christopher T.; Lapierre, Jean-Francois; Lotting, Noah R.; Oliver, Samantha K.; Scott, Caren E.; Smith, Nicole J.; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A.; Gries, Corinna; Henry, Emily N.; Skaff, Nick K.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km2). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated database reproducible and extensible, allowing users to ask new research questions with the existing database or through the addition of new data. The largest challenge of this task was the heterogeneity of the data, formats, and metadata. Many steps of data integration need manual input from experts in diverse fields, requiring close collaboration.

  2. Large Scale Processes and Extreme Floods in Brazil

    NASA Astrophysics Data System (ADS)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  3. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex

    PubMed Central

    Chaudhuri, Rishidev; Knoblauch, Kenneth; Gariel, Marie-Alice; Kennedy, Henry; Wang, Xiao-Jing

    2015-01-01

    We developed a large-scale dynamical model of the macaque neocortex, which is based on recently acquired directed- and weighted-connectivity data from tract-tracing experiments, and which incorporates heterogeneity across areas. A hierarchy of timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for sensory processing), whereas association areas integrate inputs over time and exhibit persistent activity (suitable for decision-making and working memory). The model displays multiple temporal hierarchies, as evidenced by contrasting responses to visual versus somatosensory stimulation. Moreover, slower prefrontal and temporal areas have a disproportionate impact on global brain dynamics. These findings establish a circuit mechanism for “temporal receptive windows” that are progressively enlarged along the cortical hierarchy, suggest an extension of time integration in decision-making from local to large circuits, and should prompt a re-evaluation of the analysis of functional connectivity (measured by fMRI or EEG/MEG) by taking into account inter-areal heterogeneity. PMID:26439530

  4. Random cascade model in the limit of infinite integral scale as the exponential of a nonstationary 1/f noise: Application to volatility fluctuations in stock markets

    NASA Astrophysics Data System (ADS)

    Muzy, Jean-François; Baïle, Rachel; Bacry, Emmanuel

    2013-04-01

    In this paper we propose a new model for volatility fluctuations in financial time series. This model relies on a nonstationary Gaussian process that exhibits aging behavior. It turns out that its properties, over any finite time interval, are very close to continuous cascade models. These latter models are indeed well known to reproduce faithfully the main stylized facts of financial time series. However, it involves a large-scale parameter (the so-called “integral scale” where the cascade is initiated) that is hard to interpret in finance. Moreover, the empirical value of the integral scale is in general deeply correlated to the overall length of the sample. This feature is precisely predicted by our model, which, as illustrated by various examples from daily stock index data, quantitatively reproduces the empirical observations.

  5. Chemical Processing of Electrons and Holes.

    ERIC Educational Resources Information Center

    Anderson, Timothy J.

    1990-01-01

    Presents a synopsis of four lectures given in an elective senior-level electronic material processing course to introduce solid state electronics. Provides comparisons of a large scale chemical processing plant and an integrated circuit. (YP)

  6. Environmental Response Laboratory Network

    EPA Pesticide Factsheets

    The ERLN as a national network of laboratories that can be ramped up as needed to support large scale environmental responses. It integrates capabilities of existing public and private sector labs, providing consistent capacity and quality data.

  7. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1989-01-01

    Deep Space Network advanced systems, very large scale integration architecture for decoders, radar interface and control units, microwave time delays, microwave antenna holography, and a radio frequency interference survey are among the topics discussed.

  8. Large-scale 3D inversion of marine controlled source electromagnetic data using the integral equation method

    NASA Astrophysics Data System (ADS)

    Zhdanov, M. S.; Cuma, M.; Black, N.; Wilson, G. A.

    2009-12-01

    The marine controlled source electromagnetic (MCSEM) method has become widely used in offshore oil and gas exploration. Interpretation of MCSEM data is still a very challenging problem, especially if one would like to take into account the realistic 3D structure of the subsurface. The inversion of MCSEM data is complicated by the fact that the EM response of a hydrocarbon-bearing reservoir is very weak in comparison with the background EM fields generated by an electric dipole transmitter in complex geoelectrical structures formed by a conductive sea-water layer and the terranes beneath it. In this paper, we present a review of the recent developments in the area of large-scale 3D EM forward modeling and inversion. Our approach is based on using a new integral form of Maxwell’s equations allowing for an inhomogeneous background conductivity, which results in a numerically effective integral representation for 3D EM field. This representation provides an efficient tool for the solution of 3D EM inverse problems. To obtain a robust inverse model of the conductivity distribution, we apply regularization based on a focusing stabilizing functional which allows for the recovery of models with both smooth and sharp geoelectrical boundaries. The method is implemented in a fully parallel computer code, which makes it possible to run large-scale 3D inversions on grids with millions of inversion cells. This new technique can be effectively used for active EM detection and monitoring of the subsurface targets.

  9. Fractionaly Integrated Flux model and Scaling Laws in Weather and Climate

    NASA Astrophysics Data System (ADS)

    Schertzer, Daniel; Lovejoy, Shaun

    2013-04-01

    The Fractionaly Integrated Flux model (FIF) has been extensively used to model intermittent observables, like the velocity field, by defining them with the help of a fractional integration of a conservative (i.e. strictly scale invariant) flux, such as the turbulent energy flux. It indeed corresponds to a well-defined modelling that yields the observed scaling laws. Generalised Scale Invariance (GSI) enables FIF to deal with anisotropic fractional integrations and has been rather successful to define and model a unique regime of scaling anisotropic turbulence up to planetary scales. This turbulence has an effective dimension of 23/9=2.55... instead of the classical hypothesised 2D and 3D turbulent regimes, respectively for large and small spatial scales. It therefore theoretically eliminates a non plausible "dimension transition" between these two regimes and the resulting requirement of a turbulent energy "mesoscale gap", whose empirical evidence has been brought more and more into question. More recently, GSI-FIF was used to analyse climate, therefore at much larger time scales. Indeed, the 23/9-dimensional regime necessarily breaks up at the outer spatial scales. The corresponding transition range, which can be called "macroweather", seems to have many interesting properties, e.g. it rather corresponds to a fractional differentiation in time with a roughly flat frequency spectrum. Furthermore, this transition yields the possibility to have at much larger time scales scaling space-time climate fluctuations with a much stronger scaling anisotropy between time and space. Lovejoy, S. and D. Schertzer (2013). The Weather and Climate: Emergent Laws and Multifractal Cascades. Cambridge Press (in press). Schertzer, D. et al. (1997). Fractals 5(3): 427-471. Schertzer, D. and S. Lovejoy (2011). International Journal of Bifurcation and Chaos 21(12): 3417-3456.

  10. Implementation of the Large-Scale Operations Management Test in the State of Washington.

    DTIC Science & Technology

    1982-12-01

    During FY 79, the U.S. Army Engineer Waterways Experiment Station (WES), Vicksburg, Miss., completed the first phase of its 3-year Large-Scale Operations Management Test (LSOMT). The LSOMT was designed to develop an operational plan to identify methodologies that can be implemented by the U.S. Army Engineer District, Seattle (NPS), to prevent the exotic aquatic macrophyte Eurasian watermilfoil (Myrophyllum spicatum L.) from reaching problem-level proportions in water bodies in the state of Washington. The WES developed specific plans as integral elements

  11. EuroPhenome and EMPReSS: online mouse phenotyping resource

    PubMed Central

    Mallon, Ann-Marie; Hancock, John M.

    2008-01-01

    EuroPhenome (http://www.europhenome.org) and EMPReSS (http://empress.har.mrc.ac.uk/) form an integrated resource to provide access to data and procedures for mouse phenotyping. EMPReSS describes 96 Standard Operating Procedures for mouse phenotyping. EuroPhenome contains data resulting from carrying out EMPReSS protocols on four inbred laboratory mouse strains. As well as web interfaces, both resources support web services to enable integration with other mouse phenotyping and functional genetics resources, and are committed to initiatives to improve integration of mouse phenotype databases. EuroPhenome will be the repository for a recently initiated effort to carry out large-scale phenotyping on a large number of knockout mouse lines (EUMODIC). PMID:17905814

  12. EuroPhenome and EMPReSS: online mouse phenotyping resource.

    PubMed

    Mallon, Ann-Marie; Blake, Andrew; Hancock, John M

    2008-01-01

    EuroPhenome (http://www.europhenome.org) and EMPReSS (http://empress.har.mrc.ac.uk/) form an integrated resource to provide access to data and procedures for mouse phenotyping. EMPReSS describes 96 Standard Operating Procedures for mouse phenotyping. EuroPhenome contains data resulting from carrying out EMPReSS protocols on four inbred laboratory mouse strains. As well as web interfaces, both resources support web services to enable integration with other mouse phenotyping and functional genetics resources, and are committed to initiatives to improve integration of mouse phenotype databases. EuroPhenome will be the repository for a recently initiated effort to carry out large-scale phenotyping on a large number of knockout mouse lines (EUMODIC).

  13. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Paul T; Yin, Shengjun; Klasky, Hilda B

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less

  14. Analysis of Radar and Optical Space Borne Data for Large Scale Topographical Mapping

    NASA Astrophysics Data System (ADS)

    Tampubolon, W.; Reinhardt, W.

    2015-03-01

    Normally, in order to provide high resolution 3 Dimension (3D) geospatial data, large scale topographical mapping needs input from conventional airborne campaigns which are in Indonesia bureaucratically complicated especially during legal administration procedures i.e. security clearance from military/defense ministry. This often causes additional time delays besides technical constraints such as weather and limited aircraft availability for airborne campaigns. Of course the geospatial data quality is an important issue for many applications. The increasing demand of geospatial data nowadays consequently requires high resolution datasets as well as a sufficient level of accuracy. Therefore an integration of different technologies is required in many cases to gain the expected result especially in the context of disaster preparedness and emergency response. Another important issue in this context is the fast delivery of relevant data which is expressed by the term "Rapid Mapping". In this paper we present first results of an on-going research to integrate different data sources like space borne radar and optical platforms. Initially the orthorectification of Very High Resolution Satellite (VHRS) imagery i.e. SPOT-6 has been done as a continuous process to the DEM generation using TerraSAR-X/TanDEM-X data. The role of Ground Control Points (GCPs) from GNSS surveys is mandatory in order to fulfil geometrical accuracy. In addition, this research aims on providing suitable processing algorithm of space borne data for large scale topographical mapping as described in section 3.2. Recently, radar space borne data has been used for the medium scale topographical mapping e.g. for 1:50.000 map scale in Indonesian territories. The goal of this on-going research is to increase the accuracy of remote sensing data by different activities, e.g. the integration of different data sources (optical and radar) or the usage of the GCPs in both, the optical and the radar satellite data processing. Finally this results will be used in the future as a reference for further geospatial data acquisitions to support topographical mapping in even larger scales up to the 1:10.000 map scale.

  15. Avionic Data Bus Integration Technology

    DTIC Science & Technology

    1991-12-01

    address the hardware-software interaction between a digital data bus and an avionic system. Very Large Scale Integration (VLSI) ICs and multiversion ...the SCP. In 1984, the Sperry Corporation developed a fault tolerant system which employed multiversion programming, voting, and monitoring for error... MULTIVERSION PROGRAMMING. N-version programming. 226 N-VERSION PROGRAMMING. The independent coding of a number, N, of redundant computer programs that

  16. Global Economic Integration and Local Community Resilience: Road Paving and Rural Demographic Change in the Southwestern Amazon

    ERIC Educational Resources Information Center

    Perz, Stephen G.; Cabrera, Liliana; Carvalho, Lucas Araujo; Castillo, Jorge; Barnes, Grenville

    2010-01-01

    Recent years have witnessed an expansion in international investment in large-scale infrastructure projects with the goal of achieving global economic integration. We focus on one such project, the Inter-Oceanic Highway in the "MAP" region, a trinational frontier where Bolivia, Brazil, and Peru meet in the southwestern Amazon. We adopt a…

  17. The Emerging Role of the Data Base Manager. Report No. R-1253-PR.

    ERIC Educational Resources Information Center

    Sawtelle, Thomas K.

    The Air Force Logistics Command (AFLC) is revising and enhancing its data-processing capabilities with the development of a large-scale, multi-site, on-line, integrated data base information system known as the Advanced Logistics System (ALS). A data integrity program is to be built around a Data Base Manager (DBM), an individual or a group of…

  18. Defense Acquisitions Acronyms and Terms

    DTIC Science & Technology

    2012-12-01

    Computer-Aided Design CADD Computer-Aided Design and Drafting CAE Component Acquisition Executive; Computer-Aided Engineering CAIV Cost As an...Radiation to Ordnance HFE Human Factors Engineering HHA Health Hazard Assessment HNA Host-Nation Approval HNS Host-Nation Support HOL High -Order...Engineering Change Proposal VHSIC Very High Speed Integrated Circuit VLSI Very Large Scale Integration VOC Volatile Organic Compound W WAN Wide

  19. eScience for molecular-scale simulations and the eMinerals project.

    PubMed

    Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H

    2009-03-13

    We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.

  20. Atypical language laterality is associated with large-scale disruption of network integration in children with intractable focal epilepsy.

    PubMed

    Ibrahim, George M; Morgan, Benjamin R; Doesburg, Sam M; Taylor, Margot J; Pang, Elizabeth W; Donner, Elizabeth; Go, Cristina Y; Rutka, James T; Snead, O Carter

    2015-04-01

    Epilepsy is associated with disruption of integration in distributed networks, together with altered localization for functions such as expressive language. The relation between atypical network connectivity and altered localization is unknown. In the current study we tested whether atypical expressive language laterality was associated with the alteration of large-scale network integration in children with medically-intractable localization-related epilepsy (LRE). Twenty-three right-handed children (age range 8-17) with medically-intractable LRE performed a verb generation task in fMRI. Language network activation was identified and the Laterality index (LI) was calculated within the pars triangularis and pars opercularis. Resting-state data from the same cohort were subjected to independent component analysis. Dual regression was used to identify associations between resting-state integration and LI values. Higher positive values of the LI, indicating typical language localization were associated with stronger functional integration of various networks including the default mode network (DMN). The normally symmetric resting-state networks showed a pattern of lateralized connectivity mirroring that of language function. The association between atypical language localization and network integration implies a widespread disruption of neural network development. These findings may inform the interpretation of localization studies by providing novel insights into reorganization of neural networks in epilepsy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Data Portal for the Library of Integrated Network-based Cellular Signatures (LINCS) program: integrated access to diverse large-scale cellular perturbation response data

    PubMed Central

    Koleti, Amar; Terryn, Raymond; Stathias, Vasileios; Chung, Caty; Cooper, Daniel J; Turner, John P; Vidović, Dušica; Forlin, Michele; Kelley, Tanya T; D’Urso, Alessandro; Allen, Bryce K; Torre, Denis; Jagodnik, Kathleen M; Wang, Lily; Jenkins, Sherry L; Mader, Christopher; Niu, Wen; Fazel, Mehdi; Mahi, Naim; Pilarczyk, Marcin; Clark, Nicholas; Shamsaei, Behrouz; Meller, Jarek; Vasiliauskas, Juozas; Reichard, John; Medvedovic, Mario; Ma’ayan, Avi; Pillai, Ajay

    2018-01-01

    Abstract The Library of Integrated Network-based Cellular Signatures (LINCS) program is a national consortium funded by the NIH to generate a diverse and extensive reference library of cell-based perturbation-response signatures, along with novel data analytics tools to improve our understanding of human diseases at the systems level. In contrast to other large-scale data generation efforts, LINCS Data and Signature Generation Centers (DSGCs) employ a wide range of assay technologies cataloging diverse cellular responses. Integration of, and unified access to LINCS data has therefore been particularly challenging. The Big Data to Knowledge (BD2K) LINCS Data Coordination and Integration Center (DCIC) has developed data standards specifications, data processing pipelines, and a suite of end-user software tools to integrate and annotate LINCS-generated data, to make LINCS signatures searchable and usable for different types of users. Here, we describe the LINCS Data Portal (LDP) (http://lincsportal.ccs.miami.edu/), a unified web interface to access datasets generated by the LINCS DSGCs, and its underlying database, LINCS Data Registry (LDR). LINCS data served on the LDP contains extensive metadata and curated annotations. We highlight the features of the LDP user interface that is designed to enable search, browsing, exploration, download and analysis of LINCS data and related curated content. PMID:29140462

  2. Layer-by-layer assembly of two-dimensional materials into wafer-scale heterostructures

    NASA Astrophysics Data System (ADS)

    Kang, Kibum; Lee, Kan-Heng; Han, Yimo; Gao, Hui; Xie, Saien; Muller, David A.; Park, Jiwoong

    2017-10-01

    High-performance semiconductor films with vertical compositions that are designed to atomic-scale precision provide the foundation for modern integrated circuitry and novel materials discovery. One approach to realizing such films is sequential layer-by-layer assembly, whereby atomically thin two-dimensional building blocks are vertically stacked, and held together by van der Waals interactions. With this approach, graphene and transition-metal dichalcogenides--which represent one- and three-atom-thick two-dimensional building blocks, respectively--have been used to realize previously inaccessible heterostructures with interesting physical properties. However, no large-scale assembly method exists at present that maintains the intrinsic properties of these two-dimensional building blocks while producing pristine interlayer interfaces, thus limiting the layer-by-layer assembly method to small-scale proof-of-concept demonstrations. Here we report the generation of wafer-scale semiconductor films with a very high level of spatial uniformity and pristine interfaces. The vertical composition and properties of these films are designed at the atomic scale using layer-by-layer assembly of two-dimensional building blocks under vacuum. We fabricate several large-scale, high-quality heterostructure films and devices, including superlattice films with vertical compositions designed layer-by-layer, batch-fabricated tunnel device arrays with resistances that can be tuned over four orders of magnitude, band-engineered heterostructure tunnel diodes, and millimetre-scale ultrathin membranes and windows. The stacked films are detachable, suspendable and compatible with water or plastic surfaces, which will enable their integration with advanced optical and mechanical systems.

  3. Layer-by-layer assembly of two-dimensional materials into wafer-scale heterostructures.

    PubMed

    Kang, Kibum; Lee, Kan-Heng; Han, Yimo; Gao, Hui; Xie, Saien; Muller, David A; Park, Jiwoong

    2017-10-12

    High-performance semiconductor films with vertical compositions that are designed to atomic-scale precision provide the foundation for modern integrated circuitry and novel materials discovery. One approach to realizing such films is sequential layer-by-layer assembly, whereby atomically thin two-dimensional building blocks are vertically stacked, and held together by van der Waals interactions. With this approach, graphene and transition-metal dichalcogenides-which represent one- and three-atom-thick two-dimensional building blocks, respectively-have been used to realize previously inaccessible heterostructures with interesting physical properties. However, no large-scale assembly method exists at present that maintains the intrinsic properties of these two-dimensional building blocks while producing pristine interlayer interfaces, thus limiting the layer-by-layer assembly method to small-scale proof-of-concept demonstrations. Here we report the generation of wafer-scale semiconductor films with a very high level of spatial uniformity and pristine interfaces. The vertical composition and properties of these films are designed at the atomic scale using layer-by-layer assembly of two-dimensional building blocks under vacuum. We fabricate several large-scale, high-quality heterostructure films and devices, including superlattice films with vertical compositions designed layer-by-layer, batch-fabricated tunnel device arrays with resistances that can be tuned over four orders of magnitude, band-engineered heterostructure tunnel diodes, and millimetre-scale ultrathin membranes and windows. The stacked films are detachable, suspendable and compatible with water or plastic surfaces, which will enable their integration with advanced optical and mechanical systems.

  4. Large Eddy Simulation of a Turbulent Jet

    NASA Technical Reports Server (NTRS)

    Webb, A. T.; Mansour, Nagi N.

    2001-01-01

    Here we present the results of a Large Eddy Simulation of a non-buoyant jet issuing from a circular orifice in a wall, and developing in neutral surroundings. The effects of the subgrid scales on the large eddies have been modeled with the dynamic large eddy simulation model applied to the fully 3D domain in spherical coordinates. The simulation captures the unsteady motions of the large-scales within the jet as well as the laminar motions in the entrainment region surrounding the jet. The computed time-averaged statistics (mean velocity, concentration, and turbulence parameters) compare well with laboratory data without invoking an empirical entrainment coefficient as employed by line integral models. The use of the large eddy simulation technique allows examination of unsteady and inhomogeneous features such as the evolution of eddies and the details of the entrainment process.

  5. A Discrete Constraint for Entropy Conservation and Sound Waves in Cloud-Resolving Modeling

    NASA Technical Reports Server (NTRS)

    Zeng, Xi-Ping; Tao, Wei-Kuo; Simpson, Joanne

    2003-01-01

    Ideal cloud-resolving models contain little-accumulative errors. When their domain is so large that synoptic large-scale circulations are accommodated, they can be used for the simulation of the interaction between convective clouds and the large-scale circulations. This paper sets up a framework for the models, using moist entropy as a prognostic variable and employing conservative numerical schemes. The models possess no accumulative errors of thermodynamic variables when they comply with a discrete constraint on entropy conservation and sound waves. Alternatively speaking, the discrete constraint is related to the correct representation of the large-scale convergence and advection of moist entropy. Since air density is involved in entropy conservation and sound waves, the challenge is how to compute sound waves efficiently under the constraint. To address the challenge, a compensation method is introduced on the basis of a reference isothermal atmosphere whose governing equations are solved analytically. Stability analysis and numerical experiments show that the method allows the models to integrate efficiently with a large time step.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubel, Oliver; Loring, Burlen; Vay, Jean -Luc

    The generation of short pulses of ion beams through the interaction of an intense laser with a plasma sheath offers the possibility of compact and cheaper ion sources for many applications--from fast ignition and radiography of dense targets to hadron therapy and injection into conventional accelerators. To enable the efficient analysis of large-scale, high-fidelity particle accelerator simulations using the Warp simulation suite, the authors introduce the Warp In situ Visualization Toolkit (WarpIV). WarpIV integrates state-of-the-art in situ visualization and analysis using VisIt with Warp, supports management and control of complex in situ visualization and analysis workflows, and implements integrated analyticsmore » to facilitate query- and feature-based data analytics and efficient large-scale data analysis. WarpIV enables for the first time distributed parallel, in situ visualization of the full simulation data using high-performance compute resources as the data is being generated by Warp. The authors describe the application of WarpIV to study and compare large 2D and 3D ion accelerator simulations, demonstrating significant differences in the acceleration process in 2D and 3D simulations. WarpIV is available to the public via https://bitbucket.org/berkeleylab/warpiv. The Warp In situ Visualization Toolkit (WarpIV) supports large-scale, parallel, in situ visualization and analysis and facilitates query- and feature-based analytics, enabling for the first time high-performance analysis of large-scale, high-fidelity particle accelerator simulations while the data is being generated by the Warp simulation suite. Furthermore, this supplemental material https://extras.computer.org/extra/mcg2016030022s1.pdf provides more details regarding the memory profiling and optimization and the Yee grid recentering optimization results discussed in the main article.« less

  7. WarpIV: In situ visualization and analysis of ion accelerator simulations

    DOE PAGES

    Rubel, Oliver; Loring, Burlen; Vay, Jean -Luc; ...

    2016-05-09

    The generation of short pulses of ion beams through the interaction of an intense laser with a plasma sheath offers the possibility of compact and cheaper ion sources for many applications--from fast ignition and radiography of dense targets to hadron therapy and injection into conventional accelerators. To enable the efficient analysis of large-scale, high-fidelity particle accelerator simulations using the Warp simulation suite, the authors introduce the Warp In situ Visualization Toolkit (WarpIV). WarpIV integrates state-of-the-art in situ visualization and analysis using VisIt with Warp, supports management and control of complex in situ visualization and analysis workflows, and implements integrated analyticsmore » to facilitate query- and feature-based data analytics and efficient large-scale data analysis. WarpIV enables for the first time distributed parallel, in situ visualization of the full simulation data using high-performance compute resources as the data is being generated by Warp. The authors describe the application of WarpIV to study and compare large 2D and 3D ion accelerator simulations, demonstrating significant differences in the acceleration process in 2D and 3D simulations. WarpIV is available to the public via https://bitbucket.org/berkeleylab/warpiv. The Warp In situ Visualization Toolkit (WarpIV) supports large-scale, parallel, in situ visualization and analysis and facilitates query- and feature-based analytics, enabling for the first time high-performance analysis of large-scale, high-fidelity particle accelerator simulations while the data is being generated by the Warp simulation suite. Furthermore, this supplemental material https://extras.computer.org/extra/mcg2016030022s1.pdf provides more details regarding the memory profiling and optimization and the Yee grid recentering optimization results discussed in the main article.« less

  8. Cultivating an Ethic of Environmental Sustainability: Integrating Insights from Aristotelian Virtue Ethics and Pragmatist Cognitive Development Theory

    ERIC Educational Resources Information Center

    York, Travis; Becker, Christian

    2012-01-01

    Despite increased attention for environmental sustainability programming, large-scale adoption of pro-environmental behaviors has been slow and largely short-term. This article analyzes the crucial role of ethics in this respect. The authors utilize an interdisciplinary approach drawing on virtue ethics and cognitive development theory to…

  9. Mobility Data Analytics Center.

    DOT National Transportation Integrated Search

    2016-01-01

    Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...

  10. Fan-out Estimation in Spin-based Quantum Computer Scale-up.

    PubMed

    Nguyen, Thien; Hill, Charles D; Hollenberg, Lloyd C L; James, Matthew R

    2017-10-17

    Solid-state spin-based qubits offer good prospects for scaling based on their long coherence times and nexus to large-scale electronic scale-up technologies. However, high-threshold quantum error correction requires a two-dimensional qubit array operating in parallel, posing significant challenges in fabrication and control. While architectures incorporating distributed quantum control meet this challenge head-on, most designs rely on individual control and readout of all qubits with high gate densities. We analysed the fan-out routing overhead of a dedicated control line architecture, basing the analysis on a generalised solid-state spin qubit platform parameterised to encompass Coulomb confined (e.g. donor based spin qubits) or electrostatically confined (e.g. quantum dot based spin qubits) implementations. The spatial scalability under this model is estimated using standard electronic routing methods and present-day fabrication constraints. Based on reasonable assumptions for qubit control and readout we estimate 10 2 -10 5 physical qubits, depending on the quantum interconnect implementation, can be integrated and fanned-out independently. Assuming relatively long control-free interconnects the scalability can be extended. Ultimately, the universal quantum computation may necessitate a much higher number of integrated qubits, indicating that higher dimensional electronics fabrication and/or multiplexed distributed control and readout schemes may be the preferredstrategy for large-scale implementation.

  11. Aquatic ecosystem protection and restoration: Advances in methods for assessment and evaluation

    USGS Publications Warehouse

    Bain, M.B.; Harig, A.L.; Loucks, D.P.; Goforth, R.R.; Mills, K.E.

    2000-01-01

    Many methods and criteria are available to assess aquatic ecosystems, and this review focuses on a set that demonstrates advancements from community analyses to methods spanning large spatial and temporal scales. Basic methods have been extended by incorporating taxa sensitivity to different forms of stress, adding measures linked to system function, synthesizing multiple faunal groups, integrating biological and physical attributes, spanning large spatial scales, and enabling simulations through time. These tools can be customized to meet the needs of a particular assessment and ecosystem. Two case studies are presented to show how new methods were applied at the ecosystem scale for achieving practical management goals. One case used an assessment of biotic structure to demonstrate how enhanced river flows can improve habitat conditions and restore a diverse fish fauna reflective of a healthy riverine ecosystem. In the second case, multitaxonomic integrity indicators were successful in distinguishing lake ecosystems that were disturbed, healthy, and in the process of restoration. Most methods strive to address the concept of biological integrity and assessment effectiveness often can be impeded by the lack of more specific ecosystem management objectives. Scientific and policy explorations are needed to define new ways for designating a healthy system so as to allow specification of precise quality criteria that will promote further development of ecosystem analysis tools.

  12. Watershed Dynamics, with focus on connectivity index and management of water related impacts on road infrastructure

    NASA Astrophysics Data System (ADS)

    Kalantari, Z.

    2015-12-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. This study was built on a conceptual framework for looking at SedInConnect model, topography, land use, soil data and other PCDs and climate change in an integrated way to pave the way for more integrated policy making. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. This framework can provide a region with an effective tool to inform a broad range of watershed planning activities within a region. Regional planners, decision-makers, etc. can utilize this tool to identify the most vulnerable points in a watershed and along roads to plan for interventions and actions to alter impacts of high flows and other extreme weather events on roads construction. The application of the model over a large scale can give a realistic spatial characterization of sediment connectivity for the optimal management of debris flow to road structures. The ability of the model to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  13. Gravity at the horizon: on relativistic effects, CMB-LSS correlations and ultra-large scales in Horndeski's theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renk, Janina; Zumalacárregui, Miguel; Montanari, Francesco, E-mail: renk@thphys.uni-heidelberg.de, E-mail: miguel.zumalacarregui@nordita.org, E-mail: francesco.montanari@helsinki.fi

    2016-07-01

    We address the impact of consistent modifications of gravity on the largest observable scales, focusing on relativistic effects in galaxy number counts and the cross-correlation between the matter large scale structure (LSS) distribution and the cosmic microwave background (CMB). Our analysis applies to a very broad class of general scalar-tensor theories encoded in the Horndeski Lagrangian and is fully consistent on linear scales, retaining the full dynamics of the scalar field and not assuming quasi-static evolution. As particular examples we consider self-accelerating Covariant Galileons, Brans-Dicke theory and parameterizations based on the effective field theory of dark energy, using the himore » class code to address the impact of these models on relativistic corrections to LSS observables. We find that especially effects which involve integrals along the line of sight (lensing convergence, time delay and the integrated Sachs-Wolfe effect—ISW) can be considerably modified, and even lead to O(1000%) deviations from General Relativity in the case of the ISW effect for Galileon models, for which standard probes such as the growth function only vary by O(10%). These effects become dominant when correlating galaxy number counts at different redshifts and can lead to ∼ 50% deviations in the total signal that might be observable by future LSS surveys. Because of their integrated nature, these deep-redshift cross-correlations are sensitive to modifications of gravity even when probing eras much before dark energy domination. We further isolate the ISW effect using the cross-correlation between LSS and CMB temperature anisotropies and use current data to further constrain Horndeski models. Forthcoming large-volume galaxy surveys using multiple-tracers will search for all these effects, opening a new window to probe gravity and cosmic acceleration at the largest scales available in our universe.« less

  14. A fast low-power optical memory based on coupled micro-ring lasers

    NASA Astrophysics Data System (ADS)

    Hill, Martin T.; Dorren, Harmen J. S.; de Vries, Tjibbe; Leijtens, Xaveer J. M.; den Besten, Jan Hendrik; Smalbrugge, Barry; Oei, Yok-Siang; Binsma, Hans; Khoe, Giok-Djan; Smit, Meint K.

    2004-11-01

    The increasing speed of fibre-optic-based telecommunications has focused attention on high-speed optical processing of digital information. Complex optical processing requires a high-density, high-speed, low-power optical memory that can be integrated with planar semiconductor technology for buffering of decisions and telecommunication data. Recently, ring lasers with extremely small size and low operating power have been made, and we demonstrate here a memory element constructed by interconnecting these microscopic lasers. Our device occupies an area of 18 × 40µm2 on an InP/InGaAsP photonic integrated circuit, and switches within 20ps with 5.5fJ optical switching energy. Simulations show that the element has the potential for much smaller dimensions and switching times. Large numbers of such memory elements can be densely integrated and interconnected on a photonic integrated circuit: fast digital optical information processing systems employing large-scale integration should now be viable.

  15. An Integrated Assessment of Location-Dependent Scaling for Microalgae Biofuel Production Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Andre M.; Abodeely, Jared; Skaggs, Richard

    Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting/design through processing/upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are addressed in part by applying the Integrated Assessment Framework (IAF)—an integrated multi-scale modeling, analysis, and data management suite—to address key issues in developing and operating an open-pond facility by analyzing how variability and uncertainty in space andmore » time affect algal feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. The IAF was applied to a set of sites previously identified as having the potential to cumulatively produce 5 billion-gallons/year in the southeastern U.S. and results indicate costs can be reduced by selecting the most effective processing technology pathway and scaling downstream processing capabilities to fit site-specific growing conditions, available resources, and algal strains.« less

  16. Structured approaches to large-scale systems: Variational integrators for interconnected Lagrange-Dirac systems and structured model reduction on Lie groups

    NASA Astrophysics Data System (ADS)

    Parks, Helen Frances

    This dissertation presents two projects related to the structured integration of large-scale mechanical systems. Structured integration uses the considerable differential geometric structure inherent in mechanical motion to inform the design of numerical integration schemes. This process improves the qualitative properties of simulations and becomes especially valuable as a measure of accuracy over long time simulations in which traditional Gronwall accuracy estimates lose their meaning. Often, structured integration schemes replicate continuous symmetries and their associated conservation laws at the discrete level. Such is the case for variational integrators, which discretely replicate the process of deriving equations of motion from variational principles. This results in the conservation of momenta associated to symmetries in the discrete system and conservation of a symplectic form when applicable. In the case of Lagrange-Dirac systems, variational integrators preserve a discrete analogue of the Dirac structure preserved in the continuous flow. In the first project of this thesis, we extend Dirac variational integrators to accommodate interconnected systems. We hope this work will find use in the fields of control, where a controlled system can be thought of as a "plant" system joined to its controller, and in the approach of very large systems, where modular modeling may prove easier than monolithically modeling the entire system. The second project of the thesis considers a different approach to large systems. Given a detailed model of the full system, can we reduce it to a more computationally efficient model without losing essential geometric structures in the system? Asked without the reference to structure, this is the essential question of the field of model reduction. The answer there has been a resounding yes, with Principal Orthogonal Decomposition (POD) with snapshots rising as one of the most successful methods. Our project builds on previous work to extend POD to structured settings. In particular, we consider systems evolving on Lie groups and make use of canonical coordinates in the reduction process. We see considerable improvement in the accuracy of the reduced model over the usual structure-agnostic POD approach.

  17. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    NASA Astrophysics Data System (ADS)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach basically consisted in 1- decomposing both signals (SLP field and precipitation or streamflow) using discrete wavelet multiresolution analysis and synthesis, 2- generating one statistical downscaling model per time-scale, 3- summing up all scale-dependent models in order to obtain a final reconstruction of the predictand. The results obtained revealed a significant improvement of the reconstructions for both precipitation and streamflow when using the multiresolution ESD model instead of basic ESD ; in addition, the scale-dependent spatial patterns associated to the model matched quite well those obtained from scale-dependent composite analysis. In particular, the multiresolution ESD model handled very well the significant changes in variance through time observed in either prepciptation or streamflow. For instance, the post-1980 period, which had been characterized by particularly high amplitudes in interannual-to-interdecadal variability associated with flood and extremely low-flow/drought periods (e.g., winter 2001, summer 2003), could not be reconstructed without integrating wavelet multiresolution analysis into the model. Further investigations would be required to address the issue of the stationarity of the large-scale/local-scale relationships and to test the capability of the multiresolution ESD model for interannual-to-interdecadal forecasting. In terms of methodological approach, further investigations may concern a fully comprehensive sensitivity analysis of the modeling to the parameter of the multiresolution approach (different families of scaling and wavelet functions used, number of coefficients/degree of smoothness, etc.).

  18. Large scale structure formation of the normal branch in the DGP brane world model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Yong-Seon

    2008-06-15

    In this paper, we study the large scale structure formation of the normal branch in the DGP model (Dvail, Gabadadze, and Porrati brane world model) by applying the scaling method developed by Sawicki, Song, and Hu for solving the coupled perturbed equations of motion of on-brane and off-brane. There is a detectable departure of perturbed gravitational potential from the cold dark matter model with vacuum energy even at the minimal deviation of the effective equation of state w{sub eff} below -1. The modified perturbed gravitational potential weakens the integrated Sachs-Wolfe effect which is strengthened in the self-accelerating branch DGP model.more » Additionally, we discuss the validity of the scaling solution in the de Sitter limit at late times.« less

  19. A multidisciplinary approach to the development of low-cost high-performance lightwave networks

    NASA Technical Reports Server (NTRS)

    Maitan, Jacek; Harwit, Alex

    1991-01-01

    Our research focuses on high-speed distributed systems. We anticipate that our results will allow the fabrication of low-cost networks employing multi-gigabit-per-second data links for space and military applications. The recent development of high-speed low-cost photonic components and new generations of microprocessors creates an opportunity to develop advanced large-scale distributed information systems. These systems currently involve hundreds of thousands of nodes and are made up of components and communications links that may fail during operation. In order to realize these systems, research is needed into technologies that foster adaptability and scaleability. Self-organizing mechanisms are needed to integrate a working fabric of large-scale distributed systems. The challenge is to fuse theory, technology, and development methodologies to construct a cost-effective, efficient, large-scale system.

  20. Challenges and Opportunities: One Stop Processing of Automatic Large-Scale Base Map Production Using Airborne LIDAR Data Within GIS Environment. Case Study: Makassar City, Indonesia

    NASA Astrophysics Data System (ADS)

    Widyaningrum, E.; Gorte, B. G. H.

    2017-05-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information Agency in Indonesia. As a progressive advanced technology, Geographic Information System (GIS) open possibilities to deal with geospatial data automatic processing and analyses. Considering further needs of spatial data sharing and integration, the one stop processing of LiDAR data in a GIS environment is considered a powerful and efficient approach for the base map provision. The quality of the automated topographic base map is assessed and analysed based on its completeness, correctness, quality, and the confusion matrix.

  1. A Fine-Grained Pipelined Implementation for Large-Scale Matrix Inversion on FPGA

    NASA Astrophysics Data System (ADS)

    Zhou, Jie; Dou, Yong; Zhao, Jianxun; Xia, Fei; Lei, Yuanwu; Tang, Yuxing

    Large-scale matrix inversion play an important role in many applications. However to the best of our knowledge, there is no FPGA-based implementation. In this paper, we explore the possibility of accelerating large-scale matrix inversion on FPGA. To exploit the computational potential of FPGA, we introduce a fine-grained parallel algorithm for matrix inversion. A scalable linear array processing elements (PEs), which is the core component of the FPGA accelerator, is proposed to implement this algorithm. A total of 12 PEs can be integrated into an Altera StratixII EP2S130F1020C5 FPGA on our self-designed board. Experimental results show that a factor of 2.6 speedup and the maximum power-performance of 41 can be achieved compare to Pentium Dual CPU with double SSE threads.

  2. Energy Spectra of Higher Reynolds Number Turbulence by the DNS with up to 122883 Grid Points

    NASA Astrophysics Data System (ADS)

    Ishihara, Takashi; Kaneda, Yukio; Morishita, Koji; Yokokawa, Mitsuo; Uno, Atsuya

    2014-11-01

    Large-scale direct numerical simulations (DNS) of forced incompressible turbulence in a periodic box with up to 122883 grid points have been performed using K computer. The maximum Taylor-microscale Reynolds number Rλ, and the maximum Reynolds number Re based on the integral length scale are over 2000 and 105, respectively. Our previous DNS with Rλ up to 1100 showed that the energy spectrum has a slope steeper than - 5 / 3 (the Kolmogorov scaling law) by factor 0 . 1 at the wavenumber range (kη < 0 . 03). Here η is the Kolmogorov length scale. Our present DNS at higher resolutions show that the energy spectra with different Reynolds numbers (Rλ > 1000) are well normalized not by the integral length-scale but by the Kolmogorov length scale, at the wavenumber range of the steeper slope. This result indicates that the steeper slope is not inherent character in the inertial subrange, and is affected by viscosity.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shahidehpour, Mohammad

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practicesmore » can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision results are often text-based demonstrations. WINS includes a powerful visualization tool and user interface capability for transmission analyses, planning, and assessment, which will be of great interest to power market participants, power system planners and operators, and state and federal regulatory entities; and (3) WINS can handle extended transmission models for wind integration studies. WINS models include limitations on transmission flow as well as bus voltage for analyzing power system states. The existing decision tools often consider transmission flow constraints (dc power flow) alone which could result in the over-utilization of existing resources when analyzing wind integration. WINS can be used to assist power market participants including transmission companies, independent system operators, power system operators in vertically integrated utilities, wind energy developers, and regulatory agencies to analyze economics, security, and reliability of various options for wind integration including transmission upgrades and the planning of new transmission facilities. WINS can also be used by industry for the offline training of reliability and operation personnel when analyzing wind integration uncertainties, identifying critical spots in power system operation, analyzing power system vulnerabilities, and providing credible decisions for examining operation and planning options for wind integration. Researches in this project on wind integration included (1) Development of WINS; (2) Transmission Congestion Analysis in the Eastern Interconnection; (3) Analysis of 2030 Large-Scale Wind Energy Integration in the Eastern Interconnection; (4) Large-scale Analysis of 2018 Wind Energy Integration in the Eastern U.S. Interconnection. The research resulted in 33 papers, 9 presentations, 9 PhD degrees, 4 MS degrees, and 7 awards. The education activities in this project on wind energy included (1) Wind Energy Training Facility Development; (2) Wind Energy Course Development.« less

  4. VLSI (Very Large Scale Integrated) Design of a 16 Bit Very Fast Pipelined Carry Look Ahead Adder.

    DTIC Science & Technology

    1983-09-01

    the ability for systems engineers to custom design digital integrated circuits. Until recently, the design of integrated circuits has been...traditionally carried out by a select group of logic designers working in semiconductor laboratories. Systems engineers had to "make do" or "fit in" the...products of these labs to realize their designs. The systems engineers had little participation in the actual design of the chip. The MED and CONWAY design

  5. KA-SB: from data integration to large scale reasoning

    PubMed Central

    Roldán-García, María del Mar; Navas-Delgado, Ismael; Kerzazi, Amine; Chniber, Othmane; Molina-Castro, Joaquín; Aldana-Montes, José F

    2009-01-01

    Background The analysis of information in the biological domain is usually focused on the analysis of data from single on-line data sources. Unfortunately, studying a biological process requires having access to disperse, heterogeneous, autonomous data sources. In this context, an analysis of the information is not possible without the integration of such data. Methods KA-SB is a querying and analysis system for final users based on combining a data integration solution with a reasoner. Thus, the tool has been created with a process divided into two steps: 1) KOMF, the Khaos Ontology-based Mediator Framework, is used to retrieve information from heterogeneous and distributed databases; 2) the integrated information is crystallized in a (persistent and high performance) reasoner (DBOWL). This information could be further analyzed later (by means of querying and reasoning). Results In this paper we present a novel system that combines the use of a mediation system with the reasoning capabilities of a large scale reasoner to provide a way of finding new knowledge and of analyzing the integrated information from different databases, which is retrieved as a set of ontology instances. This tool uses a graphical query interface to build user queries easily, which shows a graphical representation of the ontology and allows users o build queries by clicking on the ontology concepts. Conclusion These kinds of systems (based on KOMF) will provide users with very large amounts of information (interpreted as ontology instances once retrieved), which cannot be managed using traditional main memory-based reasoners. We propose a process for creating persistent and scalable knowledgebases from sets of OWL instances obtained by integrating heterogeneous data sources with KOMF. This process has been applied to develop a demo tool , which uses the BioPax Level 3 ontology as the integration schema, and integrates UNIPROT, KEGG, CHEBI, BRENDA and SABIORK databases. PMID:19796402

  6. Integrated Mid-Continent Carbon Capture, Sequestration & Enhanced Oil Recovery Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brian McPherson

    2010-08-31

    A consortium of research partners led by the Southwest Regional Partnership on Carbon Sequestration and industry partners, including CAP CO2 LLC, Blue Source LLC, Coffeyville Resources, Nitrogen Fertilizers LLC, Ash Grove Cement Company, Kansas Ethanol LLC, Headwaters Clean Carbon Services, Black & Veatch, and Schlumberger Carbon Services, conducted a feasibility study of a large-scale CCS commercialization project that included large-scale CO{sub 2} sources. The overall objective of this project, entitled the 'Integrated Mid-Continent Carbon Capture, Sequestration and Enhanced Oil Recovery Project' was to design an integrated system of US mid-continent industrial CO{sub 2} sources with CO{sub 2} capture, and geologicmore » sequestration in deep saline formations and in oil field reservoirs with concomitant EOR. Findings of this project suggest that deep saline sequestration in the mid-continent region is not feasible without major financial incentives, such as tax credits or otherwise, that do not exist at this time. However, results of the analysis suggest that enhanced oil recovery with carbon sequestration is indeed feasible and practical for specific types of geologic settings in the Midwestern U.S.« less

  7. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  8. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  9. Integrated Energy Solutions | NREL

    Science.gov Websites

    Transitions A man and woman standing in front of a large, color 3D visualization screen that spans the height a woman and a man testing a scaled model of a microgrid controller in a laboratory setting

  10. Integrating Residential Photovoltaics With Power Lines

    NASA Technical Reports Server (NTRS)

    Borden, C. S.

    1985-01-01

    Report finds rooftop solar-cell arrays feed excess power to electric-utility grid for fee are potentially attractive large-scale application of photovoltaic technology. Presents assessment of breakeven costs of these arrays under variety of technological and economic assumptions.

  11. A Large-scale Distributed Indexed Learning Framework for Data that Cannot Fit into Memory

    DTIC Science & Technology

    2015-03-27

    learn a classifier. Integrating three learning techniques (online, semi-supervised and active learning ) together with a selective sampling with minimum communication between the server and the clients solved this problem.

  12. Techniques for control of long-term reliability of complex integrated circuits. I - Reliability assurance by test vehicle qualification.

    NASA Technical Reports Server (NTRS)

    Van Vonno, N. W.

    1972-01-01

    Development of an alternate approach to the conventional methods of reliability assurance for large-scale integrated circuits. The product treated is a large-scale T squared L array designed for space applications. The concept used is that of qualification of product by evaluation of the basic processing used in fabricating the product, providing an insight into its potential reliability. Test vehicles are described which enable evaluation of device characteristics, surface condition, and various parameters of the two-level metallization system used. Evaluation of these test vehicles is performed on a lot qualification basis, with the lot consisting of one wafer. Assembled test vehicles are evaluated by high temperature stress at 300 C for short time durations. Stressing at these temperatures provides a rapid method of evaluation and permits a go/no go decision to be made on the wafer lot in a timely fashion.

  13. Circuit engineering principles for construction of bipolar large-scale integrated circuit storage devices and very large-scale main memory

    NASA Astrophysics Data System (ADS)

    Neklyudov, A. A.; Savenkov, V. N.; Sergeyez, A. G.

    1984-06-01

    Memories are improved by increasing speed or the memory volume on a single chip. The most effective means for increasing speeds in bipolar memories are current control circuits with the lowest extraction times for a specific power consumption (1/4 pJ/bit). The control current circuitry involves multistage current switches and circuits accelerating transient processes in storage elements and links. Circuit principles for the design of bipolar memories with maximum speeds for an assigned minimum of circuit topology are analyzed. Two main classes of storage with current control are considered: the ECL type and super-integrated injection type storage with data capacities of N = 1/4 and N 4/16, respectively. The circuits reduce logic voltage differentials and the volumes of lexical and discharge buses and control circuit buses. The limiting speed is determined by the antiinterference requirements of the memory in storage and extraction modes.

  14. Advances in multi-scale modeling of solidification and casting processes

    NASA Astrophysics Data System (ADS)

    Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang

    2011-04-01

    The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.

  15. System design and integration of the large-scale advanced prop-fan

    NASA Technical Reports Server (NTRS)

    Huth, B. P.

    1986-01-01

    In recent years, considerable attention has been directed toward improving aircraft fuel consumption. Studies have shown that blades with thin airfoils and aerodynamic sweep extend the inherent efficiency advantage that turboprop propulsion systems have demonstrated to the higher speed to today's aircraft. Hamilton Standard has designed a 9-foot diameter single-rotation Prop-Fan. It will test the hardware on a static test stand, in low speed and high speed wind tunnels and on a research aircraft. The major objective of this testing is to establish the structural integrity of large scale Prop-Fans of advanced construction, in addition to the evaluation of aerodynamic performance and the aeroacoustic design. The coordination efforts performed to ensure smooth operation and assembly of the Prop-Fan are summarized. A summary of the loads used to size the system components, the methodology used to establish material allowables and a review of the key analytical results are given.

  16. A class of hybrid finite element methods for electromagnetics: A review

    NASA Technical Reports Server (NTRS)

    Volakis, J. L.; Chatterjee, A.; Gong, J.

    1993-01-01

    Integral equation methods have generally been the workhorse for antenna and scattering computations. In the case of antennas, they continue to be the prominent computational approach, but for scattering applications the requirement for large-scale computations has turned researchers' attention to near neighbor methods such as the finite element method, which has low O(N) storage requirements and is readily adaptable in modeling complex geometrical features and material inhomogeneities. In this paper, we review three hybrid finite element methods for simulating composite scatterers, conformal microstrip antennas, and finite periodic arrays. Specifically, we discuss the finite element method and its application to electromagnetic problems when combined with the boundary integral, absorbing boundary conditions, and artificial absorbers for terminating the mesh. Particular attention is given to large-scale simulations, methods, and solvers for achieving low memory requirements and code performance on parallel computing architectures.

  17. Assessment of fire effects based on Forest Inventory and Analysis data and a long-term fire mapping data set

    Treesearch

    John D. Shaw; Sara A. Goeking; James Menlove; Charles E. Werstak

    2017-01-01

    Integration of Forest Inventory and Analysis (FIA) plot data with Monitoring Trends in Burn Severity (MTBS) data can provide new information about fire effects on forests. This integration allowed broad-scale assessment of the cover types burned in large fires, the relationship between prefire stand conditions and fire severity, and postfire stand conditions. Of the 42...

  18. Managing harvest and habitat as integrated components

    USGS Publications Warehouse

    Osnas, Erik; Runge, Michael C.; Mattsson, Brady J.; Austin, Jane E.; Boomer, G. S.; Clark, R. G.; Devers, P.; Eadie, J. M.; Lonsdorf, E. V.; Tavernia, Brian G.

    2014-01-01

    In 2007, several important initiatives in the North American waterfowl management community called for an integrated approach to habitat and harvest management. The essence of the call for integration is that harvest and habitat management affect the same resources, yet exist as separate endeavours with very different regulatory contexts. A common modelling framework could help these management streams to better understand their mutual effects. Particularly, how does successful habitat management increase harvest potential? Also, how do regional habitat programmes and large-scale harvest strategies affect continental population sizes (a metric used to express habitat goals)? In the ensuing five years, several projects took on different aspects of these challenges. While all of these projects are still on-going, and are not yet sufficiently developed to produce guidance for management decisions, they have been influential in expanding the dialogue and producing some important emerging lessons. The first lesson has been that one of the more difficult aspects of integration is not the integration across decision contexts, but the integration across spatial and temporal scales. Habitat management occurs at local and regional scales. Harvest management decisions are made at a continental scale. How do these actions, taken at different scales, combine to influence waterfowl population dynamics at all scales? The second lesson has been that consideration of the interface of habitat and harvest management can generate important insights into the objectives underlying the decision context. Often the objectives are very complex and trade-off against one another. The third lesson follows from the second – if an understanding of the fundamental objectives is paramount, there is no escaping the need for a better understanding of human dimensions, specifically the desires of hunters and nonhunters and the role they play in conservation. In the end, the compelling question is how to better understand, guide and justify decisions about conservation investments in waterfowl management. Future efforts to integrate harvest and habitat management will include completion of the species-specific case-studies, initiation of policy discussions around how to integrate the decision contexts and governing institutions, and possible consideration of a new level of integration – integration of harvest and habitats management decisions across waterfowl stocks.

  19. A large-scale perspective on stress-induced alterations in resting-state networks

    NASA Astrophysics Data System (ADS)

    Maron-Katz, Adi; Vaisvaser, Sharon; Lin, Tamar; Hendler, Talma; Shamir, Ron

    2016-02-01

    Stress is known to induce large-scale neural modulations. However, its neural effect once the stressor is removed and how it relates to subjective experience are not fully understood. Here we used a statistically sound data-driven approach to investigate alterations in large-scale resting-state functional connectivity (rsFC) induced by acute social stress. We compared rsfMRI profiles of 57 healthy male subjects before and after stress induction. Using a parcellation-based univariate statistical analysis, we identified a large-scale rsFC change, involving 490 parcel-pairs. Aiming to characterize this change, we employed statistical enrichment analysis, identifying anatomic structures that were significantly interconnected by these pairs. This analysis revealed strengthening of thalamo-cortical connectivity and weakening of cross-hemispheral parieto-temporal connectivity. These alterations were further found to be associated with change in subjective stress reports. Integrating report-based information on stress sustainment 20 minutes post induction, revealed a single significant rsFC change between the right amygdala and the precuneus, which inversely correlated with the level of subjective recovery. Our study demonstrates the value of enrichment analysis for exploring large-scale network reorganization patterns, and provides new insight on stress-induced neural modulations and their relation to subjective experience.

  20. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    PubMed

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.

  1. Manufacturing Process Developments for Regeneratively-Cooled Channel Wall Rocket Nozzles

    NASA Technical Reports Server (NTRS)

    Gradl, Paul; Brandsmeier, Will

    2016-01-01

    Regeneratively cooled channel wall nozzles incorporate a series of integral coolant channels to contain the coolant to maintain adequate wall temperatures and expand hot gas providing engine thrust and specific impulse. NASA has been evaluating manufacturing techniques targeting large scale channel wall nozzles to support affordability of current and future liquid rocket engine nozzles and thrust chamber assemblies. The development of these large scale manufacturing techniques focus on the liner formation, channel slotting with advanced abrasive water-jet milling techniques and closeout of the coolant channels to replace or augment other cost reduction techniques being evaluated for nozzles. NASA is developing a series of channel closeout techniques including large scale additive manufacturing laser deposition and explosively bonded closeouts. A series of subscale nozzles were completed evaluating these processes. Fabrication of mechanical test and metallography samples, in addition to subscale hardware has focused on Inconel 625, 300 series stainless, aluminum alloys as well as other candidate materials. Evaluations of these techniques are demonstrating potential for significant cost reductions for large scale nozzles and chambers. Hot fire testing is planned using these techniques in the future.

  2. Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset

    NASA Astrophysics Data System (ADS)

    Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.

    2017-12-01

    Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.

  3. Integrating Green and Blue Water Management Tools for Land and Water Resources Planning

    NASA Astrophysics Data System (ADS)

    Jewitt, G. P. W.

    2009-04-01

    The role of land use and land use change on the hydrological cycle is well known. However, the impacts of large scale land use change are poorly considered in water resources planning, unless they require direct abstraction of water resources and associated development of infrastructure e.g. Irrigation Schemes. However, large scale deforestation for the supply of raw materials, expansion of the areas of plantation forestry, increasing areas under food production and major plans for cultivation of biofuels in many developing countries are likely to result in extensive land use change. Given the spatial extent and temporal longevity of these proposed developments, major impacts on water resources are inevitable. It is imperative that managers and planners consider the consequences for downstream ecosystems and users in such developments. However, many popular tools, such as the vitual water approach, provide only coarse scale "order of magnitude" type estimates with poor consideration of, and limited usefulness, for land use planning. In this paper, a framework for the consideration of the impacts of large scale land use change on water resources at a range of temporal and spatial scales is presented. Drawing on experiences from South Africa, where the establishment of exotic commercial forest plantations is only permitted once a water use license has been granted, the framework adopts the "green water concept" for the identification of potential high impact areas of land use change and provides for integration with traditional "blue water" water resources planning tools for more detailed planning. Appropriate tools, ranging from simple spreadsheet solutions to more sophisticated remote sensing and hydrological models are described, and the application of the framework for consideration of water resources impacts associated with the establishment of large scale tectona grandis, sugar cane and jatropha curcas plantations is illustrated through examples in Mozambique and South Africa. Keywords: Land use change, water resources, green water, blue water, biofuels, developing countries

  4. Neurodevelopmental alterations of large-scale structural networks in children with new-onset epilepsy

    PubMed Central

    Bonilha, Leonardo; Tabesh, Ali; Dabbs, Kevin; Hsu, David A.; Stafstrom, Carl E.; Hermann, Bruce P.; Lin, Jack J.

    2014-01-01

    Recent neuroimaging and behavioral studies have revealed that children with new onset epilepsy already exhibit brain structural abnormalities and cognitive impairment. How the organization of large-scale brain structural networks is altered near the time of seizure onset and whether network changes are related to cognitive performances remain unclear. Recent studies also suggest that regional brain volume covariance reflects synchronized brain developmental changes. Here, we test the hypothesis that epilepsy during early-life is associated with abnormalities in brain network organization and cognition. We used graph theory to study structural brain networks based on regional volume covariance in 39 children with new-onset seizures and 28 healthy controls. Children with new-onset epilepsy showed a suboptimal topological structural organization with enhanced network segregation and reduced global integration compared to controls. At the regional level, structural reorganization was evident with redistributed nodes from the posterior to more anterior head regions. The epileptic brain network was more vulnerable to targeted but not random attacks. Finally, a subgroup of children with epilepsy, namely those with lower IQ and poorer executive function, had a reduced balance between network segregation and integration. Taken together, the findings suggest that the neurodevelopmental impact of new onset childhood epilepsies alters large-scale brain networks, resulting in greater vulnerability to network failure and cognitive impairment. PMID:24453089

  5. Development and Verification of a Novel Robot-Integrated Fringe Projection 3D Scanning System for Large-Scale Metrology.

    PubMed

    Du, Hui; Chen, Xiaobo; Xi, Juntong; Yu, Chengyi; Zhao, Bao

    2017-12-12

    Large-scale surfaces are prevalent in advanced manufacturing industries, and 3D profilometry of these surfaces plays a pivotal role for quality control. This paper proposes a novel and flexible large-scale 3D scanning system assembled by combining a robot, a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. A mathematical model is established for the global data fusion. Subsequently, a robust method is introduced for the establishment of the end coordinate system. As for hand-eye calibration, the calibration ball is observed by the scanner and the laser tracker simultaneously. With this data, the hand-eye relationship is solved, and then an algorithm is built to get the transformation matrix between the end coordinate system and the world coordinate system. A validation experiment is designed to verify the proposed algorithms. Firstly, a hand-eye calibration experiment is implemented and the computation of the transformation matrix is done. Then a car body rear is measured 22 times in order to verify the global data fusion algorithm. The 3D shape of the rear is reconstructed successfully. To evaluate the precision of the proposed method, a metric tool is built and the results are presented.

  6. The potential of text mining in data integration and network biology for plant research: a case study on Arabidopsis.

    PubMed

    Van Landeghem, Sofie; De Bodt, Stefanie; Drebert, Zuzanna J; Inzé, Dirk; Van de Peer, Yves

    2013-03-01

    Despite the availability of various data repositories for plant research, a wealth of information currently remains hidden within the biomolecular literature. Text mining provides the necessary means to retrieve these data through automated processing of texts. However, only recently has advanced text mining methodology been implemented with sufficient computational power to process texts at a large scale. In this study, we assess the potential of large-scale text mining for plant biology research in general and for network biology in particular using a state-of-the-art text mining system applied to all PubMed abstracts and PubMed Central full texts. We present extensive evaluation of the textual data for Arabidopsis thaliana, assessing the overall accuracy of this new resource for usage in plant network analyses. Furthermore, we combine text mining information with both protein-protein and regulatory interactions from experimental databases. Clusters of tightly connected genes are delineated from the resulting network, illustrating how such an integrative approach is essential to grasp the current knowledge available for Arabidopsis and to uncover gene information through guilt by association. All large-scale data sets, as well as the manually curated textual data, are made publicly available, hereby stimulating the application of text mining data in future plant biology studies.

  7. Integrating concept ontology and multitask learning to achieve more effective classifier training for multilevel image annotation.

    PubMed

    Fan, Jianping; Gao, Yuli; Luo, Hangzai

    2008-03-01

    In this paper, we have developed a new scheme for achieving multilevel annotations of large-scale images automatically. To achieve more sufficient representation of various visual properties of the images, both the global visual features and the local visual features are extracted for image content representation. To tackle the problem of huge intraconcept visual diversity, multiple types of kernels are integrated to characterize the diverse visual similarity relationships between the images more precisely, and a multiple kernel learning algorithm is developed for SVM image classifier training. To address the problem of huge interconcept visual similarity, a novel multitask learning algorithm is developed to learn the correlated classifiers for the sibling image concepts under the same parent concept and enhance their discrimination and adaptation power significantly. To tackle the problem of huge intraconcept visual diversity for the image concepts at the higher levels of the concept ontology, a novel hierarchical boosting algorithm is developed to learn their ensemble classifiers hierarchically. In order to assist users on selecting more effective hypotheses for image classifier training, we have developed a novel hyperbolic framework for large-scale image visualization and interactive hypotheses assessment. Our experiments on large-scale image collections have also obtained very positive results.

  8. HELICITY CONSERVATION IN NONLINEAR MEAN-FIELD SOLAR DYNAMO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pipin, V. V.; Sokoloff, D. D.; Zhang, H.

    It is believed that magnetic helicity conservation is an important constraint on large-scale astrophysical dynamos. In this paper, we study a mean-field solar dynamo model that employs two different formulations of the magnetic helicity conservation. In the first approach, the evolution of the averaged small-scale magnetic helicity is largely determined by the local induction effects due to the large-scale magnetic field, turbulent motions, and the turbulent diffusive loss of helicity. In this case, the dynamo model shows that the typical strength of the large-scale magnetic field generated by the dynamo is much smaller than the equipartition value for the magneticmore » Reynolds number 10{sup 6}. This is the so-called catastrophic quenching (CQ) phenomenon. In the literature, this is considered to be typical for various kinds of solar dynamo models, including the distributed-type and the Babcock-Leighton-type dynamos. The problem can be resolved by the second formulation, which is derived from the integral conservation of the total magnetic helicity. In this case, the dynamo model shows that magnetic helicity propagates with the dynamo wave from the bottom of the convection zone to the surface. This prevents CQ because of the local balance between the large-scale and small-scale magnetic helicities. Thus, the solar dynamo can operate in a wide range of magnetic Reynolds numbers up to 10{sup 6}.« less

  9. Fast Updating National Geo-Spatial Databases with High Resolution Imagery: China's Methodology and Experience

    NASA Astrophysics Data System (ADS)

    Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.

    2014-04-01

    Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.

  10. Temporal integration and 1/f power scaling in a circuit model of cerebellar interneurons.

    PubMed

    Maex, Reinoud; Gutkin, Boris

    2017-07-01

    Inhibitory interneurons interconnected via electrical and chemical (GABA A receptor) synapses form extensive circuits in several brain regions. They are thought to be involved in timing and synchronization through fast feedforward control of principal neurons. Theoretical studies have shown, however, that whereas self-inhibition does indeed reduce response duration, lateral inhibition, in contrast, may generate slow response components through a process of gradual disinhibition. Here we simulated a circuit of interneurons (stellate and basket cells) of the molecular layer of the cerebellar cortex and observed circuit time constants that could rise, depending on parameter values, to >1 s. The integration time scaled both with the strength of inhibition, vanishing completely when inhibition was blocked, and with the average connection distance, which determined the balance between lateral and self-inhibition. Electrical synapses could further enhance the integration time by limiting heterogeneity among the interneurons and by introducing a slow capacitive current. The model can explain several observations, such as the slow time course of OFF-beam inhibition, the phase lag of interneurons during vestibular rotation, or the phase lead of Purkinje cells. Interestingly, the interneuron spike trains displayed power that scaled approximately as 1/ f at low frequencies. In conclusion, stellate and basket cells in cerebellar cortex, and interneuron circuits in general, may not only provide fast inhibition to principal cells but also act as temporal integrators that build a very short-term memory. NEW & NOTEWORTHY The most common function attributed to inhibitory interneurons is feedforward control of principal neurons. In many brain regions, however, the interneurons are densely interconnected via both chemical and electrical synapses but the function of this coupling is largely unknown. Based on large-scale simulations of an interneuron circuit of cerebellar cortex, we propose that this coupling enhances the integration time constant, and hence the memory trace, of the circuit. Copyright © 2017 the American Physiological Society.

  11. Segmentation of Object Outlines into Parts: A Large-Scale Integrative Study

    ERIC Educational Resources Information Center

    De Winter, Joeri; Wagemans, Johan

    2006-01-01

    In this study, a large number of observers (N=201) were asked to segment a collection of outlines derived from line drawings of everyday objects (N=88). This data set was then used as a benchmark to evaluate current models of object segmentation. All of the previously proposed rules of segmentation were found supported in our results. For example,…

  12. Asynchronous Two-Level Checkpointing Scheme for Large-Scale Adjoints in the Spectral-Element Solver Nek5000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schanen, Michel; Marin, Oana; Zhang, Hong

    Adjoints are an important computational tool for large-scale sensitivity evaluation, uncertainty quantification, and derivative-based optimization. An essential component of their performance is the storage/recomputation balance in which efficient checkpointing methods play a key role. We introduce a novel asynchronous two-level adjoint checkpointing scheme for multistep numerical time discretizations targeted at large-scale numerical simulations. The checkpointing scheme combines bandwidth-limited disk checkpointing and binomial memory checkpointing. Based on assumptions about the target petascale systems, which we later demonstrate to be realistic on the IBM Blue Gene/Q system Mira, we create a model of the expected performance of our checkpointing approach and validatemore » it using the highly scalable Navier-Stokes spectralelement solver Nek5000 on small to moderate subsystems of the Mira supercomputer. In turn, this allows us to predict optimal algorithmic choices when using all of Mira. We also demonstrate that two-level checkpointing is significantly superior to single-level checkpointing when adjoining a large number of time integration steps. To our knowledge, this is the first time two-level checkpointing had been designed, implemented, tuned, and demonstrated on fluid dynamics codes at large scale of 50k+ cores.« less

  13. Hierarchical Address Event Routing for Reconfigurable Large-Scale Neuromorphic Systems.

    PubMed

    Park, Jongkil; Yu, Theodore; Joshi, Siddharth; Maier, Christoph; Cauwenberghs, Gert

    2017-10-01

    We present a hierarchical address-event routing (HiAER) architecture for scalable communication of neural and synaptic spike events between neuromorphic processors, implemented with five Xilinx Spartan-6 field-programmable gate arrays and four custom analog neuromophic integrated circuits serving 262k neurons and 262M synapses. The architecture extends the single-bus address-event representation protocol to a hierarchy of multiple nested buses, routing events across increasing scales of spatial distance. The HiAER protocol provides individually programmable axonal delay in addition to strength for each synapse, lending itself toward biologically plausible neural network architectures, and scales across a range of hierarchies suitable for multichip and multiboard systems in reconfigurable large-scale neuromorphic systems. We show approximately linear scaling of net global synaptic event throughput with number of routing nodes in the network, at 3.6×10 7 synaptic events per second per 16k-neuron node in the hierarchy.

  14. Low frequency steady-state brain responses modulate large scale functional networks in a frequency-specific means.

    PubMed

    Wang, Yi-Feng; Long, Zhiliang; Cui, Qian; Liu, Feng; Jing, Xiu-Juan; Chen, Heng; Guo, Xiao-Nan; Yan, Jin H; Chen, Hua-Fu

    2016-01-01

    Neural oscillations are essential for brain functions. Research has suggested that the frequency of neural oscillations is lower for more integrative and remote communications. In this vein, some resting-state studies have suggested that large scale networks function in the very low frequency range (<1 Hz). However, it is difficult to determine the frequency characteristics of brain networks because both resting-state studies and conventional frequency tagging approaches cannot simultaneously capture multiple large scale networks in controllable cognitive activities. In this preliminary study, we aimed to examine whether large scale networks can be modulated by task-induced low frequency steady-state brain responses (lfSSBRs) in a frequency-specific pattern. In a revised attention network test, the lfSSBRs were evoked in the triple network system and sensory-motor system, indicating that large scale networks can be modulated in a frequency tagging way. Furthermore, the inter- and intranetwork synchronizations as well as coherence were increased at the fundamental frequency and the first harmonic rather than at other frequency bands, indicating a frequency-specific modulation of information communication. However, there was no difference among attention conditions, indicating that lfSSBRs modulate the general attention state much stronger than distinguishing attention conditions. This study provides insights into the advantage and mechanism of lfSSBRs. More importantly, it paves a new way to investigate frequency-specific large scale brain activities. © 2015 Wiley Periodicals, Inc.

  15. Large-Scale Production of Fuel and Feed from Marine Microalgae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huntley, Mark

    2015-09-30

    In summary, this Consortium has demonstrated a fully integrated process for the production of biofuels and high-value nutritional bioproducts at pre-commercial scale. We have achieved unprecedented yields of algal oil, and converted the oil to viable fuels. We have demonstrated the potential value of the residual product as a viable feed ingredient for many important animals in the global food supply.

  16. A fiber-optic ice detection system for large-scale wind turbine blades

    NASA Astrophysics Data System (ADS)

    Kim, Dae-gil; Sampath, Umesh; Kim, Hyunjin; Song, Minho

    2017-09-01

    Icing causes substantial problems in the integrity of large-scale wind turbines. In this work, a fiber-optic sensor system for detection of icing with an arrayed waveguide grating is presented. The sensor system detects Fresnel reflections from the ends of the fibers. The transition in Fresnel reflection due to icing gives peculiar intensity variations, which categorizes the ice, the water, and the air medium on the wind turbine blades. From the experimental results, with the proposed sensor system, the formation of icing conditions and thickness of ice were identified successfully in real time.

  17. Imaging detectors and electronics—a view of the future

    NASA Astrophysics Data System (ADS)

    Spieler, Helmuth

    2004-09-01

    Imaging sensors and readout electronics have made tremendous strides in the past two decades. The application of modern semiconductor fabrication techniques and the introduction of customized monolithic integrated circuits have made large-scale imaging systems routine in high-energy physics. This technology is now finding its way into other areas, such as space missions, synchrotron light sources, and medical imaging. I review current developments and discuss the promise and limits of new technologies. Several detector systems are described as examples of future trends. The discussion emphasizes semiconductor detector systems, but I also include recent developments for large-scale superconducting detector arrays.

  18. Fabrication of the HIAD Large-Scale Demonstration Assembly and Upcoming Mission Applications

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; Dinonno, J. M.; Cheatwood, F M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASAs Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale.In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.

  19. Fabrication of the HIAD Large-Scale Demonstration Assembly

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; DiNonno, J. M.; Cheatwood, F. M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASA's Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale. In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.

  20. Thermochemical Processes | Bioenergy | NREL

    Science.gov Websites

    model catalysts appear on a montage of images of wood chips, liquid gasoline, a gas tanker truck, and a , pipes, and hoses, pouring a liquid from a large hose into a bucket. Integration, Scale-Up, and Piloting

  1. Statewide mesoscopic simulation for Wyoming.

    DOT National Transportation Integrated Search

    2013-10-01

    This study developed a mesoscopic simulator which is capable of representing both city-level and statewide roadway : networks. The key feature of such models are the integration of (i) a traffic flow model which is efficient enough to : scale to larg...

  2. Iterative algorithms for tridiagonal matrices on a WSI-multiprocessor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gajski, D.D.; Sameh, A.H.; Wisniewski, J.A.

    1982-01-01

    With the rapid advances in semiconductor technology, the construction of Wafer Scale Integration (WSI)-multiprocessors consisting of a large number of processors is now feasible. We illustrate the implementation of some basic linear algebra algorithms on such multiprocessors.

  3. A simple scaling approach to produce climate scenarios of local precipitation extremes for the Netherlands

    NASA Astrophysics Data System (ADS)

    Lenderink, Geert; Attema, Jisk

    2015-08-01

    Scenarios of future changes in small scale precipitation extremes for the Netherlands are presented. These scenarios are based on a new approach whereby changes in precipitation extremes are set proportional to the change in water vapor amount near the surface as measured by the 2m dew point temperature. This simple scaling framework allows the integration of information derived from: (i) observations, (ii) a new unprecedentedly large 16 member ensemble of simulations with the regional climate model RACMO2 driven by EC-Earth, and (iii) short term integrations with a non-hydrostatic model Harmonie. Scaling constants are based on subjective weighting (expert judgement) of the three different information sources taking also into account previously published work. In all scenarios local precipitation extremes increase with warming, yet with broad uncertainty ranges expressing incomplete knowledge of how convective clouds and the atmospheric mesoscale circulation will react to climate change.

  4. Ecological Consequences of Clonal Integration in Plants

    PubMed Central

    Liu, Fenghong; Liu, Jian; Dong, Ming

    2016-01-01

    Clonal plants are widespread throughout the plant kingdom and dominate in diverse habitats. Spatiotemporal heterogeneity of environment is pervasive at multiple scales, even at scales relevant to individual plants. Clonal integration refers to resource translocation and information communication among the ramets of clonal plants. Due to clonal integration, clonal plant species possess a series of peculiar attributes: plasticity in response to local and non-local conditions, labor division with organ specialization for acquiring locally abundant resources, foraging behavior by selective placement of ramets in resource-rich microhabitats, and avoidance of intraclonal competition. Clonal integration has very profound ecological consequences for clonal plants. It allows them to efficiently cope with environmental heterogeneity, by alleviating local resource shortages, buffering environmental stresses and disturbances, influencing competitive ability, increasing invasiveness, and altering species composition and invasibility at the community level. In this paper, we present a comprehensive review of research on the ecological consequences of plant clonal integration based on a large body of literature. We also attempt to propose perspectives for future research. PMID:27446093

  5. Toward a first-principles integrated simulation of tokamak edge plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, C S; Klasky, Scott A; Cummings, Julian

    2008-01-01

    Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less

  6. An O(N) and parallel approach to integral problems by a kernel-independent fast multipole method: Application to polarization and magnetization of interacting particles

    NASA Astrophysics Data System (ADS)

    Jiang, Xikai; Li, Jiyuan; Zhao, Xujun; Qin, Jian; Karpeev, Dmitry; Hernandez-Ortiz, Juan; de Pablo, Juan J.; Heinonen, Olle

    2016-08-01

    Large classes of materials systems in physics and engineering are governed by magnetic and electrostatic interactions. Continuum or mesoscale descriptions of such systems can be cast in terms of integral equations, whose direct computational evaluation requires O(N2) operations, where N is the number of unknowns. Such a scaling, which arises from the many-body nature of the relevant Green's function, has precluded wide-spread adoption of integral methods for solution of large-scale scientific and engineering problems. In this work, a parallel computational approach is presented that relies on using scalable open source libraries and utilizes a kernel-independent Fast Multipole Method (FMM) to evaluate the integrals in O(N) operations, with O(N) memory cost, thereby substantially improving the scalability and efficiency of computational integral methods. We demonstrate the accuracy, efficiency, and scalability of our approach in the context of two examples. In the first, we solve a boundary value problem for a ferroelectric/ferromagnetic volume in free space. In the second, we solve an electrostatic problem involving polarizable dielectric bodies in an unbounded dielectric medium. The results from these test cases show that our proposed parallel approach, which is built on a kernel-independent FMM, can enable highly efficient and accurate simulations and allow for considerable flexibility in a broad range of applications.

  7. An O( N) and parallel approach to integral problems by a kernel-independent fast multipole method: Application to polarization and magnetization of interacting particles

    DOE PAGES

    Jiang, Xikai; Li, Jiyuan; Zhao, Xujun; ...

    2016-08-10

    Large classes of materials systems in physics and engineering are governed by magnetic and electrostatic interactions. Continuum or mesoscale descriptions of such systems can be cast in terms of integral equations, whose direct computational evaluation requires O( N 2) operations, where N is the number of unknowns. Such a scaling, which arises from the many-body nature of the relevant Green's function, has precluded wide-spread adoption of integral methods for solution of large-scale scientific and engineering problems. In this work, a parallel computational approach is presented that relies on using scalable open source libraries and utilizes a kernel-independent Fast Multipole Methodmore » (FMM) to evaluate the integrals in O( N) operations, with O( N) memory cost, thereby substantially improving the scalability and efficiency of computational integral methods. We demonstrate the accuracy, efficiency, and scalability of our approach in the context of two examples. In the first, we solve a boundary value problem for a ferroelectric/ferromagnetic volume in free space. In the second, we solve an electrostatic problem involving polarizable dielectric bodies in an unbounded dielectric medium. Lastly, the results from these test cases show that our proposed parallel approach, which is built on a kernel-independent FMM, can enable highly efficient and accurate simulations and allow for considerable flexibility in a broad range of applications.« less

  8. Simulating cyanobacterial phenotypes by integrating flux balance analysis, kinetics, and a light distribution function

    DOE PAGES

    He, Lian; Wu, Stephen G.; Wan, Ni; ...

    2015-12-24

    In this study, genome-scale models (GSMs) are widely used to predict cyanobacterial phenotypes in photobioreactors (PBRs). However, stoichiometric GSMs mainly focus on fluxome that result in maximal yields. Cyanobacterial metabolism is controlled by both intracellular enzymes and photobioreactor conditions. To connect both intracellular and extracellular information and achieve a better understanding of PBRs productivities, this study integrates a genome-scale metabolic model of Synechocystis 6803 with growth kinetics, cell movements, and a light distribution function. The hybrid platform not only maps flux dynamics in cells of sub-populations but also predicts overall production titer and rate in PBRs. Analysis of the integratedmore » GSM demonstrates several results. First, cyanobacteria are capable of reaching high biomass concentration (>20 g/L in 21 days) in PBRs without light and CO 2 mass transfer limitations. Second, fluxome in a single cyanobacterium may show stochastic changes due to random cell movements in PBRs. Third, insufficient light due to cell self-shading can activate the oxidative pentose phosphate pathway in subpopulation cells. Fourth, the model indicates that the removal of glycogen synthesis pathway may not improve cyanobacterial bio-production in large-size PBRs, because glycogen can support cell growth in the dark zones. Based on experimental data, the integrated GSM estimates that Synechocystis 6803 in shake flask conditions has a photosynthesis efficiency of ~2.7 %. Conclusions: The multiple-scale integrated GSM, which examines both intracellular and extracellular domains, can be used to predict production yield/rate/titer in large-size PBRs. More importantly, genetic engineering strategies predicted by a traditional GSM may work well only in optimal growth conditions. In contrast, the integrated GSM may reveal mutant physiologies in diverse bioreactor conditions, leading to the design of robust strains with high chances of success in industrial settings.« less

  9. Evaluation of integration methods for hybrid simulation of complex structural systems through collapse

    NASA Astrophysics Data System (ADS)

    Del Carpio R., Maikol; Hashemi, M. Javad; Mosqueda, Gilberto

    2017-10-01

    This study examines the performance of integration methods for hybrid simulation of large and complex structural systems in the context of structural collapse due to seismic excitations. The target application is not necessarily for real-time testing, but rather for models that involve large-scale physical sub-structures and highly nonlinear numerical models. Four case studies are presented and discussed. In the first case study, the accuracy of integration schemes including two widely used methods, namely, modified version of the implicit Newmark with fixed-number of iteration (iterative) and the operator-splitting (non-iterative) is examined through pure numerical simulations. The second case study presents the results of 10 hybrid simulations repeated with the two aforementioned integration methods considering various time steps and fixed-number of iterations for the iterative integration method. The physical sub-structure in these tests consists of a single-degree-of-freedom (SDOF) cantilever column with replaceable steel coupons that provides repeatable highlynonlinear behavior including fracture-type strength and stiffness degradations. In case study three, the implicit Newmark with fixed-number of iterations is applied for hybrid simulations of a 1:2 scale steel moment frame that includes a relatively complex nonlinear numerical substructure. Lastly, a more complex numerical substructure is considered by constructing a nonlinear computational model of a moment frame coupled to a hybrid model of a 1:2 scale steel gravity frame. The last two case studies are conducted on the same porotype structure and the selection of time steps and fixed number of iterations are closely examined in pre-test simulations. The generated unbalance forces is used as an index to track the equilibrium error and predict the accuracy and stability of the simulations.

  10. Design of a decentralized reusable research database architecture to support data acquisition in large research projects.

    PubMed

    Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning

    2007-01-01

    The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.

  11. An examination of the psychometric properties of the community integration questionnaire (CIQ) in spinal cord injury.

    PubMed

    Kratz, Anna L; Chadd, Edmund; Jensen, Mark P; Kehn, Matthew; Kroll, Thilo

    2015-07-01

    To examine the psychometric properties of the Community Integration Questionnaire (CIQ) in large samples of individuals with spinal cord injury (SCI). Longitudinal 12-month survey study. Nation-wide, community dwelling. Adults with SCI: 627 at Time 1, 494 at Time 2. Not applicable. The CIQ is a 15-item measure developed to measure three domains of community integration in individuals with traumatic brain injury: home integration, social integration, and productive activity. SCI consumer input suggested the need for two additional items assessing socializing at home and internet/email activity. Exploratory factor analyses at Time 1 indicated three factors. Time 2 confirmatory factor analysis did not show a good fit of the 3-factor model. CIQ scores were normally distributed and only the Productive subscale demonstrated problems with high (25%) ceiling effects. Internal reliability was acceptable for the Total and Home scales, but low for the Social and Productive activity scales. Validity of the CIQ is suggested by significant differences by sex, age, and wheelchair use. The factor structure of the CIQ was not stable over time. The CIQ may be most useful for assessing home integration, as this is the subscale with the most scale stability and internal reliability. The CIQ may be improved for use in SCI by including items that reflect higher levels of productive functioning, integration across the life span, and home- and internet-based social functioning.

  12. An integrated probe design for measuring food quality in a microwave environment

    NASA Astrophysics Data System (ADS)

    O'Farrell, M.; Sheridan, C.; Lewis, E.; Zhao, W. Z.; Sun, T.; Grattan, K. T. V.

    2007-07-01

    The work presented describes the development of a novel integrated optical sensor system for the simultaneous and online measurement of the colour and temperature of food as it cooks in a large-scale microwave and hybrid oven systems. The integrated probe contains two different sensor concepts, one to monitor temperature and based on Fibre Bragg Grating (FBG) technology and a second for meat quality, based on reflection spectroscopy in the visible wavelength range. The combination of the two sensors into a single probe requires a careful configuration of the sensor approaches in the creation of an integrated probe design.

  13. The XMM Large Scale Structure Survey

    NASA Astrophysics Data System (ADS)

    Pierre, Marguerite

    2005-10-01

    We propose to complete, by an additional 5 deg2, the XMM-LSS Survey region overlying the Spitzer/SWIRE field. This field already has CFHTLS and Integral coverage, and will encompass about 10 deg2. The resulting multi-wavelength medium-depth survey, which complements XMM and Chandra deep surveys, will provide a unique view of large-scale structure over a wide range of redshift, and will show active galaxies in the full range of environments. The complete coverage by optical and IR surveys provides high-quality photometric redshifts, so that cosmological results can quickly be extracted. In the spirit of a Legacy survey, we will make the raw X-ray data immediately public. Multi-band catalogues and images will also be made available on short time scales.

  14. A unifying framework for systems modeling, control systems design, and system operation

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  15. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  16. Wafer-Scale Integration of Graphene-based Electronic, Optoelectronic and Electroacoustic Devices

    PubMed Central

    Tian, He; Yang, Yi; Xie, Dan; Cui, Ya-Long; Mi, Wen-Tian; Zhang, Yuegang; Ren, Tian-Ling

    2014-01-01

    In virtue of its superior properties, the graphene-based device has enormous potential to be a supplement or an alternative to the conventional silicon-based device in varies applications. However, the functionality of the graphene devices is still limited due to the restriction of the high cost, the low efficiency and the low quality of the graphene growth and patterning techniques. We proposed a simple one-step laser scribing fabrication method to integrate wafer-scale high-performance graphene-based in-plane transistors, photodetectors, and loudspeakers. The in-plane graphene transistors have a large on/off ratio up to 5.34. And the graphene photodetector arrays were achieved with photo responsivity as high as 0.32 A/W. The graphene loudspeakers realize wide-band sound generation from 1 to 50 kHz. These results demonstrated that the laser scribed graphene could be used for wafer-scale integration of a variety of graphene-based electronic, optoelectronic and electroacoustic devices. PMID:24398542

  17. Integrated Primary Care Readiness and Behaviors Scale: Development and validation in behavioral health professionals.

    PubMed

    Blaney, Cerissa L; Redding, Colleen A; Paiva, Andrea L; Rossi, Joseph S; Prochaska, James O; Blissmer, Bryan; Burditt, Caitlin T; Nash, Justin M; Bayley, Keri Dotson

    2018-03-01

    Although integrated primary care (IPC) is growing, several barriers remain. Better understanding of behavioral health professionals' (BHPs') readiness for and engagement in IPC behaviors could improve IPC research and training. This study developed measures of IPC behaviors and stage of change. The sample included 319 licensed, practicing BHPs with a range of interests and experience with IPC. Sequential measurement development procedures, with split-half cross-validation were conducted. Exploratory principal components analyses (N = 152) and confirmatory factor analyses (N = 167) yielded a 12-item scale with 2 factors: consultation/practice management (CPM) and intervention/knowledge (IK). A higher-order Integrated Primary Care Behavior Scale (IPCBS) model showed good fit to the data, and excellent internal consistencies. The multivariate analysis of variance (MANOVA) on the IPCBS demonstrated significant large-sized differences across stage and behavior groups. The IPCBS demonstrated good psychometric properties and external validation, advancing research, education, and training for IPC practice. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Climate warming, marine protected areas and the ocean-scale integrity of coral reef ecosystems.

    PubMed

    Graham, Nicholas A J; McClanahan, Tim R; MacNeil, M Aaron; Wilson, Shaun K; Polunin, Nicholas V C; Jennings, Simon; Chabanet, Pascale; Clark, Susan; Spalding, Mark D; Letourneur, Yves; Bigot, Lionel; Galzin, René; Ohman, Marcus C; Garpe, Kajsa C; Edwards, Alasdair J; Sheppard, Charles R C

    2008-08-27

    Coral reefs have emerged as one of the ecosystems most vulnerable to climate variation and change. While the contribution of a warming climate to the loss of live coral cover has been well documented across large spatial and temporal scales, the associated effects on fish have not. Here, we respond to recent and repeated calls to assess the importance of local management in conserving coral reefs in the context of global climate change. Such information is important, as coral reef fish assemblages are the most species dense vertebrate communities on earth, contributing critical ecosystem functions and providing crucial ecosystem services to human societies in tropical countries. Our assessment of the impacts of the 1998 mass bleaching event on coral cover, reef structural complexity, and reef associated fishes spans 7 countries, 66 sites and 26 degrees of latitude in the Indian Ocean. Using Bayesian meta-analysis we show that changes in the size structure, diversity and trophic composition of the reef fish community have followed coral declines. Although the ocean scale integrity of these coral reef ecosystems has been lost, it is positive to see the effects are spatially variable at multiple scales, with impacts and vulnerability affected by geography but not management regime. Existing no-take marine protected areas still support high biomass of fish, however they had no positive affect on the ecosystem response to large-scale disturbance. This suggests a need for future conservation and management efforts to identify and protect regional refugia, which should be integrated into existing management frameworks and combined with policies to improve system-wide resilience to climate variation and change.

  19. Fault-Tolerant Sequencer Using FPGA-Based Logic Designs for Space Applications

    DTIC Science & Technology

    2013-12-01

    Prototype Board SBU single bit upset SDK software development kit SDRAM synchronous dynamic random-access memory SEB single-event burnout ...current VHDL VHSIC hardware description language VHSIC very-high-speed integrated circuits VLSI very-large- scale integration VQFP very...transient pulse, called a single-event transient (SET), or even cause permanent damage to the device in the form of a burnout or gate rupture. The SEE

  20. Digital Systems Validation Handbook. Volume 2. Chapter 18. Avionic Data Bus Integration Technology

    DTIC Science & Technology

    1993-11-01

    interaction between a digital data bus and an avionic system. Very Large Scale Integration (VLSI) ICs and multiversion software, which make up digital...1984, the Sperry Corporation developed a fault tolerant system which employed multiversion programming, voting, and monitoring for error detection and...formulate all the significant behavior of a system. MULTIVERSION PROGRAMMING. N-version programming. N-VERSION PROGRAMMING. The independent coding of a

  1. Topics in programmable automation. [for materials handling, inspection, and assembly

    NASA Technical Reports Server (NTRS)

    Rosen, C. A.

    1975-01-01

    Topics explored in the development of integrated programmable automation systems include: numerically controlled and computer controlled machining; machine intelligence and the emulation of human-like capabilities; large scale semiconductor integration technology applications; and sensor technology for asynchronous local computation without burdening the executive minicomputer which controls the whole system. The role and development of training aids, and the potential application of these aids to augmented teleoperator systems are discussed.

  2. Calendar of Selected Aeronautical and Space Meetings (Calendrier des Manifestations Aeronautiques et Spatiales (Selection)).

    DTIC Science & Technology

    1983-01-01

    Physique de l’Atmosphire et Environnement terrestre 71 09 - Information, Documentation et Informatique 74 10 - Thimes gin~raux (pluridisciplinaires) et...March Louisiana (US) Fiber Communication Optical Communications IEEE Fibre Optics Electro-Optics 02-08 7-9 March Baden-Baden VDE -IEEE Specialists...Conference on Very Large Electronic Systems VDE (GE) Scale Integrated Circuits Solid State Devices IEEE Integrated Circuits Engineering Design Fabrication

  3. Integrated Database And Knowledge Base For Genomic Prospective Cohort Study In Tohoku Medical Megabank Toward Personalized Prevention And Medicine.

    PubMed

    Ogishima, Soichi; Takai, Takako; Shimokawa, Kazuro; Nagaie, Satoshi; Tanaka, Hiroshi; Nakaya, Jun

    2015-01-01

    The Tohoku Medical Megabank project is a national project to revitalization of the disaster area in the Tohoku region by the Great East Japan Earthquake, and have conducted large-scale prospective genome-cohort study. Along with prospective genome-cohort study, we have developed integrated database and knowledge base which will be key database for realizing personalized prevention and medicine.

  4. Visual influence on path integration in darkness indicates a multimodal representation of large-scale space

    PubMed Central

    Tcheang, Lili; Bülthoff, Heinrich H.; Burgess, Neil

    2011-01-01

    Our ability to return to the start of a route recently performed in darkness is thought to reflect path integration of motion-related information. Here we provide evidence that motion-related interoceptive representations (proprioceptive, vestibular, and motor efference copy) combine with visual representations to form a single multimodal representation guiding navigation. We used immersive virtual reality to decouple visual input from motion-related interoception by manipulating the rotation or translation gain of the visual projection. First, participants walked an outbound path with both visual and interoceptive input, and returned to the start in darkness, demonstrating the influences of both visual and interoceptive information in a virtual reality environment. Next, participants adapted to visual rotation gains in the virtual environment, and then performed the path integration task entirely in darkness. Our findings were accurately predicted by a quantitative model in which visual and interoceptive inputs combine into a single multimodal representation guiding navigation, and are incompatible with a model of separate visual and interoceptive influences on action (in which path integration in darkness must rely solely on interoceptive representations). Overall, our findings suggest that a combined multimodal representation guides large-scale navigation, consistent with a role for visual imagery or a cognitive map. PMID:21199934

  5. Integration and Analysis of Neighbor Discovery and Link Quality Estimation in Wireless Sensor Networks

    PubMed Central

    Radi, Marjan; Dezfouli, Behnam; Abu Bakar, Kamalrulnizam; Abd Razak, Shukor

    2014-01-01

    Network connectivity and link quality information are the fundamental requirements of wireless sensor network protocols to perform their desired functionality. Most of the existing discovery protocols have only focused on the neighbor discovery problem, while a few number of them provide an integrated neighbor search and link estimation. As these protocols require a careful parameter adjustment before network deployment, they cannot provide scalable and accurate network initialization in large-scale dense wireless sensor networks with random topology. Furthermore, performance of these protocols has not entirely been evaluated yet. In this paper, we perform a comprehensive simulation study on the efficiency of employing adaptive protocols compared to the existing nonadaptive protocols for initializing sensor networks with random topology. In this regard, we propose adaptive network initialization protocols which integrate the initial neighbor discovery with link quality estimation process to initialize large-scale dense wireless sensor networks without requiring any parameter adjustment before network deployment. To the best of our knowledge, this work is the first attempt to provide a detailed simulation study on the performance of integrated neighbor discovery and link quality estimation protocols for initializing sensor networks. This study can help system designers to determine the most appropriate approach for different applications. PMID:24678277

  6. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    NASA Astrophysics Data System (ADS)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  7. Breaking of scale invariance in the time dependence of correlation functions in isotropic and homogeneous turbulence

    NASA Astrophysics Data System (ADS)

    Tarpin, Malo; Canet, Léonie; Wschebor, Nicolás

    2018-05-01

    In this paper, we present theoretical results on the statistical properties of stationary, homogeneous, and isotropic turbulence in incompressible flows in three dimensions. Within the framework of the non-perturbative renormalization group, we derive a closed renormalization flow equation for a generic n-point correlation (and response) function for large wave-numbers with respect to the inverse integral scale. The closure is obtained from a controlled expansion and relies on extended symmetries of the Navier-Stokes field theory. It yields the exact leading behavior of the flow equation at large wave-numbers |p→ i| and for arbitrary time differences ti in the stationary state. Furthermore, we obtain the form of the general solution of the corresponding fixed point equation, which yields the analytical form of the leading wave-number and time dependence of n-point correlation functions, for large wave-numbers and both for small ti and in the limit ti → ∞. At small ti, the leading contribution at large wave-numbers is logarithmically equivalent to -α (ɛL ) 2 /3|∑tip→ i|2, where α is a non-universal constant, L is the integral scale, and ɛ is the mean energy injection rate. For the 2-point function, the (tp)2 dependence is known to originate from the sweeping effect. The derived formula embodies the generalization of the effect of sweeping to n-point correlation functions. At large wave-numbers and large ti, we show that the ti2 dependence in the leading order contribution crosses over to a |ti| dependence. The expression of the correlation functions in this regime was not derived before, even for the 2-point function. Both predictions can be tested in direct numerical simulations and in experiments.

  8. Organic field effect transistor with ultra high amplification

    NASA Astrophysics Data System (ADS)

    Torricelli, Fabrizio

    2016-09-01

    High-gain transistors are essential for the large-scale circuit integration, high-sensitivity sensors and signal amplification in sensing systems. Unfortunately, organic field-effect transistors show limited gain, usually of the order of tens, because of the large contact resistance and channel-length modulation. Here we show organic transistors fabricated on plastic foils enabling unipolar amplifiers with ultra-gain. The proposed approach is general and opens up new opportunities for ultra-large signal amplification in organic circuits and sensors.

  9. An integrated assessment of location-dependent scaling for microalgae biofuel production facilities

    DOE PAGES

    Coleman, André M.; Abodeely, Jared M.; Skaggs, Richard L.; ...

    2014-06-19

    Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting and design through processing and upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are partially addressed by applying the Integrated Assessment Framework (IAF) – an integrated multi-scale modeling, analysis, and data management suite – to address key issues in developing and operating an open-pond microalgae production facility.more » This is done by analyzing how variability and uncertainty over space and through time affect feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. To provide a baseline analysis, the IAF was applied in this paper to a set of sites in the southeastern U.S. with the potential to cumulatively produce 5 billion gallons per year. Finally, the results indicate costs can be reduced by scaling downstream processing capabilities to fit site-specific growing conditions, available and economically viable resources, and specific microalgal strains.« less

  10. The Sun in a Drawer

    ERIC Educational Resources Information Center

    Anderson, Bruce

    1975-01-01

    Widespread use of solar energy is unsuccessful with large scale integration required and conservatism of responsible institutions. Rising fuel costs and government promotion may initiate expansion in commercial, environmental, and government establishments, homes, and schools. Difficulties encountered include financing, tax incentives, design,…

  11. Memory reduction through higher level language hardware

    NASA Technical Reports Server (NTRS)

    Kerner, H.; Gellman, L.

    1972-01-01

    Application of large scale integration in computers to reduce size and manufacturing costs and to produce improvements in logic function is discussed. Use of FORTRAN 4 as computer language for this purpose is described. Effectiveness of method in storing information is illustrated.

  12. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1985-01-01

    Developments in programs managed by the Jet Propulsion Laboratory's Office of Telecommunications and Data acquisition are discussed. Space communications, radio antennas, the Deep Space Network, antenna design, Project SETI, seismology, coding, very large scale integration, downlinking, and demodulation are among the topics covered.

  13. Analyzing CMOS/SOS fabrication for LSI arrays

    NASA Technical Reports Server (NTRS)

    Ipri, A. C.

    1978-01-01

    Report discusses set of design rules that have been developed as result of work with test arrays. Set of optimum dimensions is given that would maximize process output and would correspondingly minimize costs in fabrication of large-scale integration (LSI) arrays.

  14. Aqueous Two-Phase Systems at Large Scale: Challenges and Opportunities.

    PubMed

    Torres-Acosta, Mario A; Mayolo-Deloisa, Karla; González-Valdez, José; Rito-Palomares, Marco

    2018-06-07

    Aqueous two-phase systems (ATPS) have proved to be an efficient and integrative operation to enhance recovery of industrially relevant bioproducts. After ATPS discovery, a variety of works have been published regarding their scaling from 10 to 1000 L. Although ATPS have achieved high recovery and purity yields, there is still a gap between their bench-scale use and potential industrial applications. In this context, this review paper critically analyzes ATPS scale-up strategies to enhance the potential industrial adoption. In particular, large-scale operation considerations, different phase separation procedures, the available optimization techniques (univariate, response surface methodology, and genetic algorithms) to maximize recovery and purity and economic modeling to predict large-scale costs, are discussed. ATPS intensification to increase the amount of sample to process at each system, developing recycling strategies and creating highly efficient predictive models, are still areas of great significance that can be further exploited with the use of high-throughput techniques. Moreover, the development of novel ATPS can maximize their specificity increasing the possibilities for the future industry adoption of ATPS. This review work attempts to present the areas of opportunity to increase ATPS attractiveness at industrial levels. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. A continental-scale hydrology and water quality model for Europe: Calibration and uncertainty of a high-resolution large-scale SWAT model

    NASA Astrophysics Data System (ADS)

    Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.

    2015-05-01

    A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.

  16. Quantifying streamflow change caused by forest disturbance at a large spatial scale: A single watershed study

    NASA Astrophysics Data System (ADS)

    Wei, Xiaohua; Zhang, Mingfang

    2010-12-01

    Climatic variability and forest disturbance are commonly recognized as two major drivers influencing streamflow change in large-scale forested watersheds. The greatest challenge in evaluating quantitative hydrological effects of forest disturbance is the removal of climatic effect on hydrology. In this paper, a method was designed to quantify respective contributions of large-scale forest disturbance and climatic variability on streamflow using the Willow River watershed (2860 km2) located in the central part of British Columbia, Canada. Long-term (>50 years) data on hydrology, climate, and timber harvesting history represented by equivalent clear-cutting area (ECA) were available to discern climatic and forestry influences on streamflow by three steps. First, effective precipitation, an integrated climatic index, was generated by subtracting evapotranspiration from precipitation. Second, modified double mass curves were developed by plotting accumulated annual streamflow against annual effective precipitation, which presented a much clearer picture of the cumulative effects of forest disturbance on streamflow following removal of climatic influence. The average annual streamflow changes that were attributed to forest disturbances and climatic variability were then estimated to be +58.7 and -72.4 mm, respectively. The positive (increasing) and negative (decreasing) values in streamflow change indicated opposite change directions, which suggest an offsetting effect between forest disturbance and climatic variability in the study watershed. Finally, a multivariate Autoregressive Integrated Moving Average (ARIMA) model was generated to establish quantitative relationships between accumulated annual streamflow deviation attributed to forest disturbances and annual ECA. The model was then used to project streamflow change under various timber harvesting scenarios. The methodology can be effectively applied to any large-scale single watershed where long-term data (>50 years) are available.

  17. Large scale air pollution estimation method combining land use regression and chemical transport modeling in a geostatistical framework.

    PubMed

    Akita, Yasuyuki; Baldasano, Jose M; Beelen, Rob; Cirach, Marta; de Hoogh, Kees; Hoek, Gerard; Nieuwenhuijsen, Mark; Serre, Marc L; de Nazelle, Audrey

    2014-04-15

    In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the concentration from the output of CTM and the intraurban scale variability through LUR model output, the proposed framework outperformed more conventional approaches.

  18. A new large-scale manufacturing platform for complex biopharmaceuticals.

    PubMed

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  19. Radiation-Hardened Wafer Scale Integration

    DTIC Science & Technology

    1989-10-25

    unlimited. LEXINGTON MASSACHUSETTS EXECUTIVE SUMMARY A focal plane processor (FPP) for a large array of LWIR photodetectors on a space platform must...It seems certain that large. scanning LWIR arrays will once again be of interest in the future, though their specifications will differ from those... nonuniformity and defects in the ZMR material, but films of good quality produced by this technique are now available commercially from Kopin Corporation. Such

  20. Quantum chaos inside black holes

    NASA Astrophysics Data System (ADS)

    Addazi, Andrea

    2017-06-01

    We show how semiclassical black holes can be reinterpreted as an effective geometry, composed of a large ensemble of horizonless naked singularities (eventually smoothed at the Planck scale). We call these new items frizzy-balls, which can be rigorously defined by Euclidean path integral approach. This leads to interesting implications about information paradoxes. We demonstrate that infalling information will chaotically propagate inside this system before going to the full quantum gravity regime (Planck scale).

  1. Teaching Scales in the Climate System: An example of interdisciplinary teaching and learning

    NASA Astrophysics Data System (ADS)

    Baehr, Johanna; Behrens, Jörn; Brüggemann, Michael; Frisius, Thomas; Glessmer, Mirjam S.; Hartmann, Jens; Hense, Inga; Kaleschke, Lars; Kutzbach, Lars; Rödder, Simone; Scheffran, Jürgen

    2016-04-01

    Climate change is commonly regarded as one of 21st century's grand challenges that needs to be addressed by conducting integrated research combining natural and social sciences. To meet this need, how to best train future climate researchers should be reconsidered. Here, we present our experience from a team-taught semester-long course with students of the international master program "Integrated Climate System Sciences" (ICSS) at the University of Hamburg, Germany. Ten lecturers with different backgrounds in physical, mathematical, biogeochemical and social sciences accompanied by a researcher trained in didactics prepared and regularly participated in a course which consisted of weekly classes. The foundation of the course was the use of the concept of 'scales' - climate varying on different temporal and spatial scales - by developing a joint definition of 'scales in the climate system' that is applicable in the natural sciences and in the social sciences. By applying this interdisciplinary definition of 'scales' to phenomena from all components of the climate system and the socio-economic dimensions, we aimed for an integrated description of the climate system. Following the concept of research-driven teaching and learning and using a variety of teaching techniques, the students designed their own scale diagram to illustrate climate-related phenomena in different disciplines. The highlight of the course was the presentation of individually developed scale diagrams by every student with all lecturers present. Based on the already conducted course, we currently re-design the course concept to be teachable by a similarly large group of lecturers but with alternating presence in class. With further refinement and also a currently ongoing documentation of the teaching material, we will continue to use the concept of 'scales' as a vehicle for teaching an integrated view of the climate system.

  2. InfoSymbiotics/DDDAS - The power of Dynamic Data Driven Applications Systems for New Capabilities in Environmental -, Geo-, and Space- Sciences

    NASA Astrophysics Data System (ADS)

    Darema, F.

    2016-12-01

    InfoSymbiotics/DDDAS embodies the power of Dynamic Data Driven Applications Systems (DDDAS), a concept whereby an executing application model is dynamically integrated, in a feed-back loop, with the real-time data-acquisition and control components, as well as other data sources of the application system. Advanced capabilities can be created through such new computational approaches in modeling and simulations, and in instrumentation methods, and include: enhancing the accuracy of the application model; speeding-up the computation to allow faster and more comprehensive models of a system, and create decision support systems with the accuracy of full-scale simulations; in addition, the notion of controlling instrumentation processes by the executing application results in more efficient management of application-data and addresses challenges of how to architect and dynamically manage large sets of heterogeneous sensors and controllers, an advance over the static and ad-hoc ways of today - with DDDAS these sets of resources can be managed adaptively and in optimized ways. Large-Scale-Dynamic-Data encompasses the next wave of Big Data, and namely dynamic data arising from ubiquitous sensing and control in engineered, natural, and societal systems, through multitudes of heterogeneous sensors and controllers instrumenting these systems, and where opportunities and challenges at these "large-scales" relate not only to data size but the heterogeneity in data, data collection modalities, fidelities, and timescales, ranging from real-time data to archival data. In tandem with this important dimension of dynamic data, there is an extended view of Big Computing, which includes the collective computing by networked assemblies of multitudes of sensors and controllers, this range from the high-end to the real-time seamlessly integrated and unified, and comprising the Large-Scale-Big-Computing. InfoSymbiotics/DDDAS engenders transformative impact in many application domains, ranging from the nano-scale to the terra-scale and to the extra-terra-scale. The talk will address opportunities for new capabilities together with corresponding research challenges, with illustrative examples from several application areas including environmental sciences, geosciences, and space sciences.

  3. Improving National Water Modeling: An Intercomparison of two High-Resolution, Continental Scale Models, CONUS-ParFlow and the National Water Model

    NASA Astrophysics Data System (ADS)

    Tijerina, D.; Gochis, D.; Condon, L. E.; Maxwell, R. M.

    2017-12-01

    Development of integrated hydrology modeling systems that couple atmospheric, land surface, and subsurface flow is growing trend in hydrologic modeling. Using an integrated modeling framework, subsurface hydrologic processes, such as lateral flow and soil moisture redistribution, are represented in a single cohesive framework with surface processes like overland flow and evapotranspiration. There is a need for these more intricate models in comprehensive hydrologic forecasting and water management over large spatial areas, specifically the Continental US (CONUS). Currently, two high-resolution, coupled hydrologic modeling applications have been developed for this domain: CONUS-ParFlow built using the integrated hydrologic model ParFlow and the National Water Model that uses the NCAR Weather Research and Forecasting hydrological extension package (WRF-Hydro). Both ParFlow and WRF-Hydro include land surface models, overland flow, and take advantage of parallelization and high-performance computing (HPC) capabilities; however, they have different approaches to overland subsurface flow and groundwater-surface water interactions. Accurately representing large domains remains a challenge considering the difficult task of representing complex hydrologic processes, computational expense, and extensive data needs; both models have accomplished this, but have differences in approach and continue to be difficult to validate. A further exploration of effective methodology to accurately represent large-scale hydrology with integrated models is needed to advance this growing field. Here we compare the outputs of CONUS-ParFlow and the National Water Model to each other and with observations to study the performance of hyper-resolution models over large domains. Models were compared over a range of scales for major watersheds within the CONUS with a specific focus on the Mississippi, Ohio, and Colorado River basins. We use a novel set of approaches and analysis for this comparison to better understand differences in process and bias. This intercomparison is a step toward better understanding how much water we have and interactions between surface and subsurface. Our goal is to advance our understanding and simulation of the hydrologic system and ultimately improve hydrologic forecasts.

  4. Micron-scale lens array having diffracting structures

    DOEpatents

    Goldberg, Kenneth A

    2013-10-29

    A novel micron-scale lens, a microlens, is engineered to concentrate light efficiently onto an area of interest, such as a small, light-sensitive detector element in an integrated electronic device. Existing microlens designs imitate the form of large-scale lenses and are less effective at small sizes. The microlenses described herein have been designed to accommodate diffraction effects, which dominate the behavior of light at small length scales. Thus a new class of light-concentrating optical elements with much higher relative performance has been created. Furthermore, the new designs are much easier to fabricate than previous designs.

  5. Removal of two large-scale cosmic microwave background anomalies after subtraction of the integrated Sachs-Wolfe effect

    NASA Astrophysics Data System (ADS)

    Rassat, A.; Starck, J.-L.; Dupé, F.-X.

    2013-09-01

    Context. Although there is currently a debate over the significance of the claimed large-scale anomalies in the cosmic microwave background (CMB), their existence is not totally dismissed. In parallel to the debate over their statistical significance, recent work has also focussed on masks and secondary anisotropies as potential sources of these anomalies. Aims: In this work we investigate simultaneously the impact of the method used to account for masked regions as well as the impact of the integrated Sachs-Wolfe (ISW) effect, which is the large-scale secondary anisotropy most likely to affect the CMB anomalies. In this sense, our work is an update of previous works. Our aim is to identify trends in CMB data from different years and with different mask treatments. Methods: We reconstruct the ISW signal due to 2 Micron All-Sky Survey (2MASS) and NRAO VLA Sky Survey (NVSS) galaxies, effectively reconstructing the low-redshift ISW signal out to z ~ 1. We account for regions of missing data using the sparse inpainting technique. We test sparse inpainting of the CMB, large scale structure and ISW and find that it constitutes a bias-free reconstruction method suitable to study large-scale statistical isotropy and the ISW effect. Results: We focus on three large-scale CMB anomalies: the low quadrupole, the quadrupole/octopole alignment, and the octopole planarity. After sparse inpainting, the low quadrupole becomes more anomalous, whilst the quadrupole/octopole alignment becomes less anomalous. The significance of the low quadrupole is unchanged after subtraction of the ISW effect, while the trend amongst the CMB maps is that both the low quadrupole and the quadrupole/octopole alignment have reduced significance, yet other hypotheses remain possible as well (e.g. exotic physics). Our results also suggest that both of these anomalies may be due to the quadrupole alone. While the octopole planarity significance is reduced after inpainting and after ISW subtraction, however, we do not find that it was very anomalous to start with. In the spirit of participating in reproducible research, we make all codes and resulting products which constitute main results of this paper public here: http://www.cosmostat.org/anomaliesCMB.html

  6. The influence of super-horizon scales on cosmological observables generated during inflation

    NASA Astrophysics Data System (ADS)

    Matarrese, Sabino; Musso, Marcello A.; Riotto, Antonio

    2004-05-01

    Using the techniques of out-of-equilibrium field theory, we study the influence on properties of cosmological perturbations generated during inflation on observable scales coming from fluctuations corresponding today to scales much bigger than the present Hubble radius. We write the effective action for the coarse grained inflaton perturbations, integrating out the sub-horizon modes, which manifest themselves as a coloured noise and lead to memory effects. Using the simple model of a scalar field with cubic self-interactions evolving in a fixed de Sitter background, we evaluate the two- and three-point correlation function on observable scales. Our basic procedure shows that perturbations do preserve some memory of the super-horizon scale dynamics, in the form of scale dependent imprints in the statistical moments. In particular, we find a blue tilt of the power spectrum on large scales, in agreement with the recent results of the WMAP collaboration which show a suppression of the lower multipoles in the cosmic microwave background anisotropies, and a substantial enhancement of the intrinsic non-Gaussianity on large scales.

  7. Compact component for integrated quantum optic processing

    PubMed Central

    Sahu, Partha Pratim

    2015-01-01

    Quantum interference is indispensable to derive integrated quantum optic technologies (1–2). For further progress in large scale integration of quantum optic circuit, we have introduced first time two mode interference (TMI) coupler as an ultra compact component. The quantum interference varying with coupling length corresponding to the coupling ratio is studied and the larger HOM dip with peak visibility ~0.963 ± 0.009 is found at half coupling length of TMI coupler. Our results also demonstrate complex quantum interference with high fabrication tolerance and quantum visibility in TMI coupler. PMID:26584759

  8. A new uniformly valid asymptotic integration algorithm for elasto-plastic creep and unified viscoplastic theories including continuum damage

    NASA Technical Reports Server (NTRS)

    Chulya, Abhisak; Walker, Kevin P.

    1991-01-01

    A new scheme to integrate a system of stiff differential equations for both the elasto-plastic creep and the unified viscoplastic theories is presented. The method has high stability, allows large time increments, and is implicit and iterative. It is suitable for use with continuum damage theories. The scheme was incorporated into MARC, a commercial finite element code through a user subroutine called HYPELA. Results from numerical problems under complex loading histories are presented for both small and large scale analysis. To demonstrate the scheme's accuracy and efficiency, comparisons to a self-adaptive forward Euler method are made.

  9. Consortium biology in immunology: the perspective from the Immunological Genome Project.

    PubMed

    Benoist, Christophe; Lanier, Lewis; Merad, Miriam; Mathis, Diane

    2012-10-01

    Although the field has a long collaborative tradition, immunology has made less use than genetics of 'consortium biology', wherein groups of investigators together tackle large integrated questions or problems. However, immunology is naturally suited to large-scale integrative and systems-level approaches, owing to the multicellular and adaptive nature of the cells it encompasses. Here, we discuss the value and drawbacks of this organization of research, in the context of the long-running 'big science' debate, and consider the opportunities that may exist for the immunology community. We position this analysis in light of our own experience, both positive and negative, as participants of the Immunological Genome Project.

  10. A new uniformly valid asymptotic integration algorithm for elasto-plastic-creep and unified viscoplastic theories including continuum damage

    NASA Technical Reports Server (NTRS)

    Chulya, A.; Walker, K. P.

    1989-01-01

    A new scheme to integrate a system of stiff differential equations for both the elasto-plastic creep and the unified viscoplastic theories is presented. The method has high stability, allows large time increments, and is implicit and iterative. It is suitable for use with continuum damage theories. The scheme was incorporated into MARC, a commercial finite element code through a user subroutine called HYPELA. Results from numerical problems under complex loading histories are presented for both small and large scale analysis. To demonstrate the scheme's accuracy and efficiency, comparisons to a self-adaptive forward Euler method are made.

  11. Calculating Second-Order Effects in MOSFET's

    NASA Technical Reports Server (NTRS)

    Benumof, Reuben; Zoutendyk, John A.; Coss, James R.

    1990-01-01

    Collection of mathematical models includes second-order effects in n-channel, enhancement-mode, metal-oxide-semiconductor field-effect transistors (MOSFET's). When dimensions of circuit elements relatively large, effects neglected safely. However, as very-large-scale integration of microelectronic circuits leads to MOSFET's shorter or narrower than 2 micrometer, effects become significant in design and operation. Such computer programs as widely-used "Simulation Program With Integrated Circuit Emphasis, Version 2" (SPICE 2) include many of these effects. In second-order models of n-channel, enhancement-mode MOSFET, first-order gate-depletion region diminished by triangular-cross-section deletions on end and augmented by circular-wedge-cross-section bulges on sides.

  12. Multi-time-scale hydroclimate dynamics of a regional watershed and links to large-scale atmospheric circulation: Application to the Seine river catchment, France

    NASA Astrophysics Data System (ADS)

    Massei, N.; Dieppois, B.; Hannah, D. M.; Lavers, D. A.; Fossa, M.; Laignel, B.; Debret, M.

    2017-03-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating correlation between large and local scales, empirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: (i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and (ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the links between large and local scales were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach, which integrated discrete wavelet multiresolution analysis for reconstructing monthly regional hydrometeorological processes (predictand: precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector). This approach basically consisted in three steps: 1 - decomposing large-scale climate and hydrological signals (SLP field, precipitation or streamflow) using discrete wavelet multiresolution analysis, 2 - generating a statistical downscaling model per time-scale, 3 - summing up all scale-dependent models in order to obtain a final reconstruction of the predictand. The results obtained revealed a significant improvement of the reconstructions for both precipitation and streamflow when using the multiresolution ESD model instead of basic ESD. In particular, the multiresolution ESD model handled very well the significant changes in variance through time observed in either precipitation or streamflow. For instance, the post-1980 period, which had been characterized by particularly high amplitudes in interannual-to-interdecadal variability associated with alternating flood and extremely low-flow/drought periods (e.g., winter/spring 2001, summer 2003), could not be reconstructed without integrating wavelet multiresolution analysis into the model. In accordance with previous studies, the wavelet components detected in SLP, precipitation and streamflow on interannual to interdecadal time-scales could be interpreted in terms of influence of the Gulf-Stream oceanic front on atmospheric circulation.

  13. Varying the forcing scale in low Prandtl number dynamos

    NASA Astrophysics Data System (ADS)

    Brandenburg, A.; Haugen, N. E. L.; Li, Xiang-Yu; Subramanian, K.

    2018-06-01

    Small-scale dynamos are expected to operate in all astrophysical fluids that are turbulent and electrically conducting, for example the interstellar medium, stellar interiors, and accretion disks, where they may also be affected by or competing with large-scale dynamos. However, the possibility of small-scale dynamos being excited at small and intermediate ratios of viscosity to magnetic diffusivity (the magnetic Prandtl number) has been debated, and the possibility of them depending on the large-scale forcing wavenumber has been raised. Here we show, using four values of the forcing wavenumber, that the small-scale dynamo does not depend on the scale-separation between the size of the simulation domain and the integral scale of the turbulence, i.e., the forcing scale. Moreover, the spectral bottleneck in turbulence, which has been implied as being responsible for raising the excitation conditions of small-scale dynamos, is found to be invariant under changing the forcing wavenumber. However, when forcing at the lowest few wavenumbers, the effective forcing wavenumber that enters in the definition of the magnetic Reynolds number is found to be about twice the minimum wavenumber of the domain. Our work is relevant to future studies of small-scale dynamos, of which several applications are being discussed.

  14. Describing Ecosystem Complexity through Integrated Catchment Modeling

    NASA Astrophysics Data System (ADS)

    Shope, C. L.; Tenhunen, J. D.; Peiffer, S.

    2011-12-01

    Land use and climate change have been implicated in reduced ecosystem services (ie: high quality water yield, biodiversity, and agricultural yield. The prediction of ecosystem services expected under future land use decisions and changing climate conditions has become increasingly important. Complex policy and management decisions require the integration of physical, economic, and social data over several scales to assess effects on water resources and ecology. Field-based meteorology, hydrology, soil physics, plant production, solute and sediment transport, economic, and social behavior data were measured in a South Korean catchment. A variety of models are being used to simulate plot and field scale experiments within the catchment. Results from each of the local-scale models provide identification of sensitive, local-scale parameters which are then used as inputs into a large-scale watershed model. We used the spatially distributed SWAT model to synthesize the experimental field data throughout the catchment. The approach of our study was that the range in local-scale model parameter results can be used to define the sensitivity and uncertainty in the large-scale watershed model. Further, this example shows how research can be structured for scientific results describing complex ecosystems and landscapes where cross-disciplinary linkages benefit the end result. The field-based and modeling framework described is being used to develop scenarios to examine spatial and temporal changes in land use practices and climatic effects on water quantity, water quality, and sediment transport. Development of accurate modeling scenarios requires understanding the social relationship between individual and policy driven land management practices and the value of sustainable resources to all shareholders.

  15. Biological production models as elements of coupled, atmosphere-ocean models for climate research

    NASA Technical Reports Server (NTRS)

    Platt, Trevor; Sathyendranath, Shubha

    1991-01-01

    Process models of phytoplankton production are discussed with respect to their suitability for incorporation into global-scale numerical ocean circulation models. Exact solutions are given for integrals over the mixed layer and the day of analytic, wavelength-independent models of primary production. Within this class of model, the bias incurred by using a triangular approximation (rather than a sinusoidal one) to the variation of surface irradiance through the day is computed. Efficient computation algorithms are given for the nonspectral models. More exact calculations require a spectrally sensitive treatment. Such models exist but must be integrated numerically over depth and time. For these integrations, resolution in wavelength, depth, and time are considered and recommendations made for efficient computation. The extrapolation of the one-(spatial)-dimension treatment to large horizontal scale is discussed.

  16. Experiences Integrating Transmission and Distribution Simulations for DERs with the Integrated Grid Modeling System (IGMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Hale, Elaine; Hodge, Bri-Mathias

    2016-08-11

    This paper discusses the development of, approaches for, experiences with, and some results from a large-scale, high-performance-computer-based (HPC-based) co-simulation of electric power transmission and distribution systems using the Integrated Grid Modeling System (IGMS). IGMS was developed at the National Renewable Energy Laboratory (NREL) as a novel Independent System Operator (ISO)-to-appliance scale electric power system modeling platform that combines off-the-shelf tools to simultaneously model 100s to 1000s of distribution systems in co-simulation with detailed ISO markets, transmission power flows, and AGC-level reserve deployment. Lessons learned from the co-simulation architecture development are shared, along with a case study that explores the reactivemore » power impacts of PV inverter voltage support on the bulk power system.« less

  17. Kingsbury Bay-Grassy Point habitat restoration project: A Health Impact Assessment-oral presentation

    EPA Science Inventory

    Undertaking large-scale aquatic habitat restoration projects in prominent waterfront locations, such as city parks, provides an opportunity to both improve ecological integrity and enhance community well-being. However, to consider both opportunities simultaneously, a community-b...

  18. Computer-aided design of large-scale integrated circuits - A concept

    NASA Technical Reports Server (NTRS)

    Schansman, T. T.

    1971-01-01

    Circuit design and mask development sequence are improved by using general purpose computer with interactive graphics capability establishing efficient two way communications link between design engineer and system. Interactive graphics capability places design engineer in direct control of circuit development.

  19. Incorporation of DNA barcoding into a large-scale biomonitoring program: opportunities and pitfalls

    EPA Science Inventory

    Taxonomic identification of benthic macroinvertebrates is critical to protocols used to assess the biological integrity of aquatic ecosystems. The time, expense, and inherent error rate of species-level morphological identifications has necessitated use of genus- or family-level ...

  20. ROADNET: A Real-time Data Aware System for Earth, Oceanographic, and Environmental Applications

    NASA Astrophysics Data System (ADS)

    Vernon, F.; Hansen, T.; Lindquist, K.; Ludascher, B.; Orcutt, J.; Rajasekar, A.

    2003-12-01

    The Real-time Observatories, Application, and Data management Network (ROADNet) Program aims to develop an integrated, seamless, and transparent environmental information network that will deliver geophysical, oceanographic, hydrological, ecological, and physical data to a variety of users in real-time. ROADNet is a multidisciplinary, multinational partnership of researchers, policymakers, natural resource managers, educators, and students who aim to use the data to advance our understanding and management of coastal, ocean, riparian, and terrestrial Earth systems in Southern California, Mexico, and well off shore. To date, project activity and funding have focused on the design and deployment of network linkages and on the exploratory development of the real-time data management system. We are currently adapting powerful "Data Grid" technologies to the unique challenges associated with the management and manipulation of real-time data. Current "Grid" projects deal with static data files, and significant technical innovation is required to address fundamental problems of real-time data processing, integration, and distribution. The technologies developed through this research will create a system that dynamically adapt downstream processing, cataloging, and data access interfaces when sensors are added or removed from the system; provide for real-time processing and monitoring of data streams--detecting events, and triggering computations, sensor and logger modifications, and other actions; integrate heterogeneous data from multiple (signal) domains; and provide for large-scale archival and querying of "consolidated" data. The software tools which must be developed do not exist, although limited prototype systems are available. This research has implications for the success of large-scale NSF initiatives in the Earth sciences (EarthScope), ocean sciences (OOI- Ocean Observatories Initiative), biological sciences (NEON - National Ecological Observatory Network) and civil engineering (NEES - Network for Earthquake Engineering Simulation). Each of these large scale initiatives aims to collect real-time data from thousands of sensors, and each will require new technologies to process, manage, and communicate real-time multidisciplinary environmental data on regional, national, and global scales.

  1. WikiPEATia - a web based platform for assembling peatland data through ‘crowd sourcing’

    NASA Astrophysics Data System (ADS)

    Wisser, D.; Glidden, S.; Fieseher, C.; Treat, C. C.; Routhier, M.; Frolking, S. E.

    2009-12-01

    The Earth System Science community is realizing that peatlands are an important and unique terrestrial ecosystem that has not yet been well-integrated into large-scale earth system analyses. A major hurdle is the lack of accessible, geospatial data of peatland distribution, coupled with data on peatland properties (e.g., vegetation composition, peat depth, basal dates, soil chemistry, peatland class) at the global scale. This data, however, is available at the local scale. Although a comprehensive global database on peatlands probably lags similar data on more economically important ecosystems such as forests, grasslands, croplands, a large amount of field data have been collected over the past several decades. A few efforts have been made to map peatlands at large scales but existing data have not been assembled into a single geospatial database that is publicly accessible or do not depict data with a level of detail that is needed in the Earth System Science Community. A global peatland database would contribute to advances in a number of research fields such as hydrology, vegetation and ecosystem modeling, permafrost modeling, and earth system modeling. We present a Web 2.0 approach that uses state-of-the-art webserver and innovative online mapping technologies and is designed to create such a global database through ‘crowd-sourcing’. Primary functions of the online system include form-driven textual user input of peatland research metadata, spatial data input of peatland areas via a mapping interface, database editing and querying editing capabilities, as well as advanced visualization and data analysis tools. WikiPEATia provides an integrated information technology platform for assembling, integrating, and posting peatland-related geospatial datasets facilitates and encourages research community involvement. A successful effort will make existing peatland data much more useful to the research community, and will help to identify significant data gaps.

  2. Large Scale Software Building with CMake in ATLAS

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  3. Zooming into local active galactic nuclei: the power of combining SDSS-IV MaNGA with higher resolution integral field unit observations

    NASA Astrophysics Data System (ADS)

    Wylezalek, Dominika; Schnorr Müller, Allan; Zakamska, Nadia L.; Storchi-Bergmann, Thaisa; Greene, Jenny E.; Müller-Sánchez, Francisco; Kelly, Michael; Liu, Guilin; Law, David R.; Barrera-Ballesteros, Jorge K.; Riffel, Rogemar A.; Thomas, Daniel

    2017-05-01

    Ionized gas outflows driven by active galactic nuclei (AGN) are ubiquitous in high-luminosity AGN with outflow speeds apparently correlated with the total bolometric luminosity of the AGN. This empirical relation and theoretical work suggest that in the range Lbol ˜ 1043-45 erg s-1 there must exist a threshold luminosity above which the AGN becomes powerful enough to launch winds that will be able to escape the galaxy potential. In this paper, we present pilot observations of two AGN in this transitional range that were taken with the Gemini North Multi-Object Spectrograph integral field unit (IFU). Both sources have also previously been observed within the Sloan Digital Sky Survey-IV (SDSS) Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey. While the MaNGA IFU maps probe the gas fields on galaxy-wide scales and show that some regions are dominated by AGN ionization, the new Gemini IFU data zoom into the centre with four times better spatial resolution. In the object with the lower Lbol we find evidence of a young or stalled biconical AGN-driven outflow where none was obvious at the MaNGA resolution. In the object with the higher Lbol we trace the large-scale biconical outflow into the nuclear region and connect the outflow from small to large scales. These observations suggest that AGN luminosity and galaxy potential are crucial in shaping wind launching and propagation in low-luminosity AGN. The transition from small and young outflows to galaxy-wide feedback can only be understood by combining large-scale IFU data that trace the galaxy velocity field with higher resolution, small-scale IFU maps.

  4. Deformation of leaky-dielectric fluid globules under strong electric fields: Boundary layers and jets at large Reynolds numbers

    NASA Astrophysics Data System (ADS)

    Schnitzer, Ory; Frankel, Itzchak; Yariv, Ehud

    2013-11-01

    In Taylor's theory of electrohydrodynamic drop deformation (Proc. R. Soc. Lond. A, vol. 291, 1966, pp. 159-166), inertia is neglected at the outset, resulting in fluid velocity that scales as the square of the applied-field magnitude. For large drops, with increasing field strength the Reynolds number predicted by this scaling may actually become large, suggesting the need for a complementary large-Reynolds-number investigation. Balancing viscous stresses and electrical shear forces in this limit reveals a different velocity scaling, with the 4/3-power of the applied-field magnitude. We focus here on the flow over a gas bubble. It is essentially confined to two boundary layers propagating from the poles to the equator, where they collide to form a radial jet. At leading order in the Capillary number, the bubble deforms due to (i) Maxwell stresses; (ii) the hydrodynamic boundary-layer pressure associated with centripetal acceleration; and (iii) the intense pressure distribution acting over the narrow equatorial deflection zone, appearing as a concentrated load. Remarkably, the unique flow topology and associated scalings allow to obtain a closed-form expression for this deformation through application of integral mass and momentum balances. On the bubble scale, the concentrated pressure load is manifested in the appearance of a non-smooth equatorial dimple.

  5. Advanced Packaging for VLSI/VHSIC (Very Large Scale Integrated Circuits/Very High Speed Integrated Circuits) Applications: Electrical, Thermal, and Mechanical Considerations - An IR&D Report.

    DTIC Science & Technology

    1987-11-01

    developed that can be used by circuit engineers to extract the maximum performance from the devices on various board technologies including multilayer ceramic...Design guidelines have been developed that can be used by circuit engineers to extract the maxi- mum performance from the devices on various board...25 Attenuation and Dispersion Effects ......................................... 27 Skin Effect

  6. Ultrasonically Absorptive Coatings for Hypersonic

    DTIC Science & Technology

    2008-05-13

    UAC and TPS functions. To aid in the design of UAC with regular microstructure to be tested the CUBRC LENS I tunnel, parametric studies of the UAC-LFC...approaching the large-scale demonstration stage in the CUBRC LENS tunnel as well as fabrication of ceramic UAC samples integrated into TPS. Summary...integrate UAC and TPS functions. To aid in the design of UAC with regular microstructure to be tested the CUBRC LENS I tunnel, parametric studies of

  7. Integrated analysis of remote sensing products from basic geological surveys. [Brazil

    NASA Technical Reports Server (NTRS)

    Dasilvafagundesfilho, E. (Principal Investigator)

    1984-01-01

    Recent advances in remote sensing led to the development of several techniques to obtain image information. These techniques as effective tools in geological maping are analyzed. A strategy for optimizing the images in basic geological surveying is presented. It embraces as integrated analysis of spatial, spectral, and temporal data through photoptic (color additive viewer) and computer processing at different scales, allowing large areas survey in a fast, precise, and low cost manner.

  8. VizieR Online Data Catalog: Orion Integral Filament ALMA+IRAM30m N2H+(1-0) data (Hacar+, 2018)

    NASA Astrophysics Data System (ADS)

    Hacar, A.; Tafalla, M.; Forbrich, J.; Alves, J.; Meingast, S.; Grossschedl, J.; Teixeira, P. S.

    2018-01-01

    Combined ALMA+IRAM30m large-scale N2H+(1-0) emission in the Orion ISF. Two datasets are presented here in FITS format: 1.- Full data cube: spectral resolution = 0.1 kms-1 2.- Total integrated line intensity (moment 0) map Units are in Jy/beam See also: https://sites.google.com/site/orion4dproject/home (2 data files).

  9. Activity-based protein profiling for biochemical pathway discovery in cancer

    PubMed Central

    Nomura, Daniel K.; Dix, Melissa M.; Cravatt, Benjamin F.

    2011-01-01

    Large-scale profiling methods have uncovered numerous gene and protein expression changes that correlate with tumorigenesis. However, determining the relevance of these expression changes and which biochemical pathways they affect has been hindered by our incomplete understanding of the proteome and its myriad functions and modes of regulation. Activity-based profiling platforms enable both the discovery of cancer-relevant enzymes and selective pharmacological probes to perturb and characterize these proteins in tumour cells. When integrated with other large-scale profiling methods, activity-based proteomics can provide insight into the metabolic and signalling pathways that support cancer pathogenesis and illuminate new strategies for disease diagnosis and treatment. PMID:20703252

  10. Sensory integration balance training in patients with multiple sclerosis: A randomized, controlled trial.

    PubMed

    Gandolfi, Marialuisa; Munari, Daniele; Geroin, Christian; Gajofatto, Alberto; Benedetti, Maria Donata; Midiri, Alessandro; Carla, Fontana; Picelli, Alessandro; Waldner, Andreas; Smania, Nicola

    2015-10-01

    Impaired sensory integration contributes to balance disorders in patients with multiple sclerosis (MS). The objective of this paper is to compare the effects of sensory integration balance training against conventional rehabilitation on balance disorders, the level of balance confidence perceived, quality of life, fatigue, frequency of falls, and sensory integration processing on a large sample of patients with MS. This single-blind, randomized, controlled trial involved 80 outpatients with MS (EDSS: 1.5-6.0) and subjective symptoms of balance disorders. The experimental group (n = 39) received specific training to improve central integration of afferent sensory inputs; the control group (n = 41) received conventional rehabilitation (15 treatment sessions of 50 minutes each). Before, after treatment, and at one month post-treatment, patients were evaluated by a blinded rater using the Berg Balance Scale (BBS), Activities-specific Balance Confidence Scale (ABC), Multiple Sclerosis Quality of Life-54, Fatigue Severity Scale (FSS), number of falls and the Sensory Organization Balance Test (SOT). The experimental training program produced greater improvements than the control group training on the BBS (p < 0.001), the FSS (p < 0.002), number of falls (p = 0.002) and SOT (p < 0.05). Specific training to improve central integration of afferent sensory inputs may ameliorate balance disorders in patients with MS. Clinical Trial Registration (NCT01040117). © The Author(s), 2015.

  11. Evaluation of the WRF model for precipitation downscaling on orographic complex islands

    NASA Astrophysics Data System (ADS)

    Díaz, Juan P.; González, Albano; Expósito, Francisco; Pérez, Juan C.

    2010-05-01

    General Circulation Models (GCMs) have proven to be an effective tool to simulate many aspects of large-scale and global climate. However, their applicability to climate impact studies is limited by their capabilities to resolve regional scale situations. In this sense, dynamical downscaling techniques are an appropriate alternative to estimate high resolution regional climatologies. In this work, the Weather Research and Forecasting model (WRF) has been used to simulate precipitations over the Canary Islands region during 2009. The precipitation patterns over Canary Islands, located at North Atlantic region, show large gradients over a relatively small geographical area due to large scale factors such as Trade Winds regime predominant in the area and mesoscale factors mainly due to the complex terrain. Sensitivity study of simulated WRF precipitations to variations in model setup and parameterizations was carried out. Thus, WRF experiments were performed using two way nesting at 3 km horizontal grid spacing and 28 vertical levels in the Canaries inner domain. The initial and lateral and lower boundary conditions for the outer domain were provided at 6 hourly intervals by NCEP FNL (Final) Operational Global Analysis data on 1.0x1.0 degree resolution interpolated onto the WRF model grid. Numerous model options have been tested, including different microphysics schemes, cumulus parameterizations and nudging configuration Positive-definite moisture advection condition was also checked. Two integration approaches were analyzed: a 1-year continuous long-term integration and a consecutive short-term monthly reinitialized integration. To assess the accuracy of our simulations, model results are compared against observational datasets obtained from a network of meteorological stations in the region. In general, we can observe that the regional model is able to reproduce the spatial distribution of precipitation, but overestimates rainfall, mainly during strong precipitation events.

  12. Dopaminergic modulation of hemodynamic signal variability and the functional connectome during cognitive performance.

    PubMed

    Alavash, Mohsen; Lim, Sung-Joo; Thiel, Christiane; Sehm, Bernhard; Deserno, Lorenz; Obleser, Jonas

    2018-05-15

    Dopamine underlies important aspects of cognition, and has been suggested to boost cognitive performance. However, how dopamine modulates the large-scale cortical dynamics during cognitive performance has remained elusive. Using functional MRI during a working memory task in healthy young human listeners, we investigated the effect of levodopa (l-dopa) on two aspects of cortical dynamics, blood oxygen-level-dependent (BOLD) signal variability and the functional connectome of large-scale cortical networks. We here show that enhanced dopaminergic signaling modulates the two potentially interrelated aspects of large-scale cortical dynamics during cognitive performance, and the degree of these modulations is able to explain inter-individual differences in l-dopa-induced behavioral benefits. Relative to placebo, l-dopa increased BOLD signal variability in task-relevant temporal, inferior frontal, parietal and cingulate regions. On the connectome level, however, l-dopa diminished functional integration across temporal and cingulo-opercular regions. This hypo-integration was expressed as a reduction in network efficiency and modularity in more than two thirds of the participants and to different degrees. Hypo-integration co-occurred with relative hyper-connectivity in paracentral lobule and precuneus, as well as posterior putamen. Both, l-dopa-induced BOLD signal variability modulation and functional connectome modulations proved predictive of an individual's l-dopa-induced benefits in behavioral performance, namely response speed and perceptual sensitivity. Lastly, l-dopa-induced modulations of BOLD signal variability were correlated with l-dopa-induced modulation of nodal connectivity and network efficiency. Our findings underline the role of dopamine in maintaining the dynamic range of, and communication between, cortical systems, and their explanatory power for inter-individual differences in benefits from dopamine during cognitive performance. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Fast dimension reduction and integrative clustering of multi-omics data using low-rank approximation: application to cancer molecular classification.

    PubMed

    Wu, Dingming; Wang, Dongfang; Zhang, Michael Q; Gu, Jin

    2015-12-01

    One major goal of large-scale cancer omics study is to identify molecular subtypes for more accurate cancer diagnoses and treatments. To deal with high-dimensional cancer multi-omics data, a promising strategy is to find an effective low-dimensional subspace of the original data and then cluster cancer samples in the reduced subspace. However, due to data-type diversity and big data volume, few methods can integrative and efficiently find the principal low-dimensional manifold of the high-dimensional cancer multi-omics data. In this study, we proposed a novel low-rank approximation based integrative probabilistic model to fast find the shared principal subspace across multiple data types: the convexity of the low-rank regularized likelihood function of the probabilistic model ensures efficient and stable model fitting. Candidate molecular subtypes can be identified by unsupervised clustering hundreds of cancer samples in the reduced low-dimensional subspace. On testing datasets, our method LRAcluster (low-rank approximation based multi-omics data clustering) runs much faster with better clustering performances than the existing method. Then, we applied LRAcluster on large-scale cancer multi-omics data from TCGA. The pan-cancer analysis results show that the cancers of different tissue origins are generally grouped as independent clusters, except squamous-like carcinomas. While the single cancer type analysis suggests that the omics data have different subtyping abilities for different cancer types. LRAcluster is a very useful method for fast dimension reduction and unsupervised clustering of large-scale multi-omics data. LRAcluster is implemented in R and freely available via http://bioinfo.au.tsinghua.edu.cn/software/lracluster/ .

  14. On-chip synthesis of circularly polarized emission of light with integrated photonic circuits.

    PubMed

    He, Li; Li, Mo

    2014-05-01

    The helicity of circularly polarized (CP) light plays an important role in the light-matter interaction in magnetic and quantum material systems. Exploiting CP light in integrated photonic circuits could lead to on-chip integration of novel optical helicity-dependent devices for applications ranging from spintronics to quantum optics. In this Letter, we demonstrate a silicon photonic circuit coupled with a 2D grating emitter operating at a telecom wavelength to synthesize vertically emitting, CP light from a quasi-TE waveguide mode. Handedness of the emitted circular polarized light can be thermally controlled with an integrated microheater. The compact device footprint enables a small beam diameter, which is desirable for large-scale integration.

  15. A large scale software system for simulation and design optimization of mechanical systems

    NASA Technical Reports Server (NTRS)

    Dopker, Bernhard; Haug, Edward J.

    1989-01-01

    The concept of an advanced integrated, networked simulation and design system is outlined. Such an advanced system can be developed utilizing existing codes without compromising the integrity and functionality of the system. An example has been used to demonstrate the applicability of the concept of the integrated system outlined here. The development of an integrated system can be done incrementally. Initial capabilities can be developed and implemented without having a detailed design of the global system. Only a conceptual global system must exist. For a fully integrated, user friendly design system, further research is needed in the areas of engineering data bases, distributed data bases, and advanced user interface design.

  16. The Potential of Text Mining in Data Integration and Network Biology for Plant Research: A Case Study on Arabidopsis[C][W

    PubMed Central

    Van Landeghem, Sofie; De Bodt, Stefanie; Drebert, Zuzanna J.; Inzé, Dirk; Van de Peer, Yves

    2013-01-01

    Despite the availability of various data repositories for plant research, a wealth of information currently remains hidden within the biomolecular literature. Text mining provides the necessary means to retrieve these data through automated processing of texts. However, only recently has advanced text mining methodology been implemented with sufficient computational power to process texts at a large scale. In this study, we assess the potential of large-scale text mining for plant biology research in general and for network biology in particular using a state-of-the-art text mining system applied to all PubMed abstracts and PubMed Central full texts. We present extensive evaluation of the textual data for Arabidopsis thaliana, assessing the overall accuracy of this new resource for usage in plant network analyses. Furthermore, we combine text mining information with both protein–protein and regulatory interactions from experimental databases. Clusters of tightly connected genes are delineated from the resulting network, illustrating how such an integrative approach is essential to grasp the current knowledge available for Arabidopsis and to uncover gene information through guilt by association. All large-scale data sets, as well as the manually curated textual data, are made publicly available, hereby stimulating the application of text mining data in future plant biology studies. PMID:23532071

  17. Multimodal integration of fMRI and EEG data for high spatial and temporal resolution analysis of brain networks

    PubMed Central

    Mantini, D.; Marzetti, L.; Corbetta, M.; Romani, G.L.; Del Gratta, C.

    2017-01-01

    Two major non-invasive brain mapping techniques, electroencephalography (EEG) and functional magnetic resonance imaging (fMRI), have complementary advantages with regard to their spatial and temporal resolution. We propose an approach based on the integration of EEG and fMRI, enabling the EEG temporal dynamics of information processing to be characterized within spatially well-defined fMRI large-scale networks. First, the fMRI data are decomposed into networks by means of spatial independent component analysis (sICA), and those associated with intrinsic activity and/or responding to task performance are selected using information from the related time-courses. Next, the EEG data over all sensors are averaged with respect to event timing, thus calculating event-related potentials (ERPs). The ERPs are subjected to temporal ICA (tICA), and the resulting components are localized with the weighted minimum norm (WMNLS) algorithm using the task-related fMRI networks as priors. Finally, the temporal contribution of each ERP component in the areas belonging to the fMRI large-scale networks is estimated. The proposed approach has been evaluated on visual target detection data. Our results confirm that two different components, commonly observed in EEG when presenting novel and salient stimuli respectively, are related to the neuronal activation in large-scale networks, operating at different latencies and associated with different functional processes. PMID:20052528

  18. Neurodevelopmental alterations of large-scale structural networks in children with new-onset epilepsy.

    PubMed

    Bonilha, Leonardo; Tabesh, Ali; Dabbs, Kevin; Hsu, David A; Stafstrom, Carl E; Hermann, Bruce P; Lin, Jack J

    2014-08-01

    Recent neuroimaging and behavioral studies have revealed that children with new onset epilepsy already exhibit brain structural abnormalities and cognitive impairment. How the organization of large-scale brain structural networks is altered near the time of seizure onset and whether network changes are related to cognitive performances remain unclear. Recent studies also suggest that regional brain volume covariance reflects synchronized brain developmental changes. Here, we test the hypothesis that epilepsy during early-life is associated with abnormalities in brain network organization and cognition. We used graph theory to study structural brain networks based on regional volume covariance in 39 children with new-onset seizures and 28 healthy controls. Children with new-onset epilepsy showed a suboptimal topological structural organization with enhanced network segregation and reduced global integration compared with controls. At the regional level, structural reorganization was evident with redistributed nodes from the posterior to more anterior head regions. The epileptic brain network was more vulnerable to targeted but not random attacks. Finally, a subgroup of children with epilepsy, namely those with lower IQ and poorer executive function, had a reduced balance between network segregation and integration. Taken together, the findings suggest that the neurodevelopmental impact of new onset childhood epilepsies alters large-scale brain networks, resulting in greater vulnerability to network failure and cognitive impairment. Copyright © 2014 Wiley Periodicals, Inc.

  19. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    NASA Astrophysics Data System (ADS)

    Steiakakis, Chrysanthos; Agioutantis, Zacharias; Apostolou, Evangelia; Papavgeri, Georgia; Tripolitsiotis, Achilles

    2016-01-01

    The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions. This paper presents an integrated data management system which was developed over a number of years as well as the advantages through a specific application. The presented case study illustrates how the high production slopes of a mine that exceed depths of 100-120 m were successfully mined with an average displacement rate of 10- 20 mm/day, approaching an almost slow to moderate landslide velocity. Monitoring data of the past four years are included in the database and can be analyzed to produce valuable results. Time-series data correlations of movements, precipitation records, etc. are evaluated and presented in this case study. The results can be used to successfully manage mine operations and ensure the safety of the mine and the workforce.

  20. Thermal Testing and Integration: Magnetospheric MultiScale (MMS) Observatories with Digital 1-Wire Sensors

    NASA Technical Reports Server (NTRS)

    Solimani, Jason A.; Rosanova, Santino

    2015-01-01

    Thermocouples require two thin wires to be routed out of the spacecraft to connect to the ground support equipment used to monitor and record the temperature data. This large number of wires that exit the observatory complicates integration and creates an undesirable heat path during testing. These wires exiting the spacecraft need to be characterized as a thermal short that will not exist during flight. To minimize complexity and reduce thermal variables from these ground support equipment (GSE) wires, MMS pursued a hybrid path for temperature monitoring, utilizing thermocouples and digital 1-wire temperature sensors. Digital 1-wire sensors can greatly reduce harness mass, length and complexity as they can be spliced together. For MMS, 350 digital 1-wire sensors were installed on the spacecraft with only 18 wires exiting as opposed to a potential 700 thermocouple wires. Digital 1-wire sensors had not been used in such a large scale at NASAGSFC prior to the MMS mission. During the MMS thermal vacuum testing a lessons learned matrix was formulated that will assist future integration of 1-wires into thermal testing and one day into flight.

  1. A technique for estimating the probability of radiation-stimulated failures of integrated microcircuits in low-intensity radiation fields: Application to the Spektr-R spacecraft

    NASA Astrophysics Data System (ADS)

    Popov, V. D.; Khamidullina, N. M.

    2006-10-01

    In developing radio-electronic devices (RED) of spacecraft operating in the fields of ionizing radiation in space, one of the most important problems is the correct estimation of their radiation tolerance. The “weakest link” in the element base of onboard microelectronic devices under radiation effect is the integrated microcircuits (IMC), especially of large scale (LSI) and very large scale (VLSI) degree of integration. The main characteristic of IMC, which is taken into account when making decisions on using some particular type of IMC in the onboard RED, is the probability of non-failure operation (NFO) at the end of the spacecraft’s lifetime. It should be noted that, until now, the NFO has been calculated only from the reliability characteristics, disregarding the radiation effect. This paper presents the so-called “reliability” approach to determination of radiation tolerance of IMC, which allows one to estimate the probability of non-failure operation of various types of IMC with due account of radiation-stimulated dose failures. The described technique is applied to RED onboard the Spektr-R spacecraft to be launched in 2007.

  2. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  3. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  4. Research on precision grinding technology of large scale and ultra thin optics

    NASA Astrophysics Data System (ADS)

    Zhou, Lian; Wei, Qiancai; Li, Jie; Chen, Xianhua; Zhang, Qinghua

    2018-03-01

    The flatness and parallelism error of large scale and ultra thin optics have an important influence on the subsequent polishing efficiency and accuracy. In order to realize the high precision grinding of those ductile elements, the low deformation vacuum chuck was designed first, which was used for clamping the optics with high supporting rigidity in the full aperture. Then the optics was planar grinded under vacuum adsorption. After machining, the vacuum system was turned off. The form error of optics was on-machine measured using displacement sensor after elastic restitution. The flatness would be convergenced with high accuracy by compensation machining, whose trajectories were integrated with the measurement result. For purpose of getting high parallelism, the optics was turned over and compensation grinded using the form error of vacuum chuck. Finally, the grinding experiment of large scale and ultra thin fused silica optics with aperture of 430mm×430mm×10mm was performed. The best P-V flatness of optics was below 3 μm, and parallelism was below 3 ″. This machining technique has applied in batch grinding of large scale and ultra thin optics.

  5. VisIRR: A Visual Analytics System for Information Retrieval and Recommendation for Large-Scale Document Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choo, Jaegul; Kim, Hannah; Clarkson, Edward

    In this paper, we present an interactive visual information retrieval and recommendation system, called VisIRR, for large-scale document discovery. VisIRR effectively combines the paradigms of (1) a passive pull through query processes for retrieval and (2) an active push that recommends items of potential interest to users based on their preferences. Equipped with an efficient dynamic query interface against a large-scale corpus, VisIRR organizes the retrieved documents into high-level topics and visualizes them in a 2D space, representing the relationships among the topics along with their keyword summary. In addition, based on interactive personalized preference feedback with regard to documents,more » VisIRR provides document recommendations from the entire corpus, which are beyond the retrieved sets. Such recommended documents are visualized in the same space as the retrieved documents, so that users can seamlessly analyze both existing and newly recommended ones. This article presents novel computational methods, which make these integrated representations and fast interactions possible for a large-scale document corpus. We illustrate how the system works by providing detailed usage scenarios. Finally, we present preliminary user study results for evaluating the effectiveness of the system.« less

  6. VisIRR: A Visual Analytics System for Information Retrieval and Recommendation for Large-Scale Document Data

    DOE PAGES

    Choo, Jaegul; Kim, Hannah; Clarkson, Edward; ...

    2018-01-31

    In this paper, we present an interactive visual information retrieval and recommendation system, called VisIRR, for large-scale document discovery. VisIRR effectively combines the paradigms of (1) a passive pull through query processes for retrieval and (2) an active push that recommends items of potential interest to users based on their preferences. Equipped with an efficient dynamic query interface against a large-scale corpus, VisIRR organizes the retrieved documents into high-level topics and visualizes them in a 2D space, representing the relationships among the topics along with their keyword summary. In addition, based on interactive personalized preference feedback with regard to documents,more » VisIRR provides document recommendations from the entire corpus, which are beyond the retrieved sets. Such recommended documents are visualized in the same space as the retrieved documents, so that users can seamlessly analyze both existing and newly recommended ones. This article presents novel computational methods, which make these integrated representations and fast interactions possible for a large-scale document corpus. We illustrate how the system works by providing detailed usage scenarios. Finally, we present preliminary user study results for evaluating the effectiveness of the system.« less

  7. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. New Markets for Solar Photovoltaic Power Systems

    NASA Astrophysics Data System (ADS)

    Thomas, Chacko; Jennings, Philip; Singh, Dilawar

    2007-10-01

    Over the past five years solar photovoltaic (PV) power supply systems have matured and are now being deployed on a much larger scale. The traditional small-scale remote area power supply systems are still important and village electrification is also a large and growing market but large scale, grid-connected systems and building integrated systems are now being deployed in many countries. This growth has been aided by imaginative government policies in several countries and the overall result is a growth rate of over 40% per annum in the sales of PV systems. Optimistic forecasts are being made about the future of PV power as a major source of sustainable energy. Plans are now being formulated by the IEA for very large-scale PV installations of more than 100 MW peak output. The Australian Government has announced a subsidy for a large solar photovoltaic power station of 154 MW in Victoria, based on the concentrator technology developed in Australia. In Western Australia a proposal has been submitted to the State Government for a 2 MW photovoltaic power system to provide fringe of grid support at Perenjori. This paper outlines the technologies, designs, management and policies that underpin these exciting developments in solar PV power.

  9. Concepts: Integrating population survey data from different spatial scales, sampling methods, and species

    USGS Publications Warehouse

    Dorazio, Robert; Delampady, Mohan; Dey, Soumen; Gopalaswamy, Arjun M.; Karanth, K. Ullas; Nichols, James D.

    2017-01-01

    Conservationists and managers are continually under pressure from the public, the media, and political policy makers to provide “tiger numbers,” not just for protected reserves, but also for large spatial scales, including landscapes, regions, states, nations, and even globally. Estimating the abundance of tigers within relatively small areas (e.g., protected reserves) is becoming increasingly tractable (see Chaps. 9 and 10), but doing so for larger spatial scales still presents a formidable challenge. Those who seek “tiger numbers” are often not satisfied by estimates of tiger occupancy alone, regardless of the reliability of the estimates (see Chaps. 4 and 5). As a result, wherever tiger conservation efforts are underway, either substantially or nominally, scientists and managers are frequently asked to provide putative large-scale tiger numbers based either on a total count or on an extrapolation of some sort (see Chaps. 1 and 2).

  10. Scale Up of Malonic Acid Fermentation Process: Cooperative Research and Development Final Report, CRADA Number CRD-16-612

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schell, Daniel J

    The goal of this work is to use the large fermentation vessels in the National Renewable Energy Laboratory's (NREL) Integrated Biorefinery Research Facility (IBRF) to scale-up Lygos' biological-based process for producing malonic acid and to generate performance data. Initially, work at the 1 L scale validated successful transfer of Lygos' fermentation protocols to NREL using a glucose substrate. Outside of the scope of the CRADA with NREL, Lygos tested their process on lignocellulosic sugars produced by NREL at Lawrence Berkeley National Laboratory's (LBNL) Advanced Biofuels Process Development Unit (ABPDU). NREL produced these cellulosic sugar solutions from corn stover using amore » separate cellulose/hemicellulose process configuration. Finally, NREL performed fermentations using glucose in large fermentors (1,500- and 9,000-L vessels) to intermediate product and to demonstrate successful performance of Lygos' technology at larger scales.« less

  11. Large-Scale Cubic-Scaling Random Phase Approximation Correlation Energy Calculations Using a Gaussian Basis.

    PubMed

    Wilhelm, Jan; Seewald, Patrick; Del Ben, Mauro; Hutter, Jürg

    2016-12-13

    We present an algorithm for computing the correlation energy in the random phase approximation (RPA) in a Gaussian basis requiring [Formula: see text] operations and [Formula: see text] memory. The method is based on the resolution of the identity (RI) with the overlap metric, a reformulation of RI-RPA in the Gaussian basis, imaginary time, and imaginary frequency integration techniques, and the use of sparse linear algebra. Additional memory reduction without extra computations can be achieved by an iterative scheme that overcomes the memory bottleneck of canonical RPA implementations. We report a massively parallel implementation that is the key for the application to large systems. Finally, cubic-scaling RPA is applied to a thousand water molecules using a correlation-consistent triple-ζ quality basis.

  12. Graphene-Si heterogeneous nanotechnology

    NASA Astrophysics Data System (ADS)

    Akinwande, Deji; Tao, Li

    2013-05-01

    It is widely envisioned that graphene, an atomic sheet of carbon that has generated very broad interest has the largest prospects for flexible smart systems and for integrated graphene-silicon (G-Si) heterogeneous very large-scale integrated (VLSI) nanoelectronics. In this work, we focus on the latter and elucidate the research progress that has been achieved for integration of graphene with Si-CMOS including: wafer-scale graphene growth by chemical vapor deposition on Cu/SiO2/Si substrates, wafer-scale graphene transfer that afforded the fabrication of over 10,000 devices, wafer-scalable mitigation strategies to restore graphene's device characteristics via fluoropolymer interaction, and demonstrations of graphene integrated with commercial Si- CMOS chips for hybrid nanoelectronics and sensors. Metrology at the wafer-scale has led to the development of custom Raman processing software (GRISP) now available on the nanohub portal. The metrology reveals that graphene grown on 4-in substrates have monolayer quality comparable to exfoliated flakes. At room temperature, the high-performance passivated graphene devices on SiO2/Si can afford average mobilities 3000cm2/V-s and gate modulation that exceeds an order of magnitude. The latest growth research has yielded graphene with high mobilities greater than 10,000cm2/V-s on oxidized silicon. Further progress requires track compatible graphene-Si integration via wafer bonding in order to translate graphene research from basic to applied research in commercial R and D laboratories to ultimately yield a viable nanotechnology.

  13. Genome-Scale Metabolic Reconstructions and Theoretical Investigation of Methane Conversion in Methylomicrobium buryatense Strain 5G(B1)

    DOE PAGES

    de la Torre, Andrea; Metivier, Aisha; Chu, Frances; ...

    2015-11-25

    Methane-utilizing bacteria (methanotrophs) are capable of growth on methane and are attractive systems for bio-catalysis. However, the application of natural methanotrophic strains to large-scale production of value-added chemicals/biofuels requires a number of physiological and genetic alterations. An accurate metabolic model coupled with flux balance analysis can provide a solid interpretative framework for experimental data analyses and integration.

  14. Combined Monte Carlo and path-integral method for simulated library of time-resolved reflectance curves from layered tissue models

    NASA Astrophysics Data System (ADS)

    Wilson, Robert H.; Vishwanath, Karthik; Mycek, Mary-Ann

    2009-02-01

    Monte Carlo (MC) simulations are considered the "gold standard" for mathematical description of photon transport in tissue, but they can require large computation times. Therefore, it is important to develop simple and efficient methods for accelerating MC simulations, especially when a large "library" of related simulations is needed. A semi-analytical method involving MC simulations and a path-integral (PI) based scaling technique generated time-resolved reflectance curves from layered tissue models. First, a zero-absorption MC simulation was run for a tissue model with fixed scattering properties in each layer. Then, a closed-form expression for the average classical path of a photon in tissue was used to determine the percentage of time that the photon spent in each layer, to create a weighted Beer-Lambert factor to scale the time-resolved reflectance of the simulated zero-absorption tissue model. This method is a unique alternative to other scaling techniques in that it does not require the path length or number of collisions of each photon to be stored during the initial simulation. Effects of various layer thicknesses and absorption and scattering coefficients on the accuracy of the method will be discussed.

  15. The Best of Two Worlds: ALMA + IRAM30M Observations of the Orion Integral Shape Filament

    NASA Astrophysics Data System (ADS)

    Hacar Gonzalez, Alvaro

    2018-01-01

    We have investigated the internal gas structure of the Orion Integral Shape filament using two large-scale, 150-pointing ALMA-12m mosaics and previous IRAM30m single-dish (SD) observations. From the combination of both single-dish and interferometric data we have produced a high-dynamic range and high-sensitivity map describing the internal gas structure of this filament at scales between 2 pc and 2000 AU (Hacar et al, submitted to A&A). In a series of individual CASA reductions (w/o SD data + w/o feathering), we have investigated the impact of the different uv-coverages on both the total flux and line velocity structure of our ALMA maps. Our analysis highlights the critical role played by the zero-spacing data at the different stages of the cleaning process. The results of these ALMA+IRAM30m experiments emphasize the need of high-sensitivity SD observations for the analysis of large-scale interferometric maps. During my talk, I will discuss the implications of these experiments on the dawn of the ALMA era and in the context of the new AtLAST telescope.

  16. A Commercialization Roadmap for Carbon-Negative Energy Systems

    NASA Astrophysics Data System (ADS)

    Sanchez, D.

    2016-12-01

    The Intergovernmental Panel on Climate Change (IPCC) envisages the need for large-scale deployment of net-negative CO2 emissions technologies by mid-century to meet stringent climate mitigation goals and yield a net drawdown of atmospheric carbon. Yet there are few commercial deployments of BECCS outside of niche markets, creating uncertainty about commercialization pathways and sustainability impacts at scale. This uncertainty is exacerbated by the absence of a strong policy framework, such as high carbon prices and research coordination. Here, we propose a strategy for the potential commercial deployment of BECCS. This roadmap proceeds via three steps: 1) via capture and utilization of biogenic CO2 from existing bioenergy facilities, notably ethanol fermentation, 2) via thermochemical co-conversion of biomass and fossil fuels, particularly coal, and 3) via dedicated, large-scale BECCS. Although biochemical conversion is a proven first market for BECCS, this trajectory alone is unlikely to drive commercialization of BECCS at the gigatonne scale. In contrast to biochemical conversion, thermochemical conversion of coal and biomass enables large-scale production of fuels and electricity with a wide range of carbon intensities, process efficiencies and process scales. Aside from systems integration, primarily technical barriers are involved in large-scale biomass logistics, gasification and gas cleaning. Key uncertainties around large-scale BECCS deployment are not limited to commercialization pathways; rather, they include physical constraints on biomass cultivation or CO2 storage, as well as social barriers, including public acceptance of new technologies and conceptions of renewable and fossil energy, which co-conversion systems confound. Despite sustainability risks, this commercialization strategy presents a pathway where energy suppliers, manufacturers and governments could transition from laggards to leaders in climate change mitigation efforts.

  17. Nanowire active-matrix circuitry for low-voltage macroscale artificial skin.

    PubMed

    Takei, Kuniharu; Takahashi, Toshitake; Ho, Johnny C; Ko, Hyunhyub; Gillies, Andrew G; Leu, Paul W; Fearing, Ronald S; Javey, Ali

    2010-10-01

    Large-scale integration of high-performance electronic components on mechanically flexible substrates may enable new applications in electronics, sensing and energy. Over the past several years, tremendous progress in the printing and transfer of single-crystalline, inorganic micro- and nanostructures on plastic substrates has been achieved through various process schemes. For instance, contact printing of parallel arrays of semiconductor nanowires (NWs) has been explored as a versatile route to enable fabrication of high-performance, bendable transistors and sensors. However, truly macroscale integration of ordered NW circuitry has not yet been demonstrated, with the largest-scale active systems being of the order of 1 cm(2) (refs 11,15). This limitation is in part due to assembly- and processing-related obstacles, although larger-scale integration has been demonstrated for randomly oriented NWs (ref. 16). Driven by this challenge, here we demonstrate macroscale (7×7 cm(2)) integration of parallel NW arrays as the active-matrix backplane of a flexible pressure-sensor array (18×19 pixels). The integrated sensor array effectively functions as an artificial electronic skin, capable of monitoring applied pressure profiles with high spatial resolution. The active-matrix circuitry operates at a low operating voltage of less than 5 V and exhibits superb mechanical robustness and reliability, without performance degradation on bending to small radii of curvature (2.5 mm) for over 2,000 bending cycles. This work presents the largest integration of ordered NW-array active components, and demonstrates a model platform for future integration of nanomaterials for practical applications.

  18. Tradeoffs and synergies between biofuel production and large-scale solar infrastructure in deserts

    NASA Astrophysics Data System (ADS)

    Ravi, S.; Lobell, D. B.; Field, C. B.

    2012-12-01

    Solar energy installations in deserts are on the rise, fueled by technological advances and policy changes. Deserts, with a combination of high solar radiation and availability of large areas unusable for crop production are ideal locations for large scale solar installations. For efficient power generation, solar infrastructures require large amounts of water for operation (mostly for cleaning panels and dust suppression), leading to significant moisture additions to desert soil. A pertinent question is how to use the moisture inputs for sustainable agriculture/biofuel production. We investigated the water requirements for large solar infrastructures in North American deserts and explored the possibilities for integrating biofuel production with solar infrastructure. In co-located systems the possible decline in yields due to shading by solar panels may be offsetted by the benefits of periodic water addition to biofuel crops, simpler dust management and more efficient power generation in solar installations, and decreased impacts on natural habitats and scarce resources in deserts. In particular, we evaluated the potential to integrate solar infrastructure with biomass feedstocks that grow in arid and semi-arid lands (Agave Spp), which are found to produce high yields with minimal water inputs. To this end, we conducted detailed life cycle analysis for these coupled agave biofuel - solar energy systems to explore the tradeoffs and synergies, in the context of energy input-output, water use and carbon emissions.

  19. Validating a Geographical Image Retrieval System.

    ERIC Educational Resources Information Center

    Zhu, Bin; Chen, Hsinchun

    2000-01-01

    Summarizes a prototype geographical image retrieval system that demonstrates how to integrate image processing and information analysis techniques to support large-scale content-based image retrieval. Describes an experiment to validate the performance of this image retrieval system against that of human subjects by examining similarity analysis…

  20. Transforming Power Systems; 21st Century Power Partnership

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2015-05-20

    The 21st Century Power Partnership - a multilateral effort of the Clean Energy Ministerial - serves as a platform for public-private collaboration to advance integrated solutions for the large-scale deployment of renewable energy in combination with deep energy ef?ciency and smart grid solutions.

  1. Code modernization and modularization of APEX and SWAT watershed simulation models

    USDA-ARS?s Scientific Manuscript database

    SWAT (Soil and Water Assessment Tool) and APEX (Agricultural Policy / Environmental eXtender) are respectively large and small watershed simulation models derived from EPIC Environmental Policy Integrated Climate), a field-scale agroecology simulation model. All three models are coded in FORTRAN an...

  2. Microprocessor Seminar, phase 2

    NASA Technical Reports Server (NTRS)

    Scott, W. R.

    1977-01-01

    Workshop sessions and papers were devoted to various aspects of microprocessor and large scale integrated circuit technology. Presentations were made on advanced LSI developments for high reliability military and NASA applications. Microprocessor testing techniques were discussed, and test data were presented. High reliability procurement specifications were also discussed.

  3. Energy Systems Integration Partnerships: NREL + Giner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-03-22

    This fact sheet highlights work done at the ESIF in partnership with Giner. Giner, a developer of proton-exchange membrane (PEM) technologies, has contracted with NREL to validate the performance of its large-scale PEM electrolyzer stacks. PEM electrolyzers work much like fuel cells run in reverse.

  4. An integration of minimum local feature representation methods to recognize large variation of foods

    NASA Astrophysics Data System (ADS)

    Razali, Mohd Norhisham bin; Manshor, Noridayu; Halin, Alfian Abdul; Mustapha, Norwati; Yaakob, Razali

    2017-10-01

    Local invariant features have shown to be successful in describing object appearances for image classification tasks. Such features are robust towards occlusion and clutter and are also invariant against scale and orientation changes. This makes them suitable for classification tasks with little inter-class similarity and large intra-class difference. In this paper, we propose an integrated representation of the Speeded-Up Robust Feature (SURF) and Scale Invariant Feature Transform (SIFT) descriptors, using late fusion strategy. The proposed representation is used for food recognition from a dataset of food images with complex appearance variations. The Bag of Features (BOF) approach is employed to enhance the discriminative ability of the local features. Firstly, the individual local features are extracted to construct two kinds of visual vocabularies, representing SURF and SIFT. The visual vocabularies are then concatenated and fed into a Linear Support Vector Machine (SVM) to classify the respective food categories. Experimental results demonstrate impressive overall recognition at 82.38% classification accuracy based on the challenging UEC-Food100 dataset.

  5. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  6. Turbulent thermal superstructures in Rayleigh-Bénard convection

    NASA Astrophysics Data System (ADS)

    Stevens, Richard J. A. M.; Blass, Alexander; Zhu, Xiaojue; Verzicco, Roberto; Lohse, Detlef

    2018-04-01

    We report the observation of superstructures, i.e., very large-scale and long living coherent structures in highly turbulent Rayleigh-Bénard convection up to Rayleigh Ra=109 . We perform direct numerical simulations in horizontally periodic domains with aspect ratios up to Γ =128 . In the considered Ra number regime the thermal superstructures have a horizontal extend of six to seven times the height of the domain and their size is independent of Ra. Many laboratory experiments and numerical simulations have focused on small aspect ratio cells in order to achieve the highest possible Ra. However, here we show that for very high Ra integral quantities such as the Nusselt number and volume averaged Reynolds number only converge to the large aspect ratio limit around Γ ≈4 , while horizontally averaged statistics such as standard deviation and kurtosis converge around Γ ≈8 , the integral scale converges around Γ ≈32 , and the peak position of the temperature variance and turbulent kinetic energy spectra only converge around Γ ≈64 .

  7. New strategy for drug discovery by large-scale association analysis of molecular networks of different species.

    PubMed

    Zhang, Bo; Fu, Yingxue; Huang, Chao; Zheng, Chunli; Wu, Ziyin; Zhang, Wenjuan; Yang, Xiaoyan; Gong, Fukai; Li, Yuerong; Chen, Xiaoyu; Gao, Shuo; Chen, Xuetong; Li, Yan; Lu, Aiping; Wang, Yonghua

    2016-02-25

    The development of modern omics technology has not significantly improved the efficiency of drug development. Rather precise and targeted drug discovery remains unsolved. Here a large-scale cross-species molecular network association (CSMNA) approach for targeted drug screening from natural sources is presented. The algorithm integrates molecular network omics data from humans and 267 plants and microbes, establishing the biological relationships between them and extracting evolutionarily convergent chemicals. This technique allows the researcher to assess targeted drugs for specific human diseases based on specific plant or microbe pathways. In a perspective validation, connections between the plant Halliwell-Asada (HA) cycle and the human Nrf2-ARE pathway were verified and the manner by which the HA cycle molecules act on the human Nrf2-ARE pathway as antioxidants was determined. This shows the potential applicability of this approach in drug discovery. The current method integrates disparate evolutionary species into chemico-biologically coherent circuits, suggesting a new cross-species omics analysis strategy for rational drug development.

  8. Consciousness, cognition and brain networks: New perspectives.

    PubMed

    Aldana, E M; Valverde, J L; Fábregas, N

    2016-10-01

    A detailed analysis of the literature on consciousness and cognition mechanisms based on the neural networks theory is presented. The immune and inflammatory response to the anesthetic-surgical procedure induces modulation of neuronal plasticity by influencing higher cognitive functions. Anesthetic drugs can cause unconsciousness, producing a functional disruption of cortical and thalamic cortical integration complex. The external and internal perceptions are processed through an intricate network of neural connections, involving the higher nervous activity centers, especially the cerebral cortex. This requires an integrated model, formed by neural networks and their interactions with highly specialized regions, through large-scale networks, which are distributed throughout the brain collecting information flow of these perceptions. Functional and effective connectivity between large-scale networks, are essential for consciousness, unconsciousness and cognition. It is what is called the "human connectome" or map neural networks. Copyright © 2014 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. A precise integration method for solving coupled vehicle-track dynamics with nonlinear wheel-rail contact

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Gao, Q.; Tan, S. J.; Zhong, W. X.

    2012-10-01

    A new method is proposed as a solution for the large-scale coupled vehicle-track dynamic model with nonlinear wheel-rail contact. The vehicle is simplified as a multi-rigid-body model, and the track is treated as a three-layer beam model. In the track model, the rail is assumed to be an Euler-Bernoulli beam supported by discrete sleepers. The vehicle model and the track model are coupled using Hertzian nonlinear contact theory, and the contact forces of the vehicle subsystem and the track subsystem are approximated by the Lagrange interpolation polynomial. The response of the large-scale coupled vehicle-track model is calculated using the precise integration method. A more efficient algorithm based on the periodic property of the track is applied to calculate the exponential matrix and certain matrices related to the solution of the track subsystem. Numerical examples demonstrate the computational accuracy and efficiency of the proposed method.

  10. Virtual reality adaptive stimulation of limbic networks in the mental readiness training.

    PubMed

    Cosić, Kresimir; Popović, Sinisa; Kostović, Ivica; Judas, Milos

    2010-01-01

    A significant proportion of severe psychological problems in recent large-scale peacekeeping operations underscores the importance of effective methods for strengthening the stress resilience. Virtual reality (VR) adaptive stimulation, based on the estimation of the participant's emotional state from physiological signals, may enhance the mental readiness training (MRT). Understanding neurobiological mechanisms by which the MRT based on VR adaptive stimulation can affect the resilience to stress is important for practical application in the stress resilience management. After the delivery of a traumatic audio-visual stimulus in the VR, the cascade of events occurs in the brain, which evokes various physiological manifestations. In addition to the "limbic" emotional and visceral brain circuitry, other large-scale sensory, cognitive, and memory brain networks participate with less known impact in this physiological response. The MRT based on VR adaptive stimulation may strengthen the stress resilience through targeted brain-body interactions. Integrated interdisciplinary efforts, which would integrate the brain imaging and the proposed approach, may contribute to clarifying the neurobiological foundation of the resilience to stress.

  11. Rapid underway profiling of water quality in Queensland estuaries.

    PubMed

    Hodge, Jonathan; Longstaff, Ben; Steven, Andy; Thornton, Phillip; Ellis, Peter; McKelvie, Ian

    2005-01-01

    We present an overview of a portable underway water quality monitoring system (RUM-Rapid Underway Monitoring), developed by integrating several off-the-shelf water quality instruments to provide rapid, comprehensive, and spatially referenced 'snapshots' of water quality conditions. We demonstrate the utility of the system from studies in the Northern Great Barrier Reef (Daintree River) and the Moreton Bay region. The Brisbane dataset highlights RUM's utility in characterising plumes as well as its ability to identify the smaller scale structure of large areas. RUM is shown to be particularly useful when measuring indicators with large small-scale variability such as turbidity and chlorophyll-a. Additionally, the Daintree dataset shows the ability to integrate other technologies, resulting in a more comprehensive analysis, whilst sampling offshore highlights some of the analytical issues required for sampling low concentration data. RUM is a low cost, highly flexible solution that can be modified for use in any water type, on most vessels and is only limited by the available monitoring technologies.

  12. Distributed 3D Information Visualization - Towards Integration of the Dynamic 3D Graphics and Web Services

    NASA Astrophysics Data System (ADS)

    Vucinic, Dean; Deen, Danny; Oanta, Emil; Batarilo, Zvonimir; Lacor, Chris

    This paper focuses on visualization and manipulation of graphical content in distributed network environments. The developed graphical middleware and 3D desktop prototypes were specialized for situational awareness. This research was done in the LArge Scale COllaborative decision support Technology (LASCOT) project, which explored and combined software technologies to support human-centred decision support system for crisis management (earthquake, tsunami, flooding, airplane or oil-tanker incidents, chemical, radio-active or other pollutants spreading, etc.). The performed state-of-the-art review did not identify any publicly available large scale distributed application of this kind. Existing proprietary solutions rely on the conventional technologies and 2D representations. Our challenge was to apply the "latest" available technologies, such Java3D, X3D and SOAP, compatible with average computer graphics hardware. The selected technologies are integrated and we demonstrate: the flow of data, which originates from heterogeneous data sources; interoperability across different operating systems and 3D visual representations to enhance the end-users interactions.

  13. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  14. A modeling approach to assess coastal management effects on benthic habitat quality: A case study on coastal defense and navigability

    NASA Astrophysics Data System (ADS)

    Cozzoli, Francesco; Smolders, Sven; Eelkema, Menno; Ysebaert, Tom; Escaravage, Vincent; Temmerman, Stijn; Meire, Patrick; Herman, Peter M. J.; Bouma, Tjeerd J.

    2017-01-01

    The natural coastal hydrodynamics and morphology worldwide is altered by human interventions such as embankments, shipping and dredging, which may have consequences for ecosystem functionality. To ensure long-term ecological sustainability, requires capability to predict long-term large-scale ecological effects of altered hydromorphology. As empirical data sets at relevant scales are missing, there is need for integrating ecological modeling with physical modeling. This paper presents a case study showing the long-term, large-scale macrozoobenthic community response to two contrasting human alterations of the hydromorphological habitat: deepening of estuarine channels to enhance navigability (Westerschelde) vs. realization of a storm surge barrier to enhance coastal safety (Oosterschelde). A multidisciplinary integration of empirical data and modeling of estuarine morphology, hydrodynamics and benthic ecology was used to reconstruct the hydrological evolution and resulting long-term (50 years) large-scale ecological trends for both estuaries over the last. Our model indicated that hydrodynamic alterations following the deepening of the Westerschelde had negative implications for benthic life, while the realization of the Oosterschelde storm surge barriers had mixed and habitat-dependent responses, that also include unexpected improvement of environmental quality. Our analysis illustrates long-term trends in the natural community caused by opposing management strategies. The divergent human pressures on the Oosterschelde and Westerschelde are examples of what could happen in a near future for many global coastal ecosystems. The comparative analysis of the two basins is a valuable source of information to understand (and communicate) the future ecological consequences of human coastal development.

  15. Accurate and Efficient Parallel Implementation of an Effective Linear-Scaling Direct Random Phase Approximation Method.

    PubMed

    Graf, Daniel; Beuerle, Matthias; Schurkus, Henry F; Luenser, Arne; Savasci, Gökcen; Ochsenfeld, Christian

    2018-05-08

    An efficient algorithm for calculating the random phase approximation (RPA) correlation energy is presented that is as accurate as the canonical molecular orbital resolution-of-the-identity RPA (RI-RPA) with the important advantage of an effective linear-scaling behavior (instead of quartic) for large systems due to a formulation in the local atomic orbital space. The high accuracy is achieved by utilizing optimized minimax integration schemes and the local Coulomb metric attenuated by the complementary error function for the RI approximation. The memory bottleneck of former atomic orbital (AO)-RI-RPA implementations ( Schurkus, H. F.; Ochsenfeld, C. J. Chem. Phys. 2016 , 144 , 031101 and Luenser, A.; Schurkus, H. F.; Ochsenfeld, C. J. Chem. Theory Comput. 2017 , 13 , 1647 - 1655 ) is addressed by precontraction of the large 3-center integral matrix with the Cholesky factors of the ground state density reducing the memory requirements of that matrix by a factor of [Formula: see text]. Furthermore, we present a parallel implementation of our method, which not only leads to faster RPA correlation energy calculations but also to a scalable decrease in memory requirements, opening the door for investigations of large molecules even on small- to medium-sized computing clusters. Although it is known that AO methods are highly efficient for extended systems, where sparsity allows for reaching the linear-scaling regime, we show that our work also extends the applicability when considering highly delocalized systems for which no linear scaling can be achieved. As an example, the interlayer distance of two covalent organic framework pore fragments (comprising 384 atoms in total) is analyzed.

  16. Using a framework to implement large-scale innovation in medical education with the intent of achieving sustainability.

    PubMed

    Hudson, Judith N; Farmer, Elizabeth A; Weston, Kathryn M; Bushnell, John A

    2015-01-16

    Particularly when undertaken on a large scale, implementing innovation in higher education poses many challenges. Sustaining the innovation requires early adoption of a coherent implementation strategy. Using an example from clinical education, this article describes a process used to implement a large-scale innovation with the intent of achieving sustainability. Desire to improve the effectiveness of undergraduate medical education has led to growing support for a longitudinal integrated clerkship (LIC) model. This involves a move away from the traditional clerkship of 'block rotations' with frequent changes in disciplines, to a focus upon clerkships with longer duration and opportunity for students to build sustained relationships with supervisors, mentors, colleagues and patients. A growing number of medical schools have adopted the LIC model for a small percentage of their students. At a time when increasing medical school numbers and class sizes are leading to competition for clinical supervisors it is however a daunting challenge to provide a longitudinal clerkship for an entire medical school class. This challenge is presented to illustrate the strategy used to implement sustainable large scale innovation. A strategy to implement and build a sustainable longitudinal integrated community-based clerkship experience for all students was derived from a framework arising from Roberto and Levesque's research in business. The framework's four core processes: chartering, learning, mobilising and realigning, provided guidance in preparing and rolling out the 'whole of class' innovation. Roberto and Levesque's framework proved useful for identifying the foundations of the implementation strategy, with special emphasis on the relationship building required to implement such an ambitious initiative. Although this was innovation in a new School it required change within the school, wider university and health community. Challenges encountered included some resistance to moving away from traditional hospital-centred education, initial student concern, resource limitations, workforce shortage and potential burnout of the innovators. Large-scale innovations in medical education may productively draw upon research from other disciplines for guidance on how to lay the foundations for successfully achieving sustainability.

  17. A monolithically integrated polarization entangled photon pair source on a silicon chip

    PubMed Central

    Matsuda, Nobuyuki; Le Jeannic, Hanna; Fukuda, Hiroshi; Tsuchizawa, Tai; Munro, William John; Shimizu, Kaoru; Yamada, Koji; Tokura, Yasuhiro; Takesue, Hiroki

    2012-01-01

    Integrated photonic circuits are one of the most promising platforms for large-scale photonic quantum information systems due to their small physical size and stable interferometers with near-perfect lateral-mode overlaps. Since many quantum information protocols are based on qubits defined by the polarization of photons, we must develop integrated building blocks to generate, manipulate, and measure the polarization-encoded quantum state on a chip. The generation unit is particularly important. Here we show the first integrated polarization-entangled photon pair source on a chip. We have implemented the source as a simple and stable silicon-on-insulator photonic circuit that generates an entangled state with 91 ± 2% fidelity. The source is equipped with versatile interfaces for silica-on-silicon or other types of waveguide platforms that accommodate the polarization manipulation and projection devices as well as pump light sources. Therefore, we are ready for the full-scale implementation of photonic quantum information systems on a chip. PMID:23150781

  18. Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan.

    NASA Astrophysics Data System (ADS)

    Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey

    2017-04-01

    Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this research. Hopefully, all results developed from this research can be used as a warning system for Predicting Large Scale Landslides in the southern Taiwan. Keywords:Heavy Rainfall, Large Scale, landslides, Critical Rainfall Value

  19. Co-governing decentralised water systems: an analytical framework.

    PubMed

    Yu, C; Brown, R; Morison, P

    2012-01-01

    Current discourses in urban water management emphasise a diversity of water sources and scales of infrastructure for resilience and adaptability. During the last 2 decades, in particular, various small-scale systems emerged and developed so that the debate has largely moved from centralised versus decentralised water systems toward governing integrated and networked systems of provision and consumption where small-scale technologies are embedded in large-scale centralised infrastructures. However, while centralised systems have established boundaries of ownership and management, decentralised water systems (such as stormwater harvesting technologies for the street, allotment/house scales) do not, therefore the viability for adoption and/or continued use of decentralised water systems is challenged. This paper brings together insights from the literature on public sector governance, co-production and social practices model to develop an analytical framework for co-governing such systems. The framework provides urban water practitioners with guidance when designing co-governance arrangements for decentralised water systems so that these systems continue to exist, and become widely adopted, within the established urban water regime.

  20. Cosmological measurements with general relativistic galaxy correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raccanelli, Alvise; Montanari, Francesco; Durrer, Ruth

    We investigate the cosmological dependence and the constraining power of large-scale galaxy correlations, including all redshift-distortions, wide-angle, lensing and gravitational potential effects on linear scales. We analyze the cosmological information present in the lensing convergence and in the gravitational potential terms describing the so-called ''relativistic effects'', and we find that, while smaller than the information contained in intrinsic galaxy clustering, it is not negligible. We investigate how neglecting them does bias cosmological measurements performed by future spectroscopic and photometric large-scale surveys such as SKA and Euclid. We perform a Fisher analysis using the CLASS code, modified to include scale-dependent galaxymore » bias and redshift-dependent magnification and evolution bias. Our results show that neglecting relativistic terms, especially lensing convergence, introduces an error in the forecasted precision in measuring cosmological parameters of the order of a few tens of percent, in particular when measuring the matter content of the Universe and primordial non-Gaussianity parameters. The analysis suggests a possible substantial systematic error in cosmological parameter constraints. Therefore, we argue that radial correlations and integrated relativistic terms need to be taken into account when forecasting the constraining power of future large-scale number counts of galaxy surveys.« less

  1. Engineering large-scale agent-based systems with consensus

    NASA Technical Reports Server (NTRS)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  2. An Integrated Knowledge Framework to Characterize and Scaffold Size and Scale Cognition (FS2C)

    NASA Astrophysics Data System (ADS)

    Magana, Alejandra J.; Brophy, Sean P.; Bryan, Lynn A.

    2012-09-01

    Size and scale cognition is a critical ability associated with reasoning with concepts in different disciplines of science, technology, engineering, and mathematics. As such, researchers and educators have identified the need for young learners and their educators to become scale-literate. Informed by developmental psychology literature and recent findings in nanoscale science and engineering education, we propose an integrated knowledge framework for characterizing and scaffolding size and scale cognition called the FS2C framework. Five ad hoc assessment tasks were designed informed by the FS2C framework with the goal of identifying participants' understandings of size and scale. Findings identified participants' difficulties to discern different sizes of microscale and nanoscale objects and a low level of sophistication on identifying scale worlds among participants. Results also identified that as bigger the difference between the sizes of the objects is, the more difficult was for participants to identify how many times an object is bigger or smaller than another one. Similarly, participants showed difficulties to estimate approximate sizes of sub-macroscopic objects as well as a difficulty for participants to estimate the size of very large objects. Participants' accurate location of objects on a logarithmic scale was also challenging.

  3. Taking Open Innovation to the Molecular Level - Strengths and Limitations.

    PubMed

    Zdrazil, Barbara; Blomberg, Niklas; Ecker, Gerhard F

    2012-08-01

    The ever-growing availability of large-scale open data and its maturation is having a significant impact on industrial drug-discovery, as well as on academic and non-profit research. As industry is changing to an 'open innovation' business concept, precompetitive initiatives and strong public-private partnerships including academic research cooperation partners are gaining more and more importance. Now, the bioinformatics and cheminformatics communities are seeking for web tools which allow the integration of this large volume of life science datasets available in the public domain. Such a data exploitation tool would ideally be able to answer complex biological questions by formulating only one search query. In this short review/perspective, we outline the use of semantic web approaches for data and knowledge integration. Further, we discuss strengths and current limitations of public available data retrieval tools and integrated platforms.

  4. Integrated water and renewable energy management: the Acheloos-Peneios region case study

    NASA Astrophysics Data System (ADS)

    Koukouvinos, Antonios; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Tegos, Aristotelis; Rozos, Evangelos; Papalexiou, Simon-Michael; Dimitriadis, Panayiotis; Markonis, Yiannis; Kossieris, Panayiotis; Tyralis, Christos; Karakatsanis, Georgios; Tzouka, Katerina; Christofides, Antonis; Karavokiros, George; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Within the ongoing research project "Combined Renewable Systems for Sustainable Energy Development" (CRESSENDO), we have developed a novel stochastic simulation framework for optimal planning and management of large-scale hybrid renewable energy systems, in which hydropower plays the dominant role. The methodology and associated computer tools are tested in two major adjacent river basins in Greece (Acheloos, Peneios) extending over 15 500 km2 (12% of Greek territory). River Acheloos is characterized by very high runoff and holds ~40% of the installed hydropower capacity of Greece. On the other hand, the Thessaly plain drained by Peneios - a key agricultural region for the national economy - usually suffers from water scarcity and systematic environmental degradation. The two basins are interconnected through diversion projects, existing and planned, thus formulating a unique large-scale hydrosystem whose future has been the subject of a great controversy. The study area is viewed as a hypothetically closed, energy-autonomous, system, in order to evaluate the perspectives for sustainable development of its water and energy resources. In this context we seek an efficient configuration of the necessary hydraulic and renewable energy projects through integrated modelling of the water and energy balance. We investigate several scenarios of energy demand for domestic, industrial and agricultural use, assuming that part of the demand is fulfilled via wind and solar energy, while the excess or deficit of energy is regulated through large hydroelectric works that are equipped with pumping storage facilities. The overall goal is to examine under which conditions a fully renewable energy system can be technically and economically viable for such large spatial scale.

  5. A Microcomputer-Based Program for Printing Check Plots of Integrated Circuits Specified in Caltech Intermediate Form.

    DTIC Science & Technology

    1984-12-01

    only four transistors[5]. Each year since that time, the semiconductor industry has con- sistently improved the quality of the fabrication tech- niques...rarely took place at universities and was almost exclusively confined to industry . IC design techniques were developed, tested, and taught only in the...community, it is not uncommon for industry to borrow ideas and even particular programs from these university designed tools. The Very Large Scale Integration

  6. Facilitating CCS Business Planning by Extending the Functionality of the SimCCS Integrated System Model

    DOE PAGES

    Ellett, Kevin M.; Middleton, Richard S.; Stauffer, Philip H.; ...

    2017-08-18

    The application of integrated system models for evaluating carbon capture and storage technology has expanded steadily over the past few years. To date, such models have focused largely on hypothetical scenarios of complex source-sink matching involving numerous large-scale CO 2 emitters, and high-volume, continuous reservoirs such as deep saline formations to function as geologic sinks for carbon storage. Though these models have provided unique insight on the potential costs and feasibility of deploying complex networks of integrated infrastructure, there remains a pressing need to translate such insight to the business community if this technology is to ever achieve a trulymore » meaningful impact in greenhouse gas mitigation. Here, we present a new integrated system modelling tool termed SimCCUS aimed at providing crucial decision support for businesses by extending the functionality of a previously developed model called SimCCS. The primary innovation of the SimCCUS tool development is the incorporation of stacked geological reservoir systems with explicit consideration of processes and costs associated with the operation of multiple CO 2 utilization and storage targets from a single geographic location. In such locations provide significant efficiencies through economies of scale, effectively minimizing CO 2 storage costs while simultaneously maximizing revenue streams via the utilization of CO 2 as a commodity for enhanced hydrocarbon recovery.« less

  7. Facilitating CCS Business Planning by Extending the Functionality of the SimCCS Integrated System Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellett, Kevin M.; Middleton, Richard S.; Stauffer, Philip H.

    The application of integrated system models for evaluating carbon capture and storage technology has expanded steadily over the past few years. To date, such models have focused largely on hypothetical scenarios of complex source-sink matching involving numerous large-scale CO 2 emitters, and high-volume, continuous reservoirs such as deep saline formations to function as geologic sinks for carbon storage. Though these models have provided unique insight on the potential costs and feasibility of deploying complex networks of integrated infrastructure, there remains a pressing need to translate such insight to the business community if this technology is to ever achieve a trulymore » meaningful impact in greenhouse gas mitigation. Here, we present a new integrated system modelling tool termed SimCCUS aimed at providing crucial decision support for businesses by extending the functionality of a previously developed model called SimCCS. The primary innovation of the SimCCUS tool development is the incorporation of stacked geological reservoir systems with explicit consideration of processes and costs associated with the operation of multiple CO 2 utilization and storage targets from a single geographic location. In such locations provide significant efficiencies through economies of scale, effectively minimizing CO 2 storage costs while simultaneously maximizing revenue streams via the utilization of CO 2 as a commodity for enhanced hydrocarbon recovery.« less

  8. Evaluating landscape health: Integrating societal goals and biophysical process

    USGS Publications Warehouse

    Rapport, D.J.; Gaudet, C.; Karr, J.R.; Baron, Jill S.; Bohlen, C.; Jackson, W.; Jones, Bruce; Naiman, R.J.; Norton, B.; Pollock, M. M.

    1998-01-01

    Evaluating landscape change requires the integration of the social and natural sciences. The social sciences contribute to articulating societal values that govern landscape change, while the natural sciences contribute to understanding the biophysical processes that are influenced by human activity and result in ecological change. Building upon Aldo Leopold's criteria for landscape health, the roles of societal values and biophysical processes in shaping the landscape are explored. A framework is developed for indicators of landscape health and integrity. Indicators of integrity are useful in measuring biological condition relative to the condition in landscapes largely unaffected by human activity, while indicators of health are useful in evaluating changes in highly modified landscapes. Integrating societal goals and biophysical processes requires identification of ecological services to be sustained within a given landscape. It also requires the proper choice of temporal and spatial scales. Societal values are based upon inter-generational concerns at regional scales (e.g. soil and ground water quality). Assessing the health and integrity of the environment at the landscape scale over a period of decades best integrates societal values with underlying biophysical processes. These principles are illustrated in two contrasting case studies: (1) the South Platte River study demonstrates the role of complex biophysical processes acting at a distance; and (2) the Kissimmee River study illustrates the critical importance of social, cultural and economic concerns in the design of remedial action plans. In both studies, however, interactions between the social and the biophysical governed the landscape outcomes. The legacy of evolution and the legacy of culture requires integration for the purpose of effectively coping with environmental change.

  9. Secure Data Aggregation with Fully Homomorphic Encryption in Large-Scale Wireless Sensor Networks.

    PubMed

    Li, Xing; Chen, Dexin; Li, Chunyan; Wang, Liangmin

    2015-07-03

    With the rapid development of wireless communication technology, sensor technology, information acquisition and processing technology, sensor networks will finally have a deep influence on all aspects of people's lives. The battery resources of sensor nodes should be managed efficiently in order to prolong network lifetime in large-scale wireless sensor networks (LWSNs). Data aggregation represents an important method to remove redundancy as well as unnecessary data transmission and hence cut down the energy used in communication. As sensor nodes are deployed in hostile environments, the security of the sensitive information such as confidentiality and integrity should be considered. This paper proposes Fully homomorphic Encryption based Secure data Aggregation (FESA) in LWSNs which can protect end-to-end data confidentiality and support arbitrary aggregation operations over encrypted data. In addition, by utilizing message authentication codes (MACs), this scheme can also verify data integrity during data aggregation and forwarding processes so that false data can be detected as early as possible. Although the FHE increase the computation overhead due to its large public key size, simulation results show that it is implementable in LWSNs and performs well. Compared with other protocols, the transmitted data and network overhead are reduced in our scheme.

  10. Deployment of ERP Systems at Automotive Industries, Security Inspection (Case Study: IRAN KHODRO Automotive Company)

    NASA Astrophysics Data System (ADS)

    Ali, Hatamirad; Hasan, Mehrjerdi

    Automotive industry and car production process is one of the most complex and large-scale production processes. Today, information technology (IT) and ERP systems incorporates a large portion of production processes. Without any integrated systems such as ERP, the production and supply chain processes will be tangled. The ERP systems, that are last generation of MRP systems, make produce and sale processes of these industries easier and this is the major factor of development of these industries anyhow. Today many of large-scale companies are developing and deploying the ERP systems. The ERP systems facilitate many of organization processes and make organization to increase efficiency. The security is a very important part of the ERP strategy at the organization, Security at the ERP systems, because of integrity and extensive, is more important of local and legacy systems. Disregarding of this point can play a giant role at success or failure of this kind of systems. The IRANKHODRO is the biggest automotive factory in the Middle East with an annual production over 600.000 cars. This paper presents ERP security deployment experience at the "IRANKHODRO Company". Recently, by launching ERP systems, it moved a big step toward more developments.

  11. Relativistic wide-angle galaxy bispectrum on the light cone

    NASA Astrophysics Data System (ADS)

    Bertacca, Daniele; Raccanelli, Alvise; Bartolo, Nicola; Liguori, Michele; Matarrese, Sabino; Verde, Licia

    2018-01-01

    Given the important role that the galaxy bispectrum has recently acquired in cosmology and the scale and precision of forthcoming galaxy clustering observations, it is timely to derive the full expression of the large-scale bispectrum going beyond approximated treatments which neglect integrated terms or higher-order bias terms or use the Limber approximation. On cosmological scales, relativistic effects that arise from observing the past light cone alter the observed galaxy number counts, therefore leaving their imprints on N-point correlators at all orders. In this paper we compute for the first time the bispectrum including all general relativistic, local and integrated, effects at second order, the tracers' bias at second order, geometric effects as well as the primordial non-Gaussianity contribution. This is timely considering that future surveys will probe scales comparable to the horizon where approximations widely used currently may not hold; neglecting these effects may introduce biases in estimation of cosmological parameters as well as primordial non-Gaussianity.

  12. Communication: A reduced scaling J-engine based reformulation of SOS-MP2 using graphics processing units.

    PubMed

    Maurer, S A; Kussmann, J; Ochsenfeld, C

    2014-08-07

    We present a low-prefactor, cubically scaling scaled-opposite-spin second-order Møller-Plesset perturbation theory (SOS-MP2) method which is highly suitable for massively parallel architectures like graphics processing units (GPU). The scaling is reduced from O(N⁵) to O(N³) by a reformulation of the MP2-expression in the atomic orbital basis via Laplace transformation and the resolution-of-the-identity (RI) approximation of the integrals in combination with efficient sparse algebra for the 3-center integral transformation. In contrast to previous works that employ GPUs for post Hartree-Fock calculations, we do not simply employ GPU-based linear algebra libraries to accelerate the conventional algorithm. Instead, our reformulation allows to replace the rate-determining contraction step with a modified J-engine algorithm, that has been proven to be highly efficient on GPUs. Thus, our SOS-MP2 scheme enables us to treat large molecular systems in an accurate and efficient manner on a single GPU-server.

  13. Tomato functional genomics database (TFGD): a comprehensive collection and analysis package for tomato functional genomics

    USDA-ARS?s Scientific Manuscript database

    Tomato Functional Genomics Database (TFGD; http://ted.bti.cornell.edu) provides a comprehensive systems biology resource to store, mine, analyze, visualize and integrate large-scale tomato functional genomics datasets. The database is expanded from the previously described Tomato Expression Database...

  14. Interactive graphical computer-aided design system

    NASA Technical Reports Server (NTRS)

    Edge, T. M.

    1975-01-01

    System is used for design, layout, and modification of large-scale-integrated (LSI) metal-oxide semiconductor (MOS) arrays. System is structured around small computer which provides real-time support for graphics storage display unit with keyboard, slave display unit, hard copy unit, and graphics tablet for designer/computer interface.

  15. Optimizing C-17 Pacific Basing

    DTIC Science & Technology

    2014-05-01

    Starlifter and C-5 Galaxy turbofan -powered aircraft (Owen, 2014). Pax Americana was characterized by a busy and steady operating environment for US global...center of the Pacific as well as the north and western Pacific rim (Owen, 2014). Turbofan airlifters began to be integrated into the large-scale

  16. Urban Elementary STEM Initiative

    ERIC Educational Resources Information Center

    Parker, Carolyn; Abel, Yolanda; Denisova, Ekaterina

    2015-01-01

    The new standards for K-12 science education suggest that student learning should be more integrated and should focus on crosscutting concepts and core ideas from the areas of physical science, life science, Earth/space science, and engineering/technology. This paper describes large-scale, urban elementary-focused science, technology, engineering,…

  17. PeanutBase and other bioinformatic resources for peanut

    USDA-ARS?s Scientific Manuscript database

    Large-scale genomic data for peanut have only become available in the last few years, with the advent of low-cost sequencing technologies. To make the data accessible to researchers and to integrate across diverse types of data, the International Peanut Genomics Consortium funded the development of ...

  18. Power Grid Data Analysis with R and Hadoop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Gibson, Tara D.; Kleese van Dam, Kerstin

    This book chapter presents an approach to analysis of large-scale time-series sensor information based on our experience with power grid data. We use the R-Hadoop Integrated Programming Environment (RHIPE) to analyze a 2TB data set and present code and results for this analysis.

  19. Systems Proteomics for Translational Network Medicine

    PubMed Central

    Arrell, D. Kent; Terzic, Andre

    2012-01-01

    Universal principles underlying network science, and their ever-increasing applications in biomedicine, underscore the unprecedented capacity of systems biology based strategies to synthesize and resolve massive high throughput generated datasets. Enabling previously unattainable comprehension of biological complexity, systems approaches have accelerated progress in elucidating disease prediction, progression, and outcome. Applied to the spectrum of states spanning health and disease, network proteomics establishes a collation, integration, and prioritization algorithm to guide mapping and decoding of proteome landscapes from large-scale raw data. Providing unparalleled deconvolution of protein lists into global interactomes, integrative systems proteomics enables objective, multi-modal interpretation at molecular, pathway, and network scales, merging individual molecular components, their plurality of interactions, and functional contributions for systems comprehension. As such, network systems approaches are increasingly exploited for objective interpretation of cardiovascular proteomics studies. Here, we highlight network systems proteomic analysis pipelines for integration and biological interpretation through protein cartography, ontological categorization, pathway and functional enrichment and complex network analysis. PMID:22896016

  20. Sapphire Energy - Integrated Algal Biorefinery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Rebecca L.; Tyler, Mike

    2015-07-22

    Sapphire Energy, Inc. (SEI) is a leader in large-scale photosynthetic algal biomass production, with a strongly cohesive research, development, and operations program. SEI takes a multidiscipline approach to integrate lab-based strain selection, cultivation and harvest and production scale, and extraction for the production of Green Crude oil, a drop in replacement for traditional crude oil.. SEI’s technical accomplishments since 2007 have produced a multifunctional platform that can address needs for fuel, feed, and other higher value products. Figure 1 outlines SEI’s commercialization process, including Green Crude production and refinement to drop in fuel replacements. The large scale algal biomass productionmore » facility, the SEI Integrated Algal Biorefinery (IABR), was built in Luna County near Columbus, New Mexico (see fig 2). The extraction unit was located at the existing SEI facility in Las Cruces, New Mexico, approximately 95 miles from the IABR. The IABR facility was constructed on time and on budget, and the extraction unit expansion to accommodate the biomass output from the IABR was completed in October 2012. The IABR facility uses open pond cultivation with a proprietary harvesting method to produce algal biomass; this biomass is then shipped to the extraction facility for conversion to Green Crude. The operation of the IABR and the extraction facilities has demonstrated the critical integration of traditional agricultural techniques with algae cultivation knowledge for algal biomass production, and the successful conversion of the biomass to Green Crude. All primary unit operations are de-risked, and at a scale suitable for process demonstration. The results are stable, reliable, and long-term cultivation of strains for year round algal biomass production. From June 2012 to November 2014, the IABR and extraction facilities produced 524 metric tons (MT) of biomass (on a dry weight basis), and 2,587 gallons of Green Crude. Additionally, the IABR demonstrated significant year over year yield improvements (2013 to 2014), and reduction in the cost of biomass production. Therefore, the IABR fulfills a number of critical functions in SEI’s integrated development pipeline. These functions are critical in general for the commercialization of algal biomass production and production of biofuels from algal biomass.« less

  1. Quantifying biological integrity of California sage scrub communities using plant life-form cover.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamada, Y.; Stow, D. A.; Franklin, J.

    2010-01-01

    The California sage scrub (CSS) community type in California's Mediterranean-type ecosystems supports a large number of rare, threatened, and endangered species, and is critically degraded and endangered. Monitoring ecological variables that provide information about community integrity is vital to conserving these biologically diverse communities. Fractional cover of true shrub, subshrub, herbaceous vegetation, and bare ground should fill information gaps between generalized vegetation type maps and detailed field-based plot measurements of species composition and provide an effective means for quantifying CSS community integrity. Remote sensing is the only tool available for estimating spatially comprehensive fractional cover over large extent, and fractionalmore » cover of plant life-form types is one of the measures of vegetation state that is most amenable to remote sensing. The use of remote sensing does not eliminate the need for either field surveying or vegetation type mapping; rather it will likely require a combination of approaches to reliably estimate life-form cover and to provide comprehensive information for communities. According to our review and synthesis, life-form fractional cover has strong potential for providing ecologically meaningful intermediate-scale information, which is unattainable from vegetation type maps and species-level field measurements. Thus, we strongly recommend incorporating fractional cover of true shrub, subshrub, herb, and bare ground in CSS community monitoring methods. Estimating life-form cover at a 25 m x 25 m spatial scale using remote sensing would be an appropriate approach for initial implementation. Investigation of remote sensing techniques and an appropriate spatial scale; collaboration of resource managers, biologists, and remote sensing specialists, and refinement of protocols are essential for integrating life-form fractional cover mapping into strategies for sustainable long-term CSS community management.« less

  2. Hydrologic connectivity and the contribution of stream headwaters to ecological integrity at regional scales

    USGS Publications Warehouse

    Freeman, Mary C.; Pringle, C.M.; Jackson, C.R.

    2007-01-01

    Cumulatively, headwater streams contribute to maintaining hydrologic connectivity and ecosystem integrity at regional scales. Hydrologic connectivity is the water-mediated transport of matter, energy and organisms within or between elements of the hydrologic cycle. Headwater streams compose over two-thirds of total stream length in a typical river drainage and directly connect the upland and riparian landscape to the rest of the stream ecosystem. Altering headwater streams, e.g., by channelization, diversion through pipes, impoundment and burial, modifies fluxes between uplands and downstream river segments and eliminates distinctive habitats. The large-scale ecological effects of altering headwaters are amplified by land uses that alter runoff and nutrient loads to streams, and by widespread dam construction on larger rivers (which frequently leaves free-flowing upstream portions of river systems essential to sustaining aquatic biodiversity). We discuss three examples of large-scale consequences of cumulative headwater alteration. Downstream eutrophication and coastal hypoxia result, in part, from agricultural practices that alter headwaters and wetlands while increasing nutrient runoff. Extensive headwater alteration is also expected to lower secondary productivity of river systems by reducing stream-system length and trophic subsidies to downstream river segments, affecting aquatic communities and terrestrial wildlife that utilize aquatic resources. Reduced viability of freshwater biota may occur with cumulative headwater alteration, including for species that occupy a range of stream sizes but for which headwater streams diversify the network of interconnected populations or enhance survival for particular life stages. Developing a more predictive understanding of ecological patterns that may emerge on regional scales as a result of headwater alterations will require studies focused on components and pathways that connect headwaters to river, coastal and terrestrial ecosystems. Linkages between headwaters and downstream ecosystems cannot be discounted when addressing large-scale issues such as hypoxia in the Gulf of Mexico and global losses of biodiversity.

  3. Filling in the blanks. An estimation of illicit cannabis growers' profits in Belgium.

    PubMed

    Vanhove, Wouter; Surmont, Tim; Van Damme, Patrick; De Ruyver, Brice

    2014-05-01

    As a result of increased pressure on cannabis cultivation in The Netherlands, the number of confiscated indoor cannabis plantations in Belgium is rising. Although increases are reported for all plantations sizes, half of the seized plantations contain less than 50 plants. In this study, factors and variables that influence costs and benefits of indoor cannabis cultivation are investigated as well as how these costs and benefits vary between different cannabis grower types. Real-situation data of four growers were used to perform financial analyses. Costs included fixed and variable material costs, as well as opportunity costs. Gross revenue per grow cycle was calculated based on most recent forensic findings for illicit Belgian cannabis plantations and was adjusted for the risk of getting caught. Finally, gross revenues and return on costs (ROC) were calculated over 1 year (4 cycles). Financial analysis shows that in all cases gross revenues as well as ROC are considerable, even after a single growth cycle. Highest profitability was found for large-scale (600 plants, ROC=6.8) and mid-scale plantations (150 plants, ROC=6.0). However, industrial plantations (23,000 plants, ROC=1.4) and micro-scale plantations (5 plants, ROC=2.8) are also highly remunerative. Shift of police focus away from micro-scale growers, least likely to be involved in criminal gangs, to large-scale and industrial scale plantations would influence costs as a result of changing risks of getting caught. However, sensitivity analysis shows that this does not significantly influence the conclusions on profitability of different types of indoor cannabis growers. Seizure and confiscation of profits are important elements in the integral and integrated policy approach required for tackling illicit indoor cannabis plantations. The large return of costs evidenced in the present study, underpin the policy relevance of confiscating those illicit profits as part of enforcement. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Integrating Bioregenerative Foods into the Exploration Spaceflight Food System

    NASA Technical Reports Server (NTRS)

    Douglas, Grace L.

    2017-01-01

    Food, the nutrition it provides, and the eating experiences surrounding it, are central to performance, health, and psychosocial wellbeing on long duration spaceflight missions. Exploration missions will require a spaceflight food system that is safe, nutritious, and acceptable for up to five years, possibly without cold storage. Many of the processed and packaged spaceflight foods currently used on the International Space Station will not retain acceptable quality or required levels of key nutrients under these conditions. The addition of bioregenerative produce to exploration missions may become an important countermeasure to the nutritional gaps and a resource to support psychosocial health. Bioregenerative produce will be central to establishment of Earth-independence as exploration extends deeper into space. However, bioregenerative foods introduce food safety and scarcity risks that must be eliminated prior to crew reliance on these systems. The pathway to Earth independence will require small-scale integration and validation prior to large scale bioregenerative dependence. Near term exploration missions offer the opportunity to establish small scale supplemental salad crop and fruit systems and validate infrastructure reliability, nutritional potential, and the psychosocial benefits necessary to promote further bioregenerative integration.

  5. Delineating the Construct Network of the Personnel Reaction Blank: Associations with Externalizing Tendencies and Normal Personality

    PubMed Central

    Blonigen, Daniel M.; Patrick, Christopher J.; Gasperi, Marianna; Steffen, Benjamin; Ones, Deniz S.; Arvey, Richard D.; de Oliveira Baumgartl, Viviane; do Nascimento, Elizabeth

    2010-01-01

    Integrity testing has long been utilized in personnel selection to screen for tendencies toward counterproductive workplace behaviors. The construct of externalizing from the psychopathology literature represents a coherent spectrum marked by disinhibitory traits and behaviors. The present study used a sample of male and female undergraduates to examine the construct network of the Personnel Reaction Blank (PRB; Gough, Arvey, & Bradley, 2004), a measure of integrity, in relation to externalizing as well as normal-range personality constructs assessed by the Multidimensional Personality Questionnaire (MPQ; Tellegen & Waller, 2008). Results revealed moderate to strong associations between several PRB scales and externalizing, which were largely accounted for by MPQ traits subsumed by Negative Emotionality and Constraint. After accounting for MPQ traits in the prediction of externalizing, a modest predictive increment was achieved when adding the PRB scales, particularly biographical indicators from the Prosocial Background subscale. The findings highlight externalizing as a focal criterion for scale development in the integrity testing literature, and help delineate the construct network of the PRB within the domains of personality and psychopathology. PMID:21171783

  6. Carbon Nanotube Integration with a CMOS Process

    PubMed Central

    Perez, Maximiliano S.; Lerner, Betiana; Resasco, Daniel E.; Pareja Obregon, Pablo D.; Julian, Pedro M.; Mandolesi, Pablo S.; Buffa, Fabian A.; Boselli, Alfredo; Lamagna, Alberto

    2010-01-01

    This work shows the integration of a sensor based on carbon nanotubes using CMOS technology. A chip sensor (CS) was designed and manufactured using a 0.30 μm CMOS process, leaving a free window on the passivation layer that allowed the deposition of SWCNTs over the electrodes. We successfully investigated with the CS the effect of humidity and temperature on the electrical transport properties of SWCNTs. The possibility of a large scale integration of SWCNTs with CMOS process opens a new route in the design of more efficient, low cost sensors with high reproducibility in their manufacture. PMID:22319330

  7. Computer-aided engineering of semiconductor integrated circuits

    NASA Astrophysics Data System (ADS)

    Meindl, J. D.; Dutton, R. W.; Gibbons, J. F.; Helms, C. R.; Plummer, J. D.; Tiller, W. A.; Ho, C. P.; Saraswat, K. C.; Deal, B. E.; Kamins, T. I.

    1980-07-01

    Economical procurement of small quantities of high performance custom integrated circuits for military systems is impeded by inadequate process, device and circuit models that handicap low cost computer aided design. The principal objective of this program is to formulate physical models of fabrication processes, devices and circuits to allow total computer-aided design of custom large-scale integrated circuits. The basic areas under investigation are (1) thermal oxidation, (2) ion implantation and diffusion, (3) chemical vapor deposition of silicon and refractory metal silicides, (4) device simulation and analytic measurements. This report discusses the fourth year of the program.

  8. High throughput platforms for structural genomics of integral membrane proteins.

    PubMed

    Mancia, Filippo; Love, James

    2011-08-01

    Structural genomics approaches on integral membrane proteins have been postulated for over a decade, yet specific efforts are lagging years behind their soluble counterparts. Indeed, high throughput methodologies for production and characterization of prokaryotic integral membrane proteins are only now emerging, while large-scale efforts for eukaryotic ones are still in their infancy. Presented here is a review of recent literature on actively ongoing structural genomics of membrane protein initiatives, with a focus on those aimed at implementing interesting techniques aimed at increasing our rate of success for this class of macromolecules. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Role of Concentrating Solar Power in Integrating Solar and Wind Energy: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denholm, P.; Mehos, M.

    2015-06-03

    As wind and solar photovoltaics (PV) increase in penetration it is increasingly important to examine enabling technologies that can help integrate these resources at large scale. Concentrating solar power (CSP) when deployed with thermal energy storage (TES) can provide multiple services that can help integrate variable generation (VG) resources such as wind and PV. CSP with TES can provide firm, highly flexible capacity, reducing minimum generation constraints which limit penetration and results in curtailment. By acting as an enabling technology, CSP can complement PV and wind, substantially increasing their penetration in locations with adequate solar resource.

  10. Bioenergy Development Policy and Practice Must Recognize Potential Hydrologic Impacts: Lessons from the Americas.

    PubMed

    Watkins, David W; de Moraes, Márcia M G Alcoforado; Asbjornsen, Heidi; Mayer, Alex S; Licata, Julian; Lopez, Jose Gutierrez; Pypker, Thomas G; Molina, Vivianna Gamez; Marques, Guilherme Fernandes; Carneiro, Ana Cristina Guimaraes; Nuñez, Hector M; Önal, Hayri; da Nobrega Germano, Bruna

    2015-12-01

    Large-scale bioenergy production will affect the hydrologic cycle in multiple ways, including changes in canopy interception, evapotranspiration, infiltration, and the quantity and quality of surface runoff and groundwater recharge. As such, the water footprints of bioenergy sources vary significantly by type of feedstock, soil characteristics, cultivation practices, and hydro-climatic regime. Furthermore, water management implications of bioenergy production depend on existing land use, relative water availability, and competing water uses at a watershed scale. This paper reviews previous research on the water resource impacts of bioenergy production-from plot-scale hydrologic and nutrient cycling impacts to watershed and regional scale hydro-economic systems relationships. Primary gaps in knowledge that hinder policy development for integrated management of water-bioenergy systems are highlighted. Four case studies in the Americas are analyzed to illustrate relevant spatial and temporal scales for impact assessment, along with unique aspects of biofuel production compared to other agroforestry systems, such as energy-related conflicts and tradeoffs. Based on the case studies, the potential benefits of integrated resource management are assessed, as is the need for further case-specific research.

  11. Bioenergy Development Policy and Practice Must Recognize Potential Hydrologic Impacts: Lessons from the Americas

    NASA Astrophysics Data System (ADS)

    Watkins, David W.; de Moraes, Márcia M. G. Alcoforado; Asbjornsen, Heidi; Mayer, Alex S.; Licata, Julian; Lopez, Jose Gutierrez; Pypker, Thomas G.; Molina, Vivianna Gamez; Marques, Guilherme Fernandes; Carneiro, Ana Cristina Guimaraes; Nuñez, Hector M.; Önal, Hayri; da Nobrega Germano, Bruna

    2015-12-01

    Large-scale bioenergy production will affect the hydrologic cycle in multiple ways, including changes in canopy interception, evapotranspiration, infiltration, and the quantity and quality of surface runoff and groundwater recharge. As such, the water footprints of bioenergy sources vary significantly by type of feedstock, soil characteristics, cultivation practices, and hydro-climatic regime. Furthermore, water management implications of bioenergy production depend on existing land use, relative water availability, and competing water uses at a watershed scale. This paper reviews previous research on the water resource impacts of bioenergy production—from plot-scale hydrologic and nutrient cycling impacts to watershed and regional scale hydro-economic systems relationships. Primary gaps in knowledge that hinder policy development for integrated management of water-bioenergy systems are highlighted. Four case studies in the Americas are analyzed to illustrate relevant spatial and temporal scales for impact assessment, along with unique aspects of biofuel production compared to other agroforestry systems, such as energy-related conflicts and tradeoffs. Based on the case studies, the potential benefits of integrated resource management are assessed, as is the need for further case-specific research.

  12. Multi-scale integration and predictability in resting state brain activity

    PubMed Central

    Kolchinsky, Artemy; van den Heuvel, Martijn P.; Griffa, Alessandra; Hagmann, Patric; Rocha, Luis M.; Sporns, Olaf; Goñi, Joaquín

    2014-01-01

    The human brain displays heterogeneous organization in both structure and function. Here we develop a method to characterize brain regions and networks in terms of information-theoretic measures. We look at how these measures scale when larger spatial regions as well as larger connectome sub-networks are considered. This framework is applied to human brain fMRI recordings of resting-state activity and DSI-inferred structural connectivity. We find that strong functional coupling across large spatial distances distinguishes functional hubs from unimodal low-level areas, and that this long-range functional coupling correlates with structural long-range efficiency on the connectome. We also find a set of connectome regions that are both internally integrated and coupled to the rest of the brain, and which resemble previously reported resting-state networks. Finally, we argue that information-theoretic measures are useful for characterizing the functional organization of the brain at multiple scales. PMID:25104933

  13. A low-frequency chip-scale optomechanical oscillator with 58 kHz mechanical stiffening and more than 100th-order stable harmonics.

    PubMed

    Huang, Yongjun; Flores, Jaime Gonzalo Flor; Cai, Ziqiang; Yu, Mingbin; Kwong, Dim-Lee; Wen, Guangjun; Churchill, Layne; Wong, Chee Wei

    2017-06-29

    For the sensitive high-resolution force- and field-sensing applications, the large-mass microelectromechanical system (MEMS) and optomechanical cavity have been proposed to realize the sub-aN/Hz 1/2 resolution levels. In view of the optomechanical cavity-based force- and field-sensors, the optomechanical coupling is the key parameter for achieving high sensitivity and resolution. Here we demonstrate a chip-scale optomechanical cavity with large mass which operates at ≈77.7 kHz fundamental mode and intrinsically exhibiting large optomechanical coupling of 44 GHz/nm or more, for both optical resonance modes. The mechanical stiffening range of ≈58 kHz and a more than 100 th -order harmonics are obtained, with which the free-running frequency instability is lower than 10 -6 at 100 ms integration time. Such results can be applied to further improve the sensing performance of the optomechanical inspired chip-scale sensors.

  14. Statistical Analysis of Large-Scale Structure of Universe

    NASA Astrophysics Data System (ADS)

    Tugay, A. V.

    While galaxy cluster catalogs were compiled many decades ago, other structural elements of cosmic web are detected at definite level only in the newest works. For example, extragalactic filaments were described by velocity field and SDSS galaxy distribution during the last years. Large-scale structure of the Universe could be also mapped in the future using ATHENA observations in X-rays and SKA in radio band. Until detailed observations are not available for the most volume of Universe, some integral statistical parameters can be used for its description. Such methods as galaxy correlation function, power spectrum, statistical moments and peak statistics are commonly used with this aim. The parameters of power spectrum and other statistics are important for constraining the models of dark matter, dark energy, inflation and brane cosmology. In the present work we describe the growth of large-scale density fluctuations in one- and three-dimensional case with Fourier harmonics of hydrodynamical parameters. In result we get power-law relation for the matter power spectrum.

  15. Resolving the Circumstellar Environment of the Galactic B[e] Supergiant Star MWC 137 from Large to Small Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraus, Michaela; Nickeler, Dieter H.; Liimets, Tiina

    The Galactic object MWC 137 has been suggested to belong to the group of B[e] supergiants. However, with its large-scale optical bipolar ring nebula and high-velocity jet and knots, it is a rather atypical representative of this class. We performed multiwavelength observations spreading from the optical to the radio regimes. Based on optical imaging and long-slit spectroscopic data, we found that the northern parts of the large-scale nebula are predominantly blueshifted, while the southern regions appear mostly redshifted. We developed a geometrical model consisting of two double cones. Although various observational features can be approximated with such a scenario, themore » observed velocity pattern is more complex. Using near-infrared integral-field unit spectroscopy, we studied the hot molecular gas in the vicinity of the star. The emission from the hot CO gas arises in a small-scale disk revolving around the star on Keplerian orbits. Although the disk itself cannot be spatially resolved, its emission is reflected by the dust arranged in arc-like structures and the clumps surrounding MWC 137 on small scales. In the radio regime, we mapped the cold molecular gas in the outskirts of the optical nebula. We found that large amounts of cool molecular gas and warm dust embrace the optical nebula in the east, south, and west. No cold gas or dust was detected in the north and northwestern regions. Despite the new insights into the nebula kinematics gained from our studies, the real formation scenario of the large-scale nebula remains an open issue.« less

  16. Commercial-scale biotherapeutics manufacturing facility for plant-made pharmaceuticals.

    PubMed

    Holtz, Barry R; Berquist, Brian R; Bennett, Lindsay D; Kommineni, Vally J M; Munigunti, Ranjith K; White, Earl L; Wilkerson, Don C; Wong, Kah-Yat I; Ly, Lan H; Marcel, Sylvain

    2015-10-01

    Rapid, large-scale manufacture of medical countermeasures can be uniquely met by the plant-made-pharmaceutical platform technology. As a participant in the Defense Advanced Research Projects Agency (DARPA) Blue Angel project, the Caliber Biotherapeutics facility was designed, constructed, commissioned and released a therapeutic target (H1N1 influenza subunit vaccine) in <18 months from groundbreaking. As of 2015, this facility was one of the world's largest plant-based manufacturing facilities, with the capacity to process over 3500 kg of plant biomass per week in an automated multilevel growing environment using proprietary LED lighting. The facility can commission additional plant grow rooms that are already built to double this capacity. In addition to the commercial-scale manufacturing facility, a pilot production facility was designed based on the large-scale manufacturing specifications as a way to integrate product development and technology transfer. The primary research, development and manufacturing system employs vacuum-infiltrated Nicotiana benthamiana plants grown in a fully contained, hydroponic system for transient expression of recombinant proteins. This expression platform has been linked to a downstream process system, analytical characterization, and assessment of biological activity. This integrated approach has demonstrated rapid, high-quality production of therapeutic monoclonal antibody targets, including a panel of rituximab biosimilar/biobetter molecules and antiviral antibodies against influenza and dengue fever. © 2015 Society for Experimental Biology, Association of Applied Biologists and John Wiley & Sons Ltd.

  17. Smectic viral capsids and the aneurysm instability

    NASA Astrophysics Data System (ADS)

    Dharmavaram, S.; Rudnick, J.; Lawrence, C. M.; Bruinsma, R. F.

    2018-05-01

    The capsids of certain Archaea-infecting viruses undergo large shape changes, while maintaining their integrity against rupture by osmotic pressure. We propose that these capsids are in a smectic liquid crystalline state, with the capsid proteins assembling along spirals. We show that smectic capsids are intrinsically stabilized against the formation of localized bulges with non-zero Gauss curvature while still allowing for large-scale cooperative shape transformation that involves global changes in the Gauss curvature.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Jong-Won; Hirao, Kimihiko

    Long-range corrected density functional theory (LC-DFT) attracts many chemists’ attentions as a quantum chemical method to be applied to large molecular system and its property calculations. However, the expensive time cost to evaluate the long-range HF exchange is a big obstacle to be overcome to be applied to the large molecular systems and the solid state materials. Upon this problem, we propose a linear-scaling method of the HF exchange integration, in particular, for the LC-DFT hybrid functional.

  19. Read-In Integrated Circuits for Large-Format Multi-Chip Emitter Arrays

    DTIC Science & Technology

    2015-03-31

    chip has been designed and fabricated using ONSEMI C5N process to verify our approach. Keywords: Large scale arrays; Tiling; Mosaic; Abutment ...required. X and y addressing is not a sustainable and easily expanded addressing architecture nor will it work well with abutted RIICs. Abutment Method... Abutting RIICs into an array is challenging because of the precise positioning required to achieve a uniform image. This problem is a new design

  20. Operationalizing ecological resilience at a landscape scale: A framework and case study from Silicon Valley

    NASA Astrophysics Data System (ADS)

    Beller, E.; Robinson, A.; Grossinger, R.; Grenier, L.; Davenport, A.

    2015-12-01

    Adaptation to climate change requires redesigning our landscapes and watersheds to maximize ecological resilience at large scales and integrated across urban areas, wildlands, and a diversity of ecosystem types. However, it can be difficult for environmental managers and designers to access, interpret, and apply resilience concepts at meaningful scales and across a range of settings. To address this gap, we produced a Landscape Resilience Framework that synthesizes the latest science on the qualitative mechanisms that drive resilience of ecological functions to climate change and other large-scale stressors. The framework is designed to help translate resilience science into actionable ecosystem conservation and restoration recommendations and adaptation strategies by providing a concise but comprehensive list of considerations that will help integrate resilience concepts into urban design, conservation planning, and natural resource management. The framework is composed of seven principles that represent core attributes which determine the resilience of ecological functions within a landscape. These principles are: setting, process, connectivity, redundancy, diversity/complexity, scale, and people. For each principle we identify several key operationalizable components that help illuminate specific recommendations and actions that are likely to contribute to landscape resilience for locally appropriate species, habitats, and biological processes. We are currently using the framework to develop landscape-scale recommendations for ecological resilience in the heavily urbanized Silicon Valley, California, in collaboration with local agencies, companies, and regional experts. The resilience framework is being applied across the valley, including urban, suburban, and wildland areas and terrestrial and aquatic ecosystems. Ultimately, the framework will underpin the development of strategies that can be implemented to bolster ecological resilience from a site to landscape scale.

  1. A survey on routing protocols for large-scale wireless sensor networks.

    PubMed

    Li, Changle; Zhang, Hanxiao; Hao, Binbin; Li, Jiandong

    2011-01-01

    With the advances in micro-electronics, wireless sensor devices have been made much smaller and more integrated, and large-scale wireless sensor networks (WSNs) based the cooperation among the significant amount of nodes have become a hot topic. "Large-scale" means mainly large area or high density of a network. Accordingly the routing protocols must scale well to the network scope extension and node density increases. A sensor node is normally energy-limited and cannot be recharged, and thus its energy consumption has a quite significant effect on the scalability of the protocol. To the best of our knowledge, currently the mainstream methods to solve the energy problem in large-scale WSNs are the hierarchical routing protocols. In a hierarchical routing protocol, all the nodes are divided into several groups with different assignment levels. The nodes within the high level are responsible for data aggregation and management work, and the low level nodes for sensing their surroundings and collecting information. The hierarchical routing protocols are proved to be more energy-efficient than flat ones in which all the nodes play the same role, especially in terms of the data aggregation and the flooding of the control packets. With focus on the hierarchical structure, in this paper we provide an insight into routing protocols designed specifically for large-scale WSNs. According to the different objectives, the protocols are generally classified based on different criteria such as control overhead reduction, energy consumption mitigation and energy balance. In order to gain a comprehensive understanding of each protocol, we highlight their innovative ideas, describe the underlying principles in detail and analyze their advantages and disadvantages. Moreover a comparison of each routing protocol is conducted to demonstrate the differences between the protocols in terms of message complexity, memory requirements, localization, data aggregation, clustering manner and other metrics. Finally some open issues in routing protocol design in large-scale wireless sensor networks and conclusions are proposed.

  2. Sand waves in environmental flows: Insights gained by coupling large-eddy simulation with morphodynamics

    NASA Astrophysics Data System (ADS)

    Sotiropoulos, Fotis; Khosronejad, Ali

    2016-02-01

    Sand waves arise in subaqueous and Aeolian environments as the result of the complex interaction between turbulent flows and mobile sand beds. They occur across a wide range of spatial scales, evolve at temporal scales much slower than the integral scale of the transporting turbulent flow, dominate river morphodynamics, undermine streambank stability and infrastructure during flooding, and sculpt terrestrial and extraterrestrial landscapes. In this paper, we present the vision for our work over the last ten years, which has sought to develop computational tools capable of simulating the coupled interactions of sand waves with turbulence across the broad range of relevant scales: from small-scale ripples in laboratory flumes to mega-dunes in large rivers. We review the computational advances that have enabled us to simulate the genesis and long-term evolution of arbitrarily large and complex sand dunes in turbulent flows using large-eddy simulation and summarize numerous novel physical insights derived from our simulations. Our findings explain the role of turbulent sweeps in the near-bed region as the primary mechanism for destabilizing the sand bed, show that the seeds of the emergent structure in dune fields lie in the heterogeneity of the turbulence and bed shear stress fluctuations over the initially flatbed, and elucidate how large dunes at equilibrium give rise to energetic coherent structures and modify the spectra of turbulence. We also discuss future challenges and our vision for advancing a data-driven simulation-based engineering science approach for site-specific simulations of river flooding.

  3. Simulations of turbulent rotating flows using a subfilter scale stress model derived from the partially integrated transport modeling method

    NASA Astrophysics Data System (ADS)

    Chaouat, Bruno

    2012-04-01

    The partially integrated transport modeling (PITM) method [B. Chaouat and R. Schiestel, "A new partially integrated transport model for subgrid-scale stresses and dissipation rate for turbulent developing flows," Phys. Fluids 17, 065106 (2005), 10.1063/1.1928607; R. Schiestel and A. Dejoan, "Towards a new partially integrated transport model for coarse grid and unsteady turbulent flow simulations," Theor. Comput. Fluid Dyn. 18, 443 (2005), 10.1007/s00162-004-0155-z; B. Chaouat and R. Schiestel, "From single-scale turbulence models to multiple-scale and subgridscale models by Fourier transform," Theor. Comput. Fluid Dyn. 21, 201 (2007), 10.1007/s00162-007-0044-3; B. Chaouat and R. Schiestel, "Progress in subgrid-scale transport modelling for continuous hybrid non-zonal RANS/LES simulations," Int. J. Heat Fluid Flow 30, 602 (2009), 10.1016/j.ijheatfluidflow.2009.02.021] viewed as a continuous approach for hybrid RANS/LES (Reynolds averaged Navier-Stoke equations/large eddy simulations) simulations with seamless coupling between RANS and LES regions is used to derive a subfilter scale stress model in the framework of second-moment closure applicable in a rotating frame of reference. This present subfilter scale model is based on the transport equations for the subfilter stresses and the dissipation rate and appears well appropriate for simulating unsteady flows on relatively coarse grids or flows with strong departure from spectral equilibrium because the cutoff wave number can be located almost anywhere inside the spectrum energy. According to the spectral theory developed in the wave number space [B. Chaouat and R. Schiestel, "From single-scale turbulence models to multiple-scale and subgrid-scale models by Fourier transform," Theor. Comput. Fluid Dyn. 21, 201 (2007), 10.1007/s00162-007-0044-3], the coefficients used in this model are no longer constants but they are some analytical functions of a dimensionless parameter controlling the spectral distribution of turbulence. The pressure-strain correlation term encompassed in this model is inspired from the nonlinear SSG model [C. G. Speziale, S. Sarkar, and T. B. Gatski, "Modelling the pressure-strain correlation of turbulence: an invariant dynamical systems approach," J. Fluid Mech. 227, 245 (1991), 10.1017/S0022112091000101] developed initially for homogeneous rotating flows in RANS methodology. It is modeled in system rotation using the principle of objectivity. Its modeling is especially extended in a low Reynolds number version for handling non-homogeneous wall flows. The present subfilter scale stress model is then used for simulating large scales of rotating turbulent flows on coarse and medium grids at moderate, medium, and high rotation rates. It is also applied to perform a simulation on a refined grid at the highest rotation rate. As a result, it is found that the PITM simulations reproduce fairly well the mean features of rotating channel flows allowing a drastic reduction of the computational cost in comparison with the one required for performing highly resolved LES. Overall, the mean velocities and turbulent stresses are found to be in good agreement with the data of highly resolved LES [E. Lamballais, O. Metais, and M. Lesieur, "Spectral-dynamic model for large-eddy simulations of turbulent rotating flow," Theor. Comput. Fluid Dyn. 12, 149 (1998)]. The anisotropy character of the flow resulting from the rotation effects is also well reproduced in accordance with the reference data. Moreover, the PITM2 simulations performed on the medium grid predict qualitatively well the three-dimensional flow structures as well as the longitudinal roll cells which appear in the anticyclonic wall-region of the rotating flows. As expected, the PITM3 simulation performed on the refined grid reverts to highly resolved LES. The present model based on a rational formulation appears to be an interesting candidate for tackling a large variety of engineering flows subjected to rotation.

  4. Evaluating the implementation of a national disclosure policy for large-scale adverse events in an integrated health care system: identification of gaps and successes.

    PubMed

    Maguire, Elizabeth M; Bokhour, Barbara G; Wagner, Todd H; Asch, Steven M; Gifford, Allen L; Gallagher, Thomas H; Durfee, Janet M; Martinello, Richard A; Elwy, A Rani

    2016-11-11

    Many healthcare organizations have developed disclosure policies for large-scale adverse events, including the Veterans Health Administration (VA). This study evaluated VA's national large-scale disclosure policy and identifies gaps and successes in its implementation. Semi-structured qualitative interviews were conducted with leaders, hospital employees, and patients at nine sites to elicit their perceptions of recent large-scale adverse events notifications and the national disclosure policy. Data were coded using the constructs of the Consolidated Framework for Implementation Research (CFIR). We conducted 97 interviews. Insights included how to handle the communication of large-scale disclosures through multiple levels of a large healthcare organization and manage ongoing communications about the event with employees. Of the 5 CFIR constructs and 26 sub-constructs assessed, seven were prominent in interviews. Leaders and employees specifically mentioned key problem areas involving 1) networks and communications during disclosure, 2) organizational culture, 3) engagement of external change agents during disclosure, and 4) a need for reflecting on and evaluating the policy implementation and disclosure itself. Patients shared 5) preferences for personal outreach by phone in place of the current use of certified letters. All interviewees discussed 6) issues with execution and 7) costs of the disclosure. CFIR analysis reveals key problem areas that need to be addresses during disclosure, including: timely communication patterns throughout the organization, establishing a supportive culture prior to implementation, using patient-approved, effective communications strategies during disclosures; providing follow-up support for employees and patients, and sharing lessons learned.

  5. cOSPREY: A Cloud-Based Distributed Algorithm for Large-Scale Computational Protein Design

    PubMed Central

    Pan, Yuchao; Dong, Yuxi; Zhou, Jingtian; Hallen, Mark; Donald, Bruce R.; Xu, Wei

    2016-01-01

    Abstract Finding the global minimum energy conformation (GMEC) of a huge combinatorial search space is the key challenge in computational protein design (CPD) problems. Traditional algorithms lack a scalable and efficient distributed design scheme, preventing researchers from taking full advantage of current cloud infrastructures. We design cloud OSPREY (cOSPREY), an extension to a widely used protein design software OSPREY, to allow the original design framework to scale to the commercial cloud infrastructures. We propose several novel designs to integrate both algorithm and system optimizations, such as GMEC-specific pruning, state search partitioning, asynchronous algorithm state sharing, and fault tolerance. We evaluate cOSPREY on three different cloud platforms using different technologies and show that it can solve a number of large-scale protein design problems that have not been possible with previous approaches. PMID:27154509

  6. Small, fast, and tough: Shrinking down integrated elastomer transducers

    NASA Astrophysics Data System (ADS)

    Rosset, Samuel; Shea, Herbert R.

    2016-09-01

    We review recent progress in miniaturized dielectric elastomer actuators (DEAs), sensors, and energy harvesters. We focus primarily on configurations where the large strain, high compliance, stretchability, and high level of integration offered by dielectric elastomer transducers provide significant advantages over other mm or μm-scale transduction technologies. We first present the most active application areas, including: tunable optics, soft robotics, haptics, micro fluidics, biomedical devices, and stretchable sensors. We then discuss the fabrication challenges related to miniaturization, such as thin membrane fabrication, precise patterning of compliant electrodes, and reliable batch fabrication of multilayer devices. We finally address the impact of miniaturization on strain, force, and driving voltage, as well as the important effect of boundary conditions on the performance of mm-scale DEAs.

  7. Large-scale brain networks in affective and social neuroscience: Towards an integrative functional architecture of the brain

    PubMed Central

    Barrett, Lisa Feldman; Satpute, Ajay

    2013-01-01

    Understanding how a human brain creates a human mind ultimately depends on mapping psychological categories and concepts to physical measurements of neural response. Although it has long been assumed that emotional, social, and cognitive phenomena are realized in the operations of separate brain regions or brain networks, we demonstrate that it is possible to understand the body of neuroimaging evidence using a framework that relies on domain general, distributed structure-function mappings. We review current research in affective and social neuroscience and argue that the emerging science of large-scale intrinsic brain networks provides a coherent framework for a domain-general functional architecture of the human brain. PMID:23352202

  8. Requirements and principles for the implementation and construction of large-scale geographic information systems

    NASA Technical Reports Server (NTRS)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  9. Research on unit commitment with large-scale wind power connected power system

    NASA Astrophysics Data System (ADS)

    Jiao, Ran; Zhang, Baoqun; Chi, Zhongjun; Gong, Cheng; Ma, Longfei; Yang, Bing

    2017-01-01

    Large-scale integration of wind power generators into power grid brings severe challenges to power system economic dispatch due to its stochastic volatility. Unit commitment including wind farm is analyzed from the two parts of modeling and solving methods. The structures and characteristics can be summarized after classification has been done according to different objective function and constraints. Finally, the issues to be solved and possible directions of research and development in the future are discussed, which can adapt to the requirements of the electricity market, energy-saving power generation dispatching and smart grid, even providing reference for research and practice of researchers and workers in this field.

  10. A multi-landing pad DNA integration platform for mammalian cell engineering

    PubMed Central

    Gaidukov, Leonid; Wroblewska, Liliana; Teague, Brian; Nelson, Tom; Zhang, Xin; Liu, Yan; Jagtap, Kalpana; Mamo, Selamawit; Tseng, Wen Allen; Lowe, Alexis; Das, Jishnu; Bandara, Kalpanie; Baijuraj, Swetha; Summers, Nevin M; Zhang, Lin; Weiss, Ron

    2018-01-01

    Abstract Engineering mammalian cell lines that stably express many transgenes requires the precise insertion of large amounts of heterologous DNA into well-characterized genomic loci, but current methods are limited. To facilitate reliable large-scale engineering of CHO cells, we identified 21 novel genomic sites that supported stable long-term expression of transgenes, and then constructed cell lines containing one, two or three ‘landing pad’ recombination sites at selected loci. By using a highly efficient BxB1 recombinase along with different selection markers at each site, we directed recombinase-mediated insertion of heterologous DNA to selected sites, including targeting all three with a single transfection. We used this method to controllably integrate up to nine copies of a monoclonal antibody, representing about 100 kb of heterologous DNA in 21 transcriptional units. Because the integration was targeted to pre-validated loci, recombinant protein expression remained stable for weeks and additional copies of the antibody cassette in the integrated payload resulted in a linear increase in antibody expression. Overall, this multi-copy site-specific integration platform allows for controllable and reproducible insertion of large amounts of DNA into stable genomic sites, which has broad applications for mammalian synthetic biology, recombinant protein production and biomanufacturing. PMID:29617873

  11. In-orbit assembly mission for the Space Solar Power Station

    NASA Astrophysics Data System (ADS)

    Cheng, ZhengAi; Hou, Xinbin; Zhang, Xinghua; Zhou, Lu; Guo, Jifeng; Song, Chunlin

    2016-12-01

    The Space Solar Power Station (SSPS) is a large spacecraft that utilizes solar power in space to supply power to an electric grid on Earth. A large symmetrical integrated concept has been proposed by the China Academy of Space Technology (CAST). Considering its large scale, the SSPS requires a modular design and unitized general interfaces that would be assembled in orbit. Facilities system supporting assembly procedures, which include a Reusable Heavy Lift Launch Vehicle, orbital transfer and space robots, is introduced. An integrated assembly scheme utilizing space robots to realize this platform SSPS concept is presented. This paper tried to give a preliminary discussion about the minimized time and energy cost of the assembly mission under best sequence and route This optimized assembly mission planning allows the SSPS to be built in orbit rapidly, effectively and reliably.

  12. Assessing the effects of fire disturbances on ecosystems: A scientific agenda for research and management

    USGS Publications Warehouse

    Schmoldt, D.L.; Peterson, D.L.; Keane, R.E.; Lenihan, J.M.; McKenzie, D.; Weise, D.R.; Sandberg, D.V.

    1999-01-01

    A team of fire scientists and resource managers convened 17-19 April 1996 in Seattle, Washington, to assess the effects of fire disturbance on ecosystems. Objectives of this workshop were to develop scientific recommendations for future fire research and management activities. These recommendations included a series of numerically ranked scientific and managerial questions and responses focusing on (1) links among fire effects, fuels, and climate; (2) fire as a large-scale disturbance; (3) fire-effects modeling structures; and (4) managerial concerns, applications, and decision support. At the present time, understanding of fire effects and the ability to extrapolate fire-effects knowledge to large spatial scales are limited, because most data have been collected at small spatial scales for specific applications. Although we clearly need more large-scale fire-effects data, it will be more expedient to concentrate efforts on improving and linking existing models that simulate fire effects in a georeferenced format while integrating empirical data as they become available. A significant component of this effort should be improved communication between modelers and managers to develop modeling tools to use in a planning context. Another component of this modeling effort should improve our ability to predict the interactions of fire and potential climatic change at very large spatial scales. The priority issues and approaches described here provide a template for fire science and fire management programs in the next decade and beyond.

  13. Sampling scales define occupancy and underlying occupancy-abundance relationships in animals

    Treesearch

    Robin Steenweg; Mark Hebblewhite; Jesse Whittington; Paul Lukacs; Kevin McKelvey

    2018-01-01

    Occupancy-abundance (OA) relationships are a foundational ecological phenomenon and field of study, and occupancy models are increasingly used to track population trends and understand ecological interactions. However, these two fields of ecological inquiry remain largely isolated, despite growing appreciation of the importance of integration. For example, using...

  14. High density circuit technology, part 2

    NASA Technical Reports Server (NTRS)

    Wade, T. E.

    1982-01-01

    A multilevel metal interconnection system for very large scale integration (VLSI) systems utilizing polyimides as the interlayer dielectric material is described. A complete characterization of polyimide materials is given as well as experimental methods accomplished using a double level metal test pattern. A low temperature, double exposure polyimide patterning procedure is also presented.

  15. Applying Bayesian Modeling and Receiver Operating Characteristic Methodologies for Test Utility Analysis

    ERIC Educational Resources Information Center

    Wang, Qiu; Diemer, Matthew A.; Maier, Kimberly S.

    2013-01-01

    This study integrated Bayesian hierarchical modeling and receiver operating characteristic analysis (BROCA) to evaluate how interest strength (IS) and interest differentiation (ID) predicted low–socioeconomic status (SES) youth's interest-major congruence (IMC). Using large-scale Kuder Career Search online-assessment data, this study fit three…

  16. A Rich Metadata Filesystem for Scientific Data

    ERIC Educational Resources Information Center

    Bui, Hoang

    2012-01-01

    As scientific research becomes more data intensive, there is an increasing need for scalable, reliable, and high performance storage systems. Such data repositories must provide both data archival services and rich metadata, and cleanly integrate with large scale computing resources. ROARS is a hybrid approach to distributed storage that provides…

  17. What can one sample tell us? Stable isotopes can assess complex processes in national assessments of lakes, rivers and streams.

    EPA Science Inventory

    Stable isotopes can be very useful in large-scale monitoring programs because samples for isotopic analysis are easy to collect, and isotopes integrate information about complex processes such as evaporation from water isotopes and denitrification from nitrogen isotopes. Traditi...

  18. Computers in Electrical Engineering Education at Virginia Polytechnic Institute.

    ERIC Educational Resources Information Center

    Bennett, A. Wayne

    1982-01-01

    Discusses use of computers in Electrical Engineering (EE) at Virginia Polytechnic Institute. Topics include: departmental background, level of computing power using large scale systems, mini and microcomputers, use of digital logic trainers and analog/hybrid computers, comments on integrating computers into EE curricula, and computer use in…

  19. Implementing Technology: A Change Process

    ERIC Educational Resources Information Center

    Atwell, Nedra; Maxwell, Marge; Romero, Elizabeth

    2008-01-01

    The state of Kentucky has embarked upon a large scale systems change effort to integrate Universal Design for Learning (UDL) principles, including use of digital curriculum and computerized reading supports to improve overall student achievement. A major component of this initiative is the use of Read & Write Gold. As higher expectations are…

  20. THE SAN PEDRO RIVER, AN EXAMPLE OF AN INTEGRATED SYSTEM FOR TRANSBORDER ENVIRONMENTAL MANAGEMENT USING GEOSPATIAL DATA AND PROCESS MODELS

    EPA Science Inventory

    These technologies provide the basis for developing landscape compostion and pattern indicators as sensitive measures of large-scale environmental change and thus may provide an effective and economical method for evaluating watershed conition related to disturbance from human an...

Top