Sample records for scale model store

  1. Heteroassociative storage of hippocampal pattern sequences in the CA3 subregion

    PubMed Central

    Recio, Renan S.; Reyes, Marcelo B.

    2018-01-01

    Background Recent research suggests that the CA3 subregion of the hippocampus has properties of both autoassociative network, due to its ability to complete partial cues, tolerate noise, and store associations between memories, and heteroassociative one, due to its ability to store and retrieve sequences of patterns. Although there are several computational models of the CA3 as an autoassociative network, more detailed evaluations of its heteroassociative properties are missing. Methods We developed a model of the CA3 subregion containing 10,000 integrate-and-fire neurons with both recurrent excitatory and inhibitory connections, and which exhibits coupled oscillations in the gamma and theta ranges. We stored thousands of pattern sequences using a heteroassociative learning rule with competitive synaptic scaling. Results We showed that a purely heteroassociative network model can (i) retrieve pattern sequences from partial cues with external noise and incomplete connectivity, (ii) achieve homeostasis regarding the number of connections per neuron when many patterns are stored when using synaptic scaling, (iii) continuously update the set of retrievable patterns, guaranteeing that the last stored patterns can be retrieved and older ones can be forgotten. Discussion Heteroassociative networks with synaptic scaling rules seem sufficient to achieve many desirable features regarding connectivity homeostasis, pattern sequence retrieval, noise tolerance and updating of the set of retrievable patterns. PMID:29312826

  2. Optimal management of nutrient reserves in microorganisms under time-varying environmental conditions.

    PubMed

    Nev, Olga A; Nev, Oleg A; van den Berg, Hugo A

    2017-09-21

    Intracellular reserves are a conspicuous feature of many bacteria; such internal stores are often present in the form of inclusions in which polymeric storage compounds are accumulated. Such reserves tend to increase in times of plenty and be used up in times of scarcity. Mathematical models that describe the dynamical nature of reserve build-up and use are known as "cell quota," "dynamic energy/nutrient budget," or "variable-internal-stores" models. Here we present a stoichiometrically consistent macro-chemical model that accounts for variable stores as well as adaptive allocation of building blocks to various types of catalytic machinery. The model posits feedback loops linking expression of assimilatory machinery to reserve density. The precise form of the "regulatory law" at the heart of such a loop expresses how the cell manages internal stores. We demonstrate how this "regulatory law" can be recovered from experimental data using several empirical data sets. We find that stores should be expected to be negligibly small in stable growth-sustaining environments, but prominent in environments characterised by marked fluctuations on time scales commensurate with the inherent dynamic time scale of the organismal system. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Combining remote sensing and watershed modeling for regional-scale carbon cycling studies in disturbance-prone systems

    NASA Astrophysics Data System (ADS)

    Hanan, E. J.; Tague, C.; Choate, J.; Liu, M.; Adam, J. C.

    2016-12-01

    Disturbance is a major force regulating C dynamics in terrestrial ecosystems. Evaluating future C balance in disturbance-prone systems requires understanding the underlying mechanisms that drive ecosystem processes over multiple scales of space and time. Simulation modeling is a powerful tool for bridging these scales, however, model projections are limited by large uncertainties in the initial state of vegetation C and N stores. Watershed models typically use one of two methods to initialize these stores. Spin up involves running a model until vegetation reaches steady state based on climate. This "potential" state however assumes the vegetation across the entire watershed has reached maturity and has a homogeneous age distribution. Yet to reliably represent C and N dynamics in disturbance-prone systems, models should be initialized to reflect their non-equilibrium conditions. Alternatively, remote sensing of a single vegetation parameter (typically leaf area index; LAI) can be combined with allometric relationships to allocate C and N to model stores and can reflect non-steady-state conditions. However, allometric relationships are species and region specific and do not account for environmental variation, thus resulting in C and N stores that may be unstable. To address this problem, we developed a new approach for initializing C and N pools using the watershed-scale ecohydrologic model RHESSys. The new approach merges the mechanistic stability of spinup with the spatial fidelity of remote sensing. Unlike traditional spin up, this approach supports non-homogeneous stand ages. We tested our approach in a pine-dominated watershed in central Idaho, which partially burned in July of 2000. We used LANDSAT and MODIS data to calculate LAI across the watershed following the 2000 fire. We then ran three sets of simulations using spin up, direct measurements, and the combined approach to initialize vegetation C and N stores, and compared our results to remotely sensed LAI following the simulation period. Model estimates of C, N, and water fluxes varied depending on which approach was used. The combined approach provided the best LAI estimates after 10 years of simulation. This method shows promise for improving projections of C, N, and water fluxes in disturbance-prone watersheds.

  4. Spin-Tunnel Investigation of a 1/20-Scale Model of the Northrop F-5E Airplane

    NASA Technical Reports Server (NTRS)

    Scher, Stanley H.; White, William L.

    1977-01-01

    An investigation has been conducted in the Langley spin tunnel to determine the spin and recovery characteristics of a 1/20-scale model of the Northrop F-5E airplane. The investigation included erect and inverted spins, a range of center-of- gravity locations and moments of inertia, symmetric and asymmetric store loadings, and a determination of the parachute size required for emergency spin recovery. The effects of increased elevator trailing-edge-up deflections, of leading-edge and trailing-edge flap deflections, and of simulating the geometry of large external stores were also determined.

  5. Estimating resource acquisition and at-sea body condition of a marine predator

    PubMed Central

    Schick, Robert S; New, Leslie F; Thomas, Len; Costa, Daniel P; Hindell, Mark A; McMahon, Clive R; Robinson, Patrick W; Simmons, Samantha E; Thums, Michele; Harwood, John; Clark, James S

    2013-01-01

    Body condition plays a fundamental role in many ecological and evolutionary processes at a variety of scales and across a broad range of animal taxa. An understanding of how body condition changes at fine spatial and temporal scales as a result of interaction with the environment provides necessary information about how animals acquire resources. However, comparatively little is known about intra- and interindividual variation of condition in marine systems. Where condition has been studied, changes typically are recorded at relatively coarse time-scales. By quantifying how fine-scale interaction with the environment influences condition, we can broaden our understanding of how animals acquire resources and allocate them to body stores. Here we used a hierarchical Bayesian state-space model to estimate the body condition as measured by the size of an animal's lipid store in two closely related species of marine predator that occupy different hemispheres: northern elephant seals (Mirounga angustirostris) and southern elephant seals (Mirounga leonina). The observation model linked drift dives to lipid stores. The process model quantified daily changes in lipid stores as a function of the physiological condition of the seal (lipid:lean tissue ratio, departure lipid and departure mass), its foraging location, two measures of behaviour and environmental covariates. We found that physiological condition significantly impacted lipid gain at two time-scales – daily and at departure from the colony – that foraging location was significantly associated with lipid gain in both species of elephant seals and that long-term behavioural phase was associated with positive lipid gain in northern and southern elephant seals. In northern elephant seals, the occurrence of short-term behavioural states assumed to represent foraging were correlated with lipid gain. Lipid gain was a function of covariates in both species. Southern elephant seals performed fewer drift dives than northern elephant seals and gained lipids at a lower rate. We have demonstrated a new way to obtain time series of body condition estimates for a marine predator at fine spatial and temporal scales. This modelling approach accounts for uncertainty at many levels and has the potential to integrate physiological and movement ecology of top predators. The observation model we used was specific to elephant seals, but the process model can readily be applied to other species, providing an opportunity to understand how animals respond to their environment at a fine spatial scale. PMID:23869551

  6. Property of Fluctuations of Sales Quantities by Product Category in Convenience Stores.

    PubMed

    Fukunaga, Gaku; Takayasu, Hideki; Takayasu, Misako

    2016-01-01

    The ability to ascertain the extent of product sale fluctuations for each store and locality is indispensable to inventory management. This study analyzed POS data from 158 convenience stores in Kawasaki City, Kanagawa Prefecture, Japan and found a power scaling law between the mean and standard deviation of product sales quantities for several product categories. For the statistical domains of low sales quantities, the power index was 1/2; for large sales quantities, the power index was 1, so called Taylor's law holds. The value of sales quantities with changing power indixes differed according to product category. We derived a Poissonian compound distribution model taking into account fluctuations in customer numbers to show that the scaling law could be explained theoretically for most of items. We also examined why the scaling law did not hold in some exceptional cases.

  7. Development and Application of a Process-based River System Model at a Continental Scale

    NASA Astrophysics Data System (ADS)

    Kim, S. S. H.; Dutta, D.; Vaze, J.; Hughes, J. D.; Yang, A.; Teng, J.

    2014-12-01

    Existing global and continental scale river models, mainly designed for integrating with global climate model, are of very course spatial resolutions and they lack many important hydrological processes, such as overbank flow, irrigation diversion, groundwater seepage/recharge, which operate at a much finer resolution. Thus, these models are not suitable for producing streamflow forecast at fine spatial resolution and water accounts at sub-catchment levels, which are important for water resources planning and management at regional and national scale. A large-scale river system model has been developed and implemented for water accounting in Australia as part of the Water Information Research and Development Alliance between Australia's Bureau of Meteorology (BoM) and CSIRO. The model, developed using node-link architecture, includes all major hydrological processes, anthropogenic water utilisation and storage routing that influence the streamflow in both regulated and unregulated river systems. It includes an irrigation model to compute water diversion for irrigation use and associated fluxes and stores and a storage-based floodplain inundation model to compute overbank flow from river to floodplain and associated floodplain fluxes and stores. An auto-calibration tool has been built within the modelling system to automatically calibrate the model in large river systems using Shuffled Complex Evolution optimiser and user-defined objective functions. The auto-calibration tool makes the model computationally efficient and practical for large basin applications. The model has been implemented in several large basins in Australia including the Murray-Darling Basin, covering more than 2 million km2. The results of calibration and validation of the model shows highly satisfactory performance. The model has been operalisationalised in BoM for producing various fluxes and stores for national water accounting. This paper introduces this newly developed river system model describing the conceptual hydrological framework, methods used for representing different hydrological processes in the model and the results and evaluation of the model performance. The operational implementation of the model for water accounting is discussed.

  8. Unleashing spatially distributed ecohydrology modeling using Big Data tools

    NASA Astrophysics Data System (ADS)

    Miles, B.; Idaszak, R.

    2015-12-01

    Physically based spatially distributed ecohydrology models are useful for answering science and management questions related to the hydrology and biogeochemistry of prairie, savanna, forested, as well as urbanized ecosystems. However, these models can produce hundreds of gigabytes of spatial output for a single model run over decadal time scales when run at regional spatial scales and moderate spatial resolutions (~100-km2+ at 30-m spatial resolution) or when run for small watersheds at high spatial resolutions (~1-km2 at 3-m spatial resolution). Numerical data formats such as HDF5 can store arbitrarily large datasets. However even in HPC environments, there are practical limits on the size of single files that can be stored and reliably backed up. Even when such large datasets can be stored, querying and analyzing these data can suffer from poor performance due to memory limitations and I/O bottlenecks, for example on single workstations where memory and bandwidth are limited, or in HPC environments where data are stored separately from computational nodes. The difficulty of storing and analyzing spatial data from ecohydrology models limits our ability to harness these powerful tools. Big Data tools such as distributed databases have the potential to surmount the data storage and analysis challenges inherent to large spatial datasets. Distributed databases solve these problems by storing data close to computational nodes while enabling horizontal scalability and fault tolerance. Here we present the architecture of and preliminary results from PatchDB, a distributed datastore for managing spatial output from the Regional Hydro-Ecological Simulation System (RHESSys). The initial version of PatchDB uses message queueing to asynchronously write RHESSys model output to an Apache Cassandra cluster. Once stored in the cluster, these data can be efficiently queried to quickly produce both spatial visualizations for a particular variable (e.g. maps and animations), as well as point time series of arbitrary variables at arbitrary points in space within a watershed or river basin. By treating ecohydrology modeling as a Big Data problem, we hope to provide a platform for answering transformative science and management questions related to water quantity and quality in a world of non-stationary climate.

  9. Property of Fluctuations of Sales Quantities by Product Category in Convenience Stores

    PubMed Central

    Fukunaga, Gaku; Takayasu, Hideki; Takayasu, Misako

    2016-01-01

    The ability to ascertain the extent of product sale fluctuations for each store and locality is indispensable to inventory management. This study analyzed POS data from 158 convenience stores in Kawasaki City, Kanagawa Prefecture, Japan and found a power scaling law between the mean and standard deviation of product sales quantities for several product categories. For the statistical domains of low sales quantities, the power index was 1/2; for large sales quantities, the power index was 1, so called Taylor’s law holds. The value of sales quantities with changing power indixes differed according to product category. We derived a Poissonian compound distribution model taking into account fluctuations in customer numbers to show that the scaling law could be explained theoretically for most of items. We also examined why the scaling law did not hold in some exceptional cases. PMID:27310915

  10. Water and salt balance modelling to predict the effects of land-use changes in forested catchments. 1. Small catchment water balance model

    NASA Astrophysics Data System (ADS)

    Sivapalan, Murugesu; Ruprecht, John K.; Viney, Neil R.

    1996-03-01

    A long-term water balance model has been developed to predict the hydrological effects of land-use change (especially forest clearing) in small experimental catchments in the south-west of Western Australia. This small catchment model has been used as the building block for the development of a large catchment-scale model, and has also formed the basis for a coupled water and salt balance model, developed to predict the changes in stream salinity resulting from land-use and climate change. The application of the coupled salt and water balance model to predict stream salinities in two small experimental catchments, and the application of the large catchment-scale model to predict changes in water yield in a medium-sized catchment that is being mined for bauxite, are presented in Parts 2 and 3, respectively, of this series of papers.The small catchment model has been designed as a simple, robust, conceptually based model of the basic daily water balance fluxes in forested catchments. The responses of the catchment to rainfall and pan evaporation are conceptualized in terms of three interdependent subsurface stores A, B and F. Store A depicts a near-stream perched aquifer system; B represents a deeper, permanent groundwater system; and F is an intermediate, unsaturated infiltration store. The responses of these stores are characterized by a set of constitutive relations which involves a number of conceptual parameters. These parameters are estimated by calibration by comparing observed and predicted runoff. The model has performed very well in simulations carried out on Salmon and Wights, two small experimental catchments in the Collie River basin in south-west Western Australia. The results from the application of the model to these small catchments are presented in this paper.

  11. Evaluating GCM land surface hydrology parameterizations by computing river discharges using a runoff routing model: Application to the Mississippi basin

    NASA Technical Reports Server (NTRS)

    Liston, G. E.; Sud, Y. C.; Wood, E. F.

    1994-01-01

    To relate general circulation model (GCM) hydrologic output to readily available river hydrographic data, a runoff routing scheme that routes gridded runoffs through regional- or continental-scale river drainage basins is developed. By following the basin overland flow paths, the routing model generates river discharge hydrographs that can be compared to observed river discharges, thus allowing an analysis of the GCM representation of monthly, seasonal, and annual water balances over large regions. The runoff routing model consists of two linear reservoirs, a surface reservoir and a groundwater reservoir, which store and transport water. The water transport mechanisms operating within these two reservoirs are differentiated by their time scales; the groundwater reservoir transports water much more slowly than the surface reservior. The groundwater reservior feeds the corresponding surface store, and the surface stores are connected via the river network. The routing model is implemented over the Global Energy and Water Cycle Experiment (GEWEX) Continental-Scale International Project Mississippi River basin on a rectangular grid of 2 deg X 2.5 deg. Two land surface hydrology parameterizations provide the gridded runoff data required to run the runoff routing scheme: the variable infiltration capacity model, and the soil moisture component of the simple biosphere model. These parameterizations are driven with 4 deg X 5 deg gridded climatological potential evapotranspiration and 1979 First Global Atmospheric Research Program (GARP) Global Experiment precipitation. These investigations have quantified the importance of physically realistic soil moisture holding capacities, evaporation parameters, and runoff mechanisms in land surface hydrology formulations.

  12. Highlights of X-Stack ExM Deliverable: MosaStore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ripeanu, Matei

    2016-07-20

    This brief report highlights the experience gained with MosaStore, an exploratory part of the X-Stack project “ExM: System support for extreme-scale, many-task applications”. The ExM project proposed to use concurrent workflows supported by the Swift language and runtime as an innovative programming model to exploit parallelism in exascale computers. MosaStore aims to support this endeavor by improving storage support for workflow-based applications, more precisely by exploring the gains that can be obtained from co-designing the storage system and the workflow runtime engine. MosaStore has been developed primarily at the University of British Columbia.

  13. Calibration Development for an Unsteady Two-Strut Store Balance

    NASA Astrophysics Data System (ADS)

    Schmit, Ryan; Maatz, Ian; Johnson, Rudy

    2017-11-01

    This paper addresses measurements of unsteady store forces and moment in and around a weapons bay cavity. The cavity dimensions are: Length 8.5 inches, Depth 1.5 inches, Width 2.5 with a L/D ratio of 5.67. Test conditions are at Mach 0.7 and 1.5 with Re # 2.0e6/ft. The 7.2 inches long aluminum store is held in the cavity with two struts and the strut lengths are varied to move the store to different cavity depth locations. The normal forces and pitching moments are measured with two miniature 25 pound load cells with a natural frequency of 24k. The store-strut-load cell balance can also produce unwanted structural eigenfrequencies at or near the cavity's Rossiter tones. To move the eigenfrequencies away from the cavity's Rossiter tones calls for detailed design and Finite Element Modeling (FEM) before wind tunnel testing. Included are the issues in developing a calibration method for an unsteady two-strut store balance for use inside a scaled wind tunnel weapons bay cavity model.

  14. Development and application of a large scale river system model for National Water Accounting in Australia

    NASA Astrophysics Data System (ADS)

    Dutta, Dushmanta; Vaze, Jai; Kim, Shaun; Hughes, Justin; Yang, Ang; Teng, Jin; Lerat, Julien

    2017-04-01

    Existing global and continental scale river models, mainly designed for integrating with global climate models, are of very coarse spatial resolutions and lack many important hydrological processes, such as overbank flow, irrigation diversion, groundwater seepage/recharge, which operate at a much finer resolution. Thus, these models are not suitable for producing water accounts, which have become increasingly important for water resources planning and management at regional and national scales. A continental scale river system model called Australian Water Resource Assessment River System model (AWRA-R) has been developed and implemented for national water accounting in Australia using a node-link architecture. The model includes major hydrological processes, anthropogenic water utilisation and storage routing that influence the streamflow in both regulated and unregulated river systems. Two key components of the model are an irrigation model to compute water diversion for irrigation use and associated fluxes and stores and a storage-based floodplain inundation model to compute overbank flow from river to floodplain and associated floodplain fluxes and stores. The results in the Murray-Darling Basin shows highly satisfactory performance of the model with median daily Nash-Sutcliffe Efficiency (NSE) of 0.64 and median annual bias of less than 1% for the period of calibration (1970-1991) and median daily NSE of 0.69 and median annual bias of 12% for validation period (1992-2014). The results have demonstrated that the performance of the model is less satisfactory when the key processes such as overbank flow, groundwater seepage and irrigation diversion are switched off. The AWRA-R model, which has been operationalised by the Australian Bureau of Meteorology for continental scale water accounting, has contributed to improvements in the national water account by substantially reducing accounted different volume (gain/loss).

  15. Hierarchical Distributed-Lag Models: Exploring Varying Geographic Scale and Magnitude in Associations Between the Built Environment and Health.

    PubMed

    Baek, Jonggyu; Sanchez-Vaznaugh, Emma V; Sánchez, Brisa N

    2016-03-15

    It is well known that associations between features of the built environment and health depend on the geographic scale used to construct environmental attributes. In the built environment literature, it has long been argued that geographic scales may vary across study locations. However, this hypothesized variation has not been systematically examined due to a lack of available statistical methods. We propose a hierarchical distributed-lag model (HDLM) for estimating the underlying overall shape of food environment-health associations as a function of distance from locations of interest. This method enables indirect assessment of relevant geographic scales and captures area-level heterogeneity in the magnitudes of associations, along with relevant distances within areas. The proposed model was used to systematically examine area-level variation in the association between availability of convenience stores around schools and children's weights. For this case study, body mass index (weight kg)/height (m)2) z scores (BMIz) for 7th grade children collected via California's 2001-2009 FitnessGram testing program were linked to a commercial database that contained locations of food outlets statewide. Findings suggested that convenience store availability may influence BMIz only in some places and at varying distances from schools. Future research should examine localized environmental or policy differences that may explain the heterogeneity in convenience store-BMIz associations. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Atmospheric Turbulence Modeling for Aerospace Vehicles: Fractional Order Fit

    NASA Technical Reports Server (NTRS)

    Kopasakis, George (Inventor)

    2015-01-01

    An improved model for simulating atmospheric disturbances is disclosed. A scale Kolmogorov spectral may be scaled to convert the Kolmogorov spectral into a finite energy von Karman spectral and a fractional order pole-zero transfer function (TF) may be derived from the von Karman spectral. Fractional order atmospheric turbulence may be approximated with an integer order pole-zero TF fit, and the approximation may be stored in memory.

  17. Accounting for disturbance history in models: using remote sensing to constrain carbon and nitrogen pool spin-up.

    PubMed

    Hanan, Erin J; Tague, Christina; Choate, Janet; Liu, Mingliang; Kolden, Crystal; Adam, Jennifer

    2018-03-24

    Disturbances such as wildfire, insect outbreaks, and forest clearing, play an important role in regulating carbon, nitrogen, and hydrologic fluxes in terrestrial watersheds. Evaluating how watersheds respond to disturbance requires understanding mechanisms that interact over multiple spatial and temporal scales. Simulation modeling is a powerful tool for bridging these scales; however, model projections are limited by uncertainties in the initial state of plant carbon and nitrogen stores. Watershed models typically use one of two methods to initialize these stores: spin-up to steady state or remote sensing with allometric relationships. Spin-up involves running a model until vegetation reaches equilibrium based on climate. This approach assumes that vegetation across the watershed has reached maturity and is of uniform age, which fails to account for landscape heterogeneity and non-steady-state conditions. By contrast, remote sensing, can provide data for initializing such conditions. However, methods for assimilating remote sensing into model simulations can also be problematic. They often rely on empirical allometric relationships between a single vegetation variable and modeled carbon and nitrogen stores. Because allometric relationships are species- and region-specific, they do not account for the effects of local resource limitation, which can influence carbon allocation (to leaves, stems, roots, etc.). To address this problem, we developed a new initialization approach using the catchment-scale ecohydrologic model RHESSys. The new approach merges the mechanistic stability of spin-up with the spatial fidelity of remote sensing. It uses remote sensing to define spatially explicit targets for one or several vegetation state variables, such as leaf area index, across a watershed. The model then simulates the growth of carbon and nitrogen stores until the defined targets are met for all locations. We evaluated this approach in a mixed pine-dominated watershed in central Idaho, and a chaparral-dominated watershed in southern California. In the pine-dominated watershed, model estimates of carbon, nitrogen, and water fluxes varied among methods, while the target-driven method increased correspondence between observed and modeled streamflow. In the chaparral watershed, where vegetation was more homogeneously aged, there were no major differences among methods. Thus, in heterogeneous, disturbance-prone watersheds, the target-driven approach shows potential for improving biogeochemical projections. © 2018 by the Ecological Society of America.

  18. Modelling and operation strategies of DLR's large scale thermocline test facility (TESIS)

    NASA Astrophysics Data System (ADS)

    Odenthal, Christian; Breidenbach, Nils; Bauer, Thomas

    2017-06-01

    In this work an overview of the TESIS:store thermocline test facility and its current construction status will be given. Based on this, the TESIS:store facility using sensible solid filler material is modelled with a fully transient model, implemented in MATLAB®. Results in terms of the impact of filler site and operation strategies will be presented. While low porosity and small particle diameters for the filler material are beneficial, operation strategy is one key element with potential for optimization. It is shown that plant operators have to ponder between utilization and exergetic efficiency. Different durations of the charging and discharging period enable further potential for optimizations.

  19. Wind-tunnel simulation of store jettison with the aid of magnetic artificial gravity

    NASA Technical Reports Server (NTRS)

    Stephens, T.; Adams, R.

    1972-01-01

    A method employed in the simulation of jettison of stores from aircraft involving small scale wind-tunnel drop tests from a model of the parent aircraft is described. Proper scaling of such experiments generally dictates that the gravitational acceleration should ideally be a test variable. A method of introducing a controllable artificial component of gravity by magnetic means has been proposed. The use of a magnetic artificial gravity facility based upon this idea, in conjunction with small scale wind-tunnel drop tests, would improve the accuracy of simulation. A review of the scaling laws as they apply to the design of such a facility is presented. The design constraints involved in the integration of such a facility with a wind tunnel are defined. A detailed performance analysis procedure applicable to such a facility is developed. A practical magnet configuration is defined which is capable of controlling the strength and orientation of the magnetic artificial gravity field in the vertical plane, thereby allowing simulation of store jettison from a diving or climbing aircraft. The factors involved in the choice between continuous or intermittent operation of the facility, and the use of normal or superconducting magnets, are defined.

  20. A model of attention-guided visual perception and recognition.

    PubMed

    Rybak, I A; Gusakova, V I; Golovan, A V; Podladchikova, L N; Shevtsova, N A

    1998-08-01

    A model of visual perception and recognition is described. The model contains: (i) a low-level subsystem which performs both a fovea-like transformation and detection of primary features (edges), and (ii) a high-level subsystem which includes separated 'what' (sensory memory) and 'where' (motor memory) structures. Image recognition occurs during the execution of a 'behavioral recognition program' formed during the primary viewing of the image. The recognition program contains both programmed attention window movements (stored in the motor memory) and predicted image fragments (stored in the sensory memory) for each consecutive fixation. The model shows the ability to recognize complex images (e.g. faces) invariantly with respect to shift, rotation and scale.

  1. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  2. Position, scale, and rotation invariant holographic associative memory

    NASA Astrophysics Data System (ADS)

    Fielding, Kenneth H.; Rogers, Steven K.; Kabrisky, Matthew; Mills, James P.

    1989-08-01

    This paper describes the development and characterization of a holographic associative memory (HAM) system that is able to recall stored objects whose inputs were changed in position, scale, and rotation. The HAM is based on the single iteration model described by Owechko et al. (1987); however, the system described uses a self-pumped BaTiO3 phase conjugate mirror, rather than a degenerate four-wave mixing proposed by Owechko and his coworkers. The HAM system can store objects in a position, scale, and rotation invariant feature space. The angularly multiplexed diffuse Fourier transform holograms of the HAM feature space are characterized as the memory unit; distorted input objects are correlated with the hologram, and the nonlinear phase conjugate mirror reduces cross-correlation noise and provides object discrimination. Applications of the HAM system are presented.

  3. Water-tunnel investigation of concepts for alleviation of adverse inlet spillage interactions with external stores

    NASA Technical Reports Server (NTRS)

    Neuhart, Dan H.; Rhode, Matthew N.

    1990-01-01

    A test was conducted in the NASA Langley 16- by 24-Inch Water Tunnel to study alleviation of the adverse interactions of inlet spillage flow on the external stores of a fighter aircraft. A 1/48-scale model of a fighter aircraft was used to simulate the flow environment around the aircraft inlets and on the downstream underside of the fuselage. A controlled inlet mass flow was simulated by drawing water into the inlets. Various flow control devices were used on the underside of the aircraft model to manipulate the vortical inlet spillage flow.

  4. Healthy bodegas: increasing and promoting healthy foods at corner stores in New York City.

    PubMed

    Dannefer, Rachel; Williams, Donya A; Baronberg, Sabrina; Silver, Lynn

    2012-10-01

    We assessed the effectiveness of an initiative to increase the stock and promotion of healthy foods in 55 corner stores in underserved neighborhoods. We evaluated the intervention through in-store observations and preintervention and postintervention surveys of all 55 store owners as well as surveys with customers at a subset of stores. We observed an average of 4 changes on a 15-point criteria scale. The most common were placing refrigerated water at eye level, stocking canned fruit with no sugar added, offering a healthy sandwich, and identifying healthier items. Forty-six (84%) store owners completed both surveys. Owners reported increased sales of healthier items, but identified barriers including consumer demand and lack of space and refrigeration. The percentage of customers surveyed who purchased items for which we promoted a healthier option (low-sodium canned goods, low-fat milk, whole-grain bread, healthier snacks and sandwiches) increased from 5% to 16%. Corner stores are important vehicles for access to healthy foods. The approach described here achieved improvements in participating corner stores and in some consumer purchases and may be a useful model for other locales.

  5. Real-time learning of predictive recognition categories that chunk sequences of items stored in working memory

    PubMed Central

    Kazerounian, Sohrob; Grossberg, Stephen

    2014-01-01

    How are sequences of events that are temporarily stored in a cognitive working memory unitized, or chunked, through learning? Such sequential learning is needed by the brain in order to enable language, spatial understanding, and motor skills to develop. In particular, how does the brain learn categories, or list chunks, that become selectively tuned to different temporal sequences of items in lists of variable length as they are stored in working memory, and how does this learning process occur in real time? The present article introduces a neural model that simulates learning of such list chunks. In this model, sequences of items are temporarily stored in an Item-and-Order, or competitive queuing, working memory before learning categorizes them using a categorization network, called a Masking Field, which is a self-similar, multiple-scale, recurrent on-center off-surround network that can weigh the evidence for variable-length sequences of items as they are stored in the working memory through time. A Masking Field hereby activates the learned list chunks that represent the most predictive item groupings at any time, while suppressing less predictive chunks. In a network with a given number of input items, all possible ordered sets of these item sequences, up to a fixed length, can be learned with unsupervised or supervised learning. The self-similar multiple-scale properties of Masking Fields interacting with an Item-and-Order working memory provide a natural explanation of George Miller's Magical Number Seven and Nelson Cowan's Magical Number Four. The article explains why linguistic, spatial, and action event sequences may all be stored by Item-and-Order working memories that obey similar design principles, and thus how the current results may apply across modalities. Item-and-Order properties may readily be extended to Item-Order-Rank working memories in which the same item can be stored in multiple list positions, or ranks, as in the list ABADBD. Comparisons with other models, including TRACE, MERGE, and TISK, are made. PMID:25339918

  6. Dynamic Investigation of Release Characteristics of a Streamlined Internal Store from a Simulated Bomb Bay of the Republic F-105 Airplane at Mach Numbers of 0.8, 1.4, and 1.98, Coord. No. AF-222

    NASA Technical Reports Server (NTRS)

    Lee, John B.

    1956-01-01

    An investigation has been conducted in the 27- by 27-inch preflight jet of the Langley Pilotless Aircraft Research Station at Wallops Island, Va., of the release characteristics of a dynamically scaled streamlined-type internally carried store from a simulated bomb bay at Mach numbers M(sub o) of 0.8, 1.4, and 1.98. A l/17-scale model of the Republic F-105 half-fuselage and bomb-bay configuration was used with a streamlined store shape of a fineness ratio of 6.00. Simulated altitudes were 3,400 feet at M(sub o) = 0.8, 3,400, and 29,000 feet at M(sub o) = 1.4, and 29,000 feet at M(sub o) = 1.98. At supersonic speeds, high pitching moments are induced on the store in the vicinity of the bomb bay at high dynamic pressures. Successful ejections could not be made with the original configuration at supersonic speeds at near sea-level conditions. The pitching moments caused by unsymmetrical pressures on the store in a disturbed flow field were overcome by replacing the high-aspect-ratio fin with a low-aspect-ratio fin that had a 30-percent area increase which was less subject to aeroelastic effects. Release characteristics of the store were improved by orienting the fins so that they were in a more uniform flow field at the point of store release. The store pitching moments were shown to be reduced by increasing the simulated altitude. Favorable ejections were made at subsonic speeds at near sea-level conditions.

  7. Water and salt balance modelling to predict the effects of land-use changes in forested catchments. 3. The large catchment model

    NASA Astrophysics Data System (ADS)

    Sivapalan, Murugesu; Viney, Neil R.; Jeevaraj, Charles G.

    1996-03-01

    This paper presents an application of a long-term, large catchment-scale, water balance model developed to predict the effects of forest clearing in the south-west of Western Australia. The conceptual model simulates the basic daily water balance fluxes in forested catchments before and after clearing. The large catchment is divided into a number of sub-catchments (1-5 km2 in area), which are taken as the fundamental building blocks of the large catchment model. The responses of the individual subcatchments to rainfall and pan evaporation are conceptualized in terms of three inter-dependent subsurface stores A, B and F, which are considered to represent the moisture states of the subcatchments. Details of the subcatchment-scale water balance model have been presented earlier in Part 1 of this series of papers. The response of any subcatchment is a function of its local moisture state, as measured by the local values of the stores. The variations of the initial values of the stores among the subcatchments are described in the large catchment model through simple, linear equations involving a number of similarity indices representing topography, mean annual rainfall and level of forest clearing.The model is applied to the Conjurunup catchment, a medium-sized (39·6 km2) catchment in the south-west of Western Australia. The catchment has been heterogeneously (in space and time) cleared for bauxite mining and subsequently rehabilitated. For this application, the catchment is divided into 11 subcatchments. The model parameters are estimated by calibration, by comparing observed and predicted runoff values, over a 18 year period, for the large catchment and two of the subcatchments. Excellent fits are obtained.

  8. Scaling laws describe memories of host-pathogen riposte in the HIV population.

    PubMed

    Barton, John P; Kardar, Mehran; Chakraborty, Arup K

    2015-02-17

    The enormous genetic diversity and mutability of HIV has prevented effective control of this virus by natural immune responses or vaccination. Evolution of the circulating HIV population has thus occurred in response to diverse, ultimately ineffective, immune selection pressures that randomly change from host to host. We show that the interplay between the diversity of human immune responses and the ways that HIV mutates to evade them results in distinct sets of sequences defined by similar collectively coupled mutations. Scaling laws that relate these sets of sequences resemble those observed in linguistics and other branches of inquiry, and dynamics reminiscent of neural networks are observed. Like neural networks that store memories of past stimulation, the circulating HIV population stores memories of host-pathogen combat won by the virus. We describe an exactly solvable model that captures the main qualitative features of the sets of sequences and a simple mechanistic model for the origin of the observed scaling laws. Our results define collective mutational pathways used by HIV to evade human immune responses, which could guide vaccine design.

  9. Healthy Bodegas: Increasing and Promoting Healthy Foods at Corner Stores in New York City

    PubMed Central

    Williams, Donya A.; Baronberg, Sabrina; Silver, Lynn

    2012-01-01

    Objectives. We assessed the effectiveness of an initiative to increase the stock and promotion of healthy foods in 55 corner stores in underserved neighborhoods. Methods. We evaluated the intervention through in-store observations and preintervention and postintervention surveys of all 55 store owners as well as surveys with customers at a subset of stores. Results. We observed an average of 4 changes on a 15-point criteria scale. The most common were placing refrigerated water at eye level, stocking canned fruit with no sugar added, offering a healthy sandwich, and identifying healthier items. Forty-six (84%) store owners completed both surveys. Owners reported increased sales of healthier items, but identified barriers including consumer demand and lack of space and refrigeration. The percentage of customers surveyed who purchased items for which we promoted a healthier option (low-sodium canned goods, low-fat milk, whole-grain bread, healthier snacks and sandwiches) increased from 5% to 16%. Conclusions. Corner stores are important vehicles for access to healthy foods. The approach described here achieved improvements in participating corner stores and in some consumer purchases and may be a useful model for other locales. PMID:22897534

  10. Subsonic Wind Tunnel Tests of the FBTV Configuration in Proximity of the B-52

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Priebe, R.W.

    1966-12-01

    Wind tunnel tests were conducted on a .075 scale Sandia FBTV store model in an 8-foot transonic wind tunnel during December `66. These tests were performed to obtain longitudinal and lateral stability characteristics.

  11. Hierarchical Distributed-Lag Models: Exploring Varying Geographic Scale and Magnitude in Associations Between the Built Environment and Health

    PubMed Central

    Baek, Jonggyu; Sanchez-Vaznaugh, Emma V.; Sánchez, Brisa N.

    2016-01-01

    It is well known that associations between features of the built environment and health depend on the geographic scale used to construct environmental attributes. In the built environment literature, it has long been argued that geographic scales may vary across study locations. However, this hypothesized variation has not been systematically examined due to a lack of available statistical methods. We propose a hierarchical distributed-lag model (HDLM) for estimating the underlying overall shape of food environment–health associations as a function of distance from locations of interest. This method enables indirect assessment of relevant geographic scales and captures area-level heterogeneity in the magnitudes of associations, along with relevant distances within areas. The proposed model was used to systematically examine area-level variation in the association between availability of convenience stores around schools and children's weights. For this case study, body mass index (weight kg)/height (m)2) z scores (BMIz) for 7th grade children collected via California's 2001–2009 FitnessGram testing program were linked to a commercial database that contained locations of food outlets statewide. Findings suggested that convenience store availability may influence BMIz only in some places and at varying distances from schools. Future research should examine localized environmental or policy differences that may explain the heterogeneity in convenience store–BMIz associations. PMID:26888753

  12. Rethinking key–value store for parallel I/O optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kougkas, Anthony; Eslami, Hassan; Sun, Xian-He

    2015-01-26

    Key-value stores are being widely used as the storage system for large-scale internet services and cloud storage systems. However, they are rarely used in HPC systems, where parallel file systems are the dominant storage solution. In this study, we examine the architecture differences and performance characteristics of parallel file systems and key-value stores. We propose using key-value stores to optimize overall Input/Output (I/O) performance, especially for workloads that parallel file systems cannot handle well, such as the cases with intense data synchronization or heavy metadata operations. We conducted experiments with several synthetic benchmarks, an I/O benchmark, and a real application.more » We modeled the performance of these two systems using collected data from our experiments, and we provide a predictive method to identify which system offers better I/O performance given a specific workload. The results show that we can optimize the I/O performance in HPC systems by utilizing key-value stores.« less

  13. Statistical Compression for Climate Model Output

    NASA Astrophysics Data System (ADS)

    Hammerling, D.; Guinness, J.; Soh, Y. J.

    2017-12-01

    Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus is it important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. We decompress the data by computing conditional expectations and conditional simulations from the model given the summary statistics. Conditional expectations represent our best estimate of the original data but are subject to oversmoothing in space and time. Conditional simulations introduce realistic small-scale noise so that the decompressed fields are neither too smooth nor too rough compared with the original data. Considerable attention is paid to accurately modeling the original dataset-one year of daily mean temperature data-particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers.

  14. Coarse-Grain Bandwidth Estimation Scheme for Large-Scale Network

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming; Jennings, Esther H.; Sergui, John S.

    2013-01-01

    A large-scale network that supports a large number of users can have an aggregate data rate of hundreds of Mbps at any time. High-fidelity simulation of a large-scale network might be too complicated and memory-intensive for typical commercial-off-the-shelf (COTS) tools. Unlike a large commercial wide-area-network (WAN) that shares diverse network resources among diverse users and has a complex topology that requires routing mechanism and flow control, the ground communication links of a space network operate under the assumption of a guaranteed dedicated bandwidth allocation between specific sparse endpoints in a star-like topology. This work solved the network design problem of estimating the bandwidths of a ground network architecture option that offer different service classes to meet the latency requirements of different user data types. In this work, a top-down analysis and simulation approach was created to size the bandwidths of a store-and-forward network for a given network topology, a mission traffic scenario, and a set of data types with different latency requirements. These techniques were used to estimate the WAN bandwidths of the ground links for different architecture options of the proposed Integrated Space Communication and Navigation (SCaN) Network. A new analytical approach, called the "leveling scheme," was developed to model the store-and-forward mechanism of the network data flow. The term "leveling" refers to the spreading of data across a longer time horizon without violating the corresponding latency requirement of the data type. Two versions of the leveling scheme were developed: 1. A straightforward version that simply spreads the data of each data type across the time horizon and doesn't take into account the interactions among data types within a pass, or between data types across overlapping passes at a network node, and is inherently sub-optimal. 2. Two-state Markov leveling scheme that takes into account the second order behavior of the store-and-forward mechanism, and the interactions among data types within a pass. The novelty of this approach lies in the modeling of the store-and-forward mechanism of each network node. The term store-and-forward refers to the data traffic regulation technique in which data is sent to an intermediate network node where they are temporarily stored and sent at a later time to the destination node or to another intermediate node. Store-and-forward can be applied to both space-based networks that have intermittent connectivity, and ground-based networks with deterministic connectivity. For groundbased networks, the store-and-forward mechanism is used to regulate the network data flow and link resource utilization such that the user data types can be delivered to their destination nodes without violating their respective latency requirements.

  15. A systemic approach to explore the flexibility of energy stores at the cellular scale: Examples from muscle cells.

    PubMed

    Taghipoor, Masoomeh; van Milgen, Jaap; Gondret, Florence

    2016-09-07

    Variations in energy storage and expenditure are key elements for animals adaptation to rapidly changing environments. Because of the multiplicity of metabolic pathways, metabolic crossroads and interactions between anabolic and catabolic processes within and between different cells, the flexibility of energy stores in animal cells is difficult to describe by simple verbal, textual or graphic terms. We propose a mathematical model to study the influence of internal and external challenges on the dynamic behavior of energy stores and its consequence on cell energy status. The role of the flexibility of energy stores on the energy equilibrium at the cellular level is illustrated through three case studies: variation in eating frequency (i.e., glucose input), level of physical activity (i.e., ATP requirement), and changes in cell characteristics (i.e., maximum capacity of glycogen storage). Sensitivity analysis has been performed to highlight the most relevant parameters of the model; model simulations have then been performed to illustrate how variation in these key parameters affects cellular energy balance. According to this analysis, glycogen maximum accumulation capacity and homeostatic energy demand are among the most important parameters regulating muscle cell metabolism to ensure its energy equilibrium. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Global-scale patterns of nutrient density and partitioning in forests in relation to climate.

    PubMed

    Zhang, Kerong; Song, Conghe; Zhang, Yulong; Dang, Haishan; Cheng, Xiaoli; Zhang, Quanfa

    2018-01-01

    Knowledge of nutrient storage and partitioning in forests is imperative for ecosystem models and ecological theory. Whether the nutrients (N, P, K, Ca, and Mg) stored in forest biomass and their partitioning patterns vary systematically across climatic gradients remains unknown. Here, we explored the global-scale patterns of nutrient density and partitioning using a newly compiled dataset including 372 forest stands. We found that temperature and precipitation were key factors driving the nutrients stored in living biomass of forests at global scale. The N, K, and Mg stored in living biomass tended to be greater in increasingly warm climates. The mean biomass N density was 577.0, 530.4, 513.2, and 336.7 kg/ha for tropical, subtropical, temperate, and boreal forests, respectively. Around 76% of the variation in biomass N density could be accounted by the empirical model combining biomass density, phylogeny (i.e., angiosperm, gymnosperm), and the interaction of mean annual temperature and precipitation. Climate, stand age, and biomass density significantly affected nutrients partitioning at forest community level. The fractional distribution of nutrients to roots decreased significantly with temperature, suggesting that forests in cold climates allocate greater nutrients to roots. Gymnosperm forests tended to allocate more nutrients to leaves as compared with angiosperm forests, whereas the angiosperm forests distributed more nutrients in stems. The nutrient-based Root:Shoot ratios (R:S), averaged 0.30 for R:S N , 0.36 for R:S P , 0.32 for R:S K , 0.27 for R:S Ca , and 0.35 for R:S Mg , respectively. The scaling exponents of the relationships describing root nutrients as a function of shoot nutrients were more than 1.0, suggesting that as nutrient allocated to shoot increases, nutrient allocated to roots increases faster than linearly with nutrient in shoot. Soil type significantly affected the total N, P, K, Ca, and Mg stored in living biomass of forests, and the Acrisols group displayed the lowest P, K, Ca, and Mg. © 2017 John Wiley & Sons Ltd.

  17. High-resolution Continental Scale Land Surface Model incorporating Land-water Management in United States

    NASA Astrophysics Data System (ADS)

    Shin, S.; Pokhrel, Y. N.

    2016-12-01

    Land surface models have been used to assess water resources sustainability under changing Earth environment and increasing human water needs. Overwhelming observational records indicate that human activities have ubiquitous and pertinent effects on the hydrologic cycle; however, they have been crudely represented in large scale land surface models. In this study, we enhance an integrated continental-scale land hydrology model named Leaf-Hydro-Flood to better represent land-water management. The model is implemented at high resolution (5km grids) over the continental US. Surface water and groundwater are withdrawn based on actual practices. Newly added irrigation, water diversion, and dam operation schemes allow better simulations of stream flows, evapotranspiration, and infiltration. Results of various hydrologic fluxes and stores from two sets of simulation (one with and the other without human activities) are compared over a range of river basin and aquifer scales. The improved simulations of land hydrology have potential to build consistent modeling framework for human-water-climate interactions.

  18. To the Cloud! A Grassroots Proposal to Accelerate Brain Science Discovery

    PubMed Central

    Vogelstein, Joshua T.; Mensh, Brett; Hausser, Michael; Spruston, Nelson; Evans, Alan; Kording, Konrad; Amunts, Katrin; Ebell, Christoph; Muller, Jeff; Telefont, Martin; Hill, Sean; Koushika, Sandhya P.; Cali, Corrado; Valdés-Sosa, Pedro Antonio; Littlewood, Peter; Koch, Christof; Saalfeld, Stephan; Kepecs, Adam; Peng, Hanchuan; Halchenko, Yaroslav O.; Kiar, Gregory; Poo, Mu-Ming; Poline, Jean-Baptiste; Milham, Michael P.; Schaffer, Alyssa Picchini; Gidron, Rafi; Okano, Hideyuki; Calhoun, Vince D; Chun, Miyoung; Kleissas, Dean M.; Vogelstein, R. Jacob; Perlman, Eric; Burns, Randal; Huganir, Richard; Miller, Michael I.

    2018-01-01

    The revolution in neuroscientific data acquisition is creating an analysis challenge. We propose leveraging cloud-computing technologies to enable large-scale neurodata storing, exploring, analyzing, and modeling. This utility will empower scientists globally to generate and test theories of brain function and dysfunction. PMID:27810005

  19. The storage capacity of Potts models for semantic memory retrieval

    NASA Astrophysics Data System (ADS)

    Kropff, Emilio; Treves, Alessandro

    2005-08-01

    We introduce and analyse a minimal network model of semantic memory in the human brain. The model is a global associative memory structured as a collection of N local modules, each coding a feature, which can take S possible values, with a global sparseness a (the average fraction of features describing a concept). We show that, under optimal conditions, the number cM of modules connected on average to a module can range widely between very sparse connectivity (high dilution, c_{M}/N\\to 0 ) and full connectivity (c_{M}\\to N ), maintaining a global network storage capacity (the maximum number pc of stored and retrievable concepts) that scales like pc~cMS2/a, with logarithmic corrections consistent with the constraint that each synapse may store up to a fraction of a bit.

  20. Modelling Groundwater Depletion at Regional and Global Scales: Present State and Future Prospects.

    NASA Technical Reports Server (NTRS)

    Wada, Yoshihide

    2015-01-01

    Except for frozen water in ice and glaciers, groundwater is the world's largest distributed store of freshwater and has strategic importance to global food and water security. In this paper, the most recent advances quantifying groundwater depletion (GWD) are comprehensively reviewed. This paper critically evaluates the recently advanced modeling approaches estimating GWD at regional and global scales, and the evidence of feedbacks to the Earth system including sea-level rise associated with GWD. Finally, critical challenges and opportunities in the use of groundwater are identified for the adaption to growing food demand and uncertain climate.

  1. Modeling Groundwater Depletion at Regional and Global Scales: Present State and Future Prospects

    NASA Astrophysics Data System (ADS)

    Wada, Yoshihide

    2016-03-01

    Except for frozen water in ice and glaciers, groundwater is the world's largest distributed store of freshwater and has strategic importance to global food and water security. In this paper, the most recent advances quantifying groundwater depletion (GWD) are comprehensively reviewed. This paper critically evaluates the recently advanced modeling approaches estimating GWD at regional and global scales, and the evidence of feedbacks to the Earth system including sea-level rise associated with GWD. Finally, critical challenges and opportunities in the use of groundwater are identified for the adaption to growing food demand and uncertain climate.

  2. Probing the scale of new physics by Advanced LIGO/VIRGO

    NASA Astrophysics Data System (ADS)

    Dev, P. S. Bhupal; Mazumdar, A.

    2016-05-01

    We show that if the new physics beyond the standard model is associated with a first-order phase transition around 107- 108 GeV , the energy density stored in the resulting stochastic gravitational waves and the corresponding peak frequency are within the projected final sensitivity of the advanced LIGO/VIRGO detectors. We discuss some possible new physics scenarios that could arise at such energies, and in particular, the consequences for Peccei-Quinn and supersymmetry breaking scales.

  3. Solar Magnetic Carpet III: Coronal Modelling of Synthetic Magnetograms

    NASA Astrophysics Data System (ADS)

    Meyer, K. A.; Mackay, D. H.; van Ballegooijen, A. A.; Parnell, C. E.

    2013-09-01

    This article is the third in a series working towards the construction of a realistic, evolving, non-linear force-free coronal-field model for the solar magnetic carpet. Here, we present preliminary results of 3D time-dependent simulations of the small-scale coronal field of the magnetic carpet. Four simulations are considered, each with the same evolving photospheric boundary condition: a 48-hour time series of synthetic magnetograms produced from the model of Meyer et al. ( Solar Phys. 272, 29, 2011). Three simulations include a uniform, overlying coronal magnetic field of differing strength, the fourth simulation includes no overlying field. The build-up, storage, and dissipation of magnetic energy within the simulations is studied. In particular, we study their dependence upon the evolution of the photospheric magnetic field and the strength of the overlying coronal field. We also consider where energy is stored and dissipated within the coronal field. The free magnetic energy built up is found to be more than sufficient to power small-scale, transient phenomena such as nanoflares and X-ray bright points, with the bulk of the free energy found to be stored low down, between 0.5 - 0.8 Mm. The energy dissipated is currently found to be too small to account for the heating of the entire quiet-Sun corona. However, the form and location of energy-dissipation regions qualitatively agree with what is observed on small scales on the Sun. Future MHD modelling using the same synthetic magnetograms may lead to a higher energy release.

  4. Bigger Stores, More Stores, or No Stores: Paths of Retail Restructuring in Rural America

    ERIC Educational Resources Information Center

    Vias, Alexander C.

    2004-01-01

    Changes such as the development of large international retail chains, retail concentration, locational changes, technological innovation, new labor practices, and the increasing scale of individual stores, have revolutionized the retail sector. This broad restructuring will have profound impacts in rural America because employment in retail is a…

  5. A model for optimizing file access patterns using spatio-temporal parallelism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boonthanome, Nouanesengsy; Patchett, John; Geveci, Berk

    2013-01-01

    For many years now, I/O read time has been recognized as the primary bottleneck for parallel visualization and analysis of large-scale data. In this paper, we introduce a model that can estimate the read time for a file stored in a parallel filesystem when given the file access pattern. Read times ultimately depend on how the file is stored and the access pattern used to read the file. The file access pattern will be dictated by the type of parallel decomposition used. We employ spatio-temporal parallelism, which combines both spatial and temporal parallelism, to provide greater flexibility to possible filemore » access patterns. Using our model, we were able to configure the spatio-temporal parallelism to design optimized read access patterns that resulted in a speedup factor of approximately 400 over traditional file access patterns.« less

  6. Effects of rainfall seasonality and soil moisture capacity on mean annual water balance for Australian catchments

    USGS Publications Warehouse

    Potter, N.J.; Zhang, L.; Milly, P.C.D.; McMahon, T.A.; Jakeman, A.J.

    2005-01-01

    An important factor controlling catchment‐scale water balance is the seasonal variation of climate. The aim of this study is to investigate the effect of the seasonal distributions of water and energy, and their interactions with the soil moisture store, on mean annual water balance in Australia at catchment scales using a stochastic model of soil moisture balance with seasonally varying forcing. The rainfall regime at 262 catchments around Australia was modeled as a Poisson process with the mean storm arrival rate and the mean storm depth varying throughout the year as cosine curves with annual periods. The soil moisture dynamics were represented by use of a single, finite water store having infinite infiltration capacity, and the potential evapotranspiration rate was modeled as an annual cosine curve. The mean annual water budget was calculated numerically using a Monte Carlo simulation. The model predicted that for a given level of climatic aridity the ratio of mean annual evapotranspiration to rainfall was larger where the potential evapotranspiration and rainfall were in phase, that is, in summer‐dominant rainfall catchments, than where they were out of phase. The observed mean annual evapotranspiration ratios have opposite results. As a result, estimates of mean annual evapotranspiration from the model compared poorly with observational data. Because the inclusion of seasonally varying forcing alone was not sufficient to explain variability in the mean annual water balance, other catchment properties may play a role. Further analysis showed that the water balance was highly sensitive to the catchment‐scale soil moisture capacity. Calibrations of this parameter indicated that infiltration‐excess runoff might be an important process, especially for the summer‐dominant rainfall catchments; most similar studies have shown that modeling of infiltration‐excess runoff is not required at the mean annual timescale.

  7. Impact of downslope soil transport on carbon storage and fate in permafrost dominated landscapes

    NASA Astrophysics Data System (ADS)

    Shelef, E.; Rowland, J. C.; Wilson, C. J.; Altmann, G.; Hilley, G. E.

    2014-12-01

    A large fraction of high latitude permafrost-dominated landscapes are covered by soil mantled hillslopes. In these landscapes, soil organic carbon (SOC) accumulates and is lost through lateral transport processes. At present, these processes are not included in regional or global landsurface climate models. We present preliminary results of a soil transport and storage model over a permafrost dominated hillslope. In this model soil carbon is transported downslope within a mobile layer that thaws every summer. The model tracks soil transport and its subsequent storage at the hillslope's base. In a scenario where a carbon poor subsurface is blanketed by a carbon-rich surface layer, the progressive downslope soil transport can result in net carbon sequestration. This sequestration occurs because SOC is carried from the hilllsope's near-surface layer, where it is produced by plants and is capable of decomposing, into depositional sites at the hillslope's base where it is stored in frozen deposits such that it's decomposition rate is effectively zero. We use the model to evaluate the quantities of carbon stored in depositional settings during the Holocene, and to predict changes in sequestration rate in response to thaw depth thickening expected to occur within the next century due to climate-change. At the Holocene time scale, we show that a large amount of SOC is likely stored in depositional sites that comprise only a small fraction of arctic landscapes. The convergent topography of these sites makes them susceptible to fluvial erosion and suggests that increased fluvial incision in response to climate-change-induced thawing has the potential to release significant amounts of carbon to the river system, and potentially to the atmosphere. At the time scale of the next century, increased thaw depth may increase soil-transport rates on hillslopes and therefore increase SOC sequestration rates at a magnitude that may partly compensate for the carbon release expected from permafrost thawing. Model guided field data collection is essential to reduce the uncertainty of these estimates.

  8. The potential of detecting intermediate-scale biomass and canopy interception in a coniferous forest using cosmic-ray neutron intensity measurements and neutron transport modeling

    NASA Astrophysics Data System (ADS)

    Andreasen, M.; Looms, M. C.; Bogena, H. R.; Desilets, D.; Zreda, M. G.; Sonnenborg, T. O.; Jensen, K. H.

    2014-12-01

    The water stored in the various compartments of the terrestrial ecosystem (in snow, canopy interception, soil and litter) controls the exchange of the water and energy between the land surface and the atmosphere. Therefore, measurements of the water stored within these pools are critical for the prediction of e.g. evapotranspiration and groundwater recharge. The detection of cosmic-ray neutron intensity is a novel non-invasive method for the quantification of continuous intermediate-scale soil moisture. The footprint of the cosmic-ray neutron probe is a hemisphere of a few hectometers and subsurface depths of 10-70 cm depending on wetness. The cosmic-ray neutron method offers measurements at a scale between the point-scale measurements and large-scale satellite retrievals. The cosmic-ray neutron intensity is inversely correlated to the hydrogen stored within the footprint. Overall soil moisture represents the largest pool of hydrogen and changes in the soil moisture clearly affect the cosmic-ray neutron signal. However, the neutron intensity is also sensitive to variations of hydrogen in snow, canopy interception and biomass offering the potential to determine water content in such pools from the signal. In this study we tested the potential of determining canopy interception and biomass using cosmic-ray neutron intensity measurements within the framework of the Danish Hydrologic Observatory (HOBE) and the Terrestrial Environmental Observatories (TERENO). Continuous measurements at the ground and the canopy level, along with profile measurements were conducted at towers at forest field sites. Field experiments, including shielding the cosmic-ray neutron probes with cadmium foil (to remove lower-energy neutrons) and measuring reference intensity rates at complete water saturated conditions (on the sea close to the HOBE site), were further conducted to obtain an increased understanding of the physics controlling the cosmic-ray neutron transport and the equipment used. Additionally, neutron transport modeling, using the extended version of the Monte Carlo N-Particle Transport Code, was conducted. The responses of the reference condition, different amounts of biomass, soil moisture and canopy interception on the cosmic-ray neutron intensity were simulated and compared to the measurements.

  9. Wildfire and drought dynamics destabilize carbon stores of fire-suppressed forests

    Treesearch

    J. Mason Earles; Malcolm P. North; Matthew D. Hurteau

    2014-01-01

    Widespread fire suppression and thinning have altered the structure and composition of many forests in the western United States, making them more susceptible to the synergy of large-scale drought and fire events. We examine how these changes affect carbon storage and stability compared to historic fire-adapted conditions. We modeled carbon dynamics under possible...

  10. Using RDF to Model the Structure and Process of Systems

    NASA Astrophysics Data System (ADS)

    Rodriguez, Marko A.; Watkins, Jennifer H.; Bollen, Johan; Gershenson, Carlos

    Many systems can be described in terms of networks of discrete elements and their various relationships to one another. A semantic network, or multi-relational network, is a directed labeled graph consisting of a heterogeneous set of entities connected by a heterogeneous set of relationships. Semantic networks serve as a promising general-purpose modeling substrate for complex systems. Various standardized formats and tools are now available to support practical, large-scale semantic network models. First, the Resource Description Framework (RDF) offers a standardized semantic network data model that can be further formalized by ontology modeling languages such as RDF Schema (RDFS) and the Web Ontology Language (OWL). Second, the recent introduction of highly performant triple-stores (i.e. semantic network databases) allows semantic network models on the order of 109 edges to be efficiently stored and manipulated. RDF and its related technologies are currently used extensively in the domains of computer science, digital library science, and the biological sciences. This article will provide an introduction to RDF/RDFS/OWL and an examination of its suitability to model discrete element complex systems.

  11. Intelligent monitoring system for real-time geologic CO2 storage, optimization and reservoir managemen

    NASA Astrophysics Data System (ADS)

    Dou, S.; Commer, M.; Ajo Franklin, J. B.; Freifeld, B. M.; Robertson, M.; Wood, T.; McDonald, S.

    2017-12-01

    Archer Daniels Midland Company's (ADM) world-scale agricultural processing and biofuels production complex located in Decatur, Illinois, is host to two industrial-scale carbon capture and storage projects. The first operation within the Illinois Basin-Decatur Project (IBDP) is a large-scale pilot that injected 1,000,000 metric tons of CO2 over a three year period (2011-2014) in order to validate the Illinois Basin's capacity to permanently store CO2. Injection for the second operation, the Illinois Industrial Carbon Capture and Storage Project (ICCS), started in April 2017, with the purpose of demonstrating the integration of carbon capture and storage (CCS) technology at an ethanol plant. The capacity to store over 1,000,000 metric tons of CO2 per year is anticipated. The latter project is accompanied by the development of an intelligent monitoring system (IMS) that will, among other tasks, perform hydrogeophysical joint analysis of pressure, temperature and seismic reflection data. Using a preliminary radial model assumption, we carry out synthetic joint inversion studies of these data combinations. We validate the history-matching process to be applied to field data once CO2-breakthrough at observation wells occurs. This process will aid the estimation of permeability and porosity for a reservoir model that best matches monitoring observations. The reservoir model will further be used for forecasting studies in order to evaluate different leakage scenarios and develop appropriate early-warning mechanisms. Both the inversion and forecasting studies aim at building an IMS that will use the seismic and pressure-temperature data feeds for providing continuous model calibration and reservoir status updates.

  12. Towards methodical modelling: Differences between the structure and output dynamics of multiple conceptual models

    NASA Astrophysics Data System (ADS)

    Knoben, Wouter; Woods, Ross; Freer, Jim

    2016-04-01

    Conceptual hydrologic models consist of a certain arrangement of spatial and temporal dynamics consisting of stores, fluxes and transformation functions, depending on the modeller's choices and intended use. They have the advantages of being computationally efficient, being relatively easy model structures to reconfigure and having relatively low input data demands. This makes them well-suited for large-scale and large-sample hydrology, where appropriately representing the dominant hydrologic functions of a catchment is a main concern. Given these requirements, the number of parameters in the model cannot be too high, to avoid equifinality and identifiability issues. This limits the number and level of complexity of dominant hydrologic processes the model can represent. Specific purposes and places thus require a specific model and this has led to an abundance of conceptual hydrologic models. No structured overview of these models exists and there is no clear method to select appropriate model structures for different catchments. This study is a first step towards creating an overview of the elements that make up conceptual models, which may later assist a modeller in finding an appropriate model structure for a given catchment. To this end, this study brings together over 30 past and present conceptual models. The reviewed model structures are simply different configurations of three basic model elements (stores, fluxes and transformation functions), depending on the hydrologic processes the models are intended to represent. Differences also exist in the inner workings of the stores, fluxes and transformations, i.e. the mathematical formulations that describe each model element's intended behaviour. We investigate the hypothesis that different model structures can produce similar behavioural simulations. This can clarify the overview of model elements by grouping elements which are similar, which can improve model structure selection.

  13. Numerical Modeling of Propellant Boil-Off in a Cryogenic Storage Tank

    NASA Technical Reports Server (NTRS)

    Majumdar, A. K.; Steadman, T. E.; Maroney, J. L.; Sass, J. P.; Fesmire, J. E.

    2007-01-01

    A numerical model to predict boil-off of stored propellant in large spherical cryogenic tanks has been developed. Accurate prediction of tank boil-off rates for different thermal insulation systems was the goal of this collaboration effort. The Generalized Fluid System Simulation Program, integrating flow analysis and conjugate heat transfer for solving complex fluid system problems, was used to create the model. Calculation of tank boil-off rate requires simultaneous simulation of heat transfer processes among liquid propellant, vapor ullage space, and tank structure. The reference tank for the boil-off model was the 850,000 gallon liquid hydrogen tank at Launch Complex 39B (LC- 39B) at Kennedy Space Center, which is under study for future infrastructure improvements to support the Constellation program. The methodology employed in the numerical model was validated using a sub-scale model and tank. Experimental test data from a 1/15th scale version of the LC-39B tank using both liquid hydrogen and liquid nitrogen were used to anchor the analytical predictions of the sub-scale model. Favorable correlations between sub-scale model and experimental test data have provided confidence in full-scale tank boil-off predictions. These methods are now being used in the preliminary design for other cases including future launch vehicles

  14. Collected Data of The Boreal Ecosystem and Atmosphere Study (BOREAS)

    NASA Technical Reports Server (NTRS)

    Newcomer, J. (Editor); Landis, D. (Editor); Conrad, S. (Editor); Curd, S. (Editor); Huemmrich, K. (Editor); Knapp, D. (Editor); Morrell, A. (Editor); Nickerson, J. (Editor); Papagno, A. (Editor); Rinker, D. (Editor)

    2000-01-01

    The Boreal Ecosystem-Atmosphere Study (BOREAS) was a large-scale international interdisciplinary climate-ecosystem interaction experiment in the northern boreal forests of Canada. Its goal was to improve our understanding of the boreal forests -- how they interact with the atmosphere, how much CO2 they can store, and how climate change will affect them. BOREAS wanted to learn to use satellite data to monitor the forests, and to improve computer simulation and weather models so scientists can anticipate the effects of global change. This BOREAS CD-ROM set is a set of 12 CD-ROMs containing the finalized point data sets and compressed image data from the BOREAS Project. All point data are stored in ASCII text files, and all image and GIS products are stored as binary images, compressed using GZip. Additional descriptions of the various data sets on this CD-ROM are available in other documents in the BOREAS series.

  15. Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency.

    PubMed

    Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio

    2015-01-01

    Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB.

  16. Efficient Storage Scheme of Covariance Matrix during Inverse Modeling

    NASA Astrophysics Data System (ADS)

    Mao, D.; Yeh, T. J.

    2013-12-01

    During stochastic inverse modeling, the covariance matrix of geostatistical based methods carries the information about the geologic structure. Its update during iterations reflects the decrease of uncertainty with the incorporation of observed data. For large scale problem, its storage and update cost too much memory and computational resources. In this study, we propose a new efficient storage scheme for storage and update. Compressed Sparse Column (CSC) format is utilized to storage the covariance matrix, and users can assign how many data they prefer to store based on correlation scales since the data beyond several correlation scales are usually not very informative for inverse modeling. After every iteration, only the diagonal terms of the covariance matrix are updated. The off diagonal terms are calculated and updated based on shortened correlation scales with a pre-assigned exponential model. The correlation scales are shortened by a coefficient, i.e. 0.95, every iteration to show the decrease of uncertainty. There is no universal coefficient for all the problems and users are encouraged to try several times. This new scheme is tested with 1D examples first. The estimated results and uncertainty are compared with the traditional full storage method. In the end, a large scale numerical model is utilized to validate this new scheme.

  17. From high-scale leptogenesis to low-scale one-loop neutrino mass generation

    NASA Astrophysics Data System (ADS)

    Zhou, Hang; Gu, Pei-Hong

    2018-02-01

    We show that a high-scale leptogenesis can be consistent with a low-scale one-loop neutrino mass generation. Our models are based on the SU(3)c × SU(2)L × U(1)Y × U(1) B - L gauge groups. Except a complex singlet scalar for the U(1) B - L symmetry breaking, the other new scalars and fermions (one scalar doublet, two or more real scalar singlets/triplets and three right-handed neutrinos) are odd under an unbroken Z2 discrete symmetry. The real scalar decays can produce an asymmetry stored in the new scalar doublet which subsequently decays into the standard model lepton doublets and the right-handed neutrinos. The lepton asymmetry in the standard model leptons then can be partially converted to a baryon asymmetry by the sphaleron processes. By integrating out the heavy scalar singlets/triplets, we can realize an effective theory to radiatively generate the small neutrino masses at the TeV scale. Furthermore, the lightest right-handed neutrino can serve as a dark matter candidate.

  18. A model of forest floor carbon mass for United States forest types

    Treesearch

    James E. Smith; Linda S. Heath

    2002-01-01

    Includes a large set of published values of forest floor mass and develop large-scale estimates of carbon mass according to region and forest type. Estimates of average forest floor carbon mass per hectare of forest applied to a 1997 summary forest inventory, sum to 4.5 Gt carbon stored in forests of the 48 contiguous United States.

  19. Design and Implementation of a Metadata-rich File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, S; Gokhale, M B; Maltzahn, C

    2010-01-19

    Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address thesemore » problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.« less

  20. Slot-like capacity and resource-like coding in a neural model of multiple-item working memory.

    PubMed

    Standage, Dominic; Pare, Martin

    2018-06-27

    For the past decade, research on the storage limitations of working memory has been dominated by two fundamentally different hypotheses. On the one hand, the contents of working memory may be stored in a limited number of `slots', each with a fixed resolution. On the other hand, any number of items may be stored, but with decreasing resolution. These two hypotheses have been invaluable in characterizing the computational structure of working memory, but neither provides a complete account of the available experimental data, nor speaks to the neural basis of the limitations it characterizes. To address these shortcomings, we simulated a multiple-item working memory task with a cortical network model, the cellular resolution of which allowed us to quantify the coding fidelity of memoranda as a function of memory load, as measured by the discriminability, regularity and reliability of simulated neural spiking. Our simulations account for a wealth of neural and behavioural data from human and non-human primate studies, and they demonstrate that feedback inhibition lowers both capacity and coding fidelity. Because the strength of inhibition scales with the number of items stored by the network, increasing this number progressively lowers fidelity until capacity is reached. Crucially, the model makes specific, testable predictions for neural activity on multiple-item working memory tasks.

  1. Numerical Field Model Simulation of Full Scale Fire Tests in a Closed Spherical/Cylindrical Vessel.

    DTIC Science & Technology

    1987-12-01

    the behavior of an actual fire on board a ship. The computer model will be verified by the experimental data obtained in Fire-l. It is important to... behavior in simulations where convection is important. The upwind differencing scheme takes into account the unsymmetrical phenomenon of convection by using...TANK CELL ON THE NORTH SIDE) FOR A * * PARTICULAR FIRE CELL * * COSUMS (I,J) = THE ARRAY TO STORE THE SIMILIAR VALUE FOR THE FIRE * * CELL TO THE SOUTH

  2. Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency

    PubMed Central

    Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio

    2015-01-01

    Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB. PMID:26558254

  3. Is working memory stored along a logarithmic timeline? Converging evidence from neuroscience, behavior and models.

    PubMed

    Singh, Inder; Tiganj, Zoran; Howard, Marc W

    2018-04-23

    A growing body of evidence suggests that short-term memory does not only store the identity of recently experienced stimuli, but also information about when they were presented. This representation of 'what' happened 'when' constitutes a neural timeline of recent past. Behavioral results suggest that people can sequentially access memories for the recent past, as if they were stored along a timeline to which attention is sequentially directed. In the short-term judgment of recency (JOR) task, the time to choose between two probe items depends on the recency of the more recent probe but not on the recency of the more remote probe. This pattern of results suggests a backward self-terminating search model. We review recent neural evidence from the macaque lateral prefrontal cortex (lPFC) (Tiganj, Cromer, Roy, Miller, & Howard, in press) and behavioral evidence from human JOR task (Singh & Howard, 2017) bearing on this question. Notably, both lines of evidence suggest that the timeline is logarithmically compressed as predicted by Weber-Fechner scaling. Taken together, these findings provide an integrative perspective on temporal organization and neural underpinnings of short-term memory. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. What Is Being Played in the World? Mobile eSport Applications

    ERIC Educational Resources Information Center

    Atalay, Ahmet; Topuz, Arif Cem

    2018-01-01

    In this study, the aim is to examine the most popular eSport applications at a global scale. In this context, the App Store and Google Play Store application platforms which have the highest number of users at a global scale were focused on. For this reason, the eSport applications included in these two platforms constituted the sampling of the…

  5. Modeling the temporal dynamics of nonstructural carbohydrate pools in forest trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richardson, Andrew D.

    Trees store carbohydrates, in the form of sugars and starch, as reserves to be used to power both future growth as well as to support day-to-day metabolic functions. These reserves are particularly important in the context of how trees cope with disturbance and stress—for example, as related to pest outbreaks, wind or ice damage, and extreme climate events. In this project, we measured the size of carbon reserves in forest trees, and determined how quickly these reserves are used and replaced—i.e., their “turnover time”. Our work was conducted at Harvard Forest, a temperate deciduous forest in central Massachusetts. Through fieldmore » sampling, laboratory-based chemical analyses, and allometric modeling, we scaled these measurements up to whole-tree NSC budgets. We used these data to test and improve computer simulation models of carbon flow through forest ecosystems. Our modeling focused on the mathematical representation of these stored carbon reserves, and we examined the sensitivity of model performance to different model structures. This project contributes to DOE’s goal to improve next-generation models of the earth system, and to understand the impacts of climate change on terrestrial ecosystems.« less

  6. Wind-Tunnel Tests of a 1/8-Scale Powered Model of the XTB3F-1 Airplane, TED No. NACA 2382

    NASA Technical Reports Server (NTRS)

    McKee, John W.; Vogler, Raymond D.

    1947-01-01

    A 1/8 scale model of the Grumman XTB3F-1 airplane was tested in the Langley 7- by 10-foot tunnel to determine the stability and control characteristics and to provide data for estimating the airplane handling qualities. The report includes longitudinal and lateral stability and control characteristics of the complete model, the characteristics of the isolated horizontal tail, the effects of various flow conditions through the jet duct, tests with external stores attached to the underside of the wing, ana tests simulating landing and take-off conditions with a ground board. The handling characteristics of the airplane have not been computed but some conclusions were indicated by the data. An improvement in the longitudinal stability was obtained by tilting the thrust line down. It is shown that if the wing flap is spring loaded so that the flap deflection varies with airspeed, the airplanes will be less stable than with the flap retracted or fully deflected. An increase in size of the vertical tail and of the dorsal fin gave more desirable yawing-moment characteristics than the original vertical tail and dorsal fin. Preventing air flow through the jet duct system or simulating jet operation with unheated air produced only small changes in the model characteristics. The external stores on the underside of the wing had only small effects on the model characteristics. After completion of the investigation, the model was returned to the contractor for modifications indicated by the test results.

  7. North American water availability under stress and duress: building understanding from simulations, observations and data products

    NASA Astrophysics Data System (ADS)

    Maxwell, R. M.; Condon, L. E.; Atchley, A. L.; Hector, B.

    2017-12-01

    Quantifying the available freshwater for human use and ecological function depends on fluxes and stores that are hard to observe. Evapotranspiration (ET) is the largest terrestrial flux of water behind precipitation but is observed with low spatial density. Likewise, groundwater is the largest freshwater store, yet is equally uncertain. The ability to upscale observations of these variables is an additional complication; point measurements are made at scales orders of magnitude smaller than remote sensing data products. Integrated hydrologic models that simulate continental extents at fine spatial resolution are now becoming an additional tool to constrain fluxes and address interconnections. For example, recent work has shown connections between water table depth and transpiration partitioning, and demonstrated the ability to reconcile point observations and large-scale inferences. Here we explore the dynamics of large hydrologic systems experiencing change and stress across continental North America using integrated model simulations, observations and data products. Simulations of aquifer depletion due to pervasive groundwater pumping diagnose both stream depletion and changes in ET. Simulations of systematic increases in temperature are used to understand the relationship between snowpack dynamics, surface and groundwater flow, ET and a changing climate. Remotely sensed products including the GRACE estimates of total storage change are downscaled using model simulations to better understand human impacts to the hydrologic cycle. These example applications motivate a path forward to better use simulations to understand water availability.

  8. Influence of the Magnitude and Spatial Distribution of Water Storage in Aquifers on the Character of Baseflow Recessions

    NASA Astrophysics Data System (ADS)

    Nieber, J. L.; Li, W.

    2017-12-01

    The instantaneous groundwater discharge (Qgw) from a watershed is related to volume of drainable water stored (Sgw) within the watershed aquifer(s). The relation is hysteretic and the magnitude of the hysteresis is completely scale-dependent. In the research reported here we apply a previously calibrated (USGS) GSFLOW model to the simulation of surface and subsurface runoff for the Sagehen Creek watershed. This 29.3 km2 watershed is located in the eastern range of the Sierra Nevada Mountains, and most of the precipitation falls in the form of snow. The GSFLOW model is composed of a surface water and shallow subsurface flow hydrology model, PRMS, and a groundwater flow component based on MODFLOW. PRMS is a semi-distributed watershed model, very similar in character to the well-known SWAT model. The PRMS model is coupled with the MODFLOW model in that deep percolation generated within the PRMS model feeds into the MODFLOW model. The simulated baseflow recessions, plotted as -dQ/dt vs Q, show a strong dependence to watershed topography and plot concave downward. These plots show a somewhat weaker dependence on the hydrologic fluxes of evapotranspiration and recharge, with the concave downward shape maintained but somewhat modified by these hydrologic fluxes. As expected the Qgw vs Sgw relation is markedly hysteretic. The cause for this hysteresis is related to the magnitude of water stored, and also the spatial distribution of water stored in the watershed, with the antecedent storage in upland areas controlling the recession flow in late time, while the valley area dominates the recession flow in the early time. Both the minimum streamflow (Qmin ; the flow at the transition between early time and late time uninterrupted recession) and the intercept (intercept of the regression line fit to the recession data on a log-log scale) show a strong relationship with antecedent streamflows. The minimum streamflow, Qmin, is found to be a valid normalizing parameter for producing a unique normalized -dQ/dt vs. Q relation from data manifesting the effects of hysteresis. It is proposed that this normalized relation can be used to improve the performance of low-dimension dynamic models of watershed hydrology that would otherwise not account for hysteresis in Qgw vs Sgw.

  9. Numerical simulation of ozone concentration profile and flow characteristics in paddy bulks.

    PubMed

    Pandiselvam, Ravi; Chandrasekar, Veerapandian; Thirupathi, Venkatachalam

    2017-08-01

    Ozone has shown the potential to control stored product insect pests. The high reactivity of ozone leads to special problems when it passes though an organic medium such as stored grains. Thus, there is a need for a simulation study to understand the concentration profile and flow characteristics of ozone in stored paddy bulks as a function of time. Simulation of ozone concentration through the paddy grain bulks was explained by applying the principle of the law of conservation along with a continuity equation. A higher ozone concentration value was observed at regions near the ozone diffuser whereas a lower concentration value was observed at regions away from the ozone diffuser. The relative error between the experimental and predicted ozone concentration values for the entire bin geometry was less than 42.8%. The simulation model described a non-linear change of ozone concentration in stored paddy bulks. Results of this study provide a valuable source for estimating the parameters needed for effectively designing a storage bin for fumigation of paddy grains in a commercial scale continuous-flow ozone fumigation system. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  10. Aerodynamic Loads on an External Store Adjacent to a 45 Degree Sweptback Wing at Mach Numbers from 0.70 to 1.96, Including an Evaluation of Techniques Used

    NASA Technical Reports Server (NTRS)

    Guy, Lawrence D; Hadaway, William M

    1955-01-01

    Aerodynamic forces and moments have been obtained in the Langley 9- by 12-inch blowdown tunnel on an external store and on a 45 degree swept-back wing-body combination measured separately at Mach numbers from 0.70 to 1.96. The wing was cantilevered and had an aspect ratio of 4.0; the store was independently sting-mounted and had a Douglas Aircraft Co. (DAC) store shape. The angle of attack range was from -3 degrees to 12 degrees and the Reynolds number (based on wing mean aerodynamic chord) varied from 1.2 x10(6) to 1.7 x 10(6). Wing-body transonic forces and moments have been compared with data of a geometrically similar full-scale model tested in the Langley 16-foot and 8-foot transonic tunnels in order to aid in the evaluation of transonic-tunnel interference. The principal effect of the store, for the position tested, was that of delaying the wing-fuselage pitch-up tendency to higher angles of attack at Mach numbers from 0.70 to 0.90 in a manner similar to that of a wing chord extension. The most critical loading condition on the store was that due to side force, not only because the loads were of large magnitude but also because they were in the direction of least structural strength of the supporting pylon. These side loads were greatest at high angles of attack in the supersonic speed range. Removal of the supporting pylon (or increasing the gap between the store and wing) reduced the values of the variation of side-force coefficientwith angle of attack by about 50 percent at all test Mach numbers, indicating that important reductions in store side force may be realized by proper design or location of the necessary supporting pylon. A change of the store skew angle (nose inboard) was found to relieve the excessive store side loads throughout the Mach number range. It was also determined that the relative position of the fuselage nose to the store can appreciably affect the store side forces at supersonic speeds.

  11. Integrated modelling of H-mode pedestal and confinement in JET-ILW

    NASA Astrophysics Data System (ADS)

    Saarelma, S.; Challis, C. D.; Garzotti, L.; Frassinetti, L.; Maggi, C. F.; Romanelli, M.; Stokes, C.; Contributors, JET

    2018-01-01

    A pedestal prediction model Europed is built on the existing EPED1 model by coupling it with core transport simulation using a Bohm-gyroBohm transport model to self-consistently predict JET-ILW power scan for hybrid plasmas that display weaker power degradation than the IPB98(y, 2) scaling of the energy confinement time. The weak power degradation is reproduced in the coupled core-pedestal simulation. The coupled core-pedestal model is further tested for a 3.0 MA plasma with the highest stored energy achieved in JET-ILW so far, giving a prediction of the stored plasma energy within the error margins of the measured experimental value. A pedestal density prediction model based on the neutral penetration is tested on a JET-ILW database giving a prediction with an average error of 17% from the experimental data when a parameter taking into account the fuelling rate is added into the model. However the model fails to reproduce the power dependence of the pedestal density implying missing transport physics in the model. The future JET-ILW deuterium campaign with increased heating power is predicted to reach plasma energy of 11 MJ, which would correspond to 11-13 MW of fusion power in equivalent deuterium-tritium plasma but with isotope effects on pedestal stability and core transport ignored.

  12. Scalable Automated Model Search

    DTIC Science & Technology

    2014-05-20

    ma- chines. Categories and Subject Descriptors Big Data [Distributed Computing]: Large scale optimization 1. INTRODUCTION Modern scientific and...from Continuum Analytics[1], and Apache Spark 0.8.1. Additionally, we made use of Hadoop 1.0.4 configured on local disks as our data store for the large...Borkar et al. Hyracks: A flexible and extensible foundation for data -intensive computing. In ICDE, 2011. [16] J. Canny and H. Zhao. Big data

  13. Where Does Water Go During Hydraulic Fracturing?

    PubMed

    O'Malley, D; Karra, S; Currier, R P; Makedonska, N; Hyman, J D; Viswanathan, H S

    2016-07-01

    During hydraulic fracturing millions of gallons of water are typically injected at high pressure into deep shale formations. This water can be housed in fractures, within the shale matrix, and can potentially migrate beyond the shale formation via fractures and/or faults raising environmental concerns. We describe a generic framework for producing estimates of the volume available in fractures and undamaged shale matrix where water injected into a representative shale site could reside during hydraulic fracturing, and apply it to a representative site that incorporates available field data. The amount of water that can be stored in the fractures is estimated by calculating the volume of all the fractures associated with a discrete fracture network (DFN) based on real data and using probability theory to estimate the volume of smaller fractures that are below the lower cutoff for the fracture radius in the DFN. The amount of water stored in the matrix is estimated utilizing two distinct methods-one using a two-phase model at the pore-scale and the other using a single-phase model at the continuum scale. Based on these calculations, it appears that most of the water resides in the matrix with a lesser amount in the fractures. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  14. Random access in large-scale DNA data storage.

    PubMed

    Organick, Lee; Ang, Siena Dumas; Chen, Yuan-Jyue; Lopez, Randolph; Yekhanin, Sergey; Makarychev, Konstantin; Racz, Miklos Z; Kamath, Govinda; Gopalan, Parikshit; Nguyen, Bichlien; Takahashi, Christopher N; Newman, Sharon; Parker, Hsing-Yeh; Rashtchian, Cyrus; Stewart, Kendall; Gupta, Gagan; Carlson, Robert; Mulligan, John; Carmean, Douglas; Seelig, Georg; Ceze, Luis; Strauss, Karin

    2018-03-01

    Synthetic DNA is durable and can encode digital data with high density, making it an attractive medium for data storage. However, recovering stored data on a large-scale currently requires all the DNA in a pool to be sequenced, even if only a subset of the information needs to be extracted. Here, we encode and store 35 distinct files (over 200 MB of data), in more than 13 million DNA oligonucleotides, and show that we can recover each file individually and with no errors, using a random access approach. We design and validate a large library of primers that enable individual recovery of all files stored within the DNA. We also develop an algorithm that greatly reduces the sequencing read coverage required for error-free decoding by maximizing information from all sequence reads. These advances demonstrate a viable, large-scale system for DNA data storage and retrieval.

  15. Model Store Curriculum. A Developmental Model for North Dakota Schools. Final Report. Research Series No. 13.

    ERIC Educational Resources Information Center

    Goschen, Todd; Warcup, Dennis

    The final report evaluates the activities of the first nine weeks of a project designed to develop a curriculum guide for a school-model store at a North Dakota high school. The program combines the favorable aspects of both the school store and the model store, providing "live" experiences as well as simulated ones. The Distributive…

  16. Scaling Issues Between Plot and Satellite Radiobrightness Observations of Arctic Tundra

    NASA Technical Reports Server (NTRS)

    Kim, Edward J.; England, Anthony W.; Judge, Jasmeet; Zukor, Dorothy J. (Technical Monitor)

    2000-01-01

    Data from generation of satellite microwave radiometer will allow the detection of seasonal to decadal changes in the arctic hydrology cycle as expressed in temporal and spatial patterns of moisture stored in soil and snow This nw capability will require calibrated Land Surface Process/Radiobrightness (LSP/R) model for the principal terrains found in the circumpolar Arctic. These LSP/R models can than be used in weak constraint. Dimensional Data Assimilation (DDA)of the daily satellite observation to estimate temperature and moisture profiles within the permafrost in active layer.

  17. Geospatial Data as a Service: Towards planetary scale real-time analytics

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.

    2017-12-01

    The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory analysis capabilities, for dealing with petabyte-scale geospatial data collections.

  18. Is fracture a bigger problem for smaller animals? Force and fracture scaling for a simple model of cutting, puncture and crushing

    PubMed Central

    Choi, Seunghee; Coon, Joshua J.; Goggans, Matthew Scott; Kreisman, Thomas F.; Silver, Daniel M.; Nesson, Michael H.

    2016-01-01

    Many of the materials that are challenging for large animals to cut or puncture are also cut and punctured by much smaller organisms that are limited to much smaller forces. Small organisms can overcome their force limitations by using sharper tools, but one drawback may be an increased susceptibility to fracture. We use simple contact mechanics models to estimate how much smaller the diameter of the tips or edges of tools such as teeth, claws and cutting blades must be in smaller organisms in order for them to puncture or cut the same materials as larger organisms. In order to produce the same maximum stress when maximum force scales as the square of body length, the diameter of the tool region that is in contact with the target material must scale isometrically for punch-like tools (e.g. scorpion stings) on thick targets, and for crushing tools (e.g. molars). For punch-like tools on thin targets, and for cutting blades on thick targets, the tip or edge diameters must be even smaller than expected from isometry in smaller animals. The diameters of a small sample of unworn punch-like tools from a large range of animal sizes are consistent with the model, scaling isometrically or more steeply (positively allometric). In addition, we find that the force required to puncture a thin target using real biological tools scales linearly with tip diameter, as predicted by the model. We argue that, for smaller tools, the minimum energy to fracture the tool will be a greater fraction of the minimum energy required to puncture the target, making fracture more likely. Finally, energy stored in tool bending, relative to the energy to fracture the tool, increases rapidly with the aspect ratio (length/width), and we expect that smaller organisms often have to employ higher aspect ratio tools in order to puncture or cut to the required depth with available force. The extra stored energy in higher aspect ratio tools is likely to increase the probability of fracture. We discuss some of the implications of the suggested scaling rules and possible adaptations to compensate for fracture sensitivity in smaller organisms. PMID:27274804

  19. Assessment of CO2 Storage Potential in Naturally Fractured Reservoirs With Dual-Porosity Models

    NASA Astrophysics Data System (ADS)

    March, Rafael; Doster, Florian; Geiger, Sebastian

    2018-03-01

    Naturally Fractured Reservoirs (NFR's) have received little attention as potential CO2 storage sites. Two main facts deter from storage projects in fractured reservoirs: (1) CO2 tends to be nonwetting in target formations and capillary forces will keep CO2 in the fractures, which typically have low pore volume; and (2) the high conductivity of the fractures may lead to increased spatial spreading of the CO2 plume. Numerical simulations are a powerful tool to understand the physics behind brine-CO2 flow in NFR's. Dual-porosity models are typically used to simulate multiphase flow in fractured formations. However, existing dual-porosity models are based on crude approximations of the matrix-fracture fluid transfer processes and often fail to capture the dynamics of fluid exchange accurately. Therefore, more accurate transfer functions are needed in order to evaluate the CO2 transfer to the matrix. This work presents an assessment of CO2 storage potential in NFR's using dual-porosity models. We investigate the impact of a system of fractures on storage in a saline aquifer, by analyzing the time scales of brine drainage by CO2 in the matrix blocks and the maximum CO2 that can be stored in the rock matrix. A new model to estimate drainage time scales is developed and used in a transfer function for dual-porosity simulations. We then analyze how injection rates should be limited in order to avoid early spill of CO2 (lost control of the plume) on a conceptual anticline model. Numerical simulations on the anticline show that naturally fractured reservoirs may be used to store CO2.

  20. The inclusion of ocean-current effects in a tidal-current model as forcing in the convection term and its application to the mesoscale fate of CO2 seeping from the seafloor

    NASA Astrophysics Data System (ADS)

    Sakaizawa, Ryosuke; Kawai, Takaya; Sato, Toru; Oyama, Hiroyuki; Tsumune, Daisuke; Tsubono, Takaki; Goto, Koichi

    2018-03-01

    The target seas of tidal-current models are usually semi-closed bays, minimally affected by ocean currents. For these models, tidal currents are simulated in computational domains with a spatial scale of a couple hundred kilometers or less, by setting tidal elevations at their open boundaries. However, when ocean currents cannot be ignored in the sea areas of interest, such as in open seas near coastlines, it is necessary to include ocean-current effects in these tidal-current models. In this study, we developed a numerical method to analyze tidal currents near coasts by incorporating pre-calculated ocean-current velocities. First, a large regional-scale simulation with a spatial scale of several thousand kilometers was conducted and temporal changes in the ocean-current velocity at each grid point were stored. Next, the spatially and temporally interpolated ocean-current velocity was incorporated as forcing into the cross terms of the convection term of a tidal-current model having computational domains with spatial scales of hundreds of kilometers or less. Then, we applied this method to the diffusion of dissolved CO2 in a sea area off Tomakomai, Japan, and compared the numerical results and measurements to validate the proposed method.

  1. Evaluating, predicting and mapping belowground carbon stores in Kenyan mangroves.

    PubMed

    Gress, Selena K; Huxham, Mark; Kairo, James G; Mugi, Lilian M; Briers, Robert A

    2017-01-01

    Despite covering only approximately 138 000 km 2 , mangroves are globally important carbon sinks with carbon density values three to four times that of terrestrial forests. A key challenge in evaluating the carbon benefits from mangrove forest conservation is the lack of rigorous spatially resolved estimates of mangrove sediment carbon stocks; most mangrove carbon is stored belowground. Previous work has focused on detailed estimations of carbon stores over relatively small areas, which has obvious limitations in terms of generality and scope of application. Most studies have focused only on quantifying the top 1 m of belowground carbon (BGC). Carbon stored at depths beyond 1 m, and the effects of mangrove species, location and environmental context on these stores, are poorly studied. This study investigated these variables at two sites (Gazi and Vanga in the south of Kenya) and used the data to produce a country-specific BGC predictive model for Kenya and map BGC store estimates throughout Kenya at spatial scales relevant for climate change research, forest management and REDD+ (reduced emissions from deforestation and degradation). The results revealed that mangrove species was the most reliable predictor of BGC; Rhizophora muronata had the highest mean BGC with 1485.5 t C ha -1 . Applying the species-based predictive model to a base map of species distribution in Kenya for the year 2010 with a 2.5 m 2 resolution produced an estimate of 69.41 Mt C [±9.15 95% confidence interval (C.I.)] for BGC in Kenyan mangroves. When applied to a 1992 mangrove distribution map, the BGC estimate was 75.65 Mt C (±12.21 95% C.I.), an 8.3% loss in BGC stores between 1992 and 2010 in Kenya. The country-level mangrove map provides a valuable tool for assessing carbon stocks and visualizing the distribution of BGC. Estimates at the 2.5 m 2 resolution provide sufficient details for highlighting and prioritizing areas for mangrove conservation and restoration. © 2016 John Wiley & Sons Ltd.

  2. Semantic Representation and Scale-Up of Integrated Air Traffic Management Data

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Ranjan, Shubha; Wei, Mei Y.; Eshow, Michelle M.

    2016-01-01

    Each day, the global air transportation industry generates a vast amount of heterogeneous data from air carriers, air traffic control providers, and secondary aviation entities handling baggage, ticketing, catering, fuel delivery, and other services. Generally, these data are stored in isolated data systems, separated from each other by significant political, regulatory, economic, and technological divides. These realities aside, integrating aviation data into a single, queryable, big data store could enable insights leading to major efficiency, safety, and cost advantages. In this paper, we describe an implemented system for combining heterogeneous air traffic management data using semantic integration techniques. The system transforms data from its original disparate source formats into a unified semantic representation within an ontology-based triple store. Our initial prototype stores only a small sliver of air traffic data covering one day of operations at a major airport. The paper also describes our analysis of difficulties ahead as we prepare to scale up data storage to accommodate successively larger quantities of data -- eventually covering all US commercial domestic flights over an extended multi-year timeframe. We review several approaches to mitigating scale-up related query performance concerns.

  3. Geographic scale matters in detecting the relationship between neighbourhood food environments and obesity risk: an analysis of driver license records in Salt Lake County, Utah.

    PubMed

    Fan, Jessie X; Hanson, Heidi A; Zick, Cathleen D; Brown, Barbara B; Kowaleski-Jones, Lori; Smith, Ken R

    2014-08-19

    Empirical studies of the association between neighbourhood food environments and individual obesity risk have found mixed results. One possible cause of these mixed findings is the variation in neighbourhood geographic scale used. The purpose of this paper was to examine how various neighbourhood geographic scales affected the estimated relationship between food environments and obesity risk. Cross-sectional secondary data analysis. Salt Lake County, Utah, USA. 403,305 Salt Lake County adults 25-64 in the Utah driver license database between 1995 and 2008. Utah driver license data were geo-linked to 2000 US Census data and Dun & Bradstreet business data. Food outlets were classified into the categories of large grocery stores, convenience stores, limited-service restaurants and full-service restaurants, and measured at four neighbourhood geographic scales: Census block group, Census tract, ZIP code and a 1 km buffer around the resident's house. These measures were regressed on individual obesity status using multilevel random intercept regressions. Obesity. Food environment was important for obesity but the scale of the relevant neighbourhood differs for different type of outlets: large grocery stores were not significant at all four geographic scales, limited-service restaurants at the medium-to-large scale (Census tract or larger) and convenience stores and full-service restaurants at the smallest scale (Census tract or smaller). The choice of neighbourhood geographic scale can affect the estimated significance of the association between neighbourhood food environments and individual obesity risk. However, variations in geographic scale alone do not explain the mixed findings in the literature. If researchers are constrained to use one geographic scale with multiple categories of food outlets, using Census tract or 1 km buffer as the neighbourhood geographic unit is likely to allow researchers to detect most significant relationships. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  4. The Convergence of High Performance Computing and Large Scale Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  5. NDEx - the Network Data Exchange, A Network Commons for Biologists | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Network models of biology, whether curated or derived from large-scale data analysis, are critical tools in the understanding of cancer mechanisms and in the design and personalization of therapies. The NDEx Project (Network Data Exchange) will create, deploy, and maintain an open-source, web-based software platform and public website to enable scientists, organizations, and software applications to share, store, manipulate, and publish biological networks.

  6. Twisted magnetosphere with quadrupolar fields in the exterior of a neutron star

    NASA Astrophysics Data System (ADS)

    Kojima, Yasufumi

    2018-04-01

    The magnetar magnetosphere is gradually twisted by shearing from footpoint motion, and stored magnetic energy also increases at the same time. When a state exceeds a threshold, flares/outbursts manifest themselves as a result of a catastrophic transition. Axisymmetric static solutions for a relativistic force-free magnetosphere with dipole-quadrupole mixed fields at the surface have been calculated. The quadrupole component represents a kind of magnetic-field irregularity at a small scale. Locally twisted models are constructed by limiting current flow regions, where the small part originates from a dipole-quadrupole mixture. The energy along a sequence of equilibria increases and becomes sufficient to open the magnetic field in some models. In energetically metastable states, a magnetic flux rope is formed in the vicinity of the star. The excess energy may be ejected as a magnetar flare/outburst. The general relativistic gravity is sufficient to confine the flux rope and to store huge magnetic energy, and the mechanism is also discussed.

  7. Summary of Low-Lift Drag and Directional Stability Data from Rocket Models of the Douglas XF4D-1 Airplane with and without External Stores and Rocket Packets at Mach Numbers from 0.8 to 1.38 TED No. NACA DE-349

    NASA Technical Reports Server (NTRS)

    Mitcham, Grady L.; Blanchard, Willard S.; Hastings, Earl C., Jr.

    1952-01-01

    At the request of the Bureau of Aeronautics, Department of the Navy, an investigation at transonic and low supersonic speeds of the drag and longitudinal trim characteristics of the Douglas XF4D-1 airplane is being conducted by the Langley Pilotless Aircraft Research Division. The Douglas XF4D-1 is a jet-propelled, low-aspect-ratio, swept-wing, tailless, interceptor-type airplane designed to fly at low supersonic speeds. As a part of this investigation, flight tests were made using rocket- propelled 1/10- scale models to determine the effect of the addition of 10 external stores and rocket packets on the drag at low lift coefficients. In addition to these data, some qualitative values of the directional stability parameter C(sub n beta) and duct total-pressure recovery are also presented.

  8. Twisted magnetosphere with quadrupolar fields in the exterior of a neutron star

    NASA Astrophysics Data System (ADS)

    Kojima, Yasufumi

    2018-07-01

    The magnetar magnetosphere is gradually twisted by shearing from footpoint motion, and stored magnetic energy also increases at the same time. When a state exceeds a threshold, flares/outbursts manifest themselves as a result of a catastrophic transition. Axisymmetric static solutions for a relativistic force-free magnetosphere with dipole-quadrupole mixed fields at the surface have been calculated. The quadrupole component represents a kind of magnetic-field irregularity at a small scale. Locally twisted models are constructed by limiting current flow regions, where the small part originates from a dipole-quadrupole mixture. The energy along a sequence of equilibria increases and becomes sufficient to open the magnetic field in some models. In energetically metastable states, a magnetic flux rope is formed in the vicinity of the star. The excess energy may be ejected as a magnetar flare/outburst. The general relativistic gravity is sufficient to confine the flux rope and to store huge magnetic energy, and the mechanism is also discussed.

  9. Short-Term Retrospective Land Data Assimilation Schemes

    NASA Technical Reports Server (NTRS)

    Houser, P. R.; Cosgrove, B. A.; Entin, J. K.; Lettenmaier, D.; ODonnell, G.; Mitchell, K.; Marshall, C.; Lohmann, D.; Schaake, J. C.; Duan, Q.; hide

    2000-01-01

    Subsurface moisture and temperature and snow/ice stores exhibit persistence on various time scales that has important implications for the extended prediction of climatic and hydrologic extremes. Hence, to improve their specification of the land surface, many numerical weather prediction (NWP) centers have incorporated complex land surface schemes in their forecast models. However, because land storages are integrated states, errors in NWP forcing accumulates in these stores, which leads to incorrect surface water and energy partitioning. This has motivated the development of Land Data Assimilation Schemes (LDAS) that can be used to constrain NWP surface storages. An LDAS is an uncoupled land surface scheme that is forced primarily by observations, and is therefore less affected by NWP forcing biases. The implementation of an LDAS also provides the opportunity to correct the model's trajectory using remotely-sensed observations of soil temperature, soil moisture, and snow using data assimilation methods. The inclusion of data assimilation in LDAS will greatly increase its predictive capacity, as well as provide high-quality land surface assimilated data.

  10. High-resolution mapping and spatial variability of soil organic carbon storage of permafrost-affected soils

    NASA Astrophysics Data System (ADS)

    Siewert, Matthias; Hugelius, Gustaf

    2017-04-01

    Permafrost-affected soils store large amounts of soil organic carbon (SOC). Mapping of this SOC provides a first order spatial input variable for research that relates carbon stored in permafrost regions to carbon cycle dynamics. High-resolution satellite imagery is becoming increasingly available even in circum-polar regions. The presented research highlights findings of high-resolution mapping efforts of SOC from five study areas in the northern circum-polar permafrost region. These study areas are located in Siberia (Kytalyk, Spasskaya Pad /Neleger, Lena delta), Northern Sweden (Abisko) and Northwestern Canada (Herschel Island). Our high spatial resolution analyses show how geomorphology has a strong influence on the distribution of SOC. This is organized at different spatial scales. Periglacial landforms and processes dictate local scale SOC distribution due to patterned ground. Such landforms are non-sorted circles and ice-wedge polygons of different age and scale. Palsas and peat plateaus are formed and can cover larger areas in Sub-Arctic environments. Study areas that have not been affected by Pleistocene glaciation feature ice-rich Yedoma sediments that dominate the local relief through thermokarst formation and create landscape scale macro environments that dictate the distribution of SOC. A general trend indicates higher SOC storage in Arctic tundra soils compared to forested Boreal or Sub-Arctic taiga soils. Yet, due to the shallower active layer depth in the Arctic, much of the SOC may be permanently frozen and thus not be available to ecosystem processes. Significantly more SOC is stored in soils compared to vegetation, indicating that vegetation growth and incorporation of the carbon into the plant phytomass alone will not be able to offset SOC released from permafrost. This contribution also addresses advances in thematic mapping methods and digital soil mapping of SOC in permafrost terrain. In particular machine-learning methods, such as support vector machines, artificial neural networks and random forests show promising results as a toolbox for mapping permafrost-affected soils. Yet, these new methods do not decrease our dependency from soil pedon data from the field. In contrary, soil pedon data represents an urgent research priority. Statistical analyses are provided as an indication for best practice of soil pedon sampling for the quantification and the model representation of SOC stored in permafrost-affected soils.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soltanian, Mohamad Reza; Sun, Alexander; Dai, Zhenxue

    Yucca Mountain, Nevada, had been extensively investigated as a potential deep geologic repository for storing high-level nuclear wastes. Previous field investigations of stratified alluvial aquifer downstream of the site revealed that there is a hierarchy of sedimentary facies types. There is a corresponding log conductivity and reactive surface area subpopulations within each facies at each scale of sedimentary architecture. Here in this paper, we use a Lagrangian-based transport model in order to analyze radionuclide dispersion in the saturated alluvium of Fortymile Wash, Nevada. First, we validate the Lagrangian model using high-resolution flow and reactive transport simulations. Then, we used themore » validated model to investigate how each scale of sedimentary architecture may affect long-term radionuclide transport at Yucca Mountain. Results show that the reactive solute dispersion developed by the Lagrangian model matches the ensemble average of numerical simulations well. The link between the alluvium spatial variability and reactive solute dispersion at different spatiotemporal scales is demonstrated using the Lagrangian model. Finally, the longitudinal dispersivity of the reactive plume can be on the order of hundreds to thousands of meters, and it may not reach its asymptotic value even after 10,000 years of travel time and 2–3 km of travel distance.« less

  12. Surface Freshwater Storage and Variability in the Amazon Basin from Multi-Satellite Observations, 1993-2007

    NASA Technical Reports Server (NTRS)

    Papa, Fabrice; Frappart, Frederic; Guntner, Andreas; Prigent, Catherine; Aires, Filipe; Getirana, Augusto; Maurer, Raffael

    2013-01-01

    The amount of water stored and moving through the surface water bodies of large river basins (river, floodplains, wetlands) plays a major role in the global water and biochemical cycles and is a critical parameter for water resources management. However, the spatio-temporal variations of these freshwater reservoirs are still widely unknown at the global scale. Here, we propose a hypsographic curve approach to estimate surface freshwater storage variations over the Amazon basin combining surface water extent from a multi-satellite-technique with topographic data from the Global Digital Elevation Model (GDEM) from Advance Spaceborne Thermal Emission and Reflection Radiometer (ASTER). Monthly surface water storage variations for 1993-2007 are presented, showing a strong seasonal and interannual variability, and are evaluated against in situ river discharge and precipitation. The basin-scale mean annual amplitude of approx. 1200 cu km is in the range of previous estimates and contributes to about half of the Gravity Recovery And Climate Experiment (GRACE) total water storage variations. For the first time, we map the surface water volume anomaly during the extreme droughts of 1997 (October-November) and 2005 (September-October) and found that during these dry events the water stored in the river and flood-plains of the Amazon basin was, respectively, approx. 230 (approx. 40%) and 210 (approx. 50%) cu km below the 1993-2007 average. This new 15year data set of surface water volume represents an unprecedented source of information for future hydrological or climate modeling of the Amazon. It is also a first step toward the development of such database at the global scale.

  13. Toward uniform implementation of parametric map Digital Imaging and Communication in Medicine standard in multisite quantitative diffusion imaging studies.

    PubMed

    Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C

    2018-01-01

    This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.

  14. Comparative calibration of IP scanning equipment

    NASA Astrophysics Data System (ADS)

    Ingenito, F.; Andreoli, P.; Batani, D.; Boutoux, G.; Cipriani, M.; Consoli, F.; Cristofari, G.; Curcio, A.; De Angelis, R.; Di Giorgio, G.; Ducret, J.; Forestier-Colleoni, P.; Hulin, S.; Jakubowska, K.; Rabhi, N.

    2016-05-01

    Imaging Plates (IP) are diagnostic devices which contain a photostimulable phosphor layer that stores the incident radiation dose as a latent image. The image is read with a scanner which stimulates the decay of electrons, previously excited by the incident radiation, by exposition to a laser beam. This results in emitted light, which is detected by photomultiplier tubes; so the latent image is reconstructed. IPs have the interesting feature that can be reused many times, after erasing stored information. Algorithms to convert signals stored in the detector to Photostimulated luminescence (PSL) counts depend on the scanner and are not available on every model. A comparative cross-calibration of the IP scanner Dürr CR35 BIO, used in ABC laboratory, was performed, using the Fujifilm FLA 7000 scanner as a reference, to find the equivalence between grey-scale values given by the Dürr scanner to PSL counts. Using an IP and a 55Fe β-source, we produced pairs of samples with the same exposition times, which were analysed by both scanners, placing particular attention to fading times of the image stored on IPs. Data analysis led us to the determine a conversion formula which can be used to compare data of experiments obtained in different laboratories and to use IP calibrations available, till now, only for Fujifilm scanners.

  15. 50 CFR 680.23 - Equipment and operational requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... stored in the scale computer memory is replaced. Scale weights must not be adjusted by the scale operator... for an inspection to NMFS, Alaska Region. An inspection must be requested no less than 10 working days...

  16. Development of a Model of Nitrogen Cycling in Stormwater Control Measures and Application of the Model at the Watershed Scale

    NASA Astrophysics Data System (ADS)

    Bell, C.; Tague, C.; McMillan, S. K.

    2016-12-01

    Stormwater control measures (SCMs) create ecosystems in urban watersheds that store water and promote nitrogen (N) retention and removal. This work used computer modeling at two spatial scales (the individual SCM and watershed scale) to quantify how SCMs affect runoff and nitrogen export in urban watersheds. First, routines that simulate the dynamic hydrologic and water quality processes of an individual wet pond SCM were developed and applied to quantify N processing under different environmental and design scenarios. Results showed that deeper SCMs have greater inorganic N removal efficiencies because they have more stored volume of relatively N-deplete water, and therefore have a greater capacity to dilute relatively N-rich inflow. N removal by the SCM was more sensitive to this design parameter than it was to variations in air temperature, inflow N concentrations, and inflow volume. Next, these SCM model routines were used to simulate processes of a suburban watershed in Charlotte, NC with 16 SCMs. The watershed configuration was varied to simulate runoff under different scenarios of impervious surface connectivity to SCMs with the goal of developing a simple predictive relationship between watershed condition and N loads. We used unmitigated imperviousness (UI), percent of the impervious area that is unmitigated by SCMs, to quantify watershed condition. Results showed that as SCM mitigation decreased, or as UI increased from 3% to 15%, runoff ratios and loads of nitrite and total dissolved N increased by 26% (21-32%), 14% (3-26%) and 13% (2-25%), respectively. The shape of the relationship between these response variables and UI was linear, which indicates that mitigation of any impervious surfaces will result in proportional reductions. However, the range of UI included in this study is on the low end of urban watersheds and future work will assess the behavior of this relationship at higher TI and UI levels.

  17. Impacts of insect disturbance on the structure, composition, and functioning of oak-pine forests

    NASA Astrophysics Data System (ADS)

    Medvigy, D.; Schafer, K. V.; Clark, K. L.

    2011-12-01

    Episodic disturbance is an essential feature of terrestrial ecosystems, and strongly modulates their structure, composition, and functioning. However, dynamic global vegetation models that are commonly used to make ecosystem and terrestrial carbon budget predictions rarely have an explicit representation of disturbance. One reason why disturbance is seldom included is that disturbance tends to operate on spatial scales that are much smaller than typical model resolutions. In response to this problem, the Ecosystem Demography model 2 (ED2) was developed as a way of tracking the fine-scale heterogeneity arising from disturbances. In this study, we used ED2 to simulate an oak-pine forest that experiences episodic defoliation by gypsy moth (Lymantria dispar L). The model was carefully calibrated against site-level data, and then used to simulate changes in ecosystem composition, structure, and functioning on century time scales. Compared to simulations that include gypsy moth defoliation, we show that simulations that ignore defoliation events lead to much larger ecosystem carbon stores and a larger fraction of deciduous trees relative to evergreen trees. Furthermore, we find that it is essential to preserve the fine-scale nature of the disturbance. Attempts to "smooth out" the defoliation event over an entire grid cells led to large biases in ecosystem structure and functioning.

  18. Where Does Wood Most Effectively Enhance Storage? Network-Scale Distribution of Sediment and Organic Matter Stored by Instream Wood

    NASA Astrophysics Data System (ADS)

    Pfeiffer, Andrew; Wohl, Ellen

    2018-01-01

    We used 48 reach-scale measurements of large wood and wood-associated sediment and coarse particulate organic matter (CPOM) storage within an 80 km2 catchment to examine spatial patterns of storage relative to stream order. Wood, sediment, and CPOM are not distributed uniformly across the drainage basin. Third- and fourth-order streams (23% of total stream length) disproportionately store wood and coarse and fine sediments: 55% of total wood volume, 78% of coarse sediment, and 49% of fine sediment, respectively. Fourth-order streams store 0.8 m3 of coarse sediment and 0.2 m3 of fine sediment per cubic meter of wood. CPOM storage is highest in first-order streams (60% of storage in 47% of total network stream length). First-order streams can store up to 0.3 m3 of CPOM for each cubic meter of wood. Logjams in third- and fourth-order reaches are primary sediment storage agents, whereas roots in small streams may be more important for storage of CPOM. We propose the large wood particulate storage index to quantify average volume of sediment or CPOM stored by a cubic meter of wood.

  19. Modelling impulsive factors for electronics and restaurant coupons’ e-store display

    NASA Astrophysics Data System (ADS)

    Ariningsih, P. K.; Nainggolan, M.; Sandy, I. A.

    2018-04-01

    In many times, the increment of e-store visitors does not followed by sales increment. Most purchases through e-commerce are impulsive buying, however only small amount of study is available to understand impulsive factors of e-store display. This paper suggests a preliminary concept on understanding the impulsive factors in Electronics and Restaurant Coupons e-store display, which are two among few popular group products sold through e-commerce. By conducting literature study and survey, 31 attributes were identified as impulsive factors in electronics e-store display and 20 attributes were identified as impulsive factors for restaurant coupon e-store. The attributes were then grouped into comprehensive impulsive factors by factor analysis. Each group of impulsive attributes were generated into 3 factors. Accessibility Factors and Trust Factors appeared for each group products. The other factors are Internal Factors for electronics e-store and Marketing factors for restaurant coupons e-store. Structural Equation Model of the impulsive factors was developed for each type of e-store, which stated the covariance between Trust Factors and Accessibility Factors. Based on preliminary model, Internal Factor and Trust Factor are influencing impulsive buying in electronics store. Special factor for electronics e-store is Internal Factor, while for restaurant coupons e-store is Marketing Factor.

  20. Significance of exchanging SSURGO and STATSGO data when modeling hydrology in diverse physiographic terranes

    USGS Publications Warehouse

    Williamson, Tanja N.; Taylor, Charles J.; Newson, Jeremy K.

    2013-01-01

    The Water Availability Tool for Environmental Resources (WATER) is a TOPMODEL-based hydrologic model that depends on spatially accurate soils data to function in diverse terranes. In Kentucky, this includes mountainous regions, karstic plateau, and alluvial plains. Soils data are critical because they quantify the space to store water, as well as how water moves through the soil to the stream during storm events. We compared how the model performs using two different sources of soils data--Soil Survey Geographic Database (SSURGO) and State Soil Geographic Database laboratory data (STATSGO)--for 21 basins ranging in size from 17 to 1564 km2. Model results were consistently better when SSURGO data were used, likely due to the higher field capacity, porosity, and available-water holding capacity, which cause the model to store more soil-water in the landscape and improve streamflow estimates for both low- and high-flow conditions. In addition, there were significant differences in the conductivity multiplier and scaling parameter values that describe how water moves vertically and laterally, respectively, as quantified by TOPMODEL. We also evaluated whether partitioning areas that drain to streams via sinkholes in karstic basins as separate hydrologic modeling units (HMUs) improved model performance. There were significant differences between HMUs in properties that control soil-water storage in the model, although the effect of partitioning these HMUs on streamflow simulation was inconclusive.

  1. Estimating Vegetation Rainfall Interception Using Remote Sensing Observations at Very High Resolution

    NASA Astrophysics Data System (ADS)

    Cui, Y.; Zhao, P.; Hong, Y.; Fan, W.; Yan, B.; Xie, H.

    2017-12-01

    Abstract: As an important compont of evapotranspiration, vegetation rainfall interception is the proportion of gross rainfall that is intercepted, stored and subsequently evaporated from all parts of vegetation during or following rainfall. Accurately quantifying the vegetation rainfall interception at a high resolution is critical for rainfall-runoff modeling and flood forecasting, and is also essential for understanding its further impact on local, regional, and even global water cycle dynamics. In this study, the Remote Sensing-based Gash model (RS-Gash model) is developed based on a modified Gash model for interception loss estimation using remote sensing observations at the regional scale, and has been applied and validated in the upper reach of the Heihe River Basin of China for different types of vegetation. To eliminate the scale error and the effect of mixed pixels, the RS-Gash model is applied at a fine scale of 30 m with the high resolution vegetation area index retrieved by using the unified model of bidirectional reflectance distribution function (BRDF-U) for the vegetation canopy. Field validation shows that the RMSE and R2 of the interception ratio are 3.7% and 0.9, respectively, indicating the model's strong stability and reliability at fine scale. The temporal variation of vegetation rainfall interception loss and its relationship with precipitation are further investigated. In summary, the RS-Gash model has demonstrated its effectiveness and reliability in estimating vegetation rainfall interception. When compared to the coarse resolution results, the application of this model at 30-m fine resolution is necessary to resolve the scaling issues as shown in this study. Keywords: rainfall interception; remote sensing; RS-Gash analytical model; high resolution

  2. Combined Monte Carlo and path-integral method for simulated library of time-resolved reflectance curves from layered tissue models

    NASA Astrophysics Data System (ADS)

    Wilson, Robert H.; Vishwanath, Karthik; Mycek, Mary-Ann

    2009-02-01

    Monte Carlo (MC) simulations are considered the "gold standard" for mathematical description of photon transport in tissue, but they can require large computation times. Therefore, it is important to develop simple and efficient methods for accelerating MC simulations, especially when a large "library" of related simulations is needed. A semi-analytical method involving MC simulations and a path-integral (PI) based scaling technique generated time-resolved reflectance curves from layered tissue models. First, a zero-absorption MC simulation was run for a tissue model with fixed scattering properties in each layer. Then, a closed-form expression for the average classical path of a photon in tissue was used to determine the percentage of time that the photon spent in each layer, to create a weighted Beer-Lambert factor to scale the time-resolved reflectance of the simulated zero-absorption tissue model. This method is a unique alternative to other scaling techniques in that it does not require the path length or number of collisions of each photon to be stored during the initial simulation. Effects of various layer thicknesses and absorption and scattering coefficients on the accuracy of the method will be discussed.

  3. Beyond Darcy's law: The role of phase topology and ganglion dynamics for two-fluid flow

    DOE PAGES

    Armstrong, Ryan T.; McClure, James E.; Berrill, Mark A.; ...

    2016-10-27

    Relative permeability quantifies the ease at which immiscible phases flow through porous rock and is one of the most well known constitutive relationships for petroleum engineers. It however exhibits troubling dependencies on experimental conditions and is not a unique function of phase saturation as commonly accepted in industry practices. The problem lies in the multi-scale nature of the problem where underlying disequilibrium processes create anomalous macroscopic behavior. Here we show that relative permeability rate dependencies are explained by ganglion dynamic flow. We utilize fast X-ray micro-tomography and pore-scale simulations to identify unique flow regimes during the fractional flow of immisciblemore » phases and quantify the contribution of ganglion flux to the overall flux of non-wetting phase. We anticipate our approach to be the starting point for the development of sophisticated multi-scale flow models that directly link pore-scale parameters to macro-scale behavior. Such models will have a major impact on how we recover hydrocarbons from the subsurface, store sequestered CO 2 in geological formations, and remove non-aqueous environmental hazards from the vadose zone.« less

  4. Microbial Ecology of Stored Swine Manure and Reduction of Emissions Using Condensed Tannins.

    USDA-ARS?s Scientific Manuscript database

    Management practices from large-scale swine production facilities have resulted in the increased collection and storage of manure for off-season fertilization use. Stored swine manure serves as a habitat for billions of microorganisms and is associated with the generation of odorous compounds and g...

  5. Will growing forests make the global warming problem better or worse?

    NASA Astrophysics Data System (ADS)

    Caldeira, K.; Gibbard, S.; Bala, G.; Wickett, M. E.; Phillips, T. J.

    2005-12-01

    Carbon storage in forests has been promoted as a means to slow global warming. However, forests affect climate not only through the carbon cycle; forests also affect both the absorption of solar radiation and evapotranspiration. Previously, it has been shown that boreal forests have the potential to warm the planet, offsetting the benefits of carbon storage in boreal forests (Betts, Nature 408, 187-190, 2000). Here, we show that direct climate effects of forest growth in mid-latitudes also have the potential to offset benefits of carbon storage. This suggests that mid-latitude afforestation projects must be evaluated very carefully, taking direct climate effects into account. In contrast, low-latitude tropical forests appear to cool the planet both by storing carbon and by increasing evapotranspiration; thus, slowing or reversing tropical deforestation is a win/win strategy from both carbon storage and direct climate perspectives. Evaluation of costs and benefits of afforestation depends on the time scales under consideration. On the shortest time scale, each unit of CO2 taken up by a plant is removed from the atmosphere. However, over centuries most of this CO2 taken up from the atmosphere by plants is replaced by outgassing from the ocean. On the longest time scales, atmospheric carbon dioxide content is controlled by the carbonate-silicate cycle, so the amount of carbon stored in a forest is not relevant to long-term climate change. While atmospheric CO2 impacts of afforestation diminish over time, the direct effects on climate (and silicate weathering) persist, so these effects become more important as the time scale of concern lengthens. In some cases, afforestation is predicted to lead to cooling on the time scale of decades followed by warming on the time scale of centuries. Our study involves simulations using the NCAR CAM3 atmospheric general circulation model with a slab ocean to perform idealized (and extreme) land-cover change simulations. We explore the time-dependent carbon-cycle/climate implications of these results using a schematic model of the long-term carbon cycle and climate.

  6. Genome-Scale Metabolic Model for the Green Alga Chlorella vulgaris UTEX 395 Accurately Predicts Phenotypes under Autotrophic, Heterotrophic, and Mixotrophic Growth Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuniga, Cristal; Li, Chien -Ting; Huelsman, Tyler

    The green microalgae Chlorella vulgaris has been widely recognized as a promising candidate for biofuel production due to its ability to store high lipid content and its natural metabolic versatility. Compartmentalized genome-scale metabolic models constructed from genome sequences enable quantitative insight into the transport and metabolism of compounds within a target organism. These metabolic models have long been utilized to generate optimized design strategies for an improved production process. Here, we describe the reconstruction, validation, and application of a genome-scale metabolic model for C. vulgaris UTEX 395, iCZ843. The reconstruction represents the most comprehensive model for any eukaryotic photosynthetic organismmore » to date, based on the genome size and number of genes in the reconstruction. The highly curated model accurately predicts phenotypes under photoautotrophic, heterotrophic, and mixotrophic conditions. The model was validated against experimental data and lays the foundation for model-driven strain design and medium alteration to improve yield. Calculated flux distributions under different trophic conditions show that a number of key pathways are affected by nitrogen starvation conditions, including central carbon metabolism and amino acid, nucleotide, and pigment biosynthetic pathways. Moreover, model prediction of growth rates under various medium compositions and subsequent experimental validation showed an increased growth rate with the addition of tryptophan and methionine.« less

  7. Genome-Scale Metabolic Model for the Green Alga Chlorella vulgaris UTEX 395 Accurately Predicts Phenotypes under Autotrophic, Heterotrophic, and Mixotrophic Growth Conditions

    DOE PAGES

    Zuniga, Cristal; Li, Chien -Ting; Huelsman, Tyler; ...

    2016-07-02

    The green microalgae Chlorella vulgaris has been widely recognized as a promising candidate for biofuel production due to its ability to store high lipid content and its natural metabolic versatility. Compartmentalized genome-scale metabolic models constructed from genome sequences enable quantitative insight into the transport and metabolism of compounds within a target organism. These metabolic models have long been utilized to generate optimized design strategies for an improved production process. Here, we describe the reconstruction, validation, and application of a genome-scale metabolic model for C. vulgaris UTEX 395, iCZ843. The reconstruction represents the most comprehensive model for any eukaryotic photosynthetic organismmore » to date, based on the genome size and number of genes in the reconstruction. The highly curated model accurately predicts phenotypes under photoautotrophic, heterotrophic, and mixotrophic conditions. The model was validated against experimental data and lays the foundation for model-driven strain design and medium alteration to improve yield. Calculated flux distributions under different trophic conditions show that a number of key pathways are affected by nitrogen starvation conditions, including central carbon metabolism and amino acid, nucleotide, and pigment biosynthetic pathways. Moreover, model prediction of growth rates under various medium compositions and subsequent experimental validation showed an increased growth rate with the addition of tryptophan and methionine.« less

  8. Genome-Scale Metabolic Model for the Green Alga Chlorella vulgaris UTEX 395 Accurately Predicts Phenotypes under Autotrophic, Heterotrophic, and Mixotrophic Growth Conditions.

    PubMed

    Zuñiga, Cristal; Li, Chien-Ting; Huelsman, Tyler; Levering, Jennifer; Zielinski, Daniel C; McConnell, Brian O; Long, Christopher P; Knoshaug, Eric P; Guarnieri, Michael T; Antoniewicz, Maciek R; Betenbaugh, Michael J; Zengler, Karsten

    2016-09-01

    The green microalga Chlorella vulgaris has been widely recognized as a promising candidate for biofuel production due to its ability to store high lipid content and its natural metabolic versatility. Compartmentalized genome-scale metabolic models constructed from genome sequences enable quantitative insight into the transport and metabolism of compounds within a target organism. These metabolic models have long been utilized to generate optimized design strategies for an improved production process. Here, we describe the reconstruction, validation, and application of a genome-scale metabolic model for C. vulgaris UTEX 395, iCZ843. The reconstruction represents the most comprehensive model for any eukaryotic photosynthetic organism to date, based on the genome size and number of genes in the reconstruction. The highly curated model accurately predicts phenotypes under photoautotrophic, heterotrophic, and mixotrophic conditions. The model was validated against experimental data and lays the foundation for model-driven strain design and medium alteration to improve yield. Calculated flux distributions under different trophic conditions show that a number of key pathways are affected by nitrogen starvation conditions, including central carbon metabolism and amino acid, nucleotide, and pigment biosynthetic pathways. Furthermore, model prediction of growth rates under various medium compositions and subsequent experimental validation showed an increased growth rate with the addition of tryptophan and methionine. © 2016 American Society of Plant Biologists. All rights reserved.

  9. Genome-Scale Metabolic Model for the Green Alga Chlorella vulgaris UTEX 395 Accurately Predicts Phenotypes under Autotrophic, Heterotrophic, and Mixotrophic Growth Conditions1

    PubMed Central

    Zuñiga, Cristal; Li, Chien-Ting; Zielinski, Daniel C.; Guarnieri, Michael T.; Antoniewicz, Maciek R.; Zengler, Karsten

    2016-01-01

    The green microalga Chlorella vulgaris has been widely recognized as a promising candidate for biofuel production due to its ability to store high lipid content and its natural metabolic versatility. Compartmentalized genome-scale metabolic models constructed from genome sequences enable quantitative insight into the transport and metabolism of compounds within a target organism. These metabolic models have long been utilized to generate optimized design strategies for an improved production process. Here, we describe the reconstruction, validation, and application of a genome-scale metabolic model for C. vulgaris UTEX 395, iCZ843. The reconstruction represents the most comprehensive model for any eukaryotic photosynthetic organism to date, based on the genome size and number of genes in the reconstruction. The highly curated model accurately predicts phenotypes under photoautotrophic, heterotrophic, and mixotrophic conditions. The model was validated against experimental data and lays the foundation for model-driven strain design and medium alteration to improve yield. Calculated flux distributions under different trophic conditions show that a number of key pathways are affected by nitrogen starvation conditions, including central carbon metabolism and amino acid, nucleotide, and pigment biosynthetic pathways. Furthermore, model prediction of growth rates under various medium compositions and subsequent experimental validation showed an increased growth rate with the addition of tryptophan and methionine. PMID:27372244

  10. Identifying Financially Sustainable Pricing Interventions to Promote Healthier Beverage Purchases in Small Neighborhood Stores

    PubMed Central

    Kumanyika, Shiriki; Gittelsohn, Joel; Adam, Atif; Wong, Michelle S.; Mui, Yeeli; Lee, Bruce Y.

    2018-01-01

    Introduction Residents of low-income communities often purchase sugar-sweetened beverages (SSBs) at small, neighborhood “corner” stores. Lowering water prices and increasing SSB prices are potentially complementary public health strategies to promote more healthful beverage purchasing patterns in these stores. Sustainability, however, depends on financial feasibility. Because in-store pricing experiments are complex and require retailers to take business risks, we used a simulation approach to identify profitable pricing combinations for corner stores. Methods The analytic approach was based on inventory models, which are suitable for modeling business operations. We used discrete-event simulation to build inventory models that use data representing beverage inventory, wholesale costs, changes in retail prices, and consumer demand for 2 corner stores in Baltimore, Maryland. Model outputs yielded ranges for water and SSB prices that increased water demand without loss of profit from combined water and SSB sales. Results A 20% SSB price increase allowed lowering water prices by up to 20% while maintaining profit and increased water demand by 9% and 14%, for stores selling SSBs in 12-oz cans and 16- to 20-oz bottles, respectively. Without changing water prices, profits could increase by 4% and 6%, respectively. Sensitivity analysis showed that stores with a higher volume of SSB sales could reduce water prices the most without loss of profit. Conclusion Various combinations of SSB and water prices could encourage water consumption while maintaining or increasing store owners’ profits. This model is a first step in designing and implementing profitable pricing strategies in collaboration with store owners. PMID:29369758

  11. Identifying Financially Sustainable Pricing Interventions to Promote Healthier Beverage Purchases in Small Neighborhood Stores.

    PubMed

    Nau, Claudia; Kumanyika, Shiriki; Gittelsohn, Joel; Adam, Atif; Wong, Michelle S; Mui, Yeeli; Lee, Bruce Y

    2018-01-25

    Residents of low-income communities often purchase sugar-sweetened beverages (SSBs) at small, neighborhood "corner" stores. Lowering water prices and increasing SSB prices are potentially complementary public health strategies to promote more healthful beverage purchasing patterns in these stores. Sustainability, however, depends on financial feasibility. Because in-store pricing experiments are complex and require retailers to take business risks, we used a simulation approach to identify profitable pricing combinations for corner stores. The analytic approach was based on inventory models, which are suitable for modeling business operations. We used discrete-event simulation to build inventory models that use data representing beverage inventory, wholesale costs, changes in retail prices, and consumer demand for 2 corner stores in Baltimore, Maryland. Model outputs yielded ranges for water and SSB prices that increased water demand without loss of profit from combined water and SSB sales. A 20% SSB price increase allowed lowering water prices by up to 20% while maintaining profit and increased water demand by 9% and 14%, for stores selling SSBs in 12-oz cans and 16- to 20-oz bottles, respectively. Without changing water prices, profits could increase by 4% and 6%, respectively. Sensitivity analysis showed that stores with a higher volume of SSB sales could reduce water prices the most without loss of profit. Various combinations of SSB and water prices could encourage water consumption while maintaining or increasing store owners' profits. This model is a first step in designing and implementing profitable pricing strategies in collaboration with store owners.

  12. Accounting for small scale heterogeneity in ecohydrologic watershed models

    NASA Astrophysics Data System (ADS)

    Bhaskar, A.; Fleming, B.; Hogan, D. M.

    2016-12-01

    Spatially distributed ecohydrologic models are inherently constrained by the spatial resolution of their smallest units, below which land and processes are assumed to be homogenous. At coarse scales, heterogeneity is often accounted for by computing store and fluxes of interest over a distribution of land cover types (or other sources of heterogeneity) within spatially explicit modeling units. However this approach ignores spatial organization and the lateral transfer of water and materials downslope. The challenge is to account both for the role of flow network topology and fine-scale heterogeneity. We present a new approach that defines two levels of spatial aggregation and that integrates spatially explicit network approach with a flexible representation of finer-scale aspatial heterogeneity. Critically, this solution does not simply increase the resolution of the smallest spatial unit, and so by comparison, results in improved computational efficiency. The approach is demonstrated by adapting Regional Hydro-Ecologic Simulation System (RHESSys), an ecohydrologic model widely used to simulate climate, land use, and land management impacts. We illustrate the utility of our approach by showing how the model can be used to better characterize forest thinning impacts on ecohydrology. Forest thinning is typically done at the scale of individual trees, and yet management responses of interest include impacts on watershed scale hydrology and on downslope riparian vegetation. Our approach allow us to characterize the variability in tree size/carbon reduction and water transfers between neighboring trees while still capturing hillslope to watershed scale effects, Our illustrative example demonstrates that accounting for these fine scale effects can substantially alter model estimates, in some cases shifting the impacts of thinning on downslope water availability from increases to decreases. We conclude by describing other use cases that may benefit from this approach including characterizing urban vegetation and storm water management features and their impact on watershed scale hydrology and biogeochemical cycling.

  13. Accounting for small scale heterogeneity in ecohydrologic watershed models

    NASA Astrophysics Data System (ADS)

    Burke, W.; Tague, C.

    2017-12-01

    Spatially distributed ecohydrologic models are inherently constrained by the spatial resolution of their smallest units, below which land and processes are assumed to be homogenous. At coarse scales, heterogeneity is often accounted for by computing store and fluxes of interest over a distribution of land cover types (or other sources of heterogeneity) within spatially explicit modeling units. However this approach ignores spatial organization and the lateral transfer of water and materials downslope. The challenge is to account both for the role of flow network topology and fine-scale heterogeneity. We present a new approach that defines two levels of spatial aggregation and that integrates spatially explicit network approach with a flexible representation of finer-scale aspatial heterogeneity. Critically, this solution does not simply increase the resolution of the smallest spatial unit, and so by comparison, results in improved computational efficiency. The approach is demonstrated by adapting Regional Hydro-Ecologic Simulation System (RHESSys), an ecohydrologic model widely used to simulate climate, land use, and land management impacts. We illustrate the utility of our approach by showing how the model can be used to better characterize forest thinning impacts on ecohydrology. Forest thinning is typically done at the scale of individual trees, and yet management responses of interest include impacts on watershed scale hydrology and on downslope riparian vegetation. Our approach allow us to characterize the variability in tree size/carbon reduction and water transfers between neighboring trees while still capturing hillslope to watershed scale effects, Our illustrative example demonstrates that accounting for these fine scale effects can substantially alter model estimates, in some cases shifting the impacts of thinning on downslope water availability from increases to decreases. We conclude by describing other use cases that may benefit from this approach including characterizing urban vegetation and storm water management features and their impact on watershed scale hydrology and biogeochemical cycling.

  14. Particle backscatter and relative humidity measured across cirrus clouds and comparison with state-of-the-art cirrus modelling

    NASA Astrophysics Data System (ADS)

    Brabec, M.; Wienhold, F. G.; Luo, B.; Vömel, H.; Immler, F.; Steiner, P.; Peter, T.

    2012-04-01

    Advanced measurement and modelling techniques are employed to determine the partitioning of atmospheric water between the gas phase and the condensed phase in and around cirrus clouds, and thus to identify in-cloud and out-of-cloud supersaturations with respect to ice. In November 2008 the newly developed balloon-borne backscatter sonde COBALD (Compact Optical Backscatter and AerosoL Detector) was flown 14 times together with a CFH (Cryogenic Frost point Hygrometer) from Lindenberg, Germany (52° N, 14° E). The case discussed here in detail shows two cirrus layers with in-cloud relative humidities with respect to ice between 50% and 130%. Global operational analysis data of ECMWF (roughly 1° × 1° horizontal and 1 km vertical resolution, 6-hourly stored fields) fail to represent ice water contents and relative humidities. Conversely, regional COSMO-7 forecasts (6.6 km × 6.6 km, 5-min stored fields) capture the measured humidities and cloud positions remarkably well. The main difference between ECMWF and COSMO data is the resolution of small-scale vertical features responsible for cirrus formation. Nevertheless, ice water contents in COSMO-7 are still off by factors 2-10, likely reflecting limitations in COSMO's ice phase bulk scheme. Significant improvements can be achieved by comprehensive size-resolved microphysical and optical modelling along backward trajectories based on COSMO-7 wind and temperature fields, which allow accurate computation of humidities, ice particle size distributions and backscatter ratios at the COBALD wavelengths. However, only by superimposing small-scale temperature fluctuations, which remain unresolved by the NWP models, can we obtain a satisfying agreement with the observations and reconcile the measured in-cloud non-equilibrium humidities with conventional ice cloud microphysics.

  15. Practical Considerations for Use of Mobile Apps at the Tactical Edge

    DTIC Science & Technology

    2014-06-01

    and logging The commercial app stores must scale to very large numbers of users (e.g., iTunes has over 800 million accounts, most with credit cards ...over a million Android apps and a million iOS apps are available for download from the Google Play and Apple iTunes app stores, respectively. Of these...apps and a million iOS apps are available for download from the Google Play and Apple iTunes app stores, respectively. Of these, most would not be

  16. Immune networks: multi-tasking capabilities at medium load

    NASA Astrophysics Data System (ADS)

    Agliari, E.; Annibale, A.; Barra, A.; Coolen, A. C. C.; Tantari, D.

    2013-08-01

    Associative network models featuring multi-tasking properties have been introduced recently and studied in the low-load regime, where the number P of simultaneously retrievable patterns scales with the number N of nodes as P ˜ log N. In addition to their relevance in artificial intelligence, these models are increasingly important in immunology, where stored patterns represent strategies to fight pathogens and nodes represent lymphocyte clones. They allow us to understand the crucial ability of the immune system to respond simultaneously to multiple distinct antigen invasions. Here we develop further the statistical mechanical analysis of such systems, by studying the medium-load regime, P ˜ Nδ with δ ∈ (0, 1]. We derive three main results. First, we reveal the nontrivial architecture of these networks: they exhibit a high degree of modularity and clustering, which is linked to their retrieval abilities. Second, by solving the model we demonstrate for δ < 1 the existence of large regions in the phase diagram where the network can retrieve all stored patterns simultaneously. Finally, in the high-load regime δ = 1 we find that the system behaves as a spin-glass, suggesting that finite-connectivity frameworks are required to achieve effective retrieval.

  17. Optimal causal inference: estimating stored information and approximating causal architecture.

    PubMed

    Still, Susanne; Crutchfield, James P; Ellison, Christopher J

    2010-09-01

    We introduce an approach to inferring the causal architecture of stochastic dynamical systems that extends rate-distortion theory to use causal shielding--a natural principle of learning. We study two distinct cases of causal inference: optimal causal filtering and optimal causal estimation. Filtering corresponds to the ideal case in which the probability distribution of measurement sequences is known, giving a principled method to approximate a system's causal structure at a desired level of representation. We show that in the limit in which a model-complexity constraint is relaxed, filtering finds the exact causal architecture of a stochastic dynamical system, known as the causal-state partition. From this, one can estimate the amount of historical information the process stores. More generally, causal filtering finds a graded model-complexity hierarchy of approximations to the causal architecture. Abrupt changes in the hierarchy, as a function of approximation, capture distinct scales of structural organization. For nonideal cases with finite data, we show how the correct number of the underlying causal states can be found by optimal causal estimation. A previously derived model-complexity control term allows us to correct for the effect of statistical fluctuations in probability estimates and thereby avoid overfitting.

  18. Quantitative model of price diffusion and market friction based on trading as a mechanistic random process.

    PubMed

    Daniels, Marcus G; Farmer, J Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-14

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  19. Quantitative Model of Price Diffusion and Market Friction Based on Trading as a Mechanistic Random Process

    NASA Astrophysics Data System (ADS)

    Daniels, Marcus G.; Farmer, J. Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-01

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  20. Precision of working memory for speech sounds.

    PubMed

    Joseph, Sabine; Iverson, Paul; Manohar, Sanjay; Fox, Zoe; Scott, Sophie K; Husain, Masud

    2015-01-01

    Memory for speech sounds is a key component of models of verbal working memory (WM). But how good is verbal WM? Most investigations assess this using binary report measures to derive a fixed number of items that can be stored. However, recent findings in visual WM have challenged such "quantized" views by employing measures of recall precision with an analogue response scale. WM for speech sounds might rely on both continuous and categorical storage mechanisms. Using a novel speech matching paradigm, we measured WM recall precision for phonemes. Vowel qualities were sampled from a formant space continuum. A probe vowel had to be adjusted to match the vowel quality of a target on a continuous, analogue response scale. Crucially, this provided an index of the variability of a memory representation around its true value and thus allowed us to estimate how memories were distorted from the original sounds. Memory load affected the quality of speech sound recall in two ways. First, there was a gradual decline in recall precision with increasing number of items, consistent with the view that WM representations of speech sounds become noisier with an increase in the number of items held in memory, just as for vision. Based on multidimensional scaling (MDS), the level of noise appeared to be reflected in distortions of the formant space. Second, as memory load increased, there was evidence of greater clustering of participants' responses around particular vowels. A mixture model captured both continuous and categorical responses, demonstrating a shift from continuous to categorical memory with increasing WM load. This suggests that direct acoustic storage can be used for single items, but when more items must be stored, categorical representations must be used.

  1. Modeling Sediment Detention Ponds Using Reactor Theory and Advection-Diffusion Concepts

    NASA Astrophysics Data System (ADS)

    Wilson, Bruce N.; Barfield, Billy J.

    1985-04-01

    An algorithm is presented to model the sedimentation process in detention ponds. This algorithm is based on a mass balance for an infinitesimal layer that couples reactor theory concepts with advection-diffusion processes. Reactor theory concepts are used to (1) determine residence time of sediment particles and to (2) mix influent sediment with previously stored flow. Advection-diffusion processes are used to model the (1) settling characteristics of sediment and the (2) vertical diffusion of sediment due to turbulence. Predicted results of the model are compared to those observed on two pilot scale ponds for a total of 12 runs. The average percent error between predicted and observed trap efficiency was 5.2%. Overall, the observed sedimentology values were predicted with reasonable accuracy.

  2. A Mixed-dimensional Model for Determining the Impact of Permafrost Polygonal Ground Degradation on Arctic Hydrology.

    NASA Astrophysics Data System (ADS)

    Coon, E.; Jan, A.; Painter, S. L.; Moulton, J. D.; Wilson, C. J.

    2017-12-01

    Many permafrost-affected regions in the Arctic manifest a polygonal patterned ground, which contains large carbon stores and is vulnerability to climate change as warming temperatures drive melting ice wedges, polygon degradation, and thawing of the underlying carbon-rich soils. Understanding the fate of this carbon is difficult. The system is controlled by complex, nonlinear physics coupling biogeochemistry, thermal-hydrology, and geomorphology, and there is a strong spatial scale separation between microtopograpy (at the scale of an individual polygon) and the scale of landscape change (at the scale of many thousands of polygons). Physics-based models have come a long way, and are now capable of representing the diverse set of processes, but only on individual polygons or a few polygons. Empirical models have been used to upscale across land types, including ecotypes evolving from low-centered (pristine) polygons to high-centered (degraded) polygon, and do so over large spatial extent, but are limited in their ability to discern causal process mechanisms. Here we present a novel strategy that looks to use physics-based models across scales, bringing together multiple capabilities to capture polygon degradation under a warming climate and its impacts on thermal-hydrology. We use fine-scale simulations on individual polygons to motivate a mixed-dimensional strategy that couples one-dimensional columns representing each individual polygon through two-dimensional surface flow. A subgrid model is used to incorporate the effects of surface microtopography on surface flow; this model is described and calibrated to fine-scale simulations. And critically, a subsidence model that tracks volume loss in bulk ice wedges is used to alter the subsurface structure and subgrid parameters, enabling the inclusion of the feedbacks associated with polygon degradation. This combined strategy results in a model that is able to capture the key features of polygon permafrost degradation, but in a simulation across a large spatial extent of polygonal tundra.

  3. Multiscale musculoskeletal modelling, data–model fusion and electromyography-informed modelling

    PubMed Central

    Zhang, J.; Heidlauf, T.; Sartori, M.; Besier, T.; Röhrle, O.; Lloyd, D.

    2016-01-01

    This paper proposes methods and technologies that advance the state of the art for modelling the musculoskeletal system across the spatial and temporal scales; and storing these using efficient ontologies and tools. We present population-based modelling as an efficient method to rapidly generate individual morphology from only a few measurements and to learn from the ever-increasing supply of imaging data available. We present multiscale methods for continuum muscle and bone models; and efficient mechanostatistical methods, both continuum and particle-based, to bridge the scales. Finally, we examine both the importance that muscles play in bone remodelling stimuli and the latest muscle force prediction methods that use electromyography-assisted modelling techniques to compute musculoskeletal forces that best reflect the underlying neuromuscular activity. Our proposal is that, in order to have a clinically relevant virtual physiological human, (i) bone and muscle mechanics must be considered together; (ii) models should be trained on population data to permit rapid generation and use underlying principal modes that describe both muscle patterns and morphology; and (iii) these tools need to be available in an open-source repository so that the scientific community may use, personalize and contribute to the database of models. PMID:27051510

  4. New test techniques to evaluate near field effects for supersonic store carriage and separation

    NASA Technical Reports Server (NTRS)

    Sawyer, Wallace C.; Stallings, Robert L., Jr.; Wilcox, Floyd J., Jr.; Blair, A. B., Jr.; Monta, William J.; Plentovich, Elizabeth B.

    1989-01-01

    Store separation and store carriage drag studies were conducted. A primary purpose is to develop new experimental methods to evaluate near field effects of store separation and levels of store carriage drag associated with a variety of carriage techniques for different store shapes and arrangements. Flow field measurements consisting of surface pressure distributions and vapor screen photographs are used to analyze the variations of the store separation characteristics with cavity geometry. Store carriage drag measurements representative of tangent, semi-submerged, and internal carriage installations are presented and discussed. Results are included from both fully metric models and models with only metric segments (metric pallets) and the relative merits of the two are discussed. Carriage drag measurements for store installations on an aircraft parent body are compared both with prediction methods and with installations on a generic parent body.

  5. Reactive transport in the complex heterogeneous alluvial aquifer of Fortymile Wash, Nevada

    DOE PAGES

    Soltanian, Mohamad Reza; Sun, Alexander; Dai, Zhenxue

    2017-04-02

    Yucca Mountain, Nevada, had been extensively investigated as a potential deep geologic repository for storing high-level nuclear wastes. Previous field investigations of stratified alluvial aquifer downstream of the site revealed that there is a hierarchy of sedimentary facies types. There is a corresponding log conductivity and reactive surface area subpopulations within each facies at each scale of sedimentary architecture. Here in this paper, we use a Lagrangian-based transport model in order to analyze radionuclide dispersion in the saturated alluvium of Fortymile Wash, Nevada. First, we validate the Lagrangian model using high-resolution flow and reactive transport simulations. Then, we used themore » validated model to investigate how each scale of sedimentary architecture may affect long-term radionuclide transport at Yucca Mountain. Results show that the reactive solute dispersion developed by the Lagrangian model matches the ensemble average of numerical simulations well. The link between the alluvium spatial variability and reactive solute dispersion at different spatiotemporal scales is demonstrated using the Lagrangian model. Finally, the longitudinal dispersivity of the reactive plume can be on the order of hundreds to thousands of meters, and it may not reach its asymptotic value even after 10,000 years of travel time and 2–3 km of travel distance.« less

  6. Do marginalized neighbourhoods have less healthy retail food environments? An analysis using Bayesian spatial latent factor and hurdle models.

    PubMed

    Luan, Hui; Minaker, Leia M; Law, Jane

    2016-08-22

    Findings of whether marginalized neighbourhoods have less healthy retail food environments (RFE) are mixed across countries, in part because inconsistent approaches have been used to characterize RFE 'healthfulness' and marginalization, and researchers have used non-spatial statistical methods to respond to this ultimately spatial issue. This study uses in-store features to categorize healthy and less healthy food outlets. Bayesian spatial hierarchical models are applied to explore the association between marginalization dimensions and RFE healthfulness (i.e., relative healthy food access that modelled via a probability distribution) at various geographical scales. Marginalization dimensions are derived from a spatial latent factor model. Zero-inflation occurring at the walkable-distance scale is accounted for with a spatial hurdle model. Neighbourhoods with higher residential instability, material deprivation, and population density are more likely to have access to healthy food outlets within a walkable distance from a binary 'have' or 'not have' access perspective. At the walkable distance scale however, materially deprived neighbourhoods are found to have less healthy RFE (lower relative healthy food access). Food intervention programs should be developed for striking the balance between healthy and less healthy food access in the study region as well as improving opportunities for residents to buy and consume foods consistent with dietary recommendations.

  7. Examining the Spatial Distribution of Marijuana Establishments in Colorado

    ERIC Educational Resources Information Center

    Kerski, Joseph

    2018-01-01

    In this 22-question activity, high school students investigate the spatial distribution of marijuana stores in Colorado using an interactive web map containing stores, centers, highways, population, and other data at several scales. After completing this lesson, students will know and be able to: (1) Use interactive maps, layers, and tools in…

  8. Predicting moisture and economic value of solid forest fuel piles for improving the profitability of bioenergy use

    NASA Astrophysics Data System (ADS)

    Lauren, Ari; Kinnunen, Jyrki-Pekko; Sikanen, Lauri

    2016-04-01

    Bioenergy contributes 26 % of the total energy use in Finland, and 60 % of this is provided by solid forest fuel consisting of small stems and logging residues such as tops, branches, roots and stumps. Typically the logging residues are stored as piles on site before transporting to regional combined heat and power plants for combustion. Profitability of forest fuel use depends on smart control of the feedstock. Fuel moisture, dry matter loss, and the rate of interest during the storing are the key variables affecting the economic value of the fuel. The value increases with drying, but decreases with wetting, dry matter loss and positive rate of interest. We compiled a simple simulation model computing the moisture change, dry matter loss, transportation costs and present value of feedstock piles. The model was used to predict the time of the maximum value of the stock, and to compose feedstock allocation strategies under the question: how should we choose the piles and the combustion time so that total energy yield and the economic value of the energy production is maximized? The question was assessed concerning the demand of the energy plant. The model parameterization was based on field scale studies. The initial moisture, and the rates of daily moisture change and dry matter loss in the feedstock piles depended on the day of the year according to empirical field measurements. Time step of the computation was one day. Effects of pile use timing on the total energy yield and profitability was studied using combinatorial optimization. Results show that the storing increases the pile maximum value if the natural drying onsets soon after the harvesting; otherwise dry matter loss and the capital cost of the storing overcome the benefits gained by drying. Optimized timing of the pile use can improve slightly the profitability, based on the increased total energy yield and because the energy unit based transportation costs decrease when water content in the biomass is decreased.

  9. Microstructurally-sensitive fatigue crack nucleation in Ni-based single and oligo crystals

    NASA Astrophysics Data System (ADS)

    Chen, Bo; Jiang, Jun; Dunne, Fionn P. E.

    2017-09-01

    An integrated experimental, characterisation and computational crystal plasticity study of cyclic plastic beam loading has been carried out for nickel single crystal (CMSX4) and oligocrystal (MAR002) alloys in order to assess quantitatively the mechanistic drivers for fatigue crack nucleation. The experimentally validated modelling provides knowledge of key microstructural quantities (accumulated slip, stress and GND density) at experimentally observed fatigue crack nucleation sites and it is shown that while each of these quantities is potentially important in crack nucleation, none of them in its own right is sufficient to be predictive. However, the local (elastic) stored energy density, measured over a length scale determined by the density of SSDs and GNDs, has been shown to predict crack nucleation sites in the single and oligocrystals tests. In addition, once primary nucleated cracks develop and are represented in the crystal model using XFEM, the stored energy correctly identifies where secondary fatigue cracks are observed to nucleate in experiments. This (Griffith-Stroh type) quantity also correctly differentiates and explains intergranular and transgranular fatigue crack nucleation.

  10. Entomological Collections in the Age of Big Data.

    PubMed

    Short, Andrew Edward Z; Dikow, Torsten; Moreau, Corrie S

    2018-01-07

    With a million described species and more than half a billion preserved specimens, the large scale of insect collections is unequaled by those of any other group. Advances in genomics, collection digitization, and imaging have begun to more fully harness the power that such large data stores can provide. These new approaches and technologies have transformed how entomological collections are managed and utilized. While genomic research has fundamentally changed the way many specimens are collected and curated, advances in technology have shown promise for extracting sequence data from the vast holdings already in museums. Efforts to mainstream specimen digitization have taken root and have accelerated traditional taxonomic studies as well as distribution modeling and global change research. Emerging imaging technologies such as microcomputed tomography and confocal laser scanning microscopy are changing how morphology can be investigated. This review provides an overview of how the realization of big data has transformed our field and what may lie in store.

  11. Feasibility of increasing access to healthy foods in neighborhood corner stores.

    PubMed

    O'Malley, Keelia; Gustat, Jeanette; Rice, Janet; Johnson, Carolyn C

    2013-08-01

    The feasibility of working with neighborhood corner stores to increase the availability of fresh fruit and vegetables in low-income neighborhoods in New Orleans was assessed. Household interviews and 24-hour dietary recalls (n = 97), corner store customer intercept interviews (n = 60) and interviews with corner store operators (owners/managers) (n = 12) were conducted in three neighborhoods without supermarkets. Regional produce wholesalers were contacted by phone. Results indicated that the majority of neighborhood residents use supermarkets or super stores as their primary food source. Those who did shop at corner stores typically purchased prepared foods and/or beverages making up nearly one third of their daily energy intake. Most individuals would be likely to purchase fresh fruit and vegetables from the corner stores if these foods were offered. Store operators identified cost, infrastructure and lack of customer demand as major barriers to stocking more fresh produce. Produce wholesalers did not see much business opportunity in supplying fresh produce to neighborhood corner stores on a small scale. Increasing availability of fresh fruit and vegetables in corner stores may be more feasible with the addition of systems changes that provide incentives and make it easier for neighborhood corner stores to stock and sell fresh produce.

  12. GRACE storage-runoff hystereses reveal the dynamics of ...

    EPA Pesticide Factsheets

    Watersheds function as integrated systems where climate and geology govern the movement of water. In situ instrumentation can provide local-scale insights into the non-linear relationship between streamflow and water stored in a watershed as snow, soil moisture, and groundwater. However, there is a poor understanding of these processes at the regional scale—primarily because of our inability to measure water stores and fluxes in the subsurface. Now NASA’s Gravity Recovery and Climate Experiment (GRACE) satellites quantify changes in the amount of water stored across and through the Earth, providing measurements of regional hydrologic behavior. Here we apply GRACE data to characterize for the first time how regional watersheds function as simple, dynamic systems through a series of hysteresis loops. While the physical processes underlying the loops are inherently complex, the vertical integration of terrestrial water in the GRACE signal provides process-based insights into the dynamic and non-linear function of regional-scale watersheds. We use this process-based understanding with GRACE data to effectively forecast seasonal runoff (mean R2 of 0.91) and monthly runoff (mean R2 of 0.77) in three regional-scale watersheds (>150,000 km2) of the Columbia River Basin, USA. Data from the Gravity Recovery and Climate Experiment (GRACE) satellites provide a novel dataset for understanding changes in the amount of water stored across and through the surface of the Ear

  13. Energy capture and storage in asymmetrically multistable modular structures inspired by skeletal muscle

    NASA Astrophysics Data System (ADS)

    Kidambi, Narayanan; Harne, Ryan L.; Wang, K. W.

    2017-08-01

    The remarkable versatility and adaptability of skeletal muscle that arises from the assembly of its nanoscale cross-bridges into micro-scale assemblies known as sarcomeres provides great inspiration for the development of advanced adaptive structures and material systems. Motivated by the capability of cross-bridges to capture elastic strain energy to improve the energetic efficiency of sudden movements and repeated motions, and by models of cross-bridge power stroke motions and sarcomere contractile behaviors that incorporate asymmetric, bistable potential energy landscapes, this research develops and studies modular mechanical structures that trap and store energy in higher-energy configurations. Modules exhibiting tailorable asymmetric bistability are first designed and fabricated, revealing how geometric parameters influence the asymmetry of the resulting double-well energy landscapes. These experimentally-observed characteristics are then investigated with numerical and analytical methods to characterize the dynamics of asymmetrically bistable modules. The assembly of such modules into greater structures generates complex, multi-well energy landscapes with stable system configurations exhibiting different quantities of stored elastic potential energy. Dynamic analyses illustrate the ability of these structures to capture a portion of the initial kinetic energy due to impulsive excitations as recoverable strain potential energy, and reveal how stiffness parameters, damping, and the presence of thermal noise in micro- and nano-scale applications influence energy capture behaviors. The insights gained could foster the development of advanced structural/material systems inspired by skeletal muscle, including actuators that effectively capture, store, and release energy, as well as adaptive, robust, and reusable armors and protective devices.

  14. The food environment and adult obesity in US metropolitan areas.

    PubMed

    Michimi, Akihiko; Wimberly, Michael C

    2015-11-26

    This research examines the larger-scale associations between obesity and food environments in metropolitan areas in the United States (US). The US Census County Business Patterns dataset for 2011 was used to construct various indices of food environments for selected metropolitan areas. The numbers of employees engaged in supermarkets, convenience stores, full service restaurants, fast food restaurants, and snack/coffee shops were standardised using the location quotients, and factor analysis was used to produce two uncorrelated factors measuring food environments. Data on obesity were obtained from the 2011 Behavioral Risk Factor Surveillance System. Individual level obesity measures were linked to the metropolitan area level food environment factors. Models were fitted using generalised estimating equations to control for metropolitan area level intra-correlation and individual level sociodemographic characteristics. It was found that adults residing in cities with a large share of supermarket and full-service restaurant workers were less likely to be obese, while adults residing in cities with a large share of convenience store and fast food restaurant workers were more likely to be obese. Supermarkets and full-service restaurant workers are concentrated in the Northeast and West of the US, where obesity prevalence is relatively lower, while convenience stores and fast-food restaurant workers are concentrated in the South and Midwest, where obesity prevalence is relatively higher. The food environment landscapes measured at the metropolitan area level explain the continental-scale patterns of obesity prevalence. The types of food that are readily available and widely served may translate into obesity disparities across metropolitan areas.

  15. Thermodynamic modelling of an onsite methanation reactor for upgrading producer gas from commercial small scale biomass gasifiers.

    PubMed

    Vakalis, S; Malamis, D; Moustakas, K

    2018-06-15

    Small scale biomass gasifiers have the advantage of having higher electrical efficiency in comparison to other conventional small scale energy systems. Nonetheless, a major drawback of small scale biomass gasifiers is the relatively poor quality of the producer gas. In addition, several EU Member States are seeking ways to store the excess energy that is produced from renewables like wind power and hydropower. A recent development is the storage of energy by electrolysis of water and the production of hydrogen in a process that is commonly known as "power-to-gas". The present manuscript proposes an onsite secondary reactor for upgrading producer gas by mixing it with hydrogen in order to initiate methanation reactions. A thermodynamic model has been developed for assessing the potential of the proposed methanation process. The model utilized input parameters from a representative small scale biomass gasifier and molar ratios of hydrogen from 1:0 to 1:4.1. The Villar-Cruise-Smith algorithm was used for minimizing the Gibbs free energy. The model returned the molar fractions of the permanent gases, the heating values and the Wobbe Index. For mixtures of hydrogen and producer gas on a 1:0.9 ratio the increase of the heating value is maximized with an increase of 78%. For ratios higher than 1:3, the Wobbe index increases significantly and surpasses the value of 30 MJ/Nm 3 . Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Parallel distributed, reciprocal Monte Carlo radiation in coupled, large eddy combustion simulations

    NASA Astrophysics Data System (ADS)

    Hunsaker, Isaac L.

    Radiation is the dominant mode of heat transfer in high temperature combustion environments. Radiative heat transfer affects the gas and particle phases, including all the associated combustion chemistry. The radiative properties are in turn affected by the turbulent flow field. This bi-directional coupling of radiation turbulence interactions poses a major challenge in creating parallel-capable, high-fidelity combustion simulations. In this work, a new model was developed in which reciprocal monte carlo radiation was coupled with a turbulent, large-eddy simulation combustion model. A technique wherein domain patches are stitched together was implemented to allow for scalable parallelism. The combustion model runs in parallel on a decomposed domain. The radiation model runs in parallel on a recomposed domain. The recomposed domain is stored on each processor after information sharing of the decomposed domain is handled via the message passing interface. Verification and validation testing of the new radiation model were favorable. Strong scaling analyses were performed on the Ember cluster and the Titan cluster for the CPU-radiation model and GPU-radiation model, respectively. The model demonstrated strong scaling to over 1,700 and 16,000 processing cores on Ember and Titan, respectively.

  17. Stream-water storage in the ocean using an impermeable membrane

    NASA Astrophysics Data System (ADS)

    Murabayashi, E. T.; Asuka, M.; Yamada, R.; Fok, Y. S.; Gee, H. K.

    1983-05-01

    The conceptual feasibility of storing fresh water in the ocean was investigated using a plastic membrane as the reservoir liner. In the initial phase, two physical hydraulic models were constructed to test the concept. The first was a water-filled, glass-sided box to observe the movement and reaction of the membrane to various simulated effects of currents, waves, and sediment deposition. The second was a 1:400-scale model (6.7 x 6.1 m) of West Loch, Pearl Harbor (a potential field application site), with 1:24 vertical exaggeration for similitude. The curtain method was used because it can enclose a large water body. The effect of wind, waves, tides, and currents on the curtain were simulated and the reactions observed. Although modeling is a useful tool for investigating initial concepts, its direct field application is limited because of scaling. Curtains, floating reservoirs, and bags were constructed of polyethylene sheets and deployed. All worked well after modifications were made following initial testing.

  18. Kinota: An Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring

    NASA Astrophysics Data System (ADS)

    Miles, B.; Chepudira, K.; LaBar, W.

    2017-12-01

    The Open Geospatial Consortium (OGC) SensorThings API (STA) specification, ratified in 2016, is a next-generation open standard for enabling real-time communication of sensor data. Building on over a decade of OGC Sensor Web Enablement (SWE) Standards, STA offers a rich data model that can represent a range of sensor and phenomena types (e.g. fixed sensors sensing fixed phenomena, fixed sensors sensing moving phenomena, mobile sensors sensing fixed phenomena, and mobile sensors sensing moving phenomena) and is data agnostic. Additionally, and in contrast to previous SWE standards, STA is developer-friendly, as is evident from its convenient JSON serialization, and expressive OData-based query language (with support for geospatial queries); with its Message Queue Telemetry Transport (MQTT), STA is also well-suited to efficient real-time data publishing and discovery. All these attributes make STA potentially useful for use in environmental monitoring sensor networks. Here we present Kinota(TM), an Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring. Kinota, which roughly stands for Knowledge from Internet of Things Analyses, relies on Cassandra its underlying data store, which is a horizontally scalable, fault-tolerant open-source database that is often used to store time-series data for Big Data applications (though integration with other NoSQL or rational databases is possible). With this foundation, Kinota can scale to store data from an arbitrary number of sensors collecting data every 500 milliseconds. Additionally, Kinota architecture is very modular allowing for customization by adopters who can choose to replace parts of the existing implementation when desirable. The architecture is also highly portable providing the flexibility to choose between cloud providers like azure, amazon, google etc. The scalable, flexible and cloud friendly architecture of Kinota makes it ideal for use in next-generation large-scale and high-resolution real-time environmental monitoring networks used in domains such as hydrology, geomorphology, and geophysics, as well as management applications such as flood early warning, and regulatory enforcement.

  19. Enhancing dielectric permittivity for energy-storage devices through tricritical phenomenon

    PubMed Central

    Gao, Jinghui; Wang, Yan; Liu, Yongbin; Hu, Xinghao; Ke, Xiaoqin; Zhong, Lisheng; He, Yuting; Ren, Xiaobing

    2017-01-01

    Although dielectric energy-storing devices are frequently used in high voltage level, the fast growing on the portable and wearable electronics have been increasing the demand on the energy-storing devices at finite electric field strength. This paper proposes an approach on enhancing energy density under low electric field through compositionally inducing tricriticality in Ba(Ti,Sn)O3 ferroelectric material system with enlarged dielectric response. The optimal dielectric permittivity at tricritical point can reach to εr = 5.4 × 104, and the associated energy density goes to around 30 mJ/cm3 at the electric field of 10 kV/cm, which exceeds most of the selected ferroelectric materials at the same field strength. The microstructure nature for such a tricritical behavior shows polarization inhomogeneity in nanometeric scale, which indicates a large polarizability under external electric field. Further phenomenological Landau modeling suggests that large dielectric permittivity and energy density can be ascribed to the vanishing of energy barrier for polarization altering caused by tricriticality. Our results may shed light on developing energy-storing dielectrics with large permittivity and energy density at low electric field. PMID:28098249

  20. A balanced memory network.

    PubMed

    Roudi, Yasser; Latham, Peter E

    2007-09-01

    A fundamental problem in neuroscience is understanding how working memory--the ability to store information at intermediate timescales, like tens of seconds--is implemented in realistic neuronal networks. The most likely candidate mechanism is the attractor network, and a great deal of effort has gone toward investigating it theoretically. Yet, despite almost a quarter century of intense work, attractor networks are not fully understood. In particular, there are still two unanswered questions. First, how is it that attractor networks exhibit irregular firing, as is observed experimentally during working memory tasks? And second, how many memories can be stored under biologically realistic conditions? Here we answer both questions by studying an attractor neural network in which inhibition and excitation balance each other. Using mean-field analysis, we derive a three-variable description of attractor networks. From this description it follows that irregular firing can exist only if the number of neurons involved in a memory is large. The same mean-field analysis also shows that the number of memories that can be stored in a network scales with the number of excitatory connections, a result that has been suggested for simple models but never shown for realistic ones. Both of these predictions are verified using simulations with large networks of spiking neurons.

  1. A Coupled fcGCM-GCE Modeling System: A 3D Cloud Resolving Model and a Regional Scale Model

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2005-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and ore sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. The Goddard MMF is based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM), and it has started production runs with two years results (1998 and 1999). Also, at Goddard, we have implemented several Goddard microphysical schemes (21CE, several 31CE), Goddard radiation (including explicity calculated cloud optical properties), and Goddard Land Information (LIS, that includes the CLM and NOAH land surface models) into a next generation regional scale model, WRF. In this talk, I will present: (1) A Brief review on GCE model and its applications on precipitation processes (microphysical and land processes), (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), (3) A discussion on the Goddard WRF version (its developments and applications), and (4) The characteristics of the four-dimensional cloud data sets (or cloud library) stored at Goddard.

  2. The role of porous matrix in water flow regulation within a karst unsaturated zone: an integrated hydrogeophysical approach

    NASA Astrophysics Data System (ADS)

    Carrière, Simon D.; Chalikakis, Konstantinos; Danquigny, Charles; Davi, Hendrik; Mazzilli, Naomi; Ollivier, Chloé; Emblanch, Christophe

    2016-11-01

    Some portions of the porous rock matrix in the karst unsaturated zone (UZ) can contain large volumes of water and play a major role in water flow regulation. The essential results are presented of a local-scale study conducted in 2011 and 2012 above the Low Noise Underground Laboratory (LSBB - Laboratoire Souterrain à Bas Bruit) at Rustrel, southeastern France. Previous research revealed the geological structure and water-related features of the study site and illustrated the feasibility of specific hydrogeophysical measurements. In this study, the focus is on hydrodynamics at the seasonal and event timescales. Magnetic resonance sounding (MRS) measured a high water content (more than 10 %) in a large volume of rock. This large volume of water cannot be stored in fractures and conduits within the UZ. MRS was also used to measure the seasonal variation of water stored in the karst UZ. A process-based model was developed to simulate the effect of vegetation on groundwater recharge dynamics. In addition, electrical resistivity tomography (ERT) monitoring was used to assess preferential water pathways during a rain event. This study demonstrates the major influence of water flow within the porous rock matrix on the UZ hydrogeological functioning at both the local (LSBB) and regional (Fontaine de Vaucluse) scales. By taking into account the role of the porous matrix in water flow regulation, these findings may significantly improve karst groundwater hydrodynamic modelling, exploitation, and sustainable management.

  3. Evaporation suppression from reservoirs using floating covers: Lab scale wind-tunnel observations and mechanistic model predictions

    NASA Astrophysics Data System (ADS)

    Or, Dani; Lehmann, Peter; Aminzadeh, Milad; Sommer, Martina; Wey, Hannah; Krentscher, Christiane; Wunderli, Hans; Breitenstein, Daniel

    2017-04-01

    The competition over dwindling fresh water resources is expected to intensify with projected increase in human population in arid regions, expansion of irrigated land and changes in climate and drought patterns. The volume of water stored in reservoirs would also increase to mitigate seasonal shortages due to rainfall variability and to meet irrigation water needs. By some estimates up to half of the stored water is lost to evaporation, thereby exacerbating the water scarcity problem. Recently, there is an upsurge in the use of self-assembling floating covers to suppress evaporation, yet the design and implementation remain largely empirical. We report a systematic experimental evaluation of different cover types and external drivers (radiation, wind, wind plus radiation) on evaporation suppression and energy balance of a 1.4 m2 basin placed in a wind-tunnel. Surprisingly, evaporation suppression by black and white floating covers (balls and plates) were similar despite significantly different energy balance regimes over the cover surfaces. Moreover, the evaporation suppression efficiency was a simple function of the uncovered area (square root of the uncovered fraction) with linear relations with the covered area in some cases. The thermally decoupled floating covers offer an efficient solution to the evaporation suppression with limited influence of the surface energy balance (water temperature for black and white covers was similar and remained nearly constant). The results will be linked with a predictive evaporation-energy balance model and issues of spatial scales and long exposure times will be studied.

  4. Distributed Lag Models: Examining Associations between the Built Environment and Health

    PubMed Central

    Baek, Jonggyu; Sánchez, Brisa N.; Berrocal, Veronica J.; Sanchez-Vaznaugh, Emma V.

    2016-01-01

    Built environment factors constrain individual level behaviors and choices, and thus are receiving increasing attention to assess their influence on health. Traditional regression methods have been widely used to examine associations between built environment measures and health outcomes, where a fixed, pre-specified spatial scale (e.g., 1 mile buffer) is used to construct environment measures. However, the spatial scale for these associations remains largely unknown and misspecifying it introduces bias. We propose the use of distributed lag models (DLMs) to describe the association between built environment features and health as a function of distance from the locations of interest and circumvent a-priori selection of a spatial scale. Based on simulation studies, we demonstrate that traditional regression models produce associations biased away from the null when there is spatial correlation among the built environment features. Inference based on DLMs is robust under a range of scenarios of the built environment. We use this innovative application of DLMs to examine the association between the availability of convenience stores near California public schools, which may affect children’s dietary choices both through direct access to junk food and exposure to advertisement, and children’s body mass index z-scores (BMIz). PMID:26414942

  5. ATLAS Data Management Accounting with Hadoop Pig and HBase

    NASA Astrophysics Data System (ADS)

    Lassnig, Mario; Garonne, Vincent; Dimitrov, Gancho; Canali, Luca

    2012-12-01

    The ATLAS Distributed Data Management system requires accounting of its contents at the metadata layer. This presents a hard problem due to the large scale of the system, the high dimensionality of attributes, and the high rate of concurrent modifications of data. The system must efficiently account more than 90PB of disk and tape that store upwards of 500 million files across 100 sites globally. In this work a generic accounting system is presented, which is able to scale to the requirements of ATLAS. The design and architecture is presented, and the implementation is discussed. An emphasis is placed on the design choices such that the underlying data models are generally applicable to different kinds of accounting, reporting and monitoring.

  6. Discrete Choice Model of Food Store Trips Using National Household Food Acquisition and Purchase Survey (FoodAPS).

    PubMed

    Hillier, Amy; Smith, Tony E; Whiteman, Eliza D; Chrisinger, Benjamin W

    2017-09-27

    Where households across income levels shop for food is of central concern within a growing body of research focused on where people live relative to where they shop, what they purchase and eat, and how those choices influence the risk of obesity and chronic disease. We analyzed data from the National Household Food Acquisition and Purchase Survey (FoodAPS) using a conditional logit model to determine where participants shop for food to be prepared and eaten at home and how individual and household characteristics of food shoppers interact with store characteristics and distance from home in determining store choice. Store size, whether or not it was a full-service supermarket, and the driving distance from home to the store constituted the three significant main effects on store choice. Overall, participants were more likely to choose larger stores, conventional supermarkets rather than super-centers and other types of stores, and stores closer to home. Interaction effects show that participants receiving Supplemental Nutrition Assistance Program (SNAP) were even more likely to choose larger stores. Hispanic participants were more likely than non-Hispanics to choose full-service supermarkets while White participants were more likely to travel further than non-Whites. This study demonstrates the value of explicitly spatial discrete choice models and provides evidence of national trends consistent with previous smaller, local studies.

  7. Discrete Choice Model of Food Store Trips Using National Household Food Acquisition and Purchase Survey (FoodAPS)

    PubMed Central

    Hillier, Amy; Smith, Tony E.; Whiteman, Eliza D.

    2017-01-01

    Where households across income levels shop for food is of central concern within a growing body of research focused on where people live relative to where they shop, what they purchase and eat, and how those choices influence the risk of obesity and chronic disease. We analyzed data from the National Household Food Acquisition and Purchase Survey (FoodAPS) using a conditional logit model to determine where participants shop for food to be prepared and eaten at home and how individual and household characteristics of food shoppers interact with store characteristics and distance from home in determining store choice. Store size, whether or not it was a full-service supermarket, and the driving distance from home to the store constituted the three significant main effects on store choice. Overall, participants were more likely to choose larger stores, conventional supermarkets rather than super-centers and other types of stores, and stores closer to home. Interaction effects show that participants receiving Supplemental Nutrition Assistance Program (SNAP) were even more likely to choose larger stores. Hispanic participants were more likely than non-Hispanics to choose full-service supermarkets while White participants were more likely to travel further than non-Whites. This study demonstrates the value of explicitly spatial discrete choice models and provides evidence of national trends consistent with previous smaller, local studies. PMID:28953221

  8. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models.

    PubMed

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  9. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models

    PubMed Central

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A.

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients. PMID:25374542

  10. Multi-Scale Mapping of Vegetation Biomass

    NASA Astrophysics Data System (ADS)

    Hudak, A. T.; Fekety, P.; Falkowski, M. J.; Kennedy, R. E.; Crookston, N.; Smith, A. M.; Mahoney, P.; Glenn, N. F.; Dong, J.; Kane, V. R.; Woodall, C. W.

    2016-12-01

    Vegetation biomass mapping at multiple scales is important for carbon inventory and monitoring, reporting, and verification (MRV). Project-level lidar collections allow biomass estimation with high confidence where associated with field plot measurements. Predictive models developed from such datasets are customarily used to generate landscape-scale biomass maps. We tested the feasibility of predicting biomass in landscapes surveyed with lidar but without field plots, by withholding plot datasets from a reduced model applied to the landscapes, and found support for a generalized model in the northern Idaho ecoregion. We are also upscaling a generalized model to all forested lands in Idaho. Our regional modeling approach is to sample the 30-m biomass predictions from the landscape-scale maps and use them to train a regional biomass model, using Landsat time series, topographic derivatives, and climate variables as predictors. Our regional map validation approach is to aggregate the regional, annual biomass predictions to the county level and compare them to annual county-level biomass summarized independently from systematic, field-based, annual inventories conducted by the US Forest Inventory and Analysis (FIA) Program nationally. A national-scale forest cover map generated independently from 2010 PALSAR data at 25-m resolution is being used to mask non-forest pixels from the aggregations. Effects of climate change on future regional biomass stores are also being explored, using biomass estimates projected from stand-level inventory data collected in the National Forests and comparing them to FIA plot data collected independently on public and private lands, projected under the same climate change scenarios, with disturbance trends extracted from the Landsat time series. Our ultimate goal is to demonstrate, focusing on the ecologically diverse Northwest region of the USA, a carbon monitoring system (CMS) that is accurate, objective, repeatable, and transparent.

  11. Relativistic jets without large-scale magnetic fields

    NASA Astrophysics Data System (ADS)

    Parfrey, K.; Giannios, D.; Beloborodov, A.

    2014-07-01

    The canonical model of relativistic jets from black holes requires a large-scale ordered magnetic field to provide a significant magnetic flux through the ergosphere--in the Blandford-Znajek process, the jet power scales with the square of the magnetic flux. In many jet systems the presence of the required flux in the environment of the central engine is questionable. I will describe an alternative scenario, in which jets are produced by the continuous sequential accretion of small magnetic loops. The magnetic energy stored in these coronal flux systems is amplified by the differential rotation of the accretion disc and by the rotating spacetime of the black hole, leading to runaway field line inflation, magnetic reconnection in thin current layers, and the ejection of discrete bubbles of Poynting-flux-dominated plasma. For illustration I will show the results of general-relativistic force-free electrodynamic simulations of rotating black hole coronae, performed using a new resistivity model. The dissipation of magnetic energy by coronal reconnection events, as demonstrated in these simulations, is a potential source of the observed high-energy emission from accreting compact objects.

  12. Whole-body iron transport and metabolism: Mechanistic, multi-scale model to improve treatment of anemia in chronic kidney disease

    PubMed Central

    Sarkar, Joydeep

    2018-01-01

    Iron plays vital roles in the human body including enzymatic processes, oxygen-transport via hemoglobin and immune response. Iron metabolism is characterized by ~95% recycling and minor replenishment through diet. Anemia of chronic kidney disease (CKD) is characterized by a lack of synthesis of erythropoietin leading to reduced red blood cell (RBC) formation and aberrant iron recycling. Treatment of CKD anemia aims to normalize RBC count and serum hemoglobin. Clinically, the various fluxes of iron transport and accumulation are not measured so that changes during disease (e.g., CKD) and treatment are unknown. Unwanted iron accumulation in patients is known to lead to adverse effects. Current whole-body models lack the mechanistic details of iron transport related to RBC maturation, transferrin (Tf and TfR) dynamics and assume passive iron efflux from macrophages. Hence, they are not predictive of whole-body iron dynamics and cannot be used to design individualized patient treatment. For prediction, we developed a mechanistic, multi-scale computational model of whole-body iron metabolism incorporating four compartments containing major pools of iron and RBC generation process. The model accounts for multiple forms of iron in vivo, mechanisms involved in iron uptake and release and their regulation. Furthermore, the model is interfaced with drug pharmacokinetics to allow simulation of treatment dynamics. We calibrated our model with experimental and clinical data from peer-reviewed literature to reliably simulate CKD anemia and the effects of current treatment involving combination of epoietin-alpha and iron dextran. This in silico whole-body model of iron metabolism predicts that a year of treatment can potentially lead to 90% downregulation of ferroportin (FPN) levels, 15-fold increase in iron stores with only a 20% increase in iron flux from the reticulo-endothelial system (RES). Model simulations quantified unmeasured iron fluxes, previously unknown effects of treatment on FPN-level and iron stores in the RES. This mechanistic whole-body model can be the basis for future studies that incorporate iron metabolism together with related clinical experiments. Such an approach could pave the way for development of effective personalized treatment of CKD anemia. PMID:29659573

  13. Dangerous (toxic) atmospheres in UK wood pellet and wood chip fuel storage.

    PubMed

    Simpson, Andrew T; Hemingway, Michael A; Seymour, Cliff

    2016-09-01

    There is growing use of wood pellet and wood chip boilers in the UK. Elsewhere fatalities have been reported, caused by carbon monoxide poisoning following entry into wood pellet storage areas. The aim of this work was to obtain information on how safely these two fuels are being stored in the UK. Site visits were made to six small-scale boiler systems and one large-scale pellet warehouse, to assess storage practice, risk management systems and controls, user knowledge, and potential for exposure to dangerous atmospheres. Real time measurements were made of gases in the store rooms and during laboratory tests on pellets and chips. Volatile organic compounds (VOCs) emitted and the microbiological content of the fuel was also determined. Knowledge of the hazards associated with these fuels, including confined space entry, was found to be limited at the smaller sites, but greater at the large pellet warehouse. There has been limited risk communication between companies supplying and maintaining boilers, those manufacturing and supplying fuel, and users. Risk is controlled by restricting access to the store rooms with locked entries; some store rooms have warning signs and carbon monoxide alarms. Nevertheless, some store rooms are accessed for inspection and maintenance. Laboratory tests showed that potentially dangerous atmospheres of carbon monoxide and carbon dioxide, with depleted levels of oxygen may be generated by these fuels, but this was not observed at the sites visited. Unplanned ventilation within store rooms was thought to be reducing the build-up of dangerous atmospheres. Microbiological contamination was confined to wood chips.

  14. Dangerous (toxic) atmospheres in UK wood pellet and wood chip fuel storage

    PubMed Central

    Simpson, Andrew T.; Hemingway, Michael A.; Seymour, Cliff

    2016-01-01

    ABSTRACT There is growing use of wood pellet and wood chip boilers in the UK. Elsewhere fatalities have been reported, caused by carbon monoxide poisoning following entry into wood pellet storage areas. The aim of this work was to obtain information on how safely these two fuels are being stored in the UK. Site visits were made to six small-scale boiler systems and one large-scale pellet warehouse, to assess storage practice, risk management systems and controls, user knowledge, and potential for exposure to dangerous atmospheres. Real time measurements were made of gases in the store rooms and during laboratory tests on pellets and chips. Volatile organic compounds (VOCs) emitted and the microbiological content of the fuel was also determined. Knowledge of the hazards associated with these fuels, including confined space entry, was found to be limited at the smaller sites, but greater at the large pellet warehouse. There has been limited risk communication between companies supplying and maintaining boilers, those manufacturing and supplying fuel, and users. Risk is controlled by restricting access to the store rooms with locked entries; some store rooms have warning signs and carbon monoxide alarms. Nevertheless, some store rooms are accessed for inspection and maintenance. Laboratory tests showed that potentially dangerous atmospheres of carbon monoxide and carbon dioxide, with depleted levels of oxygen may be generated by these fuels, but this was not observed at the sites visited. Unplanned ventilation within store rooms was thought to be reducing the build-up of dangerous atmospheres. Microbiological contamination was confined to wood chips. PMID:27030057

  15. Particle backscatter and relative humidity measured across cirrus clouds and comparison with microphysical cirrus modelling

    NASA Astrophysics Data System (ADS)

    Brabec, M.; Wienhold, F. G.; Luo, B. P.; Vömel, H.; Immler, F.; Steiner, P.; Hausammann, E.; Weers, U.; Peter, T.

    2012-10-01

    Advanced measurement and modelling techniques are employed to estimate the partitioning of atmospheric water between the gas phase and the condensed phase in and around cirrus clouds, and thus to identify in-cloud and out-of-cloud supersaturations with respect to ice. In November 2008 the newly developed balloon-borne backscatter sonde COBALD (Compact Optical Backscatter and AerosoL Detector) was flown 14 times together with a CFH (Cryogenic Frost point Hygrometer) from Lindenberg, Germany (52° N, 14° E). The case discussed here in detail shows two cirrus layers with in-cloud relative humidities with respect to ice between 50% and 130%. Global operational analysis data of ECMWF (roughly 1° × 1° horizontal and 1 km vertical resolution, 6-hourly stored fields) fail to represent ice water contents and relative humidities. Conversely, regional COSMO-7 forecasts (6.6 km × 6.6 km, 5-min stored fields) capture the measured humidities and cloud positions remarkably well. The main difference between ECMWF and COSMO data is the resolution of small-scale vertical features responsible for cirrus formation. Nevertheless, ice water contents in COSMO-7 are still off by factors 2-10, likely reflecting limitations in COSMO's ice phase bulk scheme. Significant improvements can be achieved by comprehensive size-resolved microphysical and optical modelling along backward trajectories based on COSMO-7 wind and temperature fields, which allow accurate computation of humidities, homogeneous ice nucleation, resulting ice particle size distributions and backscatter ratios at the COBALD wavelengths. However, only by superimposing small-scale temperature fluctuations, which remain unresolved by the numerical weather prediction models, can we obtain a satisfying agreement with the observations and reconcile the measured in-cloud non-equilibrium humidities with conventional ice cloud microphysics. Conversely, the model-data comparison provides no evidence that additional changes to ice-cloud microphysics - such as heterogeneous nucleation or changing the water vapour accommodation coefficient on ice - are required.

  16. Vesicle capture, not delivery, scales up neuropeptide storage in neuroendocrine terminals.

    PubMed

    Bulgari, Dinara; Zhou, Chaoming; Hewes, Randall S; Deitcher, David L; Levitan, Edwin S

    2014-03-04

    Neurons vary in their capacity to produce, store, and release neuropeptides packaged in dense-core vesicles (DCVs). Specifically, neurons used for cotransmission have terminals that contain few DCVs and many small synaptic vesicles, whereas neuroendocrine neuron terminals contain many DCVs. Although the mechanistic basis for presynaptic variation is unknown, past research demonstrated transcriptional control of neuropeptide synthesis suggesting that supply from the soma limits presynaptic neuropeptide accumulation. Here neuropeptide release is shown to scale with presynaptic neuropeptide stores in identified Drosophila cotransmitting and neuroendocrine terminals. However, the dramatic difference in DCV number in these terminals occurs with similar anterograde axonal transport and DCV half-lives. Thus, differences in presynaptic neuropeptide stores are not explained by DCV delivery from the soma or turnover. Instead, greater neuropeptide accumulation in neuroendocrine terminals is promoted by dramatically more efficient presynaptic DCV capture. Greater capture comes with tradeoffs, however, as fewer uncaptured DCVs are available to populate distal boutons and replenish neuropeptide stores following release. Finally, expression of the Dimmed transcription factor in cotransmitting neurons increases presynaptic DCV capture. Therefore, DCV capture in the terminal is genetically controlled and determines neuron-specific variation in peptidergic function.

  17. Vesicle capture, not delivery, scales up neuropeptide storage in neuroendocrine terminals

    PubMed Central

    Bulgari, Dinara; Zhou, Chaoming; Hewes, Randall S.; Deitcher, David L.; Levitan, Edwin S.

    2014-01-01

    Neurons vary in their capacity to produce, store, and release neuropeptides packaged in dense-core vesicles (DCVs). Specifically, neurons used for cotransmission have terminals that contain few DCVs and many small synaptic vesicles, whereas neuroendocrine neuron terminals contain many DCVs. Although the mechanistic basis for presynaptic variation is unknown, past research demonstrated transcriptional control of neuropeptide synthesis suggesting that supply from the soma limits presynaptic neuropeptide accumulation. Here neuropeptide release is shown to scale with presynaptic neuropeptide stores in identified Drosophila cotransmitting and neuroendocrine terminals. However, the dramatic difference in DCV number in these terminals occurs with similar anterograde axonal transport and DCV half-lives. Thus, differences in presynaptic neuropeptide stores are not explained by DCV delivery from the soma or turnover. Instead, greater neuropeptide accumulation in neuroendocrine terminals is promoted by dramatically more efficient presynaptic DCV capture. Greater capture comes with tradeoffs, however, as fewer uncaptured DCVs are available to populate distal boutons and replenish neuropeptide stores following release. Finally, expression of the Dimmed transcription factor in cotransmitting neurons increases presynaptic DCV capture. Therefore, DCV capture in the terminal is genetically controlled and determines neuron-specific variation in peptidergic function. PMID:24550480

  18. Computationally Efficient Modeling and Simulation of Large Scale Systems

    NASA Technical Reports Server (NTRS)

    Jain, Jitesh (Inventor); Koh, Cheng-Kok (Inventor); Balakrishnan, Vankataramanan (Inventor); Cauley, Stephen F (Inventor); Li, Hong (Inventor)

    2014-01-01

    A system for simulating operation of a VLSI interconnect structure having capacitive and inductive coupling between nodes thereof, including a processor, and a memory, the processor configured to perform obtaining a matrix X and a matrix Y containing different combinations of passive circuit element values for the interconnect structure, the element values for each matrix including inductance L and inverse capacitance P, obtaining an adjacency matrix A associated with the interconnect structure, storing the matrices X, Y, and A in the memory, and performing numerical integration to solve first and second equations.

  19. Weather prediction using a genetic memory

    NASA Technical Reports Server (NTRS)

    Rogers, David

    1990-01-01

    Kanaerva's sparse distributed memory (SDM) is an associative memory model based on the mathematical properties of high dimensional binary address spaces. Holland's genetic algorithms are a search technique for high dimensional spaces inspired by evolutional processes of DNA. Genetic Memory is a hybrid of the above two systems, in which the memory uses a genetic algorithm to dynamically reconfigure its physical storage locations to reflect correlations between the stored addresses and data. This architecture is designed to maximize the ability of the system to scale-up to handle real world problems.

  20. Atomic-scale studies on the effect of boundary coherency on stability in twinned Cu

    NASA Astrophysics Data System (ADS)

    Niu, Rongmei; Han, Ke; Su, Yi-Feng; Salters, Vincent J.

    2014-01-01

    The stored energy and hardness of nanotwinned (NT) Cu are related to interaction between dislocations and {111}-twin boundaries (TBs) studied at atomic scales by high-angle annular dark-field scanning transmission electron microscope. Lack of mobile dislocations at coherent TBs (CTBs) provides as-deposited NT Cu a rare combination of stability and hardness. The introduction of numerous incoherent TBs (ITBs) reduces both the stability and hardness. While storing more energy in their ITBs than in the CTBs, deformed NT Cu also exhibits high dislocation density and TB mobility and therefore has increased the driving force for recovery, coarsening, and recrystallization.

  1. A Computationally Efficient Parallel Levenberg-Marquardt Algorithm for Large-Scale Big-Data Inversion

    NASA Astrophysics Data System (ADS)

    Lin, Y.; O'Malley, D.; Vesselinov, V. V.

    2015-12-01

    Inverse modeling seeks model parameters given a set of observed state variables. However, for many practical problems due to the facts that the observed data sets are often large and model parameters are often numerous, conventional methods for solving the inverse modeling can be computationally expensive. We have developed a new, computationally-efficient Levenberg-Marquardt method for solving large-scale inverse modeling. Levenberg-Marquardt methods require the solution of a dense linear system of equations which can be prohibitively expensive to compute for large-scale inverse problems. Our novel method projects the original large-scale linear problem down to a Krylov subspace, such that the dimensionality of the measurements can be significantly reduced. Furthermore, instead of solving the linear system for every Levenberg-Marquardt damping parameter, we store the Krylov subspace computed when solving the first damping parameter and recycle it for all the following damping parameters. The efficiency of our new inverse modeling algorithm is significantly improved by using these computational techniques. We apply this new inverse modeling method to invert for a random transitivity field. Our algorithm is fast enough to solve for the distributed model parameters (transitivity) at each computational node in the model domain. The inversion is also aided by the use regularization techniques. The algorithm is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). Julia is an advanced high-level scientific programing language that allows for efficient memory management and utilization of high-performance computational resources. By comparing with a Levenberg-Marquardt method using standard linear inversion techniques, our Levenberg-Marquardt method yields speed-up ratio of 15 in a multi-core computational environment and a speed-up ratio of 45 in a single-core computational environment. Therefore, our new inverse modeling method is a powerful tool for large-scale applications.

  2. Numerical modelling and experimental study of liquid evaporation during gel formation

    NASA Astrophysics Data System (ADS)

    Pokusaev, B. G.; Khramtsov, D. P.

    2017-11-01

    Gels are promising materials in biotechnology and medicine as a medium for storing cells for bioprinting applications. Gel is a two-phase system consisting of solid medium and liquid phase. Understanding of a gel structure evolution and gel aging during liquid evaporation is a crucial step in developing new additive bioprinting technologies. A numerical and experimental study of liquid evaporation was performed. In experimental study an evaporation process of an agarose gel layer located on Petri dish was observed and mass difference was detected using electronic scales. Numerical model was based on a smoothed particle hydrodynamics method. Gel in a model was represented as a solid-liquid system and liquid evaporation was modelled due to capillary forces and heat transfer. Comparison of experimental data and numerical results demonstrated that model can adequately represent evaporation process in agarose gel.

  3. A Simple Technique for Securing Data at Rest Stored in a Computing Cloud

    NASA Astrophysics Data System (ADS)

    Sedayao, Jeff; Su, Steven; Ma, Xiaohao; Jiang, Minghao; Miao, Kai

    "Cloud Computing" offers many potential benefits, including cost savings, the ability to deploy applications and services quickly, and the ease of scaling those application and services once they are deployed. A key barrier for enterprise adoption is the confidentiality of data stored on Cloud Computing Infrastructure. Our simple technique implemented with Open Source software solves this problem by using public key encryption to render stored data at rest unreadable by unauthorized personnel, including system administrators of the cloud computing service on which the data is stored. We validate our approach on a network measurement system implemented on PlanetLab. We then use it on a service where confidentiality is critical - a scanning application that validates external firewall implementations.

  4. Semantic Web repositories for genomics data using the eXframe platform.

    PubMed

    Merrill, Emily; Corlosquet, Stéphane; Ciccarese, Paolo; Clark, Tim; Das, Sudeshna

    2014-01-01

    With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge.

  5. Shallow aquifer storage and recovery (SASR): Initial findings from the Willamette Basin, Oregon

    NASA Astrophysics Data System (ADS)

    Neumann, P.; Haggerty, R.

    2012-12-01

    A novel mode of shallow aquifer management could increase the volumetric potential and distribution of groundwater storage. We refer to this mode as shallow aquifer storage and recovery (SASR) and gauge its potential as a freshwater storage tool. By this mode, water is stored in hydraulically connected aquifers with minimal impact to surface water resources. Basin-scale numerical modeling provides a linkage between storage efficiency and hydrogeological parameters, which in turn guides rulemaking for how and where water can be stored. Increased understanding of regional groundwater-surface water interactions is vital to effective SASR implementation. In this study we (1) use a calibrated model of the central Willamette Basin (CWB), Oregon to quantify SASR storage efficiency at 30 locations; (2) estimate SASR volumetric storage potential throughout the CWB based on these results and pertinent hydrogeological parameters; and (3) introduce a methodology for management of SASR by such parameters. Of 3 shallow, sedimentary aquifers in the CWB, we find the moderately conductive, semi-confined, middle sedimentary unit (MSU) to be most efficient for SASR. We estimate that users overlying 80% of the area in this aquifer could store injected water with greater than 80% efficiency, and find efficiencies of up to 95%. As a function of local production well yields, we estimate a maximum annual volumetric storage potential of 30 million m3 using SASR in the MSU. This volume constitutes roughly 9% of the current estimated summer pumpage in the Willamette basin at large. The dimensionless quantity lag #—calculated using modeled specific capacity, distance to nearest in-layer stream boundary, and injection duration—exhibits relatively high correlation to SASR storage efficiency at potential locations in the CWB. This correlation suggests that basic field measurements could guide SASR as an efficient shallow aquifer storage tool.

  6. Flight Dynamic Simulation of Fighter In the Asymmetric External Store Release Process

    NASA Astrophysics Data System (ADS)

    Safi’i, Imam; Arifianto, Ony; Nurohman, Chandra

    2018-04-01

    In the fighter design, it is important to evaluate and analyze the flight dynamic of the aircraft earlier in the development process. One of the case is the dynamics of external store release process. A simulation tool can be used to analyze the fighter/external store system’s dynamics in the preliminary design stage. This paper reports the flight dynamics of Jet Fighter Experiment (JF-1 E) in asymmetric Advance Medium Range Air to Air Missile (AMRAAM) release process through simulations. The JF-1 E and AIM 120 AMRAAAM models are built by using Advanced Aircraft Analysis (AAA) and Missile Datcom software. By using these softwares, the aerodynamic stability and control derivatives can be obtained and used to model the dynamic characteristic of the fighter and the external store. The dynamic system is modeled by using MATLAB/Simulink software. By using this software, both the fighter/external store integration and the external store release process is simulated, and the dynamic of the system can be analyzed.

  7. The impact of Indonesian peatland degradation on downstream marine ecosystems and the global carbon cycle.

    PubMed

    Abrams, Jesse F; Hohn, Sönke; Rixen, Tim; Baum, Antje; Merico, Agostino

    2016-01-01

    Tropical peatlands are among the most space-efficient stores of carbon on Earth containing approximately 89 Gt C. Of this, 57 Gt (65%) are stored in Indonesian peatlands. Large-scale exploitation of land, including deforestation and drainage for the establishment of oil palm plantations, is changing the carbon balance of Indonesian peatlands, turning them from a natural sink to a source via outgassing of CO2 to the atmosphere and leakage of dissolved organic carbon (DOC) into the coastal ocean. The impacts of this perturbation to the coastal environment and at the global scale are largely unknown. Here, we evaluate the downstream effects of released Indonesian peat carbon on coastal ecosystems and on the global carbon cycle. We use a biogeochemical box model in combination with novel and literature observations to investigate the impact of different carbon emission scenarios on the combined ocean-atmosphere system. The release of all carbon stored in the Indonesian peat pool, considered as a worst-case scenario, will increase atmospheric pCO2 by 8 ppm to 15 ppm within the next 200 years. The expected impact on the Java Sea ecosystems is most significant on the short term (over a few hundred years) and is characterized by an increase of 3.3% in phytoplankton, 32% in seagrass biomass, and 5% decrease in coral biomass. On the long term, however, the coastal ecosystems will recover to reach near pre-excursion conditions. Our results suggest that the ultimate fate of the peat carbon is in the deep ocean with 69% of it landing in the deep DIC pool after 1000 years, but the effects on the global ocean carbonate chemistry will be marginal. © 2015 John Wiley & Sons Ltd.

  8. An open source web interface for linking models to infrastructure system databases

    NASA Astrophysics Data System (ADS)

    Knox, S.; Mohamed, K.; Harou, J. J.; Rheinheimer, D. E.; Medellin-Azuara, J.; Meier, P.; Tilmant, A.; Rosenberg, D. E.

    2016-12-01

    Models of networked engineered resource systems such as water or energy systems are often built collaboratively with developers from different domains working at different locations. These models can be linked to large scale real world databases, and they are constantly being improved and extended. As the development and application of these models becomes more sophisticated, and the computing power required for simulations and/or optimisations increases, so has the need for online services and tools which enable the efficient development and deployment of these models. Hydra Platform is an open source, web-based data management system, which allows modellers of network-based models to remotely store network topology and associated data in a generalised manner, allowing it to serve multiple disciplines. Hydra Platform uses a web API using JSON to allow external programs (referred to as `Apps') to interact with its stored networks and perform actions such as importing data, running models, or exporting the networks to different formats. Hydra Platform supports multiple users accessing the same network and has a suite of functions for managing users and data. We present ongoing development in Hydra Platform, the Hydra Web User Interface, through which users can collaboratively manage network data and models in a web browser. The web interface allows multiple users to graphically access, edit and share their networks, run apps and view results. Through apps, which are located on the server, the web interface can give users access to external data sources and models without the need to install or configure any software. This also ensures model results can be reproduced by removing platform or version dependence. Managing data and deploying models via the web interface provides a way for multiple modellers to collaboratively manage data, deploy and monitor model runs and analyse results.

  9. Digital Rocks Portal: a sustainable platform for imaged dataset sharing, translation and automated analysis

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Hanlon, M.; Nanda, G.; Agarwal, P.

    2015-12-01

    Recent advances in imaging have provided a wealth of 3D datasets that reveal pore space microstructure (nm to cm length scale) and allow investigation of nonlinear flow and mechanical phenomena from first principles using numerical approaches. This framework has popularly been called "digital rock physics". Researchers, however, have trouble storing and sharing the datasets both due to their size and the lack of standardized image types and associated metadata for volumetric datasets. This impedes scientific cross-validation of the numerical approaches that characterize large scale porous media properties, as well as development of multiscale approaches required for correct upscaling. A single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal, that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Once widely accepter, the repository will jumpstart productivity and enable scientific inquiry and engineering decisions founded on a data-driven basis. This is the first repository of its kind. We show initial results on incorporating essential software tools and pipelines that make it easier for researchers to store and reuse data, and for educators to quickly visualize and illustrate concepts to a wide audience. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.

  10. Fleet DNA Phase 1 Refinement & Phase 2 Implementation; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Kenneth; Duran, Adam

    2015-06-11

    Fleet DNA acts as a secure data warehouse for medium- and heavy-duty vehicle data. It demonstrates that vehicle drive cycle data can be collected and stored for large-scale analysis and modeling applications. The data serve as a real-world data source for model development and validation. Storage of the results of past/present/future data collection efforts improves analysis efficiency through pooling of shared data and provides the opportunity for 'big data' type analyses. Fleet DNA shows it is possible to develop a common database structure that can store/analyze/report on data sourced from multiple parties, each with unique data formats/types. Data filtration andmore » normalization algorithms developed for the project allow for a wide range of data types and inputs, expanding the project’s potential. Fleet DNA demonstrates the power of integrating Big Data with existing and future tools and analyses: it provides an enhanced understanding and education of users, users can explore greenhouse gases and economic opportunities via AFLEET and ADOPT modeling, drive cycles can be characterized and visualized using DRIVE, high-level vehicle modeling can be performed using real-world drive cycles via FASTSim, and data reporting through Fleet DNA Phase 1 and 2 websites provides external users access to analysis results and gives the opportunity to explore on their own.« less

  11. Factors associated with supermarket and convenience store closure: a discrete time spatial survival modelling approach.

    PubMed

    Warren, Joshua L; Gordon-Larsen, Penny

    2018-06-01

    While there is a literature on the distribution of food stores across geographic and social space, much of this research uses cross-sectional data. Analyses attempting to understand whether the availability of stores across neighborhoods is associated with diet and/or health outcomes are limited by a lack of understanding of factors that shape the emergence of new stores and the closure of others. We used quarterly data on supermarket and convenience store locations spanning seven years (2006-2012) and tract-level census data in four US cities: Birmingham, Alabama; Chicago, Illinois; Minneapolis, Minnesota; San Francisco, California. A spatial discrete-time survival model was used to identify factors associated with an earlier and/or later closure time of a store. Sales volume was typically the strongest indicator of store survival. We identified heterogeneity in the association between tract-level poverty and racial composition with respect to store survival. Stores in high poverty, non-White tracts were often at a disadvantage in terms of survival length. The observed patterns of store survival varied by some of the same neighborhood sociodemographic factors associated with lifestyle and health outcomes, which could lead to confusion in interpretation in studies of the estimated effects of introduction of food stores into neighborhoods on health.

  12. Investigating the Relationships between Canopy Characteristics and Snow Depth Distribution at Fine Scales: Preliminary Results from the SnowEX TLS Campaign

    NASA Astrophysics Data System (ADS)

    Glenn, N. F.; Uhlmann, Z.; Spaete, L.; Tennant, C.; Hiemstra, C. A.; McNamara, J.

    2017-12-01

    Predicting changes in forested seasonal snowpacks under altered climate scenarios is one of the most pressing hydrologic challenges facing today's society. Airborne- and satellite-based remote sensing methods hold the potential to transform measurements of terrestrial water stores in snowpack, improve process representations of snowpack accumulation and ablation, and to generate high quality predictions that inform potential strategies to better manage water resources. While the effects of forest on snowpack are well documented, many of the fine-scale processes influenced by the forest-canopy are not directly accounted for because most snow models don't explicitly represent canopy structure and canopy heterogeneity. This study investigates the influence of forest canopy on snowpack distribution at fine scales and quantifies the influence of canopy heterogeneity on snowpack accumulation and ablation processes. We use terrestrial laser scanning (TLS) data collected during the SnowEX campaign to discover how the relationships between canopy and snow distributions change across scales. Our sample scales range from individual trees to patches of trees across the Grand Mesa, CO, SnowEx site.

  13. Investigating energy-saving potentials in the cloud.

    PubMed

    Lee, Da-Sheng

    2014-02-20

    Collecting webpage messages can serve as a sensor for investigating the energy-saving potential of buildings. Focusing on stores, a cloud sensor system is developed to collect data and determine their energy-saving potential. The owner of a store under investigation must register online, report the store address, area, and the customer ID number on the electric meter. The cloud sensor system automatically surveys the energy usage records by connecting to the power company website and calculating the energy use index (EUI) of the store. Other data includes the chain store check, company capital, location price, and the influence of weather conditions on the store; even the exposure frequency of store under investigation may impact the energy usage collected online. After collecting data from numerous stores, a multi-dimensional data array is constructed to determine energy-saving potential by identifying stores with similarity conditions. Similarity conditions refer to analyzed results that indicate that two stores have similar capital, business scale, weather conditions, and exposure frequency on web. Calculating the EUI difference or pure technical efficiency of stores, the energy-saving potential is determined. In this study, a real case study is performed. An 8-dimensional (8D) data array is constructed by surveying web data related to 67 stores. Then, this study investigated the savings potential of the 33 stores, using a site visit, and employed the cloud sensor system to determine the saving potential. The case study results show good agreement between the data obtained by the site visit and the cloud investigation, with errors within 4.17%. Among 33 the samples, eight stores have low saving potentials of less than 5%. The developed sensor on the cloud successfully identifies them as having low saving potential and avoids wasting money on the site visit.

  14. Investigating Energy-Saving Potentials in the Cloud

    PubMed Central

    Lee, Da-Sheng

    2014-01-01

    Collecting webpage messages can serve as a sensor for investigating the energy-saving potential of buildings. Focusing on stores, a cloud sensor system is developed to collect data and determine their energy-saving potential. The owner of a store under investigation must register online, report the store address, area, and the customer ID number on the electric meter. The cloud sensor system automatically surveys the energy usage records by connecting to the power company website and calculating the energy use index (EUI) of the store. Other data includes the chain store check, company capital, location price, and the influence of weather conditions on the store; even the exposure frequency of store under investigation may impact the energy usage collected online. After collecting data from numerous stores, a multi-dimensional data array is constructed to determine energy-saving potential by identifying stores with similarity conditions. Similarity conditions refer to analyzed results that indicate that two stores have similar capital, business scale, weather conditions, and exposure frequency on web. Calculating the EUI difference or pure technical efficiency of stores, the energy-saving potential is determined. In this study, a real case study is performed. An 8-dimensional (8D) data array is constructed by surveying web data related to 67 stores. Then, this study investigated the savings potential of the 33 stores, using a site visit, and employed the cloud sensor system to determine the saving potential. The case study results show good agreement between the data obtained by the site visit and the cloud investigation, with errors within 4.17%. Among 33 the samples, eight stores have low saving potentials of less than 5%. The developed sensor on the cloud successfully identifies them as having low saving potential and avoids wasting money on the site visit. PMID:24561405

  15. A balanced water layer concept for subglacial hydrology in large scale ice sheet models

    NASA Astrophysics Data System (ADS)

    Goeller, S.; Thoma, M.; Grosfeld, K.; Miller, H.

    2012-12-01

    There is currently no doubt about the existence of a wide-spread hydrological network under the Antarctic ice sheet, which lubricates the ice base and thus leads to increased ice velocities. Consequently, ice models should incorporate basal hydrology to obtain meaningful results for future ice dynamics and their contribution to global sea level rise. Here, we introduce the balanced water layer concept, covering two prominent subglacial hydrological features for ice sheet modeling on a continental scale: the evolution of subglacial lakes and balance water fluxes. We couple it to the thermomechanical ice-flow model RIMBAY and apply it to a synthetic model domain inspired by the Gamburtsev Mountains, Antarctica. In our experiments we demonstrate the dynamic generation of subglacial lakes and their impact on the velocity field of the overlaying ice sheet, resulting in a negative ice mass balance. Furthermore, we introduce an elementary parametrization of the water flux-basal sliding coupling and reveal the predominance of the ice loss through the resulting ice streams against the stabilizing influence of less hydrologically active areas. We point out, that established balance flux schemes quantify these effects only partially as their ability to store subglacial water is lacking.

  16. Using dry spell dynamics of land surface temperature to evaluate large-scale model representation of soil moisture control on evapotranspiration

    NASA Astrophysics Data System (ADS)

    Taylor, Christopher M.; Harris, Philip P.; Gallego-Elvira, Belen; Folwell, Sonja S.

    2017-04-01

    The soil moisture control on the partition of land surface fluxes between sensible and latent heat is a key aspect of land surface models used within numerical weather prediction and climate models. As soils dry out, evapotranspiration (ET) decreases, and the excess energy is used to warm the atmosphere. Poor simulations of this dynamic process can affect predictions of mean, and in particular, extreme air temperatures, and can introduce substantial biases into projections of climate change at regional scales. The lack of reliable observations of fluxes and root zone soil moisture at spatial scales that atmospheric models use (typically from 1 to several hundred kilometres), coupled with spatial variability in vegetation and soil properties, makes it difficult to evaluate the flux partitioning at the model grid box scale. To overcome this problem, we have developed techniques to use Land Surface Temperature (LST) to evaluate models. As soils dry out, LST rises, so it can be used under certain circumstances as a proxy for the partition between sensible and latent heat. Moreover, long time series of reliable LST observations under clear skies are available globally at resolutions of the order of 1km. Models can exhibit large biases in seasonal mean LST for various reasons, including poor description of aerodynamic coupling, uncertainties in vegetation mapping, and errors in down-welling radiation. Rather than compare long-term average LST values with models, we focus on the dynamics of LST during dry spells, when negligible rain falls, and the soil moisture store is drying out. The rate of warming of the land surface, or, more precisely, its warming rate relative to the atmosphere, emphasises the impact of changes in soil moisture control on the surface energy balance. Here we show the application of this approach to model evaluation, with examples at continental and global scales. We can compare the behaviour of both fully-coupled land-atmosphere models, and land surface models forced by observed meteorology. This approach provides insight into a fundamental process that affects predictions on multiple time scales, and which has an important impact for society.

  17. Neural network modeling of associative memory: Beyond the Hopfield model

    NASA Astrophysics Data System (ADS)

    Dasgupta, Chandan

    1992-07-01

    A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying dynamics are used to store and associatively recall information, are described. In the first class of models, a hierarchical structure is used to store an exponentially large number of strongly correlated memories. The second class of models uses limit cycles to store and retrieve individual memories. A neurobiologically plausible network that generates low-amplitude periodic variations of activity, similar to the oscillations observed in electroencephalographic recordings, is also described. Results obtained from analytic and numerical studies of the properties of these networks are discussed.

  18. Natural flood risk management in flashy headwater catchments: managing runoff peaks, timing, water quality and sediment regimes

    NASA Astrophysics Data System (ADS)

    Wilkinson, Mark; Addy, Steve; Ghimire, Sohan; Kenyon, Wendy; Nicholson, Alex; Quinn, Paul; Stutter, Marc; Watson, Helen

    2013-04-01

    Over the past decade many European catchments have experienced an unusually high number of flood events. A large number of these events are the result of intense rainfall in small headwater catchments which are dominated by surface runoff generation, resulting in flash flooding of local communities. Soil erosion and related water quality issues, among others, are typically associated with such rapid runoff generation. The hazard of flooding is increasing owing to impacts of changing climatic patterns (including more intense summer storms), intensification of agriculture within rural catchments and continued pressure to build on floodplains. Concurrently, the cost of constructing and maintaining traditional flood defences in small communities outweigh the potential benefits. Hence, there is a growing interest in more cost effective natural approaches that also have multipurpose benefits in terms of sediment, water quality, and habitat creation. Many catchments in Europe are intensively farmed and there is great potential for agriculture to be part of the solution to flood risk management. Natural flood management (NFM) is the alteration, restoration or use of landscape features with the aim of reducing flood risk by slowing down, storing (and filtering) rapid surface runoff. NFM includes measures such as temporarily storing water in ponds/wetlands, increasing soil infiltration, planting trees on floodplains and within catchments, re-meandering and wood placements in streams/ditches. In this presentation we highlight case studies from densely instrumented research sites across the UK (which could be typical of many European catchments) where NFM measures have been installed in small scale flashy catchments. The presentation will give an overview of the function of these measures in these catchments and how other multiple benefits are being accrued. Study catchments include the headwater catchments of the Bowmont (3 to 8 km2) and Belford Burn (6 km2) catchments. These catchments are known for their rapid runoff generation and have downstream local communities at risk of flash flooding. In Bowmont, NFM measures are currently being put in place to restore river bars and to store water more effectively on the flood plains during these flashy events. For example, Apex engineered wood structure in the river channel and riparian zones are designed to trap sediment and log bank protection structures are being installed to stop bank erosion. Tree planting in the catchment is also taking place. In the Belford catchment storage ponds and woody debris have been installed over the past five years to help to reduce the flood risk to the village of Belford. A dense instrumentation network has provided data for analysis and modelling which shows evidence of local scale flood peak reductions along with the collection of large amounts of sediment. A modelling study carried out (using a pond network model) during an intense summer storm showed that 30 small scale pond features used in sequence could reduce the flood peak by ~35% at the local scale. Findings show that managing surface runoff and local ditch flow at local scale headwater catchments is a cost effective way of managing flashy catchment for flood risk and sediment control. Working with catchment stakeholders is vital. Information given by the local community post flooding has been useful in placing NFM measures throughout the catchments. Involving the local communities in these projects and giving them access to the data and model outputs has helped to develop these projects further.

  19. The use of stored carbon reserves in growth of temperate tree roots and leaf buds: Analyses using radiocarbon measurements and modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaudinski, J.B.; Torn, M.S.; Riley, W.J.

    2009-02-01

    Characterizing the use of carbon (C) reserves in trees is important for understanding regional and global C cycles, stress responses, asynchrony between photosynthetic activity and growth demand, and isotopic exchanges in studies of tree physiology and ecosystem C cycling. Using an inadvertent, whole-ecosystem radiocarbon ({sup 14}C) release in a temperate deciduous oak forest and numerical modeling, we estimated that the mean age of stored C used to grow both leaf buds and new roots is 0.7 years and about 55% of new-root growth annually comes from stored C. Therefore, the calculated mean age of C used to grow new-root tissuemore » is {approx}0.4 years. In short, new roots contain a lot of stored C but it is young in age. Additionally, the type of structure used to model stored C input is important. Model structures that did not include storage, or that assumed stored and new C mixed well (within root or shoot tissues) before being used for root growth, did not fit the data nearly as well as when a distinct storage pool was used. Consistent with these whole-ecosystem labeling results, the mean age of C in new-root tissues determined using 'bomb-{sup 14}C' in three additional forest sites in North America and Europe (one deciduous, two coniferous) was less than 1-2 years. The effect of stored reserves on estimated ages of fine roots is unlikely to be large in most natural abundance isotope studies. However, models of root C dynamics should take stored reserves into account, particularly for pulse-labeling studies and fast-cycling roots (<1 years).« less

  20. Estimates of live-tree carbon stores in the Pacific Northwest are sensitive to model selection

    Treesearch

    Susanna L. Melson; Mark E. Harmon; Jeremy S. Fried; James B. Domingo

    2011-01-01

    Estimates of live-tree carbon stores are influenced by numerous uncertainties. One of them is model-selection uncertainty: one has to choose among multiple empirical equations and conversion factors that can be plausibly justified as locally applicable to calculate the carbon store from inventory measurements such as tree height and diameter at breast height (DBH)....

  1. Small Convenience Stores and the Local Food Environment: An Analysis of Resident Shopping Behavior Using Multilevel Modeling.

    PubMed

    Ruff, Ryan Richard; Akhund, Ali; Adjoian, Tamar

    2016-01-01

    Local food environments can influence the diet and health of individuals through food availability, proximity to retail stores, pricing, and promotion. This study focused on how small convenience stores, known in New York City as bodegas, influence resident shopping behavior and the food environment. Using a cross-sectional design, 171 bodegas and 2118 shoppers were sampled. Small convenience stores in New York City. Any bodega shopper aged 18+ who purchased food or beverage from a participating store. Data collection consisted of a store assessment, a health and behavior survey given to exiting customers, and a bag check that recorded product information for all customer purchases. Descriptive statistics were generated for bodega store characteristics, shopper demographics, and purchase behavior. Multilevel models were used to assess the influence of product availability, placement, and advertising on consumer purchases of sugar-sweetened beverages (SSBs), water, and fruits and vegetables. Seventy-one percent of participants reported shopping at bodegas five or more times per week, and 35% reported purchasing all or most of their monthly food allotment at bodegas. Model results indicated that lower amounts of available fresh produce were significantly and independently associated with a higher likelihood of SSB purchases. A second, stratified multilevel model showed that the likelihood of purchasing an SSB increased with decreasing varieties of produce when produce was located at the front of the store. No significant effects were found for water placement and beverage advertising. Small convenience stores in New York City are an easily accessible source of foods and beverages. Bodegas may be suitable for interventions designed to improve food choice and diet.

  2. A Balanced Memory Network

    PubMed Central

    Roudi, Yasser; Latham, Peter E

    2007-01-01

    A fundamental problem in neuroscience is understanding how working memory—the ability to store information at intermediate timescales, like tens of seconds—is implemented in realistic neuronal networks. The most likely candidate mechanism is the attractor network, and a great deal of effort has gone toward investigating it theoretically. Yet, despite almost a quarter century of intense work, attractor networks are not fully understood. In particular, there are still two unanswered questions. First, how is it that attractor networks exhibit irregular firing, as is observed experimentally during working memory tasks? And second, how many memories can be stored under biologically realistic conditions? Here we answer both questions by studying an attractor neural network in which inhibition and excitation balance each other. Using mean-field analysis, we derive a three-variable description of attractor networks. From this description it follows that irregular firing can exist only if the number of neurons involved in a memory is large. The same mean-field analysis also shows that the number of memories that can be stored in a network scales with the number of excitatory connections, a result that has been suggested for simple models but never shown for realistic ones. Both of these predictions are verified using simulations with large networks of spiking neurons. PMID:17845070

  3. The flow structure of pyroclastic density currents: evidence from particle models and large-scale experiments

    NASA Astrophysics Data System (ADS)

    Dellino, Pierfrancesco; Büttner, Ralf; Dioguardi, Fabio; Doronzo, Domenico Maria; La Volpe, Luigi; Mele, Daniela; Sonder, Ingo; Sulpizio, Roberto; Zimanowski, Bernd

    2010-05-01

    Pyroclastic flows are ground hugging, hot, gas-particle flows. They represent the most hazardous events of explosive volcanism, one striking example being the famous historical eruption of Pompeii (AD 79) at Vesuvius. Much of our knowledge on the mechanics of pyroclastic flows comes from theoretical models and numerical simulations. Valuable data are also stored in the geological record of past eruptions, i.e. the particles contained in pyroclastic deposits, but they are rarely used for quantifying the destructive potential of pyroclastic flows. In this paper, by means of experiments, we validate a model that is based on data from pyroclastic deposits. It allows the reconstruction of the current's fluid-dynamic behaviour. We show that our model results in likely values of dynamic pressure and particle volumetric concentration, and allows quantifying the hazard potential of pyroclastic flows.

  4. A disposable, self-contained PCR chip.

    PubMed

    Kim, Jitae; Byun, Doyoung; Mauk, Michael G; Bau, Haim H

    2009-02-21

    A disposable, self-contained polymerase chain reaction (PCR) chip with on-board stored, just-on-time releasable, paraffin-passivated, dry reagents is described. During both storage and sample preparation, the paraffin immobilizes and protects the stored reagents. Fluid flow through the reactor leaves the reagents undisturbed. Prior to the amplification step, the chamber is filled with target analyte suspended in water. Upon heating the PCR chamber to the DNA's denaturation temperature, the paraffin melts and moves out of the way, and the reagents are released and hydrated. To better understand the reagent release process, a scaled up model of the reactor was constructed and the paraffin migration was visualized. Experiments were carried out with a 30 microl reactor demonstrating detectable amplification (with agarose gel electrophoresis) of 10 fg ( approximately 200 copies) of lambda DNA template. The in-reactor storage and on-time release of the PCR reagents reduce the number of needed operations and significantly simplifies the flow control that would, otherwise, be needed in lab-on-chip devices.

  5. A Disposable, Self-Contained PCR Chip

    PubMed Central

    Kim, Jitae; Byun, Doyoung; Mauk, Michael G.; Bau, Haim H.

    2009-01-01

    A disposable, self-contained polymerase chain reaction (PCR) chip with on-board stored, just on time releasable, paraffin-passivated, dry reagents is described. During both storage and sample preparation, the paraffin immobilizes and protects the stored reagents. Fluid flow through the reactor leaves the reagents undisturbed. Prior to the amplification step, the chamber is filled with target analyte suspended in water. Upon heating the PCR chamber to the DNA’s denaturation temperature, the paraffin melts and moves out of the way, and the reagents are released and hydrated. To better understand the reagent release process, a scaled up model of the reactor was constructed and the paraffin migration was visualized. Experiments were carried out with a 30 μl reactor demonstrating detectable amplification (with agarose gel electrophoresis) of 10 fg (~200 copies) of lambda DNA template. The in-reactor storage and on-time release of the PCR reagents reduce the number of needed operations and significantly simplify the flow control that would, otherwise, be needed in lab-on-chip devices. PMID:19190797

  6. Distributed cerebellar plasticity implements generalized multiple-scale memory components in real-robot sensorimotor tasks.

    PubMed

    Casellato, Claudia; Antonietti, Alberto; Garrido, Jesus A; Ferrigno, Giancarlo; D'Angelo, Egidio; Pedrocchi, Alessandra

    2015-01-01

    The cerebellum plays a crucial role in motor learning and it acts as a predictive controller. Modeling it and embedding it into sensorimotor tasks allows us to create functional links between plasticity mechanisms, neural circuits and behavioral learning. Moreover, if applied to real-time control of a neurorobot, the cerebellar model has to deal with a real noisy and changing environment, thus showing its robustness and effectiveness in learning. A biologically inspired cerebellar model with distributed plasticity, both at cortical and nuclear sites, has been used. Two cerebellum-mediated paradigms have been designed: an associative Pavlovian task and a vestibulo-ocular reflex, with multiple sessions of acquisition and extinction and with different stimuli and perturbation patterns. The cerebellar controller succeeded to generate conditioned responses and finely tuned eye movement compensation, thus reproducing human-like behaviors. Through a productive plasticity transfer from cortical to nuclear sites, the distributed cerebellar controller showed in both tasks the capability to optimize learning on multiple time-scales, to store motor memory and to effectively adapt to dynamic ranges of stimuli.

  7. Data Model as an Architectural View

    DTIC Science & Technology

    2009-10-01

    store order - processing system. Logical. The logical data model is an evolution of the conceptual data model towards a data management technology (e.g...online store order - processing system at different stages. Perhaps the first draft was elaborated by the architect during discussion of requirements

  8. Spatial-Temporal Modeling of Neighborhood Sociodemographic Characteristics and Food Stores

    PubMed Central

    Lamichhane, Archana P.; Warren, Joshua L.; Peterson, Marc; Rummo, Pasquale; Gordon-Larsen, Penny

    2015-01-01

    The literature on food stores, neighborhood poverty, and race/ethnicity is mixed and lacks methods of accounting for complex spatial and temporal clustering of food resources. We used quarterly data on supermarket and convenience store locations from Nielsen TDLinx (Nielsen Holdings N.V., New York, New York) spanning 7 years (2006–2012) and census tract-based neighborhood sociodemographic data from the American Community Survey (2006–2010) to assess associations between neighborhood sociodemographic characteristics and food store distributions in the Metropolitan Statistical Areas (MSAs) of 4 US cities (Birmingham, Alabama; Chicago, Illinois; Minneapolis, Minnesota; and San Francisco, California). We fitted a space-time Poisson regression model that accounted for the complex spatial-temporal correlation structure of store locations by introducing space-time random effects in an intrinsic conditionally autoregressive model within a Bayesian framework. After accounting for census tract–level area, population, their interaction, and spatial and temporal variability, census tract poverty was significantly and positively associated with increasing expected numbers of supermarkets among tracts in all 4 MSAs. A similar positive association was observed for convenience stores in Birmingham, Minneapolis, and San Francisco; in Chicago, a positive association was observed only for predominantly white and predominantly black tracts. Our findings suggest a positive association between greater numbers of food stores and higher neighborhood poverty, with implications for policy approaches related to food store access by neighborhood poverty. PMID:25515169

  9. Markov chain sampling of the O(n) loop models on the infinite plane

    NASA Astrophysics Data System (ADS)

    Herdeiro, Victor

    2017-07-01

    A numerical method was recently proposed in Herdeiro and Doyon [Phys. Rev. E 94, 043322 (2016), 10.1103/PhysRevE.94.043322] showing a precise sampling of the infinite plane two-dimensional critical Ising model for finite lattice subsections. The present note extends the method to a larger class of models, namely the O(n) loop gas models for n ∈(1 ,2 ] . We argue that even though the Gibbs measure is nonlocal, it is factorizable on finite subsections when sufficient information on the loops touching the boundaries is stored. Our results attempt to show that provided an efficient Markov chain mixing algorithm and an improved discrete lattice dilation procedure the planar limit of the O(n) models can be numerically studied with efficiency similar to the Ising case. This confirms that scale invariance is the only requirement for the present numerical method to work.

  10. Conceptual model of sediment processes in the upper Yuba River watershed, Sierra Nevada, CA

    USGS Publications Warehouse

    Curtis, J.A.; Flint, L.E.; Alpers, Charles N.; Yarnell, S.M.

    2005-01-01

    This study examines the development of a conceptual model of sediment processes in the upper Yuba River watershed; and we hypothesize how components of the conceptual model may be spatially distributed using a geographical information system (GIS). The conceptual model illustrates key processes controlling sediment dynamics in the upper Yuba River watershed and was tested and revised using field measurements, aerial photography, and low elevation videography. Field reconnaissance included mass wasting and channel storage inventories, assessment of annual channel change in upland tributaries, and evaluation of the relative importance of sediment sources and transport processes. Hillslope erosion rates throughout the study area are relatively low when compared to more rapidly eroding landscapes such as the Pacific Northwest and notable hillslope sediment sources include highly erodible andesitic mudflows, serpentinized ultramafics, and unvegetated hydraulic mine pits. Mass wasting dominates surface erosion on the hillslopes; however, erosion of stored channel sediment is the primary contributor to annual sediment yield. We used GIS to spatially distribute the components of the conceptual model and created hillslope erosion potential and channel storage models. The GIS models exemplify the conceptual model in that landscapes with low potential evapotranspiration, sparse vegetation, steep slopes, erodible geology and soils, and high road densities display the greatest hillslope erosion potential and channel storage increases with increasing stream order. In-channel storage in upland tributaries impacted by hydraulic mining is an exception. Reworking of stored hydraulic mining sediment in low-order tributaries continues to elevate upper Yuba River sediment yields. Finally, we propose that spatially distributing the components of a conceptual model in a GIS framework provides a guide for developing more detailed sediment budgets or numerical models making it an inexpensive way to develop a roadmap for understanding sediment dynamics at a watershed scale.

  11. Length and area equivalents for interpreting wildland resource maps

    Treesearch

    Elliot L. Amidon; Marilyn S. Whitfield

    1969-01-01

    Map users must refer to an appropriate scale in interpreting wildland resource maps. Length and area equivalents for nine map scales commonly used have been computed. For each scale a 1-page table consists of map-to-ground equivalents, buffer strip or road widths, and cell dimensions required for a specified acreage. The conversion factors are stored in a Fortran...

  12. Not in My Back Yard: A Comparative Analysis of Crime Around Publicly Funded Drug Treatment Centers, Liquor Stores, Convenience Stores, and Corner Stores in One Mid-Atlantic City.

    PubMed

    Furr-Holden, C Debra M; Milam, Adam J; Nesoff, Elizabeth D; Johnson, Renee M; Fakunle, David O; Jennings, Jacky M; Thorpe, Roland J

    2016-01-01

    This research examined whether publicly funded drug treatment centers (DTCs) were associated with violent crime in excess of the violence happening around other commercial businesses. Violent crime data and locations of community entities were geocoded and mapped. DTCs and other retail outlets were matched based on a Neighborhood Disadvantage score at the census tract level. Street network buffers ranging from 100 to 1,400 feet were placed around each location. Negative binomial regression models were used to estimate the relationship between the count of violent crimes and the distance from each business type. Compared with the mean count of violent crime around drug treatment centers, the mean count of violent crime (M = 2.87) was significantly higher around liquor stores (M = 3.98; t test; p < .01) and corner stores (M = 3.78; t test; p < .01), and there was no statistically significant difference between the count around convenience stores (M = 2.65; t test; p = .32). In the adjusted negative binomial regression models, there was a negative and significant relationship between the count of violent crime and the distance from drug treatment centers (β = -.069, p < .01), liquor stores (β = -.081, p < .01), corner stores (β = -.116, p < .01), and convenience stores (β = -.154, p < .01). Violent crime associated with drug treatment centers is similar to that associated with liquor stores and is less frequent than that associated with convenience stores and corner stores.

  13. National-Scale Hydrologic Classification & Agricultural Decision Support: A Multi-Scale Approach

    NASA Astrophysics Data System (ADS)

    Coopersmith, E. J.; Minsker, B.; Sivapalan, M.

    2012-12-01

    Classification frameworks can help organize catchments exhibiting similarity in hydrologic and climatic terms. Focusing this assessment of "similarity" upon specific hydrologic signatures, in this case the annual regime curve, can facilitate the prediction of hydrologic responses. Agricultural decision-support over a diverse set of catchments throughout the United States depends upon successful modeling of the wetting/drying process without necessitating separate model calibration at every site where such insights are required. To this end, a holistic classification framework is developed to describe both climatic variability (humid vs. arid, winter rainfall vs. summer rainfall) and the draining, storing, and filtering behavior of any catchment, including ungauged or minimally gauged basins. At the national scale, over 400 catchments from the MOPEX database are analyzed to construct the classification system, with over 77% of these catchments ultimately falling into only six clusters. At individual locations, soil moisture models, receiving only rainfall as input, produce correlation values in excess of 0.9 with respect to observed soil moisture measurements. By deploying physical models for predicting soil moisture exclusively from precipitation that are calibrated at gauged locations, overlaying machine learning techniques to improve these estimates, then generalizing the calibration parameters for catchments in a given class, agronomic decision-support becomes available where it is needed rather than only where sensing data are located.lassifications of 428 U.S. catchments on the basis of hydrologic regime data, Coopersmith et al, 2012.

  14. Constraining the dynamics of the water budget at high spatial resolution in the world's water towers using models and remote sensing data; Snake River Basin, USA

    NASA Astrophysics Data System (ADS)

    Watson, K. A.; Masarik, M. T.; Flores, A. N.

    2016-12-01

    Mountainous, snow-dominated basins are often referred to as the water towers of the world because they store precipitation in seasonal snowpacks, which gradually melt and provide water supplies to downstream communities. Yet significant uncertainties remain in terms of quantifying the stores and fluxes of water in these regions as well as the associated energy exchanges. Constraining these stores and fluxes is crucial for advancing process understanding and managing these water resources in a changing climate. Remote sensing data are particularly important to these efforts due to the remoteness of these landscapes and high spatial variability in water budget components. We have developed a high resolution regional climate dataset extending from 1986 to the present for the Snake River Basin in the northwestern USA. The Snake River Basin is the largest tributary of the Columbia River by volume and a critically important basin for regional economies and communities. The core of the dataset was developed using a regional climate model, forced by reanalysis data. Specifically the Weather Research and Forecasting (WRF) model was used to dynamically downscale the North American Regional Reanalysis (NARR) over the region at 3 km horizontal resolution for the period of interest. A suite of satellite remote sensing products provide independent, albeit uncertain, constraint on a number of components of the water and energy budgets for the region across a range of spatial and temporal scales. For example, GRACE data are used to constrain basinwide terrestrial water storage and MODIS products are used to constrain the spatial and temporal evolution of evapotranspiration and snow cover. The joint use of both models and remote sensing products allows for both better understanding of water cycle dynamics and associated hydrometeorologic processes, and identification of limitations in both the remote sensing products and regional climate simulations.

  15. Tools and Methods for Visualization of Mesoscale Ocean Eddies

    NASA Astrophysics Data System (ADS)

    Bemis, K. G.; Liu, L.; Silver, D.; Kang, D.; Curchitser, E.

    2017-12-01

    Mesoscale ocean eddies form in the Gulf Stream and transport heat and nutrients across the ocean basin. The internal structure of these three-dimensional eddies and the kinematics with which they move are critical to a full understanding of their transport capacity. A series of visualization tools have been developed to extract, characterize, and track ocean eddies from 3D modeling results, to visually show the ocean eddy story by applying various illustrative visualization techniques, and to interactively view results stored on a server from a conventional browser. In this work, we apply a feature-based method to track instances of ocean eddies through the time steps of a high-resolution multidecadal regional ocean model and generate a series of eddy paths which reflect the life cycle of individual eddy instances. The basic method uses the Okubu-Weiss parameter to define eddy cores but could be adapted to alternative specifications of an eddy. Stored results include pixel-lists for each eddy instance, tracking metadata for eddy paths, and physical and geometric properties. In the simplest view, isosurfaces are used to display eddies along an eddy path. Individual eddies can then be selected and viewed independently or an eddy path can be viewed in the context of all eddy paths (longer than a specified duration) and the ocean basin. To tell the story of mesoscale ocean eddies, we combined illustrative visualization techniques, including visual effectiveness enhancement, focus+context, and smart visibility, with the extracted volume features to explore eddy characteristics at multiple scales from ocean basin to individual eddy. An evaluation by domain experts indicates that combining our feature-based techniques with illustrative visualization techniques provides an insight into the role eddies play in ocean circulation. A web-based GUI is under development to facilitate easy viewing of stored results. The GUI provides the user control to choose amongst available datasets, to specify the variables (such as temperature or salinity) to display on the isosurfaces, and to choose the scale and orientation of the view. These techniques allow an oceanographer to browse the data based on eddy paths and individual eddies rather than slices or volumes of data.

  16. Evaluating Benefits of LID Practices at Multiple Spatial Scales Using SUSTAIN

    EPA Science Inventory

    Low impact development (LID) is a storm water management approach that essentially mimics the way nature works: infiltrate, filter, store, evaporate, and detain runoff close to its source. LID practices are distributed in nature, and they work on decentralized micro-scales and m...

  17. Estimating the Uncertain Mathematical Structure of Hydrological Model via Bayesian Data Assimilation

    NASA Astrophysics Data System (ADS)

    Bulygina, N.; Gupta, H.; O'Donell, G.; Wheater, H.

    2008-12-01

    The structure of hydrological model at macro scale (e.g. watershed) is inherently uncertain due to many factors, including the lack of a robust hydrological theory at the macro scale. In this work, we assume that a suitable conceptual model for the hydrologic system has already been determined - i.e., the system boundaries have been specified, the important state variables and input and output fluxes to be included have been selected, and the major hydrological processes and geometries of their interconnections have been identified. The structural identification problem then is to specify the mathematical form of the relationships between the inputs, state variables and outputs, so that a computational model can be constructed for making simulations and/or predictions of system input-state-output behaviour. We show how Bayesian data assimilation can be used to merge both prior beliefs in the form of pre-assumed model equations with information derived from the data to construct a posterior model. The approach, entitled Bayesian Estimation of Structure (BESt), is used to estimate a hydrological model for a small basin in England, at hourly time scales, conditioned on the assumption of 3-dimensional state - soil moisture storage, fast and slow flow stores - conceptual model structure. Inputs to the system are precipitation and potential evapotranspiration, and outputs are actual evapotranspiration and streamflow discharge. Results show the difference between prior and posterior mathematical structures, as well as provide prediction confidence intervals that reflect three types of uncertainty: due to initial conditions, due to input and due to mathematical structure.

  18. --No Title--

    Science.gov Websites

    , the user is responsbile for controlling the quality of observational data # and ensuring data is also # # OUTPUTS: # 1) observational data (named data_obs) and Model data (named data_model) # stored under " observational and model data, stored in correct locations # 2) "data" and "figures" folders

  19. The compression–error trade-off for large gridded data sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silver, Jeremy D.; Zender, Charles S.

    The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scalemore » and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed. When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.« less

  20. The compression–error trade-off for large gridded data sets

    DOE PAGES

    Silver, Jeremy D.; Zender, Charles S.

    2017-01-27

    The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scalemore » and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed. When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.« less

  1. Carbon cycling in extratropical terrestrial ecosystems of the Northern Hemisphere during the 20th century: a modeling analysis of the influences of soil thermal dynamics

    USGS Publications Warehouse

    Zhuang, Q.; McGuire, A.D.; Melillo, J.M.; Clein, Joy S.; Dargaville, R.J.; Kicklighter, D.W.; Myneni, Ranga B.; Dong, J.; Romanovsky, V.E.; Harden, J.; Hobbie, J.E.

    2003-01-01

    There is substantial evidence that soil thermal dynamics are changing in terrestrial ecosystems of the Northern Hemisphere and that these dynamics have implications for the exchange of carbon between terrestrial ecosystems and the atmosphere. To date, large-scale biogeochemical models have been slow to incorporate the effects of soil thermal dynamics on processes that affect carbon exchange with the atmosphere. In this study we incorporated a soil thermal module (STM), appropriate to both permafrost and non-permafrost soils, into a large-scale ecosystem model, version 5.0 of the Terrestrial Ecosystem Model (TEM). We then compared observed regional and seasonal patterns of atmospheric CO2 to simulations of carbon dynamics for terrestrial ecosystems north of 30°N between TEM 5.0 and an earlier version of TEM (version 4.2) that lacked a STM. The timing of the draw-down of atmospheric CO2 at the start of the growing season and the degree of draw-down during the growing season were substantially improved by the consideration of soil thermal dynamics. Both versions of TEM indicate that climate variability and change promoted the loss of carbon from temperate ecosystems during the first half of the 20th century, and promoted carbon storage during the second half of the century. The results of the simulations by TEM suggest that land-use change in temperate latitudes (30–60°N) plays a stronger role than climate change in driving trends for increased uptake of carbon in extratropical terrestrial ecosystems (30–90°N) during recent decades. In the 1980s the TEM 5.0 simulation estimated that extratropical terrestrial ecosystems stored 0.55 Pg C yr−1, with 0.24 Pg C yr−1 in North America and 0.31 Pg C yr−1 in northern Eurasia. From 1990 through 1995 the model simulated that these ecosystems stored 0.90 Pg C yr−1, with 0.27 Pg C yr−1 stored in North America and 0.63 Pg C yr−1 stored in northern Eurasia. Thus, in comparison to the 1980s, simulated net carbon storage in the 1990s was enhanced by an additional 0.35 Pg C yr−1 in extratropical terrestrial ecosystems, with most of the additional storage in northern Eurasia. The carbon storage simulated by TEM 5.0 in the 1980s and 1990s was lower than estimates based on other methodologies, including estimates by atmospheric inversion models and remote sensing and inventory analyses. This suggests that other issues besides the role of soil thermal dynamics may be responsible, in part, for the temporal and spatial dynamics of carbon storage of extratropical terrestrial ecosystems. In conclusion, the consideration of soil thermal dynamics and terrestrial cryospheric processes in modeling the global carbon cycle has helped to reduce biases in the simulation of the seasonality of carbon dynamics of extratropical terrestrial ecosystems. This progress should lead to an enhanced ability to clarify the role of other issues that influence carbon dynamics in terrestrial regions that experience seasonal freezing and thawing of soil.

  2. The North American Carbon Program Multi-scale Synthesis and Terrestrial Model Intercomparison Project – Part 2: Environmental driver data

    DOE PAGES

    Wei, Yaxing; Liu, Shishi; Huntzinger, Deborah N.; ...

    2014-12-05

    Ecosystems are important and dynamic components of the global carbon cycle, and terrestrial biospheric models (TBMs) are crucial tools in further understanding of how terrestrial carbon is stored and exchanged with the atmosphere across a variety of spatial and temporal scales. Improving TBM skills, and quantifying and reducing their estimation uncertainties, pose significant challenges. The Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) is a formal multi-scale and multi-model intercomparison effort set up to tackle these challenges. The MsTMIP protocol prescribes standardized environmental driver data that are shared among model teams to facilitate model model and model observation comparisons. Inmore » this article, we describe the global and North American environmental driver data sets prepared for the MsTMIP activity to both support their use in MsTMIP and make these data, along with the processes used in selecting/processing these data, accessible to a broader audience. Based on project needs and lessons learned from past model intercomparison activities, we compiled climate, atmospheric CO 2 concentrations, nitrogen deposition, land use and land cover change (LULCC), C3 / C4 grasses fractions, major crops, phenology and soil data into a standard format for global (0.5⁰ x 0.5⁰ resolution) and regional (North American: 0.25⁰ x 0.25⁰ resolution) simulations. In order to meet the needs of MsTMIP, improvements were made to several of the original environmental data sets, by improving the quality, and/or changing their spatial and temporal coverage, and resolution. The resulting standardized model driver data sets are being used by over 20 different models participating in MsTMIP. Lastly, the data are archived at the Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, http://daac.ornl.gov) to provide long-term data management and distribution.« less

  3. An introduction to the healthy corner store intervention model in Canada.

    PubMed

    Mah, Catherine L; Minaker, Leia M; Jameson, Kristie; Rappaport, Lissie; Taylor, Krystal; Graham, Marketa; Moody, Natalie; Cook, Brian

    2017-09-14

    The majority of Canadians' food acquisition occurs in retail stores. Retail science has become increasingly sophisticated in demonstrating how consumer environments influence population-level diet quality and health status. The retail food environment literature is new but growing rapidly in Canada, and there is a relative paucity of evidence from intervention research implemented in Canada. The healthy corner store model is a comprehensive complex population health intervention in small retail stores, intended to transform an existing business model to a health-promoting one through intersectoral collaboration. Healthy corner store interventions typically involve conversions of existing stores with the participation of health, community, and business sector partners, addressing business fundamentals, merchandising, and consumer demand. This article introduces pioneering experiences with the healthy corner store intervention in Canada. First, we offer a brief overview of the state of evidence within and outside Canada. Second, we discuss three urban and one rural healthy corner store initiatives, led through partnerships among community food security organizations, public health units, academics, and business partners, in Manitoba, Ontario, and Newfoundland and Labrador. Third, we synthesize the promising practices from these local examples, including aspects of both intervention science (e.g., refinements in measuring the food environment) and community-based practice (e.g., dealing with unhealthy food items and economic impact for the retailer). This article will synthesize practical experiences with healthy corner stores in Canada. It offers a baseline assessment of promising aspects of this intervention for health and health equity, and identifies opportunities to strengthen both science and practice in this area of retail food environment work.

  4. Magnetic Reconnection and Particle Acceleration in the Solar Corona

    NASA Astrophysics Data System (ADS)

    Neukirch, Thomas

    Reconnection plays a major role for the magnetic activity of the solar atmosphere, for example solar flares. An interesting open problem is how magnetic reconnection acts to redistribute the stored magnetic energy released during an eruption into other energy forms, e.g. gener-ating bulk flows, plasma heating and non-thermal energetic particles. In particular, finding a theoretical explanation for the observed acceleration of a large number of charged particles to high energies during solar flares is presently one of the most challenging problems in solar physics. One difficulty is the vast difference between the microscopic (kinetic) and the macro-scopic (MHD) scales involved. Whereas the phenomena observed to occur on large scales are reasonably well explained by the so-called standard model, this does not seem to be the case for the small-scale (kinetic) aspects of flares. Over the past years, observations, in particular by RHESSI, have provided evidence that a naive interpretation of the data in terms of the standard solar flare/thick target model is problematic. As a consequence, the role played by magnetic reconnection in the particle acceleration process during solar flares may have to be reconsidered.

  5. Poromechanical response of naturally fractured sorbing media

    NASA Astrophysics Data System (ADS)

    Kumar, Hemant

    The injection of CO2 in coal seams has been utilized for enhanced gas recovery and potential CO2 sequestration in unmineable coal seams. It is advantageous because as it enhances the production and significant volumes of CO2 may be stored simultaneously. The key issues for enhanced gas recovery and geologic sequestration of CO2 include (1) Injectivity prediction: The chemical and physical processes initiated by the injection of CO2 in the coal seam leads to permeability/porosity changes (2) Up scaling: Development of full scale coupled reservoir model which may predict the enhanced production, associated permeability changes and quantity of sequestered CO2. (3) Reservoir Stimulation: The coalbeds are often fractured and proppants are placed into the fractures to prevent the permeability reduction but the permeability evolution in such cases is poorly understood. These issues are largely governed by dynamic coupling of adsorption, fluid exchange, transport, water content, stress regime, fracture geometry and physiomechanical changes in coals which are triggered by CO 2 injection. The understanding of complex interactions in coal has been investigated through laboratory experiments and full reservoir scale models are developed to answer key issues. (Abstract shortened by ProQuest.).

  6. The personal shopper – a pilot randomized trial of grocery store-based dietary advice

    PubMed Central

    Lewis, K H; Roblin, D W; Leo, M; Block, J P

    2015-01-01

    The objective of this study was to test the feasibility and preliminary efficacy of a store-based dietary education intervention against traditional clinic-based advice. Patients with obesity (n = 55, mean [standard deviation, SD] age 44.3[9.2] years, 64% women, 87% non-Hispanic Black) were randomized to receive dietary counselling either in a grocery store or a clinic. Change between groups (analysis of covariance) was assessed for outcomes including: dietary quality (Healthy Eating Index – 2005 [0–100 points] ), and nutritional knowledge (0–65-point knowledge scale). Both groups reported improved diet quality at the end of the study. Grocery participants had greater increases in knowledge (mean [SD] change = 5.7 [6.1] points) than clinic participants (mean [SD] change = 3.2 [4.0] points) (P = 0.04). Participants enjoyed the store-based sessions. Grocery store-based visits offer a promising approach for dietary counselling. PMID:25873139

  7. The personal shopper--a pilot randomized trial of grocery store-based dietary advice.

    PubMed

    Lewis, K H; Roblin, D W; Leo, M; Block, J P

    2015-06-01

    The objective of this study was to test the feasibility and preliminary efficacy of a store-based dietary education intervention against traditional clinic-based advice. Patients with obesity (n = 55, mean [standard deviation, SD] age 44.3[9.2] years, 64% women, 87% non-Hispanic Black) were randomized to receive dietary counselling either in a grocery store or a clinic. Change between groups (analysis of covariance) was assessed for outcomes including: dietary quality (Healthy Eating Index--2005 [0-100 points]), and nutritional knowledge (0-65-point knowledge scale). Both groups reported improved diet quality at the end of the study. Grocery participants had greater increases in knowledge (mean [SD] change = 5.7 [6.1] points) than clinic participants (mean [SD] change = 3.2 [4.0] points) (P = 0.04). Participants enjoyed the store-based sessions. Grocery store-based visits offer a promising approach for dietary counselling. © 2015 The Authors. Clinical Obesity published by John Wiley & Sons Ltd on behalf of World Obesity.

  8. Comparison of batch sorption tests, pilot studies, and modeling for estimating GAC bed life.

    PubMed

    Scharf, Roger G; Johnston, Robert W; Semmens, Michael J; Hozalski, Raymond M

    2010-02-01

    Saint Paul Regional Water Services (SPRWS) in Saint Paul, MN experiences annual taste and odor episodes during the warm summer months. These episodes are attributed primarily to geosmin that is produced by cyanobacteria growing in the chain of lakes used to convey and store the source water pumped from the Mississippi River. Batch experiments, pilot-scale experiments, and model simulations were performed to determine the geosmin removal performance and bed life of a granular activated carbon (GAC) filter-sorber. Using batch adsorption isotherm parameters, the estimated bed life for the GAC filter-sorber ranged from 920 to 1241 days when challenged with a constant concentration of 100 ng/L of geosmin. The estimated bed life obtained using the AdDesignS model and the actual pilot-plant loading history was 594 days. Based on the pilot-scale GAC column data, the actual bed life (>714 days) was much longer than the simulated values because bed life was extended by biological degradation of geosmin. The continuous feeding of high concentrations of geosmin (100-400 ng/L) in the pilot-scale experiments enriched for a robust geosmin-degrading culture that was sustained when the geosmin feed was turned off for 40 days. It is unclear, however, whether a geosmin-degrading culture can be established in a full-scale filter that experiences taste and odor episodes for only 1 or 2 months per year. The results of this research indicate that care must be exercised in the design and interpretation of pilot-scale experiments and model simulations for predicting taste and odor removal in full-scale GAC filter-sorbers. Adsorption and the potential for biological degradation must be considered to estimate GAC bed life for the conditions of intermittent geosmin loading typically experienced by full-scale systems. (c) 2009 Elsevier Ltd. All rights reserved.

  9. Advanced Flywheel Composite Rotors: Low-Cost, High-Energy Density Flywheel Storage Grid Demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-10-01

    GRIDS Project: Boeing is developing a new material for use in the rotor of a low-cost, high-energy flywheel storage technology. Flywheels store energy by increasing the speed of an internal rotor —slowing the rotor releases the energy back to the grid when needed. The faster the rotor spins, the more energy it can store. Boeing’s new material could drastically improve the energy stored in the rotor. The team will work to improve the storage capacity of their flywheels and increase the duration over which they store energy. The ultimate goal of this project is to create a flywheel system thatmore » can be scaled up for use by electric utility companies and produce power for a full hour at a cost of $100 per kilowatt hour.« less

  10. Re-caching by Western scrub-jays (Aphelocoma californica) cannot be attributed to stress.

    PubMed

    Thom, James M; Clayton, Nicola S

    2013-01-01

    Western scrub-jays (Aphelocoma californica) live double lives, storing food for the future while raiding the stores of other birds. One tactic scrub-jays employ to protect stores is "re-caching"-relocating caches out of sight of would-be thieves. Recent computational modelling work suggests that re-caching might be mediated not by complex cognition, but by a combination of memory failure and stress. The "Stress Model" asserts that re-caching is a manifestation of a general drive to cache, rather than a desire to protect existing stores. Here, we present evidence strongly contradicting the central assumption of these models: that stress drives caching, irrespective of social context. In Experiment (i), we replicate the finding that scrub-jays preferentially relocate food they were watched hiding. In Experiment (ii) we find no evidence that stress increases caching. In light of our results, we argue that the Stress Model cannot account for scrub-jay re-caching.

  11. Preliminary analytical study on the feasibility of using reinforced concrete pile foundations for renewable energy storage by compressed air energy storage technology

    NASA Astrophysics Data System (ADS)

    Tulebekova, S.; Saliyev, D.; Zhang, D.; Kim, J. R.; Karabay, A.; Turlybek, A.; Kazybayeva, L.

    2017-11-01

    Compressed air energy storage technology is one of the promising methods that have high reliability, economic feasibility and low environmental impact. Current applications of the technology are mainly limited to energy storage for power plants using large scale underground caverns. This paper explores the possibility of making use of reinforced concrete pile foundations to store renewable energy generated from solar panels or windmills attached to building structures. The energy will be stored inside the pile foundation with hollow sections via compressed air. Given the relatively small volume of storage provided by the foundation, the required storage pressure is expected to be higher than that in the large-scale underground cavern. The high air pressure typically associated with large temperature increase, combined with structural loads, will make the pile foundation in a complicated loading condition, which might cause issues in the structural and geotechnical safety. This paper presents a preliminary analytical study on the performance of the pile foundation subjected to high pressure, large temperature increase and structural loads. Finite element analyses on pile foundation models, which are built from selected prototype structures, have been conducted. The analytical study identifies maximum stresses in the concrete of the pile foundation under combined pressure, temperature change and structural loads. Recommendations have been made for the use of reinforced concrete pile foundations for renewable energy storage.

  12. D-Light on promoters: a client-server system for the analysis and visualization of cis-regulatory elements

    PubMed Central

    2013-01-01

    Background The binding of transcription factors to DNA plays an essential role in the regulation of gene expression. Numerous experiments elucidated binding sequences which subsequently have been used to derive statistical models for predicting potential transcription factor binding sites (TFBS). The rapidly increasing number of genome sequence data requires sophisticated computational approaches to manage and query experimental and predicted TFBS data in the context of other epigenetic factors and across different organisms. Results We have developed D-Light, a novel client-server software package to store and query large amounts of TFBS data for any number of genomes. Users can add small-scale data to the server database and query them in a large scale, genome-wide promoter context. The client is implemented in Java and provides simple graphical user interfaces and data visualization. Here we also performed a statistical analysis showing what a user can expect for certain parameter settings and we illustrate the usage of D-Light with the help of a microarray data set. Conclusions D-Light is an easy to use software tool to integrate, store and query annotation data for promoters. A public D-Light server, the client and server software for local installation and the source code under GNU GPL license are available at http://biwww.che.sbg.ac.at/dlight. PMID:23617301

  13. The AFFDL-Nielsen Flow-Field Study

    DTIC Science & Technology

    1976-04-01

    76-18 1.0 INTRODUCTION This investigation was conducted in the von K ~ n Gas Dynamics Facility (VKF) Supersonic Wind Tunnel (A) for Nielsen...flow field-surveys, using a cone probe rake to determine the local velocity field; (2) pressure distributions on a store model; and (3) force and...moment data on a store model. In addition, free-stream (interference-free) data were obtained with the probe rake and on the force and pressure store

  14. Effective combination of DIC, AE, and UPV nondestructive techniques on a scaled model of the Belgian nuclear waste container

    NASA Astrophysics Data System (ADS)

    Iliopoulos, Sokratis N.; Areias, Lou; Pyl, Lincy; Vantomme, John; Van Marcke, Philippe; Coppens, Erik; Aggelis, Dimitrios G.

    2015-03-01

    Protecting the environment and future generations against the potential hazards arising from high-level and heat emitting radioactive waste is a worldwide concern. Following this direction, the Belgian Agency for Radioactive Waste and Enriched Fissile Materials has come up with the reference design which considers the geological disposal of the waste in purely indurated clay. In this design the wastes are first post-conditioned in massive concrete structures called Supercontainers before being transported to the underground repositories. The Supercontainers are cylindrical structures which consist of four engineering barriers that from the inner to the outer surface are namely: the overpack, the filler, the concrete buffer and possibly the envelope. The overpack, which is made of carbon steel, is the place where the vitrified wastes and spent fuel are stored. The buffer, which is made of concrete, creates a highly alkaline environment ensuring slow and uniform overpack corrosion as well as radiological shielding. In order to evaluate the feasibility to construct such Supercontainers two scaled models have so far been designed and tested. The first scaled model indicated crack formation on the surface of the concrete buffer but the absence of a crack detection and monitoring system precluded defining the exact time of crack initiation, as well as the origin, the penetration depth, the crack path and the propagation history. For this reason, the second scaled model test was performed to obtain further insight by answering to the aforementioned questions using the Digital Image Correlation, Acoustic Emission and Ultrasonic Pulse Velocity nondestructive testing techniques.

  15. Not in My Back Yard: A Comparative Analysis of Crime Around Publicly Funded Drug Treatment Centers, Liquor Stores, Convenience Stores, and Corner Stores in One Mid-Atlantic City

    PubMed Central

    Furr-Holden, C. Debra M.; Milam, Adam J.; Nesoff, Elizabeth D.; Johnson, Renee M.; Fakunle, David O.; Jennings, Jacky M.; Thorpe, Roland J.

    2016-01-01

    Objective: This research examined whether publicly funded drug treatment centers (DTCs) were associated with violent crime in excess of the violence happening around other commercial businesses. Method: Violent crime data and locations of community entities were geocoded and mapped. DTCs and other retail outlets were matched based on a Neighborhood Disadvantage score at the census tract level. Street network buffers ranging from 100 to 1,400 feet were placed around each location. Negative binomial regression models were used to estimate the relationship between the count of violent crimes and the distance from each business type. Results: Compared with the mean count of violent crime around drug treatment centers, the mean count of violent crime (M = 2.87) was significantly higher around liquor stores (M = 3.98; t test; p < .01) and corner stores (M = 3.78; t test; p < .01), and there was no statistically significant difference between the count around convenience stores (M = 2.65; t test; p = .32). In the adjusted negative binomial regression models, there was a negative and significant relationship between the count of violent crime and the distance from drug treatment centers (β = -.069, p < .01), liquor stores (β = -.081, p < .01), corner stores (β = -.116, p < .01), and convenience stores (β = -.154, p < .01). Conclusions: Violent crime associated with drug treatment centers is similar to that associated with liquor stores and is less frequent than that associated with convenience stores and corner stores. PMID:26751351

  16. Semantic Web repositories for genomics data using the eXframe platform

    PubMed Central

    2014-01-01

    Background With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. Methods To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Conclusions Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge. PMID:25093072

  17. Multitemporal L- and C-Band Synthetic Aperture Radar To Highlight Differences in Water Status Among Boreal Forest and Wetland Systems in the Yukon Flats, Interior Alaska

    USGS Publications Warehouse

    Balser, Andrew W.; Wylie, Bruce K.

    2010-01-01

    Tracking landscape-scale water status in high-latitude boreal systems is indispensible to understanding the fate of stored and sequestered carbon in a climate change scenario. Spaceborne synthetic aperture radar (SAR) imagery provides critical information for water and moisture status in Alaskan boreal environments at the landscape scale. When combined with results from optical sensor analyses, a complementary picture of vegetation, biomass, and water status emerges. Whereas L-band SAR showed better inherent capacity to map water status, C-band had much more temporal coverage in this study. Analysis through the use of L- and C-band SARs combined with Landsat Enhanced Thematic Mapper Plus (ETM+) enables landscape stratification by vegetation and by seasonal and interannual hydrology. Resultant classifications are highly relevant to biogeochemistry at the landscape scale. These results enhance our understanding of ecosystem processes relevant to carbon balance and may be scaled up to inform regional carbon flux estimates and better parameterize general circulation models (GCMs).

  18. Current status of postnatal depression smartphone applications available on application stores: an information quality analysis.

    PubMed

    Zhang, Melvyn Wb; Ho, Roger Cm; Loh, Alvona; Wing, Tracey; Wynne, Olivia; Chan, Sally Wai Chi; Car, Josip; Fung, Daniel Shuen Sheng

    2017-11-14

    It is the aim of the current research to identify some common functionalities of postnatal application, and to determine the quality of the information content of postnatal depression application using validated scales that have been applied for applications in other specialties. To determine the information quality of the postnatal depression smartphone applications, the two most widely used smartphone application stores, namely Apple iTunes as well as Google Android Play store, were searched between 20May and 31 May. No participants were involved. The inclusion criteria for the application were that it must have been searchable using the keywords 'postnatal', 'pregnancy', 'perinatal', 'postpartum' and 'depression', and must be in English language. The Silberg Scale was used in the assessment of the information quality of the smartphone applications. The information quality score was the primary outcome measure. Our current results highlighted that while there is currently a myriad of applications, only 14 applications are specifically focused on postnatal depression. In addition, the majority of the currently available applications on the store have only disclosed their last date of modification as well as ownership. There remain very limited disclosures about the information of the authors, as well as the references for the information included in the application itself. The average score for the Silberg Scale for the postnatal applications we have analysed is 3.0. There remains a need for healthcare professionals and developers to jointly conceptualise new applications with better information quality and evidence base. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Transient thermal analysis for radioactive liquid mixing operations in a large-scaled tank

    DOE PAGES

    Lee, S. Y.; Smith, III, F. G.

    2014-07-25

    A transient heat balance model was developed to assess the impact of a Submersible Mixer Pump (SMP) on radioactive liquid temperature during the process of waste mixing and removal for the high-level radioactive materials stored in Savannah River Site (SRS) tanks. The model results will be mainly used to determine the SMP design impacts on the waste tank temperature during operations and to develop a specification for a new SMP design to replace existing longshaft mixer pumps used during waste removal. The present model was benchmarked against the test data obtained by the tank measurement to examine the quantitative thermalmore » response of the tank and to establish the reference conditions of the operating variables under no SMP operation. The results showed that the model predictions agreed with the test data of the waste temperatures within about 10%.« less

  20. Washington State Spirits Privatization: How Satisfied were Liquor Purchasers Before and After, and by Type of Retail Store in 2014?

    PubMed

    Greenfield, Thomas K; Williams, Edwina; Kerr, William C; Subbaraman, Meenakshi S; Ye, Yu

    2018-07-03

    In 2012 Washington State ended a wholesale/retail monopoly on liquor, permitting sale of spirits in stores with > 10,000 square feet. Implementation resulted in average price increases, but also five times the stores selling liquor. As part of a privatization evaluation, we studied pre-post and between-store-type purchase experiences. A 2010 Washington State Liquor Control Board (LCB) survey of liquor purchasers (n = 599), and the 2014 baseline of a repeated telephone survey (1,202 residents; n = 465 purchasers), each included 10 LCB questions on satisfaction with purchase experiences, each attribute with graded response scale A = 4 to D = 1 and F (0 = fail). Analyses used t-tests for satisfaction differences by time and analysis of variance (ANOVA) for 2014 between-store satisfaction-level differences. Five purchase features were rated more favorably after privatization (ps < .05-.001), including product supply, staff professionalism, location convenience, store hours, and prices (though price rated lowest both times); selection offered, courtesy, and checkout speed were unaltered, and number of staff and staff knowledge declined (both p < .001). Eight consumer experiences differed by store type: five satisfaction aspects (supply, selection, number of staff, operating hours, and checkout speed) were highest for liquor superstores, while location convenience favored grocery and drug stores, and price satisfaction favored wholesale (Costco) stores, with staff knowledge highest at liquor stores. Satisfaction with liquor purchases increased after privatization for half the consumer experiences. Availability (location convenience and store hours) was important to liquor purchasers. Such results are relevant to sustained support for the policy of privatizing spirits retail monopolies.

  1. Study on store-space assignment based on logistic AGV in e-commerce goods to person picking pattern

    NASA Astrophysics Data System (ADS)

    Xu, Lijuan; Zhu, Jie

    2017-10-01

    This paper studied on the store-space assignment based on logistic AGV in E-commerce goods to person picking pattern, and established the store-space assignment model based on the lowest picking cost, and design for store-space assignment algorithm after the cluster analysis based on similarity coefficient. And then through the example analysis, compared the picking cost between store-space assignment algorithm this paper design and according to item number and storage according to ABC classification allocation, and verified the effectiveness of the design of the store-space assignment algorithm.

  2. A body composition model to estimate mammalian energy stores and metabolic rates from body mass and body length, with application to polar bears.

    PubMed

    Molnár, Péter K; Klanjscek, Tin; Derocher, Andrew E; Obbard, Martyn E; Lewis, Mark A

    2009-08-01

    Many species experience large fluctuations in food availability and depend on energy from fat and protein stores for survival, reproduction and growth. Body condition and, more specifically, energy stores thus constitute key variables in the life history of many species. Several indices exist to quantify body condition but none can provide the amount of stored energy. To estimate energy stores in mammals, we propose a body composition model that differentiates between structure and storage of an animal. We develop and parameterize the model specifically for polar bears (Ursus maritimus Phipps) but all concepts are general and the model could be easily adapted to other mammals. The model provides predictive equations to estimate structural mass, storage mass and storage energy from an appropriately chosen measure of body length and total body mass. The model also provides a means to estimate basal metabolic rates from body length and consecutive measurements of total body mass. Model estimates of body composition, structural mass, storage mass and energy density of 970 polar bears from Hudson Bay were consistent with the life history and physiology of polar bears. Metabolic rate estimates of fasting adult males derived from the body composition model corresponded closely to theoretically expected and experimentally measured metabolic rates. Our method is simple, non-invasive and provides considerably more information on the energetic status of individuals than currently available methods.

  3. Choosing the best partition of the output from a large-scale simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Challacombe, Chelsea Jordan; Casleton, Emily Michele

    Data partitioning becomes necessary when a large-scale simulation produces more data than can be feasibly stored. The goal is to partition the data, typically so that every element belongs to one and only one partition, and store summary information about the partition, either a representative value plus an estimate of the error or a distribution. Once the partitions are determined and the summary information stored, the raw data is discarded. This process can be performed in-situ; meaning while the simulation is running. When creating the partitions there are many decisions that researchers must make. For instance, how to determine oncemore » an adequate number of partitions have been created, how are the partitions created with respect to dividing the data, or how many variables should be considered simultaneously. In addition, decisions must be made for how to summarize the information within each partition. Because of the combinatorial number of possible ways to partition and summarize the data, a method of comparing the different possibilities will help guide researchers into choosing a good partitioning and summarization scheme for their application.« less

  4. Water age and stream solute dynamics at the Hubbard Brook Experimental Forest (US)

    NASA Astrophysics Data System (ADS)

    Botter, Gianluca; Benettin, Paolo; McGuire, Kevin; Rinaldo, Andrea

    2016-04-01

    The contribution discusses experimental and modeling results from a headwater catchment at the Hubbard Brook Experimental Forest (New Hampshire, USA) to explore the link between stream solute dynamics and water age. A theoretical framework based on water age dynamics, which represents a general basis for characterizing solute transport at the catchment scale, is used to model both conservative and weathering-derived solutes. Based on the available information about the hydrology of the site, an integrated transport model was developed and used to estimate the relevant hydrochemical fluxes. The model was designed to reproduce the deuterium content of streamflow and allowed for the estimate of catchment water storage and dynamic travel time distributions (TTDs). Within this framework, dissolved silicon and sodium concentration in streamflow were simulated by implementing first-order chemical kinetics based explicitly on dynamic TTD, thus upscaling local geochemical processes to catchment scale. Our results highlight the key role of water stored within the subsoil glacial material in both the short-term and long-term solute circulation at Hubbard Brook. The analysis of the results provided by the calibrated model allowed a robust estimate of the emerging concentration-discharge relationship, streamflow age distributions (including the fraction of event water) and storage size, and their evolution in time due to hydrologic variability.

  5. Evaporation suppression from water reservoirs using floating covers: Lab scale observations and model predictions

    NASA Astrophysics Data System (ADS)

    Or, D.; Lehmann, P.; Aminzadeh, M.; Sommer, M.; Wey, H.; Wunderli, H.; Breitenstein, D.

    2016-12-01

    The competition over dwindling fresh water resources is expected to intensify with projected increase in human population in arid regions, expansion of irrigated land and changes in climate and drought patterns. The volume of water stored in reservoirs would also increase to mitigate seasonal shortages due to rainfall variability and to meet irrigation water needs. By some estimates up to half of the stored water is lost to evaporation thereby exacerbating the water scarcity problem. Recently, there is an upsurge in the use of self-assembling floating covers to suppress evaporation, yet the design, and implementation remain largely empirical. Studies have shown that evaporation suppression is highly nonlinear, as also known from a century of research on gas exchange from plant leaves (that often evaporate as free water surfaces through stomata that are only 1% of leaf area). We report a systematic evaluation of different cover types and external drivers (radiation, wind, wind+radiation) on evaporation suppression and energy balance of a 1.4 m2 basin placed in a wind-tunnel. Surprisingly, evaporation suppression by black and white floating covers (balls and plates) were similar despite significantly different energy balance regimes over the cover surfaces. Moreover, the evaporation suppression efficiency was a simple function of the uncovered area (square root of the uncovered fraction) with linear relations with the covered area in some cases. The thermally decoupled floating covers offer an efficient solution to the evaporation suppression with limited influence of the surface energy balance (water temperature for black and white covers was similar and remained nearly constant). The results will be linked with a predictive evaporation-energy balance model and issues of spatial scales and long exposure times will be studied.

  6. Temporal Organization of Sound Information in Auditory Memory.

    PubMed

    Song, Kun; Luo, Huan

    2017-01-01

    Memory is a constructive and organizational process. Instead of being stored with all the fine details, external information is reorganized and structured at certain spatiotemporal scales. It is well acknowledged that time plays a central role in audition by segmenting sound inputs into temporal chunks of appropriate length. However, it remains largely unknown whether critical temporal structures exist to mediate sound representation in auditory memory. To address the issue, here we designed an auditory memory transferring study, by combining a previously developed unsupervised white noise memory paradigm with a reversed sound manipulation method. Specifically, we systematically measured the memory transferring from a random white noise sound to its locally temporal reversed version on various temporal scales in seven experiments. We demonstrate a U-shape memory-transferring pattern with the minimum value around temporal scale of 200 ms. Furthermore, neither auditory perceptual similarity nor physical similarity as a function of the manipulating temporal scale can account for the memory-transferring results. Our results suggest that sounds are not stored with all the fine spectrotemporal details but are organized and structured at discrete temporal chunks in long-term auditory memory representation.

  7. Simulating hydrologic and hydraulic processes throughout the Amazon River Basin

    USGS Publications Warehouse

    Beighley, R.E.; Eggert, K.G.; Dunne, T.; He, Y.; Gummadi, V.; Verdin, K.L.

    2009-01-01

    Presented here is a model framework based on a land surface topography that can be represented with various degrees of resolution and capable of providing representative channel/floodplain hydraulic characteristics on a daily to hourly scale. The framework integrates two models: (1) a water balance model (WBM) for the vertical fluxes and stores of water in and through the canopy and soil layers based on the conservation of mass and energy, and (2) a routing model for the horizontal routing of surface and subsurface runoff and channel and floodplain waters based on kinematic and diffusion wave methodologies. The WBM is driven by satellite-derived precipitation (TRMM_3B42) and air temperature (MOD08_M3). The model's use of an irregular computational grid is intended to facilitate parallel processing for applications to continental and global scales. Results are presented for the Amazon Basin over the period Jan 2001 through Dec 2005. The model is shown to capture annual runoff totals, annual peaks, seasonal patterns, and daily fluctuations over a range of spatial scales (>1, 000 to < 4·7M km2). For the period of study, results suggest basin-wide total water storage changes in the Amazon vary by approximately + /− 5 to 10 cm, and the fractional components accounting for these changes are: root zone soil moisture (20%), subsurface water being routed laterally to channels (40%) and channel/floodplain discharge (40%). Annual variability in monthly water storage changes by + /− 2·5 cm is likely due to 0·5 to 1 month variability in the arrival of significant rainfall periods throughout the basin.

  8. Effect of increases in energy-related labor forces upon retailing in Alabama

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robicheaux, R.A.

    1983-06-01

    The heightened mining employment that will result from increased extraction of coal from Alabama's Warrior Coal Basin will boost retail sales and employment. The Warrior Coal Basin counties (Fayette, Jefferson, Tuscaloosa and Walker) are heavily dependent upon coal mining as a source of employment and wages. Further, since the counties' economies grew increasingly dependent upon coal mining activities throughout the 1970s, it was believed that it would be possible to measure, with some acceptable level of reliability, the impact of the steadily rising mining activity upon the area's retailing sector. Therefore, a small scale econometric model was developed which representsmore » the interrelationships among income, mining and trade employment and retail sales in the four-county Warrior Coal Basin area. The results of two versions of the model are presented. In the first version, area-wide retail sales are treated in the aggregate. In the second version, retail sales are disaggregated into twelve categories (e.g., food, apparel, furniture, etc.). The models were specified using 1960 to 1976 data. The mining employment growth scenario used in this report called for steady increases in mining employment that culminated in an employment level that is 4000 above the baseline employment projections by 1985. Both versions of the model predicted that cumulative real regional income would increase by $1.39 billion over seven years with the added mining employment. The predicted impacts on trade employment and real retail sales varied between the two models, however. The aggregate model predicts the addition of 7500 trade workers and an additional $1.35 billion in real retail sales. The disaggregate model suggests that food stores, automobile dealers, general merchandise stores, gas stations and lumber and building materials retailers would enjoy the greatest positive benefits.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shao, Jiahang; Antipov, Sergey P.; Baryshev, Sergey V.

    Field emission from a solid metal surface has been continuously studied for a century over macroscopic to atomic scales. It is general knowledge that, other than the surface properties, the emitted current is governed solely by the applied electric field. A pin cathode has been used to study the dependence of field emission on stored energy in an L-band rf gun. The stored energy was changed by adjusting the axial position (distance between the cathode base and the gun back surface) of the cathode while the applied electric field on the cathode tip is kept constant. Avery strong correlation ofmore » the field-emission current with the stored energy has been observed. While eliminating all possible interfering sources, an enhancement of the current by a factor of 5 was obtained as the stored energy was increased by a factor of 3. It implies that under certain circumstances a localized field emission may be significantly altered by the global parameters in a system.« less

  10. Spatial-temporal modeling of neighborhood sociodemographic characteristics and food stores.

    PubMed

    Lamichhane, Archana P; Warren, Joshua L; Peterson, Marc; Rummo, Pasquale; Gordon-Larsen, Penny

    2015-01-15

    The literature on food stores, neighborhood poverty, and race/ethnicity is mixed and lacks methods of accounting for complex spatial and temporal clustering of food resources. We used quarterly data on supermarket and convenience store locations from Nielsen TDLinx (Nielsen Holdings N.V., New York, New York) spanning 7 years (2006-2012) and census tract-based neighborhood sociodemographic data from the American Community Survey (2006-2010) to assess associations between neighborhood sociodemographic characteristics and food store distributions in the Metropolitan Statistical Areas (MSAs) of 4 US cities (Birmingham, Alabama; Chicago, Illinois; Minneapolis, Minnesota; and San Francisco, California). We fitted a space-time Poisson regression model that accounted for the complex spatial-temporal correlation structure of store locations by introducing space-time random effects in an intrinsic conditionally autoregressive model within a Bayesian framework. After accounting for census tract-level area, population, their interaction, and spatial and temporal variability, census tract poverty was significantly and positively associated with increasing expected numbers of supermarkets among tracts in all 4 MSAs. A similar positive association was observed for convenience stores in Birmingham, Minneapolis, and San Francisco; in Chicago, a positive association was observed only for predominantly white and predominantly black tracts. Our findings suggest a positive association between greater numbers of food stores and higher neighborhood poverty, with implications for policy approaches related to food store access by neighborhood poverty. © The Author 2014. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Comparison between sparsely distributed memory and Hopfield-type neural network models

    NASA Technical Reports Server (NTRS)

    Keeler, James D.

    1986-01-01

    The Sparsely Distributed Memory (SDM) model (Kanerva, 1984) is compared to Hopfield-type neural-network models. A mathematical framework for comparing the two is developed, and the capacity of each model is investigated. The capacity of the SDM can be increased independently of the dimension of the stored vectors, whereas the Hopfield capacity is limited to a fraction of this dimension. However, the total number of stored bits per matrix element is the same in the two models, as well as for extended models with higher order interactions. The models are also compared in their ability to store sequences of patterns. The SDM is extended to include time delays so that contextual information can be used to cover sequences. Finally, it is shown how a generalization of the SDM allows storage of correlated input pattern vectors.

  12. Traceability, reproducibility and wiki-exploration for “à-la-carte” reconstructions of genome-scale metabolic models

    PubMed Central

    Got, Jeanne; Cortés, María Paz; Maass, Alejandro

    2018-01-01

    Genome-scale metabolic models have become the tool of choice for the global analysis of microorganism metabolism, and their reconstruction has attained high standards of quality and reliability. Improvements in this area have been accompanied by the development of some major platforms and databases, and an explosion of individual bioinformatics methods. Consequently, many recent models result from “à la carte” pipelines, combining the use of platforms, individual tools and biological expertise to enhance the quality of the reconstruction. Although very useful, introducing heterogeneous tools, that hardly interact with each other, causes loss of traceability and reproducibility in the reconstruction process. This represents a real obstacle, especially when considering less studied species whose metabolic reconstruction can greatly benefit from the comparison to good quality models of related organisms. This work proposes an adaptable workspace, AuReMe, for sustainable reconstructions or improvements of genome-scale metabolic models involving personalized pipelines. At each step, relevant information related to the modifications brought to the model by a method is stored. This ensures that the process is reproducible and documented regardless of the combination of tools used. Additionally, the workspace establishes a way to browse metabolic models and their metadata through the automatic generation of ad-hoc local wikis dedicated to monitoring and facilitating the process of reconstruction. AuReMe supports exploration and semantic query based on RDF databases. We illustrate how this workspace allowed handling, in an integrated way, the metabolic reconstructions of non-model organisms such as an extremophile bacterium or eukaryote algae. Among relevant applications, the latter reconstruction led to putative evolutionary insights of a metabolic pathway. PMID:29791443

  13. Efficient spiking neural network model of pattern motion selectivity in visual cortex.

    PubMed

    Beyeler, Michael; Richert, Micah; Dutt, Nikil D; Krichmar, Jeffrey L

    2014-07-01

    Simulating large-scale models of biological motion perception is challenging, due to the required memory to store the network structure and the computational power needed to quickly solve the neuronal dynamics. A low-cost yet high-performance approach to simulating large-scale neural network models in real-time is to leverage the parallel processing capability of graphics processing units (GPUs). Based on this approach, we present a two-stage model of visual area MT that we believe to be the first large-scale spiking network to demonstrate pattern direction selectivity. In this model, component-direction-selective (CDS) cells in MT linearly combine inputs from V1 cells that have spatiotemporal receptive fields according to the motion energy model of Simoncelli and Heeger. Pattern-direction-selective (PDS) cells in MT are constructed by pooling over MT CDS cells with a wide range of preferred directions. Responses of our model neurons are comparable to electrophysiological results for grating and plaid stimuli as well as speed tuning. The behavioral response of the network in a motion discrimination task is in agreement with psychophysical data. Moreover, our implementation outperforms a previous implementation of the motion energy model by orders of magnitude in terms of computational speed and memory usage. The full network, which comprises 153,216 neurons and approximately 40 million synapses, processes 20 frames per second of a 40 × 40 input video in real-time using a single off-the-shelf GPU. To promote the use of this algorithm among neuroscientists and computer vision researchers, the source code for the simulator, the network, and analysis scripts are publicly available.

  14. Weighted-outer-product associative neural network

    NASA Astrophysics Data System (ADS)

    Ji, Han-Bing

    1991-11-01

    A weighted outer product learning (WOPL) scheme for associative memory neural network is presented in which learning orders are incorporated to the Hopfield model. WOPL can be guaranteed to achieve correct recall of some stored datums no matter whether or not they are stable in the Hopfield model, and whether the number of stored datums is small or large. A technically sufficient condition is also discussed for how to suitably choose learning orders to fully utilize WOPL for correct recall of as many stored datums as possible.

  15. Force Reconstruction from Ejection Tests of Stores from Aircraft Used for Model Predictions and Missing/Bad Gages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Michael; Cap, Jerome S.; Starr, Michael J.

    One of the more severe environments for a store on an aircraft is during the ejection of the store. During this environment it is not possible to instrument all component responses, and it is also likely that some instruments may fail during the environment testing. This work provides a method for developing these responses from failed gages and uninstrumented locations. First, the forces observed by the store during the environment are reconstructed. A simple sampling method is used to reconstruct these forces given various parameters. Then, these forces are applied to a model to generate the component responses. Validation ismore » performed on this methodology.« less

  16. Integrated experimental and modeling assessment of potential effects of gas leakages on groundwater composition

    NASA Astrophysics Data System (ADS)

    Berta, Marton; Dethlefsen, Frank; Ebert, Markus; Schäfer, Dirk

    2017-04-01

    Storing renewably produced energy is one of the major challenges for the energy systems of the upcoming decades. Power-to-gas technologies coupled to geological storage of compressed air, methane, and hydrogen offer a comparatively safe and cost-efficient way for large-scale energy storage. However, the stored gases can potentially escape from their geological reservoir and may thus affect protected natural goods such as groundwater. The geochemical reactions responsible for these composition changes are usually investigated separately in experiments and numerical models. Here we present the outcomes of an integrated experimental and modeling approach through the example of a compressed air leakage scenario. A main consequence of the presence of oxygen to be assessed in an aquifer is pyrite oxidation, well known from acid mine drainage sites. However, in contrast to acid mine drainage sites exhibiting unsaturated sediments and fed by meteoric low-carbonate water, aquifers such as in Northern Germany contain a considerable amount of solid and dissolved inorganic carbon species potentially buffering pH changes. High pressure flow-through column experiments representing an intrusion of compressed air into an aquifer were carried out to quantify pyrite oxidation kinetics and to incorporate the observations into a descriptive reaction model. Surface passivation was found to decrease the reactivity of pyrite by more than 90% after a few months of experimental run time. We propose that the carbonate buffer system enables the precipitation of a passivating mineral layer on the pyrite surface reducing the overall reaction rate significantly. Consequently, an established rate law from the literature was extended by a reactive surface passivation term[1]. This improved reaction rate equation was incorporated into a 3D numerical model using OpenGeoSys with parameters representing similarly typical aquifer conditions the experiments had characterized. These boundaries include pyrite content, oxygen dissolution kinetics, groundwater composition including the carbonate buffer, and diffusive and advective transport parameters. The results of site-scale multiphase reactive transport modeling revealed the expected spatial distribution of redox-sensitive species such as oxygen, pyrite, and sulfate in an aquifer following a leakage. The changes in concentration of sulfate, dissolved oxygen, and H+ observed in the lab-scale experiments were qualitatively reproduced by the models applying the same boundary conditions on a site-scale. This integrated study acknowledged that the combination of experiments and models is a powerful tool to prognose the geochemical consequences of gas leakage on site scale. However, it is yet unknown how the passivation would be effected if the carbonate buffer depleted on the long term and under what circumstances a transition from the passivating pyrite oxidation process to the non-passivating process observed for instance in acid mine drainage setups occurs. These restrictions mark the limits of validity of our experimental and modeling concept. This conclusion suggests the feasibility of the presented integrated approach also when evaluating comparable scenarios on methane and hydrogen storage based on experimental results gathered similarly[2]. [1]Berta et al. Environ Earth Sci (2016) 75:1175, DOI 10.1007/s12665-016-5985-7. [2]Berta et al. First Break (2015) 33,93-95, ISSN 1365-2397. This work is part of the ANGUS+ project funded by the BMBF-FK03EK3022.

  17. Supervised Learning Using Spike-Timing-Dependent Plasticity of Memristive Synapses.

    PubMed

    Nishitani, Yu; Kaneko, Yukihiro; Ueda, Michihito

    2015-12-01

    We propose a supervised learning model that enables error backpropagation for spiking neural network hardware. The method is modeled by modifying an existing model to suit the hardware implementation. An example of a network circuit for the model is also presented. In this circuit, a three-terminal ferroelectric memristor (3T-FeMEM), which is a field-effect transistor with a gate insulator composed of ferroelectric materials, is used as an electric synapse device to store the analog synaptic weight. Our model can be implemented by reflecting the network error to the write voltage of the 3T-FeMEMs and introducing a spike-timing-dependent learning function to the device. An XOR problem was successfully demonstrated as a benchmark learning by numerical simulations using the circuit properties to estimate the learning performance. In principle, the learning time per step of this supervised learning model and the circuit is independent of the number of neurons in each layer, promising a high-speed and low-power calculation in large-scale neural networks.

  18. Body mass dependence of glycogen stores in the anoxia-tolerant crucian carp ( Carassius carassius L.)

    NASA Astrophysics Data System (ADS)

    Vornanen, Matti; Asikainen, Juha; Haverinen, Jaakko

    2011-03-01

    Glycogen is a vital energy substrate for anaerobic organisms, and the size of glycogen stores can be a limiting factor for anoxia tolerance of animals. To this end, glycogen stores in 12 different tissues of the crucian carp ( Carassius carassius L.), an anoxia-tolerant fish species, were examined. Glycogen content of different tissues was 2-10 times higher in winter (0.68-18.20% of tissue wet weight) than in summer (0.12-4.23%). In scale, bone and brain glycogen stores were strongly dependent on body mass (range between 0.6 and 785 g), small fish having significantly more glycogen than large fish ( p < 0.05). In fin and skin, size dependence was evident in winter, but not in summer, while in other tissues (ventricle, atrium, intestine, liver, muscle, and spleen), no size dependence was found. The liver was much bigger in small than large fish ( p < 0.001), and there was a prominent enlargement of the liver in winter irrespective of fish size. As a consequence, the whole body glycogen reserves, measured as a sum of glycogen from different tissues, varied from 6.1% of the body mass in the 1-g fish to 2.0% in the 800-g fish. Since anaerobic metabolic rate scales down with body size, the whole body glycogen reserves could provide energy for approximately 79 and 88 days of anoxia in small and large fish, respectively. There was, however, a drastic difference in tissue distribution of glycogen between large and small fish: in the small fish, the liver was the major glycogen store (68% of the stores), while in the large fish, the white myotomal muscle was the principal deposit of glycogen (57%). Since muscle glycogen is considered to be unavailable for blood glucose regulation, its usefulness in anoxia tolerance of the large crucian carp might be limited, although not excluded. Therefore, mobilization of muscle glycogen under anoxia needs to be rigorously tested.

  19. Spatial Distribution of Fate and Transport Parameters Using Cxtfit in a Karstified Limestone Model

    NASA Astrophysics Data System (ADS)

    Toro, J.; Padilla, I. Y.

    2017-12-01

    Karst environments have a high capacity to transport and store large amounts of water. This makes karst aquifers a productive resource for human consumption and ecological integrity, but also makes them vulnerable to potential contamination of hazardous chemical substances. High heterogeneity and anisotropy of karst aquifer properties make them very difficult to characterize for accurate prediction of contaminant mobility and persistence in groundwater. Current technologies to characterize and quantify flow and transport processes at field-scale is limited by low resolution of spatiotemporal data. To enhance this resolution and provide the essential knowledge of karst groundwater systems, studies at laboratory scale can be conducted. This work uses an intermediate karstified lab-scale physical model (IKLPM) to study fate and transport processes and assess viable tools to characterize heterogeneities in karst systems. Transport experiments are conducted in the IKLPM using step injections of calcium chloride, uranine, and rhodamine wt tracers. Temporal concentration distributions (TCDs) obtained from the experiments are analyzed using the method of moments and CXTFIT to quantify fate and transport parameters in the system at various flow rates. The spatial distribution of the estimated fate and transport parameters for the tracers revealed high variability related to preferential flow heterogeneities and scale dependence. Results are integrated to define spatially-variable transport regions within the system and assess their fate and transport characteristics.

  20. The Good Food Junction: a Community-Based Food Store Intervention to Address Nutritional Health Inequities

    PubMed Central

    Muhajarine, Nazeem; Ridalls, Tracy; Abonyi, Sylvia; Vatanparast, Hassan; Whiting, Susan; Walker, Ryan

    2016-01-01

    Background This is a 2-year study to assess the early impacts of a new grocery store intervention in a former food desert. Objective The purpose of the study is to understand the early health effects of the introduction of a large-scale food and nutrition-focused community-based population health intervention, the Good Food Junction (GFJ) Cooperative Store, in a geographically bounded group of socially disadvantaged neighborhoods (the “core neighborhoods”) in a midsized Canadian city. The GFJ grocery store was tasked with improving the access of residents to healthy, affordable food. The 5 research questions are: (1) What is the awareness and perception of the GFJ store among residents of the core neighborhoods? (2) Are there differences in awareness and perception among those who do and do not shop at the GFJ? (3) Will healthy food purchasing at the GFJ by residents of the core neighborhoods change over time, and what purchases are these individuals making at this store? (4) What early impact(s) will the GFJ have on key health-related outcomes (such as household food security status, vegetable and fruit intake, key aspects of self-reported mental health, self-reported health)? and (5) Are the effects of the intervention seen for specific vulnerable population groups, such as Aboriginal people, seniors (65 years old or older) and new immigrants (settled in Saskatoon for less than 5 years)? Methods The research project examined initial impacts of the GFJ on the health of the residents in surrounding neighborhoods through a door-to-door cross-sectional survey of food access and household demographics; an examination of GFJ sales data by location of shoppers' residences; and a 1-year, 3-time-point longitudinal study of self-reported health of GFJ shoppers. Results Analyses are on-going, but preliminary results show that shoppers are using the store for its intended purpose, which is to improve access to healthy food in a former food desert. Conclusions To our knowledge this is the first large-scale study of a full-service grocery store intervention in a former food desert in Canada that has used multiple data sources, as well as longitudinal analyses, to examine its effects. Its findings will contribute significantly to the knowledge base on food environment interventions. PMID:27079140

  1. Modeling and Simulating Multiple Failure Masking enabled by Local Recovery for Stencil-based Applications at Extreme Scales

    DOE PAGES

    Gamell, Marc; Teranishi, Keita; Mayo, Jackson; ...

    2017-04-24

    By obtaining multi-process hard failure resilience at the application level is a key challenge that must be overcome before the promise of exascale can be fully realized. Some previous work has shown that online global recovery can dramatically reduce the overhead of failures when compared to the more traditional approach of terminating the job and restarting it from the last stored checkpoint. If online recovery is performed in a local manner further scalability is enabled, not only due to the intrinsic lower costs of recovering locally, but also due to derived effects when using some application types. In this papermore » we model one such effect, namely multiple failure masking, that manifests when running Stencil parallel computations on an environment when failures are recovered locally. First, the delay propagation shape of one or multiple failures recovered locally is modeled to enable several analyses of the probability of different levels of failure masking under certain Stencil application behaviors. These results indicate that failure masking is an extremely desirable effect at scale which manifestation is more evident and beneficial as the machine size or the failure rate increase.« less

  2. Modeling and Simulating Multiple Failure Masking enabled by Local Recovery for Stencil-based Applications at Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamell, Marc; Teranishi, Keita; Mayo, Jackson

    By obtaining multi-process hard failure resilience at the application level is a key challenge that must be overcome before the promise of exascale can be fully realized. Some previous work has shown that online global recovery can dramatically reduce the overhead of failures when compared to the more traditional approach of terminating the job and restarting it from the last stored checkpoint. If online recovery is performed in a local manner further scalability is enabled, not only due to the intrinsic lower costs of recovering locally, but also due to derived effects when using some application types. In this papermore » we model one such effect, namely multiple failure masking, that manifests when running Stencil parallel computations on an environment when failures are recovered locally. First, the delay propagation shape of one or multiple failures recovered locally is modeled to enable several analyses of the probability of different levels of failure masking under certain Stencil application behaviors. These results indicate that failure masking is an extremely desirable effect at scale which manifestation is more evident and beneficial as the machine size or the failure rate increase.« less

  3. [Activities of System Studies and Simulation, Inc.

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Contents include the following: 1. Launch Vehicle Interface Work Performed: a. S3 provided to KSC the new launch inclination targets needed for the April '04 launch date. 2. Prelaunch operations work performed: a. S3 updated the staffing plan for MSFC on-console personnel\\during the Final Countdown prior to launch. 3. Software Assessment Work Performed: a. S3 evaluated and recommended approval for Program Control Board (PCB) proposed change 649 for ground software changes, as well as change 650 and 650A for Stored Program Commands. 4. Education and Public Outreach Work Performed: a. S3 continues to coordinate the effort for the design and fabrication of scale models of the GP-3 for use at the launch site, education forums, and management/technical briefings. S3 also prepared a Change Request for additional funds needed for fabrication of additional scale models. S3 drafted the planned uses of these models, including the possibility of participation in the Boston, MA showings of the traveling Einstein Exhibit. 5. Program Management Support Work Performed: a. S3 prepared the input for and closed three MSFC Centerwide Action Item Tracking Systems (CAITS) actions during this period.

  4. Marketing Strategies to Encourage Rural Residents of High-Obesity Counties to Buy Fruits and Vegetables in Grocery Stores

    PubMed Central

    Liu, Emily; Stephenson, Tammy; Houlihan, Jessica

    2017-01-01

    Introduction Obesity rates in Appalachia are among the highest in the United States, and knowledge of upstream approaches to decrease prevalence among this vulnerable population is limited. The primary aim of this study was to examine the association between healthy, diet-based, social marketing interventions in grocery stores and frequency of fruit and vegetable intake. Methods A social marketing campaign was conducted among 17 grocery stores (N = 240 participant surveys) over 4 months in 5 rural Kentucky counties. Interventions included providing food samples, recipe cards, and promotional discounts on fruits and vegetables and moving high-calorie foods to side aisles. Results Most survey participants reported that recipe cards influenced their desire to purchase ingredients as well as fruits and vegetables in general. Results indicated a significant association between the influence of recipe cards and frequency of fruit and vegetable consumption. Conclusion Small-scale interventions in grocery stores influenced purchasing choices among Appalachian residents. Working with various store managers and food venues in rural high-obesity communities is a promising way to encourage purchasing of fruits and vegetables. PMID:29023231

  5. Marketing Strategies to Encourage Rural Residents of High-Obesity Counties to Buy Fruits and Vegetables in Grocery Stores.

    PubMed

    Liu, Emily; Stephenson, Tammy; Houlihan, Jessica; Gustafson, Alison

    2017-10-12

    Obesity rates in Appalachia are among the highest in the United States, and knowledge of upstream approaches to decrease prevalence among this vulnerable population is limited. The primary aim of this study was to examine the association between healthy, diet-based, social marketing interventions in grocery stores and frequency of fruit and vegetable intake. A social marketing campaign was conducted among 17 grocery stores (N = 240 participant surveys) over 4 months in 5 rural Kentucky counties. Interventions included providing food samples, recipe cards, and promotional discounts on fruits and vegetables and moving high-calorie foods to side aisles. Most survey participants reported that recipe cards influenced their desire to purchase ingredients as well as fruits and vegetables in general. Results indicated a significant association between the influence of recipe cards and frequency of fruit and vegetable consumption. Small-scale interventions in grocery stores influenced purchasing choices among Appalachian residents. Working with various store managers and food venues in rural high-obesity communities is a promising way to encourage purchasing of fruits and vegetables.

  6. Multifluid geo-energy systems: Using geologic CO 2 storage for geothermal energy production and grid-scale energy storage in sedimentary basins

    DOE PAGES

    Buscheck, Thomas A.; Bielicki, Jeffrey M.; Edmunds, Thomas A.; ...

    2016-05-05

    We present an approach that uses the huge fluid and thermal storage capacity of the subsurface, together with geologic carbon dioxide (CO 2) storage, to harvest, store, and dispatch energy from subsurface (geothermal) and surface (solar, nuclear, fossil) thermal resources, as well as excess energy on electric grids. Captured CO 2 is injected into saline aquifers to store pressure, generate artesian flow of brine, and provide a supplemental working fluid for efficient heat extraction and power conversion. Concentric rings of injection and production wells create a hydraulic mound to store pressure, CO 2, and thermal energy. This energy storage canmore » take excess power from the grid and excess/waste thermal energy, and dispatch that energy when it is demanded and thus enable higher penetration of variable renewable energy technologies (e.g., wind, solar). CO 2 stored in the subsurface functions as a cushion gas to provide enormous pressure-storage capacity and displace large quantities of brine, some of which can be treated for a variety of beneficial uses. Geothermal power and energy-storage applications may generate enough revenues to compensate for CO 2 capture costs. While our approach can use nitrogen (N 2), in addition to CO 2, as a supplemental fluid, and store thermal energy, this study focuses using CO 2 for geothermal energy production and grid-scale energy storage. We conduct a techno-economic assessment to determine the levelized cost of electricity of using this approach to generate geothermal power. We present a reservoir pressure-management strategy that diverts a small portion of the produced brine for beneficial consumptive use to reduce the pumping cost of fluid recirculation, while reducing the risk of seismicity, caprock fracture, and CO 2 leakage.« less

  7. Adiabatic Quantum Optimization for Associative Memory Recall

    NASA Astrophysics Data System (ADS)

    Seddiqi, Hadayat; Humble, Travis

    2014-12-01

    Hopfield networks are a variant of associative memory that recall patterns stored in the couplings of an Ising model. Stored memories are conventionally accessed as fixed points in the network dynamics that correspond to energetic minima of the spin state. We show that memories stored in a Hopfield network may also be recalled by energy minimization using adiabatic quantum optimization (AQO). Numerical simulations of the underlying quantum dynamics allow us to quantify AQO recall accuracy with respect to the number of stored memories and noise in the input key. We investigate AQO performance with respect to how memories are stored in the Ising model according to different learning rules. Our results demonstrate that AQO recall accuracy varies strongly with learning rule, a behavior that is attributed to differences in energy landscapes. Consequently, learning rules offer a family of methods for programming adiabatic quantum optimization that we expect to be useful for characterizing AQO performance.

  8. Coupling tree rings and eddy covariance to estimate long-term above and belowground carbon storage at the stand level

    NASA Astrophysics Data System (ADS)

    Dye, A.; Alexander, M. R.; Bishop, D.; Pederson, N.; Hessl, A. E.

    2016-12-01

    Storage of carbon in terrestrial plants and soils directly reduces atmospheric carbon concentration, and it is thereby imperative to improve our understanding of where carbon is being stored and released in an ecosystem and how storages and releases are changing over time. At data-rich sites, coupling alternative measurements of carbon flux can improve this understanding. Here, we present a methodology to inversely model stand-level net storage and release of above- and belowground carbon over a period of 1-2 decades using co-located tree-ring plots and eddy covariance towers at three eastern U.S. forests. We reconstructed annual aboveground wood production (aNPP) from tree rings collected near eddy covariance towers. We compared our aNPP reconstructions with annual tower NEE to address whether interannual variations are correlated. Despite modest correlation, we observed magnitude differences between both records that vary annually. We interpret these differences as indicative of changes in belowground carbon storage, i.e. an aNPP:NEE ratio > 1 indicates a net release of belowground carbon and a ratio < 1 a net storage of belowground carbon. For this interpretation, we assume the following: a) carbon not directed to above or belowground pools is insignificant, b) carbon not stored above ground is stored below ground, and c) mature trees do not add to a storage pool at a higher level every year. While the offset between biometric aNPP and tower NEE could partially be attributed to the diversion of assimilated carbon to nonstructural carbohydrates instead of growth, we argue that this becomes a less important factor over longer time scales in a mature tree. Our approach does not quantify belowground NPP or allocation, but we present a method for estimating belowground carbon storage and release at the stand level, an otherwise difficult task at this scale due to heterogeneity across the stand.

  9. The Grand Challenge of Basin-Scale Groundwater Quality Management Modelling

    NASA Astrophysics Data System (ADS)

    Fogg, G. E.

    2017-12-01

    The last 50+ years of agricultural, urban and industrial land and water use practices have accelerated the degradation of groundwater quality in the upper portions of many major aquifer systems upon which much of the world relies for water supply. In the deepest and most extensive systems (e.g., sedimentary basins) that typically have the largest groundwater production rates and hold fresh groundwaters on decadal to millennial time scales, most of the groundwater is not yet contaminated. Predicting the long-term future groundwater quality in such basins is a grand scientific challenge. Moreover, determining what changes in land and water use practices would avert future, irreversible degradation of these massive freshwater stores is a grand challenge both scientifically and societally. It is naïve to think that the problem can be solved by eliminating or reducing enough of the contaminant sources, for human exploitation of land and water resources will likely always result in some contamination. The key lies in both reducing the contaminant sources and more proactively managing recharge in terms of both quantity and quality, such that the net influx of contaminants is sufficiently moderate and appropriately distributed in space and time to reverse ongoing groundwater quality degradation. Just as sustainable groundwater quantity management is greatly facilitated with groundwater flow management models, sustainable groundwater quality management will require the use of groundwater quality management models. This is a new genre of hydrologic models do not yet exist, partly because of the lack of modeling tools and the supporting research to model non-reactive as well as reactive transport on large space and time scales. It is essential that the contaminant hydrogeology community, which has heretofore focused almost entirely on point-source plume-scale problems, direct it's efforts toward the development of process-based transport modeling tools and analyses capable of appropriately upscaling advection-dispersion and reactions at the basin scale (10^2 km). A road map for research and development in groundwater quality management modeling and its application toward securing future groundwater resources will be discussed.

  10. Reducing Risk in CO2 Sequestration: A Framework for Integrated Monitoring of Basin Scale Injection

    NASA Astrophysics Data System (ADS)

    Seto, C. J.; Haidari, A. S.; McRae, G. J.

    2009-12-01

    Geological sequestration of CO2 is an option for stabilization of atmospheric CO2 concentrations. Technical ability to safely store CO2 in the subsurface has been demonstrated through pilot projects and a long history of enhanced oil recovery and acid gas disposal operations. To address climate change, current injection operations must be scaled up by a factor of 100, raising issues of safety and security. Monitoring and verification is an essential component in ensuring safe operations and managing risk. Monitoring provides assurance that CO2 is securely stored in the subsurface, and the mechanisms governing transport and storage are well understood. It also provides an early warning mechanism for identification of anomalies in performance, and a means for intervention and remediation through the ability to locate the CO2. Through theoretical studies, bench scale experiments and pilot tests, a number of technologies have demonstrated their ability to monitor CO2 in the surface and subsurface. Because the focus of these studies has been to demonstrate feasibility, individual techniques have not been integrated to provide a more robust method for monitoring. Considering the large volumes required for injection, size of the potential footprint, length of time a project must be monitored and uncertainty, operational considerations of cost and risk must balance safety and security. Integration of multiple monitoring techniques will reduce uncertainty in monitoring injected CO2, thereby reducing risk. We present a framework for risk management of large scale injection through model based monitoring network design. This framework is applied to monitoring CO2 in a synthetic reservoir where there is uncertainty in the underlying permeability field controlling fluid migration. Deformation and seismic data are used to track plume migration. A modified Ensemble Kalman filter approach is used to estimate flow properties by jointly assimilating flow and geomechanical observations. Issues of risk, cost and uncertainty are considered.

  11. Pricing of Staple Foods at Supermarkets versus Small Food Stores

    PubMed Central

    Caspi, Caitlin E.; Pelletier, Jennifer E.; Harnack, Lisa J.; Erickson, Darin J.; Laska, Melissa N.

    2017-01-01

    Prices affect food purchase decisions, particularly in lower-income communities, where access to a range of food retailers (including supermarkets) is limited. The aim of this study was to examine differences in staple food pricing between small urban food stores and the closest supermarkets, as well as whether pricing differentials varied based on proximity between small stores and larger retailers. In 2014, prices were measured for 15 staple foods during store visits in 140 smaller stores (corner stores, gas-marts, dollar stores, and pharmacies) in Minneapolis/St. Paul, MN and their closest supermarket. Mixed models controlling for store type were used to estimate the average price differential between: (a) smaller stores and supermarkets; (b) isolated smaller stores (>1 mile to closest supermarket) and non-isolated smaller stores; and (c) isolated smaller stores inside versus outside USDA-identified food deserts. On average, all items except white bread were 10–54% more expensive in smaller stores than in supermarkets (p < 0.001). Prices were generally not significantly different in isolated stores compared with non-isolated stores for most items. Among isolated stores, there were no price differences inside versus outside food deserts. We conclude that smaller food stores have higher prices for most staple foods compared to their closest supermarket, regardless of proximity. More research is needed to examine staple food prices in different retail spaces. PMID:28809795

  12. Pricing of Staple Foods at Supermarkets versus Small Food Stores.

    PubMed

    Caspi, Caitlin E; Pelletier, Jennifer E; Harnack, Lisa J; Erickson, Darin J; Lenk, Kathleen; Laska, Melissa N

    2017-08-15

    Prices affect food purchase decisions, particularly in lower-income communities, where access to a range of food retailers (including supermarkets) is limited. The aim of this study was to examine differences in staple food pricing between small urban food stores and the closest supermarkets, as well as whether pricing differentials varied based on proximity between small stores and larger retailers. In 2014, prices were measured for 15 staple foods during store visits in 140 smaller stores (corner stores, gas-marts, dollar stores, and pharmacies) in Minneapolis/St. Paul, MN and their closest supermarket. Mixed models controlling for store type were used to estimate the average price differential between: (a) smaller stores and supermarkets; (b) isolated smaller stores (>1 mile to closest supermarket) and non-isolated smaller stores; and (c) isolated smaller stores inside versus outside USDA-identified food deserts. On average, all items except white bread were 10-54% more expensive in smaller stores than in supermarkets ( p < 0.001). Prices were generally not significantly different in isolated stores compared with non-isolated stores for most items. Among isolated stores, there were no price differences inside versus outside food deserts. We conclude that smaller food stores have higher prices for most staple foods compared to their closest supermarket, regardless of proximity. More research is needed to examine staple food prices in different retail spaces.

  13. the observation, simulation and evaluation of lake-air interaction process over a high altitude small lake on the Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Wang, Binbin; Ma, Yaoming; Ma, Weiqiang; Su, Bob

    2017-04-01

    Lakes are an important part of the landscape on the Tibetan Plateau. The area that contains most of the plateau lakes has been expanding in recent years, but the impact of lakes on lake-atmosphere energy and water interactions is poorly understood because of a lack of observational data and adequate modeling systems. Furthermore, Precise measurements of evaporation and understanding of the physical controls on turbulent heat flux over lakes at different time scales have fundamental significance for catchment-scale water balance analysis and local-scale climate modeling. To test the performance of lake-air turbulent exchange models over high-altitude lakes and to understanding the driving forces for turbulent heat flux and obtain the actual evaporation over the small high-altitude lakes, an eddy covariance observational system was built above the water surface of the small Nam Co Lake (with an altitude of 4715 m and an area of approximately 1 km2) in April 2012. Firstly, we proposed the proper Charnock coefficient (0.031) and the roughness Reynolds number (0.54) for simulation using turbulent data in 2012, and validated the results using data in 2013 independently; secondly, wind speed shows significance at half-hourly time scales, whereas water vapor and temperature gradients have higher correlations over daily and monthly time scales in lake-air turbulent heat exchange; thirdly, the total evaporation in this small lake (812 mm) is approximately 200 mm larger than that from adjacent Nam Co (approximately 627 mm) during their ice-free seasons. Moreover, the energy stored during April to June is mainly released during September to November, suggesting an energy balance closure value of 0.97 over the entire ice-free season; lastly, 10 evaporation estimation methods are evaluated with the prepared datasets.

  14. Overview: Channel morphology and sediment transport in steepland streams

    Treesearch

    T. E. Lisle

    1987-01-01

    Abstract - New understanding of how steepland channels formed is being pursued over a large range of scales, from entrainment of bed particles to the transfer of stored sediment down channel systems. Low submergence of bed particles during transport and wide heterogeneity in particle sizes strongly affect bedload transport. At the scale of a reach, scour-lobes are...

  15. DTN routing in body sensor networks with dynamic postural partitioning.

    PubMed

    Quwaider, Muhannad; Biswas, Subir

    2010-11-01

    This paper presents novel store-and-forward packet routing algorithms for Wireless Body Area Networks ( WBAN ) with frequent postural partitioning. A prototype WBAN has been constructed for experimentally characterizing on-body topology disconnections in the presence of ultra short range radio links, unpredictable RF attenuation, and human postural mobility. On-body DTN routing protocols are then developed using a stochastic link cost formulation, capturing multi-scale topological localities in human postural movements. Performance of the proposed protocols are evaluated experimentally and via simulation, and are compared with a number of existing single-copy DTN routing protocols and an on-body packet flooding mechanism that serves as a performance benchmark with delay lower-bound. It is shown that via multi-scale modeling of the spatio-temporal locality of on-body link disconnection patterns, the proposed algorithms can provide better routing performance compared to a number of existing probabilistic, opportunistic, and utility-based DTN routing protocols in the literature.

  16. Numerical investigation of a helicopter combustion chamber using LES and tabulated chemistry

    NASA Astrophysics Data System (ADS)

    Auzillon, Pierre; Riber, Eléonore; Gicquel, Laurent Y. M.; Gicquel, Olivier; Darabiha, Nasser; Veynante, Denis; Fiorina, Benoît

    2013-01-01

    This article presents Large Eddy Simulations (LES) of a realistic aeronautical combustor device: the chamber CTA1 designed by TURBOMECA. Under nominal operating conditions, experiments show hot spots observed on the combustor walls, in the vicinity of the injectors. These high temperature regions disappear when modifying the fuel stream equivalence ratio. In order to account for detailed chemistry effects within LES, the numerical simulation uses the recently developed turbulent combustion model F-TACLES (Filtered TAbulated Chemistry for LES). The principle of this model is first to generate a lookup table where thermochemical variables are computed from a set of filtered laminar unstrained premixed flamelets. To model the interactions between the flame and the turbulence at the subgrid scale, a flame wrinkling analytical model is introduced and the Filtered Density Function (FDF) of the mixture fraction is modeled by a β function. Filtered thermochemical quantities are stored as a function of three coordinates: the filtered progress variable, the filtered mixture fraction and the mixture fraction subgrid scale variance. The chemical lookup table is then coupled with the LES using a mathematical formalism that ensures an accurate prediction of the flame dynamics. The numerical simulation of the CTA1 chamber with the F-TACLES turbulent combustion model reproduces fairly the temperature fields observed in experiments. In particular the influence of the fuel stream equivalence ratio on the flame position is well captured.

  17. Contextual Compression of Large-Scale Wind Turbine Array Simulations: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny M; Brunhart-Lupo, Nicholas J; Potter, Kristin C

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysismore » and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interactive visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contexualized representation is a valid approach and encourages contextual data management.« less

  18. Contextual Compression of Large-Scale Wind Turbine Array Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny M; Brunhart-Lupo, Nicholas J; Potter, Kristin C

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysismore » and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interative visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contextualized representation is a valid approach and encourages contextual data management.« less

  19. Cerebellar input configuration toward object model abstraction in manipulation tasks.

    PubMed

    Luque, Niceto R; Garrido, Jesus A; Carrillo, Richard R; Coenen, Olivier J-M D; Ros, Eduardo

    2011-08-01

    It is widely assumed that the cerebellum is one of the main nervous centers involved in correcting and refining planned movement and accounting for disturbances occurring during movement, for instance, due to the manipulation of objects which affect the kinematics and dynamics of the robot-arm plant model. In this brief, we evaluate a way in which a cerebellar-like structure can store a model in the granular and molecular layers. Furthermore, we study how its microstructure and input representations (context labels and sensorimotor signals) can efficiently support model abstraction toward delivering accurate corrective torque values for increasing precision during different-object manipulation. We also describe how the explicit (object-related input labels) and implicit state input representations (sensorimotor signals) complement each other to better handle different models and allow interpolation between two already stored models. This facilitates accurate corrections during manipulations of new objects taking advantage of already stored models.

  20. Improved water balance component estimates through joint assimilation of GRACE water storage and SMOS soil moisture retrievals

    NASA Astrophysics Data System (ADS)

    Tian, Siyuan; Tregoning, Paul; Renzullo, Luigi J.; van Dijk, Albert I. J. M.; Walker, Jeffrey P.; Pauwels, Valentijn R. N.; Allgeyer, Sébastien

    2017-03-01

    The accuracy of global water balance estimates is limited by the lack of observations at large scale and the uncertainties of model simulations. Global retrievals of terrestrial water storage (TWS) change and soil moisture (SM) from satellites provide an opportunity to improve model estimates through data assimilation. However, combining these two data sets is challenging due to the disparity in temporal and spatial resolution at both vertical and horizontal scale. For the first time, TWS observations from the Gravity Recovery and Climate Experiment (GRACE) and near-surface SM observations from the Soil Moisture and Ocean Salinity (SMOS) were jointly assimilated into a water balance model using the Ensemble Kalman Smoother from January 2010 to December 2013 for the Australian continent. The performance of joint assimilation was assessed against open-loop model simulations and the assimilation of either GRACE TWS anomalies or SMOS SM alone. The SMOS-only assimilation improved SM estimates but reduced the accuracy of groundwater and TWS estimates. The GRACE-only assimilation improved groundwater estimates but did not always produce accurate estimates of SM. The joint assimilation typically led to more accurate water storage profile estimates with improved surface SM, root-zone SM, and groundwater estimates against in situ observations. The assimilation successfully downscaled GRACE-derived integrated water storage horizontally and vertically into individual water stores at the same spatial scale as the model and SMOS, and partitioned monthly averaged TWS into daily estimates. These results demonstrate that satellite TWS and SM measurements can be jointly assimilated to produce improved water balance component estimates.

  1. Research Directions in Database Security IV

    DTIC Science & Technology

    1993-07-01

    second algorithm, which is based on multiversion timestamp ordering, is that high level transactions can be forced to read arbitrarily old data values...system. The first, the single ver- sion model, stores only the latest veision of each data item, while the second, the 88 multiversion model, stores... Multiversion Database Model In the standard database model, where there is only one version of each data item, all transactions compete for the most recent

  2. Capacity for patterns and sequences in Kanerva's SDM as compared to other associative memory models

    NASA Technical Reports Server (NTRS)

    Keeler, James D.

    1987-01-01

    The information capacity of Kanerva's Sparse Distributed Memory (SDM) and Hopfield-type neural networks is investigated. Under the approximations used, it is shown that the total information stored in these systems is proportional to the number connections in the network. The proportionality constant is the same for the SDM and Hopfield-type models independent of the particular model, or the order of the model. The approximations are checked numerically. This same analysis can be used to show that the SDM can store sequences of spatiotemporal patterns, and the addition of time-delayed connections allows the retrieval of context dependent temporal patterns. A minor modification of the SDM can be used to store correlated patterns.

  3. Capacity for patterns and sequences in Kanerva's SDM as compared to other associative memory models. [Sparse, Distributed Memory

    NASA Technical Reports Server (NTRS)

    Keeler, James D.

    1988-01-01

    The information capacity of Kanerva's Sparse Distributed Memory (SDM) and Hopfield-type neural networks is investigated. Under the approximations used here, it is shown that the total information stored in these systems is proportional to the number connections in the network. The proportionality constant is the same for the SDM and Hopfield-type models independent of the particular model, or the order of the model. The approximations are checked numerically. This same analysis can be used to show that the SDM can store sequences of spatiotemporal patterns, and the addition of time-delayed connections allows the retrieval of context dependent temporal patterns. A minor modification of the SDM can be used to store correlated patterns.

  4. Physical controls on half-hourly, daily, and monthly turbulent flux and energy budget over a high-altitude small lake on the Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Wang, Binbin; Ma, Yaoming; Ma, Weiqiang; Su, Zhongbo

    2017-02-01

    Precise measurements of evaporation and understanding of the physical controls on turbulent heat flux over lakes have fundamental significance for catchment-scale water balance analysis and local-scale climate modeling. The observation and simulation of lake-air turbulent flux processes have been widely carried out, but studies that examine high-altitude lakes on the Tibetan Plateau are still rare, especially for small lakes. An eddy covariance (EC) system, together with a four-component radiation sensor and instruments for measuring water temperature profiles, was set up in a small lake within the Nam Co basin in April 2012 for long-term evaporation and energy budget observations. With the valuable measurements collected during the ice-free periods in 2012 and 2013, the main conclusions are summarized as follows: First, a bulk aerodynamic transfer model (B model), with parameters optimized for the specific wave pattern in the small lake, could provide reliable and consistent results with EC measurements, and B model simulations are suitable for data interpolation due to inadequate footprint or malfunction of the EC instrument. Second, the total evaporation in this small lake (812 mm) is approximately 200 mm larger than that from adjacent Nam Co (approximately 627 mm) during their ice-free seasons. Third, wind speed shows significance at temporal scales of half hourly, whereas water vapor and temperature gradients have higher correlations over temporal scales of daily and monthly in lake-air turbulent heat exchange. Finally, energy stored during April to June is mainly released during September to November, suggesting an energy balance closure value of 0.97.

  5. Soil Functional Mapping: A Geospatial Framework for Scaling Soil Carbon Cycling

    NASA Astrophysics Data System (ADS)

    Lawrence, C. R.

    2017-12-01

    Climate change is dramatically altering biogeochemical cycles in most terrestrial ecosystems, particularly the cycles of water and carbon (C). These changes will affect myriad ecosystem processes of importance, including plant productivity, C exports to aquatic systems, and terrestrial C storage. Soil C storage represents a critical feedback to climate change as soils store more C than the atmosphere and aboveground plant biomass combined. While we know plant and soil C cycling are strongly coupled with soil moisture, substantial unknowns remain regarding how these relationships can be scaled up from soil profiles to ecosystems. This greatly limits our ability to build a process-based understanding of the controls on and consequences of climate change at regional scales. In an effort to address this limitation we: (1) describe an approach to classifying soils that is based on underlying differences in soil functional characteristics and (2) examine the utility of this approach as a scaling tool that honors the underlying soil processes. First, geospatial datasets are analyzed in the context of our current understanding of soil C and water cycling in order to predict soil functional units that can be mapped at the scale of ecosystems or watersheds. Next, the integrity of each soil functional unit is evaluated using available soil C data and mapping units are refined as needed. Finally, targeted sampling is conducted to further differentiate functional units or fill in any data gaps that are identified. Completion of this workflow provides new geospatial datasets that are based on specific soil functions, in this case the coupling of soil C and water cycling, and are well suited for integration with regional-scale soil models. Preliminary results from this effort highlight the advantages of a scaling approach that balances theory, measurement, and modeling.

  6. Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets.

    PubMed

    Datta, Abhirup; Banerjee, Sudipto; Finley, Andrew O; Gelfand, Alan E

    2016-01-01

    Spatial process models for analyzing geostatistical data entail computations that become prohibitive as the number of spatial locations become large. This article develops a class of highly scalable nearest-neighbor Gaussian process (NNGP) models to provide fully model-based inference for large geostatistical datasets. We establish that the NNGP is a well-defined spatial process providing legitimate finite-dimensional Gaussian densities with sparse precision matrices. We embed the NNGP as a sparsity-inducing prior within a rich hierarchical modeling framework and outline how computationally efficient Markov chain Monte Carlo (MCMC) algorithms can be executed without storing or decomposing large matrices. The floating point operations (flops) per iteration of this algorithm is linear in the number of spatial locations, thereby rendering substantial scalability. We illustrate the computational and inferential benefits of the NNGP over competing methods using simulation studies and also analyze forest biomass from a massive U.S. Forest Inventory dataset at a scale that precludes alternative dimension-reducing methods. Supplementary materials for this article are available online.

  7. Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets

    PubMed Central

    Datta, Abhirup; Banerjee, Sudipto; Finley, Andrew O.; Gelfand, Alan E.

    2018-01-01

    Spatial process models for analyzing geostatistical data entail computations that become prohibitive as the number of spatial locations become large. This article develops a class of highly scalable nearest-neighbor Gaussian process (NNGP) models to provide fully model-based inference for large geostatistical datasets. We establish that the NNGP is a well-defined spatial process providing legitimate finite-dimensional Gaussian densities with sparse precision matrices. We embed the NNGP as a sparsity-inducing prior within a rich hierarchical modeling framework and outline how computationally efficient Markov chain Monte Carlo (MCMC) algorithms can be executed without storing or decomposing large matrices. The floating point operations (flops) per iteration of this algorithm is linear in the number of spatial locations, thereby rendering substantial scalability. We illustrate the computational and inferential benefits of the NNGP over competing methods using simulation studies and also analyze forest biomass from a massive U.S. Forest Inventory dataset at a scale that precludes alternative dimension-reducing methods. Supplementary materials for this article are available online. PMID:29720777

  8. Store and Restaurant Advertising and Health of Public Housing Residents

    ERIC Educational Resources Information Center

    Heinrich, Katie M.; Li, Dongmei; Regan, Gail R.; Howard, Hugh H.; Ahluwalia, Jasjit S.; Lee, Rebecca E.

    2012-01-01

    Objectives: To determine relationships between food and beverage signs and health. Methods: In 12 public housing neighborhoods, food and alcohol signs were counted for stores and restaurants. Health and demographic data were from 373 adults. Results: Multilevel modeling showed higher BMI was related to more store and restaurant alcohol signs,…

  9. Consumers' quality perception of national branded, national store branded, and imported store branded beef.

    PubMed

    Banović, Marija; Grunert, Klaus G; Barreira, Maria Madalena; Fontes, Magda Aguiar

    2010-01-01

    This study investigated the differences in the consumers' quality perception of national branded, national store branded, and imported store branded beef. Partial Least Squares analysis is used for modelling the quality perception process. Results show that consumers perceived national branded Carnalentejana beef, as better on all quality cues and quality aspects than the other two store branded beefs. Preference for Carnalentejana beef stayed highly consistent even after the blind test, where consumers differentiated this beef from the other two beef brands on all sensory dimensions: taste, tenderness, and juiciness, and chose it as the preferred one. Consumers utilized more perceived intrinsic cues to infer expected eating quality of store branded beefs.

  10. Rock.XML - Towards a library of rock physics models

    NASA Astrophysics Data System (ADS)

    Jensen, Erling Hugo; Hauge, Ragnar; Ulvmoen, Marit; Johansen, Tor Arne; Drottning, Åsmund

    2016-08-01

    Rock physics modelling provides tools for correlating physical properties of rocks and their constituents to the geophysical observations we measure on a larger scale. Many different theoretical and empirical models exist, to cover the range of different types of rocks. However, upon reviewing these, we see that they are all built around a few main concepts. Based on this observation, we propose a format for digitally storing the specifications for rock physics models which we have named Rock.XML. It does not only contain data about the various constituents, but also the theories and how they are used to combine these building blocks to make a representative model for a particular rock. The format is based on the Extensible Markup Language XML, making it flexible enough to handle complex models as well as scalable towards extending it with new theories and models. This technology has great advantages as far as documenting and exchanging models in an unambiguous way between people and between software. Rock.XML can become a platform for creating a library of rock physics models; making them more accessible to everyone.

  11. Various-scale controls of complex subduction dynamics on magmatic-hydrothermal processes in eastern Mediterranean

    NASA Astrophysics Data System (ADS)

    Menant, Armel; Jolivet, Laurent; Sternai, Pietro; Ducoux, Maxime; Augier, Romain; Rabillard, Aurélien; Gerya, Taras; Guillou-Frottier, Laurent

    2014-05-01

    In subduction environment, magmatic-hydrothermal processes, responsible for the emplacement of magmatic bodies and related mineralization, are strongly controlled by slab dynamics. This 3D dynamics is often complex, resulting notably in spatial evolution through time of mineralization and magmatism types and in fast kinematic changes at the surface. Study at different scales of the distribution of these magmatic and hydrothermal products is useful to better constrain subduction dynamics. This work is focused on the eastern Mediterranean, where the complex dynamics of the Tethyan active margin since the upper Cretaceous is still largely debated. We propose new kinematic reconstructions of the region also showing the distribution of magmatic products and mineralization in space and time. Three main periods have thus been identified with a general southward migration of magmatic and ore bodies. (1) From late Cretaceous to lower Paleocene, calc-alkaline magmatism and porphyry Cu deposits emplaced notably in the Balkans, along a long linear cordillera. (2) From late Paleocene to Eocene, a barren period occurred while the Pelagonian microcontinent was buried within the subduction zone. (3) Since the Oligocene, Au-rich deposits and related K-rich magmatism emplaced in the Rhodopes, the Aegean and western Anatolian extensional domains in response to fast slab retreat and related mantle flow inducing the partial melting of the lithospheric mantle or the base of the upper crust where Au was previously stored. The emplacement at shallow level of this mineralization was largely controlled by large-scale structures that drained the magmatic-hydrothermal fluids. In the Cyclades for instance, field studies show that Au-rich but also base metal-rich ore deposits are syn-extensional and spatially related to large-scale detachment systems (e.g. on Tinos, Mykonos, Serifos islands), which are recognized as subduction-related structures. These results highlight the importance at different scales of subduction dynamics and related mantle flow on the emplacement of mineralization and magmatic bodies. Indeed, besides a general southward migration of the magmatic-hydrothermal activity since the upper Cretaceous from the Balkans to the present-day Aegean volcanic arc, a secondary westward migration is observed during the Miocene from the Menderes massif to the Cyclades. This feature is a possible consequence of a slab tearing event and related mantle flow, as suggested notably by tomographic models below western Anatolia. To further test the effects of slab retreat and tearing on the flow and temperature field within the mantle, we performed 3D thermo-mechanical numerical modeling. Models suggest that the asthenospheric flow induced by the development of a slab tear controls the migration of magmatic products stored at the base of the crust, influencing the distribution of potentially fertile magmas within the upper crust.

  12. Store and restaurant advertising and health of public housing residents.

    PubMed

    Heinrich, Katie M; Li, Dongmei; Regan, Gail R; Howard, Hugh H; Ahluwalia, Jasjit S; Lee, Rebecca E

    2012-01-01

    To determine relationships between food and beverage signs and health. In 12 public housing neighborhoods, food and alcohol signs were counted for stores and restaurants. Health and demographic data were from 373 adults. Multilevel modeling showed higher BMI was related to more store and restaurant alcohol signs, higher blood pressure, nonsmokers, and females. Higher dietary fat consumption was related to more store and restaurant alcohol and fewer low-calorie healthy signs, lower fruit consumption, fewer minutes walked, and white and Hispanic/Latino ethnicity. Signs in stores and restaurants are related to BMI and dietary fat consumption among residents.

  13. An FPGA-Based Massively Parallel Neuromorphic Cortex Simulator

    PubMed Central

    Wang, Runchun M.; Thakur, Chetan S.; van Schaik, André

    2018-01-01

    This paper presents a massively parallel and scalable neuromorphic cortex simulator designed for simulating large and structurally connected spiking neural networks, such as complex models of various areas of the cortex. The main novelty of this work is the abstraction of a neuromorphic architecture into clusters represented by minicolumns and hypercolumns, analogously to the fundamental structural units observed in neurobiology. Without this approach, simulating large-scale fully connected networks needs prohibitively large memory to store look-up tables for point-to-point connections. Instead, we use a novel architecture, based on the structural connectivity in the neocortex, such that all the required parameters and connections can be stored in on-chip memory. The cortex simulator can be easily reconfigured for simulating different neural networks without any change in hardware structure by programming the memory. A hierarchical communication scheme allows one neuron to have a fan-out of up to 200 k neurons. As a proof-of-concept, an implementation on one Altera Stratix V FPGA was able to simulate 20 million to 2.6 billion leaky-integrate-and-fire (LIF) neurons in real time. We verified the system by emulating a simplified auditory cortex (with 100 million neurons). This cortex simulator achieved a low power dissipation of 1.62 μW per neuron. With the advent of commercially available FPGA boards, our system offers an accessible and scalable tool for the design, real-time simulation, and analysis of large-scale spiking neural networks. PMID:29692702

  14. An FPGA-Based Massively Parallel Neuromorphic Cortex Simulator.

    PubMed

    Wang, Runchun M; Thakur, Chetan S; van Schaik, André

    2018-01-01

    This paper presents a massively parallel and scalable neuromorphic cortex simulator designed for simulating large and structurally connected spiking neural networks, such as complex models of various areas of the cortex. The main novelty of this work is the abstraction of a neuromorphic architecture into clusters represented by minicolumns and hypercolumns, analogously to the fundamental structural units observed in neurobiology. Without this approach, simulating large-scale fully connected networks needs prohibitively large memory to store look-up tables for point-to-point connections. Instead, we use a novel architecture, based on the structural connectivity in the neocortex, such that all the required parameters and connections can be stored in on-chip memory. The cortex simulator can be easily reconfigured for simulating different neural networks without any change in hardware structure by programming the memory. A hierarchical communication scheme allows one neuron to have a fan-out of up to 200 k neurons. As a proof-of-concept, an implementation on one Altera Stratix V FPGA was able to simulate 20 million to 2.6 billion leaky-integrate-and-fire (LIF) neurons in real time. We verified the system by emulating a simplified auditory cortex (with 100 million neurons). This cortex simulator achieved a low power dissipation of 1.62 μW per neuron. With the advent of commercially available FPGA boards, our system offers an accessible and scalable tool for the design, real-time simulation, and analysis of large-scale spiking neural networks.

  15. Long-Term Recency in Anterograde Amnesia

    PubMed Central

    Talmi, Deborah; Caplan, Jeremy B.; Richards, Brian; Moscovitch, Morris

    2015-01-01

    Amnesia is usually described as an impairment of a long-term memory (LTM) despite an intact short-term memory (STM). The intact recency effect in amnesia had supported this view. Although dual-store models of memory have been challenged by single-store models based on interference theory, this had relatively little influence on our understanding and treatment of amnesia, perhaps because the debate has centred on experiments in the neurologically intact population. Here we tested a key prediction of single-store models for free recall in amnesia: that people with amnesia will exhibit a memory advantage for the most recent items even when all items are stored in and retrieved from LTM, an effect called long-term recency. People with amnesia and matched controls studied, and then free-recalled, word lists with a distractor task following each word, including the last (continual distractor task, CDFR). This condition was compared to an Immediate Free Recall (IFR, no distractors) and a Delayed Free Recall (DFR, end-of-list distractor only) condition. People with amnesia demonstrated the full long-term recency pattern: the recency effect was attenuated in DFR and returned in CDFR. The advantage of recency over midlist items in CDFR was comparable to that of controls, confirming a key prediction of single-store models. Memory deficits appeared only after the first word recalled in each list, suggesting the impairment in amnesia may emerge only as the participant’s recall sequence develops, perhaps due to increased susceptibility to output interference. Our findings suggest that interference mechanisms are preserved in amnesia despite the overall impairment to LTM, and challenge strict dual-store models of memory and their dominance in explaining amnesia. We discuss the implication of our findings for rehabilitation. PMID:26046770

  16. Baltimore City Stores Increased The Availability Of Healthy Food After WIC Policy Change.

    PubMed

    Cobb, Laura K; Anderson, Cheryl A M; Appel, Lawrence; Jones-Smith, Jessica; Bilal, Usama; Gittelsohn, Joel; Franco, Manuel

    2015-11-01

    As part of a 2009 revision to the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) program, the Department of Agriculture required WIC-authorized stores to stock additional varieties of healthy food. The long-term effects of this policy on access to healthy food are unknown. Using surveys conducted in 118 Baltimore City, Maryland, food stores in 2006 and 2012, we examined associations of the change in healthy food availability with store type, neighborhood demographics, and the 2009 WIC policy change. Overall, healthy food availability improved significantly between 2006 and 2012, with the greatest increases in corner stores and in census tracts with more than 60 percent black residents. On an 11-point scale measuring availability of fruit (3 points), vegetables (4 points), bread (2 points), and milk (2 points), the WIC policy change was associated with a 0.72-point increase in WIC-relevant healthy food availability, while joining WIC was associated with a 0.99-point increase. Stores that carry a limited variety of food items may be more receptive to stocking healthier food than previously thought, particularly within neighborhoods with a majority of black residents. Policies targeting healthy food availability have the potential to increase availability and decrease health disparities. Project HOPE—The People-to-People Health Foundation, Inc.

  17. Variable-Internal-Stores models of microbial growth and metabolism with dynamic allocation of cellular resources.

    PubMed

    Nev, Olga A; van den Berg, Hugo A

    2017-01-01

    Variable-Internal-Stores models of microbial metabolism and growth have proven to be invaluable in accounting for changes in cellular composition as microbial cells adapt to varying conditions of nutrient availability. Here, such a model is extended with explicit allocation of molecular building blocks among various types of catalytic machinery. Such an extension allows a reconstruction of the regulatory rules employed by the cell as it adapts its physiology to changing environmental conditions. Moreover, the extension proposed here creates a link between classic models of microbial growth and analyses based on detailed transcriptomics and proteomics data sets. We ascertain the compatibility between the extended Variable-Internal-Stores model and the classic models, demonstrate its behaviour by means of simulations, and provide a detailed treatment of the uniqueness and the stability of its equilibrium point as a function of the availabilities of the various nutrients.

  18. Earthquake and failure forecasting in real-time: A Forecasting Model Testing Centre

    NASA Astrophysics Data System (ADS)

    Filgueira, Rosa; Atkinson, Malcolm; Bell, Andrew; Main, Ian; Boon, Steven; Meredith, Philip

    2013-04-01

    Across Europe there are a large number of rock deformation laboratories, each of which runs many experiments. Similarly there are a large number of theoretical rock physicists who develop constitutive and computational models both for rock deformation and changes in geophysical properties. Here we consider how to open up opportunities for sharing experimental data in a way that is integrated with multiple hypothesis testing. We present a prototype for a new forecasting model testing centre based on e-infrastructures for capturing and sharing data and models to accelerate the Rock Physicist (RP) research. This proposal is triggered by our work on data assimilation in the NERC EFFORT (Earthquake and Failure Forecasting in Real Time) project, using data provided by the NERC CREEP 2 experimental project as a test case. EFFORT is a multi-disciplinary collaboration between Geoscientists, Rock Physicists and Computer Scientist. Brittle failure of the crust is likely to play a key role in controlling the timing of a range of geophysical hazards, such as volcanic eruptions, yet the predictability of brittle failure is unknown. Our aim is to provide a facility for developing and testing models to forecast brittle failure in experimental and natural data. Model testing is performed in real-time, verifiably prospective mode, in order to avoid selection biases that are possible in retrospective analyses. The project will ultimately quantify the predictability of brittle failure, and how this predictability scales from simple, controlled laboratory conditions to the complex, uncontrolled real world. Experimental data are collected from controlled laboratory experiments which includes data from the UCL Laboratory and from Creep2 project which will undertake experiments in a deep-sea laboratory. We illustrate the properties of the prototype testing centre by streaming and analysing realistically noisy synthetic data, as an aid to generating and improving testing methodologies in imperfect conditions. The forecasting model testing centre uses a repository to hold all the data and models and a catalogue to hold all the corresponding metadata. It allows to: Data transfer: Upload experimental data: We have developed FAST (Flexible Automated Streaming Transfer) tool to upload data from RP laboratories to the repository. FAST sets up data transfer requirements and selects automatically the transfer protocol. Metadata are automatically created and stored. Web data access: Create synthetic data: Users can choose a generator and supply parameters. Synthetic data are automatically stored with corresponding metadata. Select data and models: Search the metadata using criteria design for RP. The metadata of each data (synthetic or from laboratory) and models are well-described through their respective catalogues accessible by the web portal. Upload models: Upload and store a model with associated metadata. This provide an opportunity to share models. The web portal solicits and creates metadata describing each model. Run model and visualise results: Selected data and a model to be submitted to a High Performance Computational resource hiding technical details. Results are displayed in accelerated time and stored allowing retrieval, inspection and aggregation. The forecasting model testing centre proposed could be integrated into EPOS. Its expected benefits are: Improved the understanding of brittle failure prediction and its scalability to natural phenomena. Accelerated and extensive testing and rapid sharing of insights. Increased impact and visibility of RP and GeoScience research. Resources for education and training. A key challenge is to agree the framework for sharing RP data and models. Our work is provocative first step.

  19. Development and implementation of a food store-based intervention to improve diet in the Republic of the Marshall Islands.

    PubMed

    Gittelsohn, Joel; Dyckman, William; Tan, May Lynn; Boggs, Malia K; Frick, Kevin D; Alfred, Julie; Winch, Peter J; Haberle, Heather; Palafox, Neal A

    2006-10-01

    Effective approaches for the prevention and reduction of obesity and obesity-related chronic diseases are urgently needed. Food store-centered programs represent one approach that may be both effective and sustainable. The authors developed a food store-based intervention in the Marshall Islands using qualitative and quantitative formative research methods, including a store usage survey (n = 184) and in-depth interviews with large-store managers (n = 13), small-store managers (n = 7), customers (n = 10), and community leaders (n = 4). This process was followed up by development and piloting of specific intervention components and workshops to finalize materials. The final intervention combined mass media (newspaper articles, video, radio announcements) and in-store components (shelf labels, cooking demonstrations, posters, recipe cards) and had high store-owner support and participation. High levels of exposure to the intervention were achieved during the 10-week period of implementation. This model for developing food store-based interventions is applicable to other settings.

  20. Efficient vibration mode analysis of aircraft with multiple external store configurations

    NASA Technical Reports Server (NTRS)

    Karpel, M.

    1988-01-01

    A coupling method for efficient vibration mode analysis of aircraft with multiple external store configurations is presented. A set of low-frequency vibration modes, including rigid-body modes, represent the aircraft. Each external store is represented by its vibration modes with clamped boundary conditions, and by its rigid-body inertial properties. The aircraft modes are obtained from a finite-element model loaded by dummy rigid external stores with fictitious masses. The coupling procedure unloads the dummy stores and loads the actual stores instead. The analytical development is presented, the effects of the fictitious mass magnitudes are discussed, and a numerical example is given for a combat aircraft with external wing stores. Comparison with vibration modes obtained by a direct (full-size) eigensolution shows very accurate coupling results. Once the aircraft and stores data bases are constructed, the computer time for analyzing any external store configuration is two to three orders of magnitude less than that of a direct solution.

  1. Verification of hydrologic landscape derived basin-scale classifications in the Pacific Northwest

    Treesearch

    Keith Sawicz

    2016-01-01

    The interaction between the physical and climatic attributes of a basin (form) control how water is partitioned, stored, and conveyed through a catchment (function). Hydrologic Landscapes (HLs) were previously...

  2. Thinking Like a Whole Building: Whole Foods Market New Construction Summary, U.S. Department of Energy's Commercial Building Partnerships (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2011-04-01

    Whole Foods Market participates in the U.S. Department of Energy's Commercial Building Partnerships (CBP) to identify and develop cost-effective, readily deployed, replicable energy efficiency measures (EEMs) for commercial buildings. Whole Foods Market is working with the National Renewable Energy Laboratory (NREL) on a retrofit and a new construction CBP project. Whole Foods Market's CBP new construction project is a standalone store in Raleigh, North Carolina. Whole Foods Market examined the energy systems and the interactions between those systems in the design for the new Raleigh store. Based on this collaboration and preliminary energy modeling, Whole Foods Market and NREL identifiedmore » a number of cost-effective EEMs that can be readily deployed in other Whole Foods Market stores and in other U.S. supermarkets. If the actual savings in the Raleigh store - which NREL will monitor and verify - match the modeling results, each year this store will save nearly $100,000 in operating costs (Raleigh's rates are about $0.06/kWh for electricity and $0.83/therm for natural gas). The store will also use 41% less energy than a Standard 90.1-compliant store and avoid about 3.7 million pounds of carbon dioxide emissions.« less

  3. Thinking Like a Whole Building: A Whole Foods Market New Construction Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deru, M.; Bonnema, E.; Doebber, I.

    2011-04-01

    Whole Foods Market participates in the U.S. Department of Energy's Commercial Building Partnerships (CBP) to identify and develop cost-effective, readily deployed, replicable energy efficiency measures (EEMs) for commercial buildings. Whole Foods Market is working with the National Renewable Energy Laboratory (NREL) on a retrofit and a new construction CBP project. Whole Foods Market's CBP new construction project is a standalone store in Raleigh, North Carolina. Whole Foods Market examined the energy systems and the interactions between those systems in the design for the new Raleigh store. Based on this collaboration and preliminary energy modeling, Whole Foods Market and NREL identifiedmore » a number of cost-effective EEMs that can be readily deployed in other Whole Foods Market stores and in other U.S. supermarkets. If the actual savings in the Raleigh store - which NREL will monitor and verify - match the modeling results, each year this store will save nearly $100,000 in operating costs (Raleigh's rates are about $0.06/kWh for electricity and $0.83/therm for natural gas). The store will also use 41% less energy than a Standard 90.1-compliant store and avoid about 3.7 million pounds of carbon dioxide emissions.« less

  4. Carbon storage in mountainous headwater streams: The role of old-growth forest and logjams

    NASA Astrophysics Data System (ADS)

    Beckman, Natalie D.; Wohl, Ellen

    2014-03-01

    We measured wood piece characteristics and particulate organic matter (POM) in stored sediments in 30 channel-spanning logjams along headwater streams in the Colorado Front Range, USA. Logjams are on streams flowing through old-growth (>200 years), disturbed (<200 years, natural disturbance), or altered (<200 years, logged) subalpine conifer forest. We examined how channel-spanning logjams influence riverine carbon storage (measured as the total volatile carbon fraction of stored sediment and instream wood). Details of carbon storage associated with logjams reflect age and disturbance history of the adjacent riparian forest. A majority of the carbon within jams is stored as wood. Wood volume is significantly larger in old-growth and disturbed reaches than in altered reaches. Carbon storage also differs in relation to forest characteristics. Sediment from old-growth streams has significantly higher carbon content than altered streams. Volume of carbon stored in jam sediment correlates with jam wood volume in old-growth and disturbed forests, but not in altered forests. Forest stand age and wood volume within a jam explain 43% of the variation of carbon stored in jam sediment. First-order estimates of the amount of carbon stored within a stream reach show an order of magnitude difference between disturbed and altered reaches. Our first-order estimates of reach-scale riverine carbon storage suggest that the carbon per hectare stored in streams is on the same order of magnitude as the carbon stored as dead biomass in terrestrial subalpine forests of the region. Of particular importance, old-growth forest correlates with more carbon storage in rivers.

  5. Dense mesh sampling for video-based facial animation

    NASA Astrophysics Data System (ADS)

    Peszor, Damian; Wojciechowska, Marzena

    2016-06-01

    The paper describes an approach for selection of feature points on three-dimensional, triangle mesh obtained using various techniques from several video footages. This approach has a dual purpose. First, it allows to minimize the data stored for the purpose of facial animation, so that instead of storing position of each vertex in each frame, one could store only a small subset of vertices for each frame and calculate positions of others based on the subset. Second purpose is to select feature points that could be used for anthropometry-based retargeting of recorded mimicry to another model, with sampling density beyond that which can be achieved using marker-based performance capture techniques. Developed approach was successfully tested on artificial models, models constructed using structured light scanner, and models constructed from video footages using stereophotogrammetry.

  6. Leadership: Where and What Is Leadership Excellence?

    DTIC Science & Technology

    1986-04-01

    of Sam Neaman who came to McCrory’s chain of stores and motivated the personnel of one store (Indianapolis) to go out and explore their competition. He...around from a looser to a winner. Neaman could have used the changed Indiana store as a model for other McCrory stores. He did, in a way. However, his... Neaman followed the same strategy in Flushing, New York. He motivated the local employees to do it themselves. In their own way. A different kind of

  7. Identification of fine scale and landscape scale drivers of urban aboveground carbon stocks using high-resolution modeling and mapping.

    PubMed

    Mitchell, Matthew G E; Johansen, Kasper; Maron, Martine; McAlpine, Clive A; Wu, Dan; Rhodes, Jonathan R

    2018-05-01

    Urban areas are sources of land use change and CO 2 emissions that contribute to global climate change. Despite this, assessments of urban vegetation carbon stocks often fail to identify important landscape-scale drivers of variation in urban carbon, especially the potential effects of landscape structure variables at different spatial scales. We combined field measurements with Light Detection And Ranging (LiDAR) data to build high-resolution models of woody plant aboveground carbon across the urban portion of Brisbane, Australia, and then identified landscape scale drivers of these carbon stocks. First, we used LiDAR data to quantify the extent and vertical structure of vegetation across the city at high resolution (5×5m). Next, we paired this data with aboveground carbon measurements at 219 sites to create boosted regression tree models and map aboveground carbon across the city. We then used these maps to determine how spatial variation in land cover/land use and landscape structure affects these carbon stocks. Foliage densities above 5m height, tree canopy height, and the presence of ground openings had the strongest relationships with aboveground carbon. Using these fine-scale relationships, we estimate that 2.2±0.4 TgC are stored aboveground in the urban portion of Brisbane, with mean densities of 32.6±5.8MgCha -1 calculated across the entire urban land area, and 110.9±19.7MgCha -1 calculated within treed areas. Predicted carbon densities within treed areas showed strong positive relationships with the proportion of surrounding tree cover and how clumped that tree cover was at both 1km 2 and 1ha resolutions. Our models predict that even dense urban areas with low tree cover can have high carbon densities at fine scales. We conclude that actions and policies aimed at increasing urban carbon should focus on those areas where urban tree cover is most fragmented. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. A computationally efficient parallel Levenberg-Marquardt algorithm for highly parameterized inverse model analyses

    NASA Astrophysics Data System (ADS)

    Lin, Youzuo; O'Malley, Daniel; Vesselinov, Velimir V.

    2016-09-01

    Inverse modeling seeks model parameters given a set of observations. However, for practical problems because the number of measurements is often large and the model parameters are also numerous, conventional methods for inverse modeling can be computationally expensive. We have developed a new, computationally efficient parallel Levenberg-Marquardt method for solving inverse modeling problems with a highly parameterized model space. Levenberg-Marquardt methods require the solution of a linear system of equations which can be prohibitively expensive to compute for moderate to large-scale problems. Our novel method projects the original linear problem down to a Krylov subspace such that the dimensionality of the problem can be significantly reduced. Furthermore, we store the Krylov subspace computed when using the first damping parameter and recycle the subspace for the subsequent damping parameters. The efficiency of our new inverse modeling algorithm is significantly improved using these computational techniques. We apply this new inverse modeling method to invert for random transmissivity fields in 2-D and a random hydraulic conductivity field in 3-D. Our algorithm is fast enough to solve for the distributed model parameters (transmissivity) in the model domain. The algorithm is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). By comparing with Levenberg-Marquardt methods using standard linear inversion techniques such as QR or SVD methods, our Levenberg-Marquardt method yields a speed-up ratio on the order of ˜101 to ˜102 in a multicore computational environment. Therefore, our new inverse modeling method is a powerful tool for characterizing subsurface heterogeneity for moderate to large-scale problems.

  9. Cloud Computing: A model Construct of Real-Time Monitoring for Big Dataset Analytics Using Apache Spark

    NASA Astrophysics Data System (ADS)

    Alkasem, Ameen; Liu, Hongwei; Zuo, Decheng; Algarash, Basheer

    2018-01-01

    The volume of data being collected, analyzed, and stored has exploded in recent years, in particular in relation to the activity on the cloud computing. While large-scale data processing, analysis, storage, and platform model such as cloud computing were previously and currently are increasingly. Today, the major challenge is it address how to monitor and control these massive amounts of data and perform analysis in real-time at scale. The traditional methods and model systems are unable to cope with these quantities of data in real-time. Here we present a new methodology for constructing a model for optimizing the performance of real-time monitoring of big datasets, which includes a machine learning algorithms and Apache Spark Streaming to accomplish fine-grained fault diagnosis and repair of big dataset. As a case study, we use the failure of Virtual Machines (VMs) to start-up. The methodology proposition ensures that the most sensible action is carried out during the procedure of fine-grained monitoring and generates the highest efficacy and cost-saving fault repair through three construction control steps: (I) data collection; (II) analysis engine and (III) decision engine. We found that running this novel methodology can save a considerate amount of time compared to the Hadoop model, without sacrificing the classification accuracy or optimization of performance. The accuracy of the proposed method (92.13%) is an improvement on traditional approaches.

  10. Federated learning of predictive models from federated Electronic Health Records.

    PubMed

    Brisimi, Theodora S; Chen, Ruidi; Mela, Theofanie; Olshevsky, Alex; Paschalidis, Ioannis Ch; Shi, Wei

    2018-04-01

    In an era of "big data," computationally efficient and privacy-aware solutions for large-scale machine learning problems become crucial, especially in the healthcare domain, where large amounts of data are stored in different locations and owned by different entities. Past research has been focused on centralized algorithms, which assume the existence of a central data repository (database) which stores and can process the data from all participants. Such an architecture, however, can be impractical when data are not centrally located, it does not scale well to very large datasets, and introduces single-point of failure risks which could compromise the integrity and privacy of the data. Given scores of data widely spread across hospitals/individuals, a decentralized computationally scalable methodology is very much in need. We aim at solving a binary supervised classification problem to predict hospitalizations for cardiac events using a distributed algorithm. We seek to develop a general decentralized optimization framework enabling multiple data holders to collaborate and converge to a common predictive model, without explicitly exchanging raw data. We focus on the soft-margin l 1 -regularized sparse Support Vector Machine (sSVM) classifier. We develop an iterative cluster Primal Dual Splitting (cPDS) algorithm for solving the large-scale sSVM problem in a decentralized fashion. Such a distributed learning scheme is relevant for multi-institutional collaborations or peer-to-peer applications, allowing the data holders to collaborate, while keeping every participant's data private. We test cPDS on the problem of predicting hospitalizations due to heart diseases within a calendar year based on information in the patients Electronic Health Records prior to that year. cPDS converges faster than centralized methods at the cost of some communication between agents. It also converges faster and with less communication overhead compared to an alternative distributed algorithm. In both cases, it achieves similar prediction accuracy measured by the Area Under the Receiver Operating Characteristic Curve (AUC) of the classifier. We extract important features discovered by the algorithm that are predictive of future hospitalizations, thus providing a way to interpret the classification results and inform prevention efforts. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. BioModels Database: An enhanced, curated and annotated resource for published quantitative kinetic models

    PubMed Central

    2010-01-01

    Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in community-supported and standardised formats. In addition, the models and their components should be cross-referenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freely-accessible online resource for storing, viewing, retrieving, and analysing published, peer-reviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access up-to-date data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. PMID:20587024

  12. The influence of the WIC food package changes on the retail food environment in New Orleans.

    PubMed

    Rose, Donald; O'Malley, Keelia; Dunaway, Lauren Futrell; Bodor, J Nicholas

    2014-01-01

    To examine the effect of the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) food package changes on availability of healthy foods in small stores. Pre-post comparison group design with repeat in-store observations. New Orleans. Small stores (n = 102; 77% of total) were visited in 2009. Of these, 91% were observed again in 2010, including both WIC (n = 27) and non-WIC (n = 66) stores. The 2009 WIC food package changes to include healthier foods. Change in store availability of fruits, vegetables, lower-fat milks, whole wheat bread, and brown rice. Change in number of varieties and shelf length of fruits and vegetables. Difference-in-differences analysis using logit models for change in availability and regression models for change in number of varieties or shelf length. The WIC stores were more likely to improve availability of lower-fat milks than non-WIC stores (adjusted odds ratio, 5.0, 95% confidence interval, 1.2-21.0). An even greater relative improvement was seen with whole grains. The WIC stores showed a relative increase in number of varieties of fresh fruits (0.9 ± 0.3; P < .01) and shelf length of vegetables (1.2 ± 0.4 meters; P < .01). Results suggest that WIC changes improved the availability of healthy foods in small stores in New Orleans. Similar changes throughout the country could have a significant impact on neighborhood food environments. Copyright © 2014 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  13. Short and long-term energy intake patterns and their implications for human body weight regulation.

    PubMed

    Chow, Carson C; Hall, Kevin D

    2014-07-01

    Adults consume millions of kilocalories over the course of a few years, but the typical weight gain amounts to only a few thousand kilocalories of stored energy. Furthermore, food intake is highly variable from day to day and yet body weight is remarkably stable. These facts have been used as evidence to support the hypothesis that human body weight is regulated by active control of food intake operating on both short and long time scales. Here, we demonstrate that active control of human food intake on short time scales is not required for body weight stability and that the current evidence for long term control of food intake is equivocal. To provide more data on this issue, we emphasize the urgent need for developing new methods for accurately measuring energy intake changes over long time scales. We propose that repeated body weight measurements can be used along with mathematical modeling to calculate long-term changes in energy intake and thereby quantify adherence to a diet intervention and provide dynamic feedback to individuals that seek to control their body weight. Published by Elsevier Inc.

  14. Large-scale filament formation inhibits the activity of CTP synthetase

    PubMed Central

    Barry, Rachael M; Bitbol, Anne-Florence; Lorestani, Alexander; Charles, Emeric J; Habrian, Chris H; Hansen, Jesse M; Li, Hsin-Jung; Baldwin, Enoch P; Wingreen, Ned S; Kollman, Justin M; Gitai, Zemer

    2014-01-01

    CTP Synthetase (CtpS) is a universally conserved and essential metabolic enzyme. While many enzymes form small oligomers, CtpS forms large-scale filamentous structures of unknown function in prokaryotes and eukaryotes. By simultaneously monitoring CtpS polymerization and enzymatic activity, we show that polymerization inhibits activity, and CtpS's product, CTP, induces assembly. To understand how assembly inhibits activity, we used electron microscopy to define the structure of CtpS polymers. This structure suggests that polymerization sterically hinders a conformational change necessary for CtpS activity. Structure-guided mutagenesis and mathematical modeling further indicate that coupling activity to polymerization promotes cooperative catalytic regulation. This previously uncharacterized regulatory mechanism is important for cellular function since a mutant that disrupts CtpS polymerization disrupts E. coli growth and metabolic regulation without reducing CTP levels. We propose that regulation by large-scale polymerization enables ultrasensitive control of enzymatic activity while storing an enzyme subpopulation in a conformationally restricted form that is readily activatable. DOI: http://dx.doi.org/10.7554/eLife.03638.001 PMID:25030911

  15. Short and long-term energy intake patterns and their implications for human body weight regulation

    PubMed Central

    Chow, Carson C.; Hall, Kevin D.

    2014-01-01

    Adults consume millions of kilocalories over the course of a few years, but the typical weight gain amounts to only a few thousand kilocalories of stored energy. Furthermore, food intake is highly variable from day to day and yet body weight is remarkably stable. These facts have been used as evidence to support the hypothesis that human body weight is regulated by active control of food intake operating on both short and long time scales. Here, we demonstrate that active control of human food intake on short time scales is not required for body weight stability and that the current evidence for long term control of food intake is equivocal. To provide more data on this issue, we emphasize the urgent need for developing new methods for accurately measuring energy intake changes over long time scales. We propose that repeated body weight measurements can be used along with mathematical modeling to calculate long-term changes in energy intake and thereby quantify adherence to a diet intervention and provide dynamic feedback to individuals that seek to control their body weight. PMID:24582679

  16. Increased autumn rainfall disrupts predator-prey interactions in fragmented boreal forests.

    PubMed

    Terraube, Julien; Villers, Alexandre; Poudré, Léo; Varjonen, Rauno; Korpimäki, Erkki

    2017-04-01

    There is a pressing need to understand how changing climate interacts with land-use change to affect predator-prey interactions in fragmented landscapes. This is particularly true in boreal ecosystems facing fast climate change and intensification in forestry practices. Here, we investigated the relative influence of autumn climate and habitat quality on the food-storing behaviour of a generalist predator, the pygmy owl, using a unique data set of 15 850 prey items recorded in western Finland over 12 years. Our results highlighted strong effects of autumn climate (number of days with rainfall and with temperature <0 °C) on food-store composition. Increasing frequency of days with precipitation in autumn triggered a decrease in (i) total prey biomass stored, (ii) the number of bank voles (main prey) stored, and (iii) the scaled mass index of pygmy owls. Increasing proportions of old spruce forests strengthened the functional response of owls to variations in vole abundance and were more prone to switch from main prey to alternative prey (passerine birds) depending on local climate conditions. High-quality habitat may allow pygmy owls to buffer negative effects of inclement weather and cyclic variation in vole abundance. Additionally, our results evidenced sex-specific trends in body condition, as the scaled mass index of smaller males increased while the scaled mass index of larger females decreased over the study period, probably due to sex-specific foraging strategies and energy requirements. Long-term temporal stability in local vole abundance refutes the hypothesis of climate-driven change in vole abundance and suggests that rainier autumns could reduce the vulnerability of small mammals to predation by pygmy owls. As small rodents are key prey species for many predators in northern ecosystems, our findings raise concern about the impact of global change on boreal food webs through changes in main prey vulnerability. © 2016 John Wiley & Sons Ltd.

  17. Enabling Energy-Awareness in the Semantic 3d City Model of Vienna

    NASA Astrophysics Data System (ADS)

    Agugiaro, G.

    2016-09-01

    This paper presents and discusses the first results regarding selection, analysis, preparation and eventual integration of a number of energy-related datasets, chosen in order to enrich a CityGML-based semantic 3D city model of Vienna. CityGML is an international standard conceived specifically as information and data model for semantic city models at urban and territorial scale. The still-in-development Energy Application Domain Extension (ADE) is a CityGML extension conceived to specifically model, manage and store energy-related features and attributes for buildings. The work presented in this paper is embedded within the European Marie-Curie ITN project "CINERGY, Smart cities with sustainable energy systems", which aims, among the rest, at developing urban decision making and operational optimisation software tools to minimise non-renewable energy use in cities. Given the scope and scale of the project, it is therefore vital to set up a common, unique and spatio-semantically coherent urban data model to be used as information hub for all applications being developed. This paper reports about the experiences done so far, it describes the test area in Vienna, Austria, and the available data sources, it shows and exemplifies the main data integration issues, the strategies developed to solve them in order to obtain the enriched 3D city model. The first results as well as some comments about their quality and limitations are presented, together with the discussion regarding the next steps and some planned improvements.

  18. Sparsey™: event recognition via deep hierarchical sparse distributed codes

    PubMed Central

    Rinkus, Gerard J.

    2014-01-01

    The visual cortex's hierarchical, multi-level organization is captured in many biologically inspired computational vision models, the general idea being that progressively larger scale (spatially/temporally) and more complex visual features are represented in progressively higher areas. However, most earlier models use localist representations (codes) in each representational field (which we equate with the cortical macrocolumn, “mac”), at each level. In localism, each represented feature/concept/event (hereinafter “item”) is coded by a single unit. The model we describe, Sparsey, is hierarchical as well but crucially, it uses sparse distributed coding (SDC) in every mac in all levels. In SDC, each represented item is coded by a small subset of the mac's units. The SDCs of different items can overlap and the size of overlap between items can be used to represent their similarity. The difference between localism and SDC is crucial because SDC allows the two essential operations of associative memory, storing a new item and retrieving the best-matching stored item, to be done in fixed time for the life of the model. Since the model's core algorithm, which does both storage and retrieval (inference), makes a single pass over all macs on each time step, the overall model's storage/retrieval operation is also fixed-time, a criterion we consider essential for scalability to the huge (“Big Data”) problems. A 2010 paper described a nonhierarchical version of this model in the context of purely spatial pattern processing. Here, we elaborate a fully hierarchical model (arbitrary numbers of levels and macs per level), describing novel model principles like progressive critical periods, dynamic modulation of principal cells' activation functions based on a mac-level familiarity measure, representation of multiple simultaneously active hypotheses, a novel method of time warp invariant recognition, and we report results showing learning/recognition of spatiotemporal patterns. PMID:25566046

  19. Store and Restaurant Advertising and Health of Public Housing Residents

    PubMed Central

    Heinrich, Katie M.; Li, Dongmei; Regan, Gail R.; Howard, Hugh H.; Ahluwalia, Jasjit S.; Lee, Rebecca E.

    2015-01-01

    Objectives To determine relationships between food and beverage signs and health. Methods In 12 public housing neighborhoods, food and alcohol signs were counted for stores and restaurants. Health and demographic data were from 373 adults. Results Multilevel modeling showed higher BMI was related to more store and restaurant alcohol signs, higher blood pressure, nonsmokers, and females. Higher dietary fat consumption was related to more store and restaurant alcohol and fewer low-calorie healthy signs, lower fruit consumption, fewer minutes walked, and white and Hispanic/Latino ethnicity. Conclusions Signs in stores and restaurants are related to BMI and dietary fat consumption among residents. PMID:22251784

  20. Combined quantity management and biological treatment of sludge liquor at Hamburg's wastewater treatment plants--first experience in operation with the Store and Treat process.

    PubMed

    Laurich, F

    2004-01-01

    Store and Treat (SAT) is a new concept for the management of ammonium-rich process waste waters at wastewater treatment plants. It combines the advantages of quantity management and separate biological treatment, whereby both operations are carried out in the same tank. Now the first full-scale application of that method was realized in Hamburg. As first experience shows the process can help to increase nitrogen removal and to reduce energy consumption.

  1. Hydrogeologic settings and groundwater-flow simulations for regional investigations of the transport of anthropogenic and natural contaminants to public-supply wells—Investigations begun in 2004

    USGS Publications Warehouse

    Eberts, Sandra M.

    2011-01-01

    A study of the Transport of Anthropogenic and Natural Contaminants to public-supply wells (TANC study) was begun in 2001 as part of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. The study was designed to shed light on factors that affect the vulnerability of groundwater and, more specifically, water from public-supply wells to contamination to provide a context for the NAWQA Program's earlier finding of mixtures of contaminants at low concentrations in groundwater near the water table in urban areas across the Nation. The TANC study has included investigations at both the regional (tens to thousands of square kilometers) and local (generally less than 25 square kilometers) scales. At the regional scale, the approach to investigation involves refining conceptual models of groundwater flow in hydrologically distinct settings and then constructing or updating a groundwater-flow model with particle tracking for each setting to help quantify regional water budgets, public-supply well contributing areas (areas contributing recharge to wells and zones of contribution for wells), and traveltimes from recharge areas to selected wells. A great deal of information about each contributing area is captured from the model output, including values for 170 variables that describe physical and (or) geochemical characteristics of the contributing areas. The information is subsequently stored in a relational database. Retrospective water-quality data from monitoring, domestic, and many of the public-supply wells, as well as data from newly collected samples at selected public-supply wells, also are stored in the database and are used with the model output to help discern the more important factors affecting vulnerability in many, if not most, settings. The study began with investigations in seven regional areas, and it benefits from being conducted as part of the NAWQA Program, in which consistent methods are used so that meaningful comparisons can be made. The hydrogeologic settings and regional-scale groundwater-flow models from the initial seven regional areas are documented in Chapter A of this U.S. Geological Survey Professional Paper. Also documented in Chapter A are the methods used to collect and compile the water-quality data, determine contributing areas of the public-supply wells, and characterize the oxidation-reduction (redox) conditions in each setting. A data dictionary for the database that was designed to enable joint storage and access to water-quality data and groundwater-flow model particle-tracking output is included as Appendix 1 of Chapter A. This chapter, Chapter B, documents modifications to the study methods and presents descriptions of two regional areas that were added to the TANC study in 2004.

  2. Food mirages: geographic and economic barriers to healthful food access in Portland, Oregon.

    PubMed

    Breyer, Betsy; Voss-Andreae, Adriana

    2013-11-01

    This paper investigated the role of grocery store prices in structuring food access for low-income households in Portland, Oregon. We conducted a detailed healthful foods market basket survey and developed an index of store cost based on the USDA Thrifty Food Plan. Using this index, we estimated the difference in street-network distance between the nearest low-cost grocery store and the nearest grocery store irrespective of cost. Spatial regression of this metric in relation to income, poverty, and gentrification at the census tract scale lead to a new theory regarding food access in the urban landscape. Food deserts are sparse in Portland, but food mirages are abundant, particularly in gentrifying areas where poverty remains high. In a food mirage, grocery stores are plentiful but prices are beyond the means of low-income households, making them functionally equivalent to food deserts in that a long journey to obtain affordable, nutritious food is required in either case. Results suggested that evaluation of food environments should, at a minimum, consider both proximity and price in assessing healthy food access for low-income households. © 2013 Elsevier Ltd. All rights reserved.

  3. Cross-Service Investigation of Geographical Information Systems

    DTIC Science & Technology

    2004-03-01

    Figure 8 illustrates the combined layers. Information for the layers is stored in a database format. The two types of storage are vector and...raster models. In a vector model, the image and information are stored as geometric objects such as points, lines, or polygons. In a raster model...DNCs are a vector -based digital database with selected maritime significant physical features from hydrographic charts. Layers within the DNC are data

  4. Quantum Vertex Model for Reversible Classical Computing

    NASA Astrophysics Data System (ADS)

    Chamon, Claudio; Mucciolo, Eduardo; Ruckenstein, Andrei; Yang, Zhicheng

    We present a planar vertex model that encodes the result of a universal reversible classical computation in its ground state. The approach involves Boolean variables (spins) placed on links of a two-dimensional lattice, with vertices representing logic gates. Large short-ranged interactions between at most two spins implement the operation of each gate. The lattice is anisotropic with one direction corresponding to computational time, and with transverse boundaries storing the computation's input and output. The model displays no finite temperature phase transitions, including no glass transitions, independent of circuit. The computational complexity is encoded in the scaling of the relaxation rate into the ground state with the system size. We use thermal annealing and a novel and more efficient heuristic \\x9Dannealing with learning to study various computational problems. To explore faster relaxation routes, we construct an explicit mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating a novel approach to reversible classical computation based on quantum annealing.

  5. Heritage House Maintenance Using 3d City Model Application Domain Extension Approach

    NASA Astrophysics Data System (ADS)

    Mohd, Z. H.; Ujang, U.; Liat Choon, T.

    2017-11-01

    Heritage house is part of the architectural heritage of Malaysia that highly valued. Many efforts by the Department of Heritage to preserve this heritage house such as monitoring the damage problems of heritage house. The damage problems of heritage house might be caused by wooden decay, roof leakage and exfoliation of wall. One of the initiatives for maintaining and documenting this heritage house is through Three-dimensional (3D) of technology. 3D city models are widely used now and much used by researchers for management and analysis. CityGML is a standard tool that usually used by researchers to exchange, storing and managing virtual 3D city models either geometric and semantic information. Moreover, it also represent multi-scale of 3D model in five level of details (LoDs) whereby each of level give a distinctive functions. The extension of CityGML was recently introduced and can be used for problems monitoring and the number of habitants of a house.

  6. Force-directed visualization for conceptual data models

    NASA Astrophysics Data System (ADS)

    Battigaglia, Andrew; Sutter, Noah

    2017-03-01

    Conceptual data models are increasingly stored in an eXtensible Markup Language (XML) format because of its portability between different systems and the ability of databases to use this format for storing data. However, when attempting to capture business or design needs, an organized graphical format is preferred in order to facilitate communication to receive as much input as possible from users and subject-matter experts. Existing methods of achieving this conversion suffer from problems of not being specific enough to capture all of the needs of conceptual data modeling and not being able to handle a large number of relationships between entities. This paper describes an implementation for a modeling solution to clearly illustrate conceptual data models stored in XML formats in well organized and structured diagrams. A force layout with several different parameters is applied to the diagram to create both compact and easily traversable relationships between entities.

  7. Local curvature entropy-based 3D terrain representation using a comprehensive Quadtree

    NASA Astrophysics Data System (ADS)

    Chen, Qiyu; Liu, Gang; Ma, Xiaogang; Mariethoz, Gregoire; He, Zhenwen; Tian, Yiping; Weng, Zhengping

    2018-05-01

    Large scale 3D digital terrain modeling is a crucial part of many real-time applications in geoinformatics. In recent years, the improved speed and precision in spatial data collection make the original terrain data more complex and bigger, which poses challenges for data management, visualization and analysis. In this work, we presented an effective and comprehensive 3D terrain representation based on local curvature entropy and a dynamic Quadtree. The Level-of-detail (LOD) models of significant terrain features were employed to generate hierarchical terrain surfaces. In order to reduce the radical changes of grid density between adjacent LODs, local entropy of terrain curvature was regarded as a measure of subdividing terrain grid cells. Then, an efficient approach was presented to eliminate the cracks among the different LODs by directly updating the Quadtree due to an edge-based structure proposed in this work. Furthermore, we utilized a threshold of local entropy stored in each parent node of this Quadtree to flexibly control the depth of the Quadtree and dynamically schedule large-scale LOD terrain. Several experiments were implemented to test the performance of the proposed method. The results demonstrate that our method can be applied to construct LOD 3D terrain models with good performance in terms of computational cost and the maintenance of terrain features. Our method has already been deployed in a geographic information system (GIS) for practical uses, and it is able to support the real-time dynamic scheduling of large scale terrain models more easily and efficiently.

  8. Challenges in Managing Trustworthy Large-scale Digital Science

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  9. ODM2 (Observation Data Model): The EarthChem Use Case

    NASA Astrophysics Data System (ADS)

    Lehnert, Kerstin; Song, Lulin; Hsu, Leslie; Horsburgh, Jeffrey S.; Aufdenkampe, Anthony K.; Mayorga, Emilio; Tarboton, David; Zaslavsky, Ilya

    2014-05-01

    PetDB is an online data system that was created in the late 1990's to serve online a synthesis of published geochemical and petrological data of igneous and metamorphic rocks. PetDB has today reached a volume of 2.5 million analytical values for nearly 70,000 rock samples. PetDB's data model (Lehnert et al., G-Cubed 2000) was designed to store sample-based observational data generated by the analysis of rocks, together with a wide range of metadata documenting provenance of the samples, analytical procedures, data quality, and data source. Attempts to store additional types of geochemical data such as time-series data of seafloor hydrothermal springs and volcanic gases, depth-series data for marine sediments and soils, and mineral or mineral inclusion data revealed the limitations of the schema: the inability to properly record sample hierarchies (for example, a garnet that is included in a diamond that is included in a xenolith that is included in a kimberlite rock sample), inability to properly store time-series data, inability to accommodate classification schemes other than rock lithologies, deficiencies of identifying and documenting datasets that are not part of publications. In order to overcome these deficiencies, PetDB has been developing a new data schema using the ODM2 information model (ODM=Observation Data Model). The development of ODM2 is a collaborative project that leverages the experience of several existing information representations, including PetDB and EarthChem, and the CUAHSI HIS Observations Data Model (ODM), as well as the general specification for encoding observational data called Observations and Measurements (O&M) to develop a uniform information model that seamlessly manages spatially discrete, feature-based earth observations from environmental samples and sample fractions as well as in-situ sensors, and to test its initial implementation in a variety of user scenarios. The O&M model, adopted as an international standard by the Open Geospatial Consortium, and later by ISO, is the foundation of several domain markup languages such as OGC WaterML 2, used for exchanging hydrologic time series. O&M profiles for samples and sample fractions have not been standardized yet, and there is a significant variety in sample data representations used across agencies and academic projects. The intent of the ODM2 project is to create a unified relational representation for different types of spatially discrete observational data, ensuring that the data can be efficiently stored, transferred, catalogued and queried within a variety of earth science applications. We will report on the initial design and implementation of the new model for PetDB, and results of testing the model against a set of common queries. We have explored several aspects of the model, including: semantic consistency, validation and integrity checking, portability and maintainability, query efficiency, and scalability. The sample datasets from PetDB have been loaded in the initial physical implementation for testing. The results of the experiments point to both benefits and challenges of the initial design, and illustrate the key trade-off between the generality of design, ease of interpretation, and query efficiency, especially as the system needs to scale to millions of records.

  10. Organic carbon stock modelling for the quantification of the carbon sinks in terrestrial ecosystems

    NASA Astrophysics Data System (ADS)

    Durante, Pilar; Algeet, Nur; Oyonarte, Cecilio

    2017-04-01

    Given the recent environmental policies derived from the serious threats caused by global change, practical measures to decrease net CO2 emissions have to be put in place. Regarding this, carbon sequestration is a major measure to reduce atmospheric CO2 concentrations within a short and medium term, where terrestrial ecosystems play a basic role as carbon sinks. Development of tools for quantification, assessment and management of organic carbon in ecosystems at different scales and management scenarios, it is essential to achieve these commitments. The aim of this study is to establish a methodological framework for the modeling of this tool, applied to a sustainable land use planning and management at spatial and temporal scale. The methodology for carbon stock estimation in ecosystems is based on merger techniques between carbon stored in soils and aerial biomass. For this purpose, both spatial variability map of soil organic carbon (SOC) and algorithms for calculation of forest species biomass will be created. For the modelling of the SOC spatial distribution at different map scales, it is necessary to fit in and screen the available information of soil database legacy. Subsequently, SOC modelling will be based on the SCORPAN model, a quantitative model use to assess the correlation among soil-forming factors measured at the same site location. These factors will be selected from both static (terrain morphometric variables) and dynamic variables (climatic variables and vegetation indexes -NDVI-), providing to the model the spatio-temporal characteristic. After the predictive model, spatial inference techniques will be used to achieve the final map and to extrapolate the data to unavailable information areas (automated random forest regression kriging). The estimated uncertainty will be calculated to assess the model performance at different scale approaches. Organic carbon modelling of aerial biomass will be estimate using LiDAR (Light Detection And Ranging) algorithms. The available LiDAR databases will be used. LiDAR statistics (which describe the LiDAR cloud point data to calculate forest stand parameters) will be correlated with different canopy cover variables. The regression models applied to the total area will produce a continuous geo-information map to each canopy variable. The CO2 estimation will be calculated by dry-mass conversion factors for each forest species (C kg-CO2 kg equivalent). The result is the organic carbon modelling at spatio-temporal scale with different levels of uncertainty associated to the predictive models and diverse detailed scales. However, one of the main expected problems is due to the heterogeneous spatial distribution of the soil information, which influences on the prediction of the models at different spatial scales and, consequently, at SOC map scale. Besides this, the variability and mixture of the forest species of the aerial biomass decrease the accuracy assessment of the organic carbon.

  11. GraphReduce: Processing Large-Scale Graphs on Accelerator-Based Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Dipanjan; Song, Shuaiwen; Agarwal, Kapil

    2015-11-15

    Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the host andmore » device.« less

  12. Computer aided systems human engineering: A hypermedia tool

    NASA Technical Reports Server (NTRS)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  13. The NASA Langley building solar project and the supporting Lewis solar technology program

    NASA Technical Reports Server (NTRS)

    Ragsdale, R. G.; Namkoong, D.

    1974-01-01

    A solar energy technology program is described that includes solar collector testing in an indoor solar simulator facility and in an outdoor test facility, property measurements of solar panel coatings, and operation of a laboratory-scale solar model system test facility. Early results from simulator tests indicate that non-selective coatings behave more nearly in accord with predicted performance than do selective coatings. Initial experiments on the decay rate of thermally stratified hot water in a storage tank have been run. Results suggest that where high temperature water is required, excess solar energy collected by a building solar system should be stored overnight in the form of chilled water rather than hot water.

  14. Using selection bias to explain the observed structure of Internet diffusions

    PubMed Central

    Golub, Benjamin; Jackson, Matthew O.

    2010-01-01

    Recently, large datasets stored on the Internet have enabled the analysis of processes, such as large-scale diffusions of information, at new levels of detail. In a recent study, Liben-Nowell and Kleinberg [(2008) Proc Natl Acad Sci USA 105:4633–4638] observed that the flow of information on the Internet exhibits surprising patterns whereby a chain letter reaches its typical recipient through long paths of hundreds of intermediaries. We show that a basic Galton–Watson epidemic model combined with the selection bias of observing only large diffusions suffices to explain these patterns. Thus, selection biases of which data we observe can radically change the estimation of classical diffusion processes. PMID:20534439

  15. Development and validation of the Italian version of the Mobile Application Rating Scale and its generalisability to apps targeting primary prevention.

    PubMed

    Domnich, Alexander; Arata, Lucia; Amicizia, Daniela; Signori, Alessio; Patrick, Bernard; Stoyanov, Stoyan; Hides, Leanne; Gasparini, Roberto; Panatto, Donatella

    2016-07-07

    A growing body of literature affirms the usefulness of mobile technologies, including mobile applications (apps), in the primary prevention field. The quality of health apps, which today number in the thousands, is a crucial parameter, as it may affect health-related decision-making and outcomes among app end-users. The mobile application rating scale (MARS) has recently been developed to evaluate the quality of such apps, and has shown good psychometric properties. Since there is no standardised tool for assessing the apps available in Italian app stores, the present study developed and validated an Italian version of MARS in apps targeting primary prevention. The original 23-item version of the MARS assesses mobile app quality in four objective quality dimensions (engagement, functionality, aesthetics, information) and one subjective dimension. Validation of this tool involved several steps; the universalist approach to achieving equivalence was adopted. Following two backward translations, a reconciled Italian version of MARS was produced and compared with the original scale. On the basis of sample size estimation, 48 apps from three major app stores were downloaded; the first 5 were used for piloting, while the remaining 43 were used in the main study in order to assess the psychometric properties of the scale. The apps were assessed by two raters, each working independently. The psychometric properties of the final version of the scale was assessed including the inter-rater reliability, internal consistency, convergent, divergent and concurrent validities. The intralingual equivalence of the Italian version of the MARS was confirmed by the authors of the original scale. A total of 43 apps targeting primary prevention were tested. The MARS displayed acceptable psychometric properties. The MARS total score showed an excellent level of both inter-rater agreement (intra-class correlation coefficient of .96) and internal consistency (Cronbach's α of .90 and .91 for the two raters, respectively). Other types of validity, including convergent, divergent, discriminative, known-groups and scalability, were also established. The Italian version of MARS is a valid and reliable tool for assessing the health-related primary prevention apps available in Italian app stores.

  16. A 3D Full-Stokes Calving Model Applied to a West Greenland Outlet Glacier

    NASA Astrophysics Data System (ADS)

    Todd, Joe; Christoffersen, Poul; Zwinger, Thomas; Råback, Peter; Chauché, Nolwenn; Hubbard, Alun; Toberg, Nick; Luckman, Adrian; Benn, Doug; Slater, Donald; Cowton, Tom

    2017-04-01

    Iceberg calving from outlet glaciers accounts for around half of all mass loss from both the Greenland and Antarctic ice sheets. The diverse nature of calving and its complex links to both internal dynamics and external climate make it challenging to incorporate into models of glaciers and ice sheets. Consequently, calving represents one of the most significant uncertainties in predictions of future sea level rise. Here, we present results from a new 3D full-Stokes calving model developed in Elmer/Ice and applied to Store Glacier, the second largest outlet glacier in West Greenland. The calving model implements the crevasse depth criterion, which states that calving occurs when surface and basal crevasses penetrate the full thickness of the glacier. The model also implements a new 3D rediscretization approach and a time-evolution scheme which allow the calving front to evolve realistically through time. We use the model to test Store's sensitivity to two seasonal environmental processes believed to significantly influence calving: submarine melt undercutting and ice mélange buttressing. Store Glacier discharges 13.9 km3 of ice annually, and this calving rate shows a strong seasonal trend. We aim to reproduce this seasonal trend by forcing the model with present day levels of submarine melting and ice mélange buttressing. Sensitivity to changes in these frontal processes was also investigated, by forcing the model with a) increased submarine melt rates acting over longer periods of time and b) decreased mélange buttressing force acting over a reduced period. The model displays a range of observed calving behaviour and provides a good match to the observed seasonal evolution of the Store's terminus. The results indicate that ice mélange is the primary driver of the observed seasonal advance of the terminus and the associated seasonal variation in calving rate. The model also demonstrates a significant influence from submarine melting on calving rate. The results also highlight the importance of topographic setting; Store Glacier terminates on a large bedrock sill, and this was found to exert a first-order control on calving rate, explaining Store Glacier's comparative stability during a period when many Greenland outlet glaciers underwent concurrent retreat.

  17. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  18. Using a computational model to quantify the potential impact of changing the placement of healthy beverages in stores as an intervention to "Nudge" adolescent behavior choice.

    PubMed

    Wong, Michelle S; Nau, Claudia; Kharmats, Anna Yevgenyevna; Vedovato, Gabriela Milhassi; Cheskin, Lawrence J; Gittelsohn, Joel; Lee, Bruce Y

    2015-12-23

    Product placement influences consumer choices in retail stores. While sugar sweetened beverage (SSB) manufacturers expend considerable effort and resources to determine how product placement may increase SSB purchases, the information is proprietary and not available to the public health and research community. This study aims to quantify the effect of non-SSB product placement in corner stores on adolescent beverage purchasing behavior. Corner stores are small privately owned retail stores that are important beverage providers in low-income neighborhoods--where adolescents have higher rates of obesity. Using data from a community-based survey in Baltimore and parameters from the marketing literature, we developed a decision-analytic model to simulate and quantify how placement of healthy beverage (placement in beverage cooler closest to entrance, distance from back of the store, and vertical placement within each cooler) affects the probability of adolescents purchasing non-SSBs. In our simulation, non-SSB purchases were 2.8 times higher when placed in the "optimal location"--on the second or third shelves of the front cooler--compared to the worst location on the bottom shelf of the cooler farthest from the entrance. Based on our model results and survey data, we project that moving non-SSBs from the worst to the optional location would result in approximately 5.2 million more non-SSBs purchased by Baltimore adolescents annually. Our study is the first to quantify the potential impact of changing placement of beverages in corner stores. Our findings suggest that this could be a low-cost, yet impactful strategy to nudge this population--highly susceptible to obesity--towards healthier beverage decisions.

  19. Tyrosine kinases activate store-mediated Ca2+ entry in human platelets through the reorganization of the actin cytoskeleton.

    PubMed Central

    Rosado, J A; Graves, D; Sage, S O

    2000-01-01

    We have recently reported that store-mediated Ca(2+) entry in platelets is likely to be mediated by a reversible trafficking and coupling of the endoplasmic reticulum with the plasma membrane, a model termed 'secretion-like coupling'. In this model the actin cytoskeleton plays a key regulatory role. Since tyrosine kinases have been shown to be important for Ca(2+) entry in platelets and other cells, we have now investigated the possible involvement of tyrosine kinases in the secretion-like-coupling model. Treatment of platelets with thrombin or thapsigargin induced actin polymerization by a calcium-independent pathway. Methyl 2,5-dihydroxycinnamate, a tyrosine kinase inhibitor, prevented thrombin- or thapsigargin-induced actin polymerization. The effects of tyrosine kinases in store-mediated Ca(2+) entry were found to be entirely dependent on the actin cytoskeleton. PP1, an inhibitor of the Src family of proteins, partially inhibited store-mediated Ca(2+) entry. In addition, depletion of intracellular Ca(2+) stores stimulated cytoskeletal association of the cytoplasmic tyrosine kinase pp60(src), a process that was sensitive to treatment with cytochalasin D and PP1, but not to inhibition of Ras proteins using prenylcysteine analogues. Finally, combined inhibition of both Ras proteins and tyrosine kinases resulted in complete inhibition of Ca(2+) entry, suggesting that these two families of proteins have independent effects in the activation of store-mediated Ca(2+) entry in human platelets. PMID:11023829

  20. A Hierarchical Visualization Analysis Model of Power Big Data

    NASA Astrophysics Data System (ADS)

    Li, Yongjie; Wang, Zheng; Hao, Yang

    2018-01-01

    Based on the conception of integrating VR scene and power big data analysis, a hierarchical visualization analysis model of power big data is proposed, in which levels are designed, targeting at different abstract modules like transaction, engine, computation, control and store. The regularly departed modules of power data storing, data mining and analysis, data visualization are integrated into one platform by this model. It provides a visual analysis solution for the power big data.

  1. Review of the Scientific Understanding of Radioactive Waste at the U.S. DOE Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Reid A.; Buck, Edgar C.; Chun, Jaehun

    This paper reviews the origin and chemical and rheological complexity of radioactive waste at the U.S. Department of Energy’s Hanford Site. The waste, stored in underground tanks, was generated via three distinct processes over decades of plutonium extraction operations. Although close records were kept of original waste disposition, tank-to-tank transfers and conditions that impede equilibrium complicate our understanding of the chemistry, phase composition, and rheology of the waste. Tank waste slurries comprise particles and aggregates from nano to micron scales, with varying densities, morphologies, heterogeneous compositions, and complicated responses to flow regimes and process conditions. Further, remnant or changing radiationmore » fields may affect the stability and rheology of the waste. These conditions pose challenges for transport through conduits or pipes to treatment plants for vitrification. Additionally, recalcitrant boehmite degrades glass quality and must be reduced prior to vitrification, but dissolves much more slowly than predicted given surface normalized rates. Existing empirical models based on ex situ experiments and observations lack true predictive capabilities. Recent advances in in situ microscopy, aberration corrected TEM, theoretical modeling across scales, and experimental methods for probing the physics and chemistry at mineral-fluid and mineral-mineral interfaces are being implemented to build robustly predictive physics-based models.« less

  2. Radiation preservation and test marketing of fruits and vegetables

    NASA Astrophysics Data System (ADS)

    Zhicheng, Xu; Dong, Cai; Fuying, He; Deyao, Zhao

    1993-07-01

    To develop the technology for radiation preservation of fruits and vegetables, many varieties of fruits and vegetables had been researched. Results showed that the low dose irradiation is useful to preservation of fruits and vegetables. On the besis of research, 1900 tons garlic, 950 tons onion, 500 tons potatoes, 710 tons apples and 1000 kg litchi had been irradiated in commercial scale. The quality control standards of irradiated garlic, onion and potato had been established and used for commercial scale irradiation. In order to collect consumers in store response to irradiated foods, a special counter was set up for selling irradiated apples in Nan Jing Road (W), Shanghai. 634 sheets of consumer in-store respense investigation forms have been returned and analysed. These results showed that when consumer understands the benefit of irradiation preservation such as higher quality, greater safety, longer shelf-live, wide product availability, or good prices for value, consumer would willingly buy irradiated food.

  3. Availability, quality and price of produce in low-income neighbourhood food stores in California raise equity issues.

    PubMed

    Gosliner, Wendi; Brown, Daniel M; Sun, Betty C; Woodward-Lopez, Gail; Crawford, Patricia B

    2018-06-01

    To assess produce availability, quality and price in a large sample of food stores in low-income neighbourhoods in California. Cross-sectional statewide survey. Between 2011 and 2015, local health departments assessed store type, WIC (Supplemental Nutrition Program for Women, Infants, and Children)/SNAP (Supplemental Nutrition Assistance Program) participation, produce availability, quality and price of selected items in stores in low-income neighbourhoods. Secondary data provided reference chain supermarket produce prices matched by county and month. t Tests and ANOVA examined differences by store type; regression models examined factors associated with price. Large grocery stores (n 231), small markets (n 621) and convenience stores (n 622) in 225 neighbourhoods. Produce in most large groceries was rated high quality (97 % of fruits, 98 % of vegetables), but not in convenience stores (25 % fruits, 14 % vegetables). Small markets and convenience stores participating in WIC and/or SNAP had better produce availability, variety and quality than non-participating stores. Produce prices across store types were, on average, higher than reference prices from matched chain supermarkets (27 % higher in large groceries, 37 % higher in small markets, 102 % higher in convenience stores). Price was significantly inversely associated with produce variety, adjusting for quality, store type, and SNAP and WIC participation. The study finds that fresh produce is more expensive in low-income neighbourhoods and that convenience stores offer more expensive, poorer-quality produce than other stores. Variety is associated with price and most limited in convenience stores, suggesting more work is needed to determine how convenience stores can provide low-income consumers with access to affordable, high-quality produce. WIC and SNAP can contribute to the solution.

  4. Racial Bias in the Manager-Employee Relationship: An Analysis of Quits, Dismissals, and Promotions at a Large Retail Firm

    ERIC Educational Resources Information Center

    Giuliano, Laura; Levine, David I.; Leonard, Jonathan

    2011-01-01

    Using data from a large U.S. retail firm, we examine how racial matches between managers and their employees affect rates of employee quits, dismissals, and promotions. We exploit changes in management at hundreds of stores to estimate hazard models with store fixed effects that control for all unobserved differences across store locations. We…

  5. Effect of conventional and square stores on the longitudinal aerodynamic characteristics of a fighter aircraft model at supersonic speeds. [in the langley unitary plan wind tunnel

    NASA Technical Reports Server (NTRS)

    Monta, W. J.

    1980-01-01

    The effects of conventional and square stores on the longitudinal aerodynamic characteristics of a fighter aircraft configuration at Mach numbers of 1.6, 1.8, and 2.0 was investigated. Five conventional store configurations and six arrangements of a square store configuration were studied. All configurations of the stores produced small, positive increments in the pitching moment throughout the angle-of-attack range, but the configuration with area ruled wing tanks also had a slight decrease on stability at the higher angles of attack. There were some small changes in lift coefficient because of the addition of the stores, causing the drag increment to vary with the lift coefficient. As a result, there were corresponding changes in the increments of the maximum lift drag ratios. The store drag coefficient based on the cross sectional area of the stores ranged from a maximum of 1.1 for the configuration with three Maverick missiles to a minimum of about .040 for the two MK-84 bombs and the arrangements with four square stores touching or two square stores in tandem. Square stores located side by side yielded about 0.50 in the aft position compared to 0.74 in the forward position.

  6. The missing path to gain customers loyalty in pharmacy retail: The role of the store in developing satisfaction and trust.

    PubMed

    Castaldo, Sandro; Grosso, Monica; Mallarini, Erika; Rindone, Marco

    2016-01-01

    An evolution led to community pharmacies experiencing increased competition both between themselves and with new entrants in the sector, for example, grocery retailers. Applying certain retail marketing strategies aimed at developing store loyalty may be an appropriate strategic path for pharmacies wanting to compete in this new arena. This study aimed to develop and test a two-step model to identify the determinants of store loyalty for community pharmacies in Italy. Based on the retail literature, qualitative research was conducted to identify key variables determining loyalty to community pharmacies. The model was then tested by means of a phone survey. A total of 735 usable questionnaires was collected. The study highlights the key role of the relationship between pharmacists and their customers in the loyalty-building path; trust in pharmacists is the first driver of satisfaction and a direct and indirect (through satisfaction) driver of trust in pharmacies, which leads to store loyalty. Retail-level levers, such as the store environment, assortment, and communication, influence trust in pharmacies. This model is a first step toward investigating loyalty-building by applying the retail management literature's concepts to the community pharmacy sector. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. The role of climate on inter-annual variation in stream nitrate fluxes and concentrations.

    PubMed

    Gascuel-Odoux, Chantal; Aurousseau, Pierre; Durand, Patrick; Ruiz, Laurent; Molenat, Jérôme

    2010-11-01

    In recent decades, temporal variations in nitrate fluxes and concentrations in temperate rivers have resulted from the interaction of anthropogenic and climatic factors. The effect of climatic drivers remains unclear, while the relative importance of the drivers seems to be highly site dependent. This paper focuses on 2-6 year variations called meso-scale variations, and analyses the climatic drivers of these variations in a study site characterized by high N inputs from intensive animal farming systems and shallow aquifers with impervious bedrock in a temperate climate. Three approaches are developed: 1) an analysis of long-term records of nitrate fluxes and nitrate concentrations in 30 coastal rivers of Western France, which were well-marked by meso-scale cycles in the fluxes and concentration with a slight hysteresis; 2) a test of the climatic control using a lumped two-box model, which demonstrates that hydrological assumptions are sufficient to explain these meso-scale cycles; and 3) a model of nitrate fluxes and concentrations in two contrasted catchments subjected to recent mitigation measures, which analyses nitrate fluxes and concentrations in relation to N stored in groundwater. In coastal rivers, hydrological drivers (i.e., effective rainfall), and particularly the dynamics of the water table and rather stable nitrate concentration, explain the meso-scale cyclic patterns. In the headwater catchment, agricultural and hydrological drivers can interact according to their settings. The requirements to better distinguish the effect of climate and human changes in integrated water management are addressed: long-term monitoring, coupling the analysis and the modelling of large sets of catchments incorporating different sizes, land uses and environmental factors. Copyright © 2009 Elsevier B.V. All rights reserved.

  8. Soil moisture at local scale: Measurements and simulations

    NASA Astrophysics Data System (ADS)

    Romano, Nunzio

    2014-08-01

    Soil moisture refers to the water present in the uppermost part of a field soil and is a state variable controlling a wide array of ecological, hydrological, geotechnical, and meteorological processes. The literature on soil moisture is very extensive and is developing so rapidly that it might be considered ambitious to seek to present the state of the art concerning research into this key variable. Even when covering investigations about only one aspect of the problem, there is a risk of some inevitable omission. A specific feature of the present essay, which may make this overview if not comprehensive at least of particular interest, is that the reader is guided through the various traditional and more up-to-date methods by the central thread of techniques developed to measure soil moisture interwoven with applications of modeling tools that exploit the observed datasets. This paper restricts its analysis to the evolution of soil moisture at the local (spatial) scale. Though a somewhat loosely defined term, it is linked here to a characteristic length of the soil volume investigated by the soil moisture sensing probe. After presenting the most common concepts and definitions about the amount of water stored in a certain volume of soil close to the land surface, this paper proceeds to review ground-based methods for monitoring soil moisture and evaluates modeling tools for the analysis of the gathered information in various applications. Concluding remarks address questions of monitoring and modeling of soil moisture at scales larger than the local scale with the related issue of data aggregation. An extensive, but not exhaustive, list of references is provided, enabling the reader to gain further insights into this subject.

  9. Association between store food environment and customer purchases in small grocery stores, gas-marts, pharmacies and dollar stores.

    PubMed

    Caspi, Caitlin E; Lenk, Kathleen; Pelletier, Jennifer E; Barnes, Timothy L; Harnack, Lisa; Erickson, Darin J; Laska, Melissa N

    2017-06-05

    Purchases at small/non-traditional food stores tend to have poor nutritional quality, and have been associated with poor health outcomes, including increased obesity risk The purpose of this study was to examine whether customers who shop at small/non-traditional food stores with more health promoting features make healthier purchases. In a cross-sectional design, data collectors assessed store features in a sample of 99 small and non-traditional food stores not participating in the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) in Minneapolis/St. Paul, MN in 2014. Customer intercept interviews (n = 594) collected purchase data from a bag check and demographics from a survey. Store measures included fruit/vegetable and whole grain availability, an overall Healthy Food Supply Score (HFSS), healthy food advertisements and in-store placement, and shelf space of key items. Customer nutritional measures were analyzed using Nutrient Databases System for Research (NDSR), and included the purchase of ≥1 serving of fruits/vegetables; ≥1 serving of whole grains; and overall Healthy Eating Index-2010 (HEI-2010) score for foods/beverages purchased. Associations between store and customer measures were estimated in multilevel linear and logistic regression models, controlling for customer characteristics and store type. Few customers purchased fruits and vegetables (8%) or whole grains (8%). In fully adjusted models, purchase HEI-2010 scores were associated with fruit/vegetable shelf space (p = 0.002) and the ratio of shelf space devoted to healthy vs. less healthy items (p = 0.0002). Offering ≥14 varieties of fruit/vegetables was associated with produce purchases (OR 3.9, 95% CI 1.2-12.3), as was having produce visible from the store entrance (OR 2.3 95% CI 1.0 to 5.8), but whole grain availability measures were not associated with whole grain purchases. Strategies addressing both customer demand and the availability of healthy food may be necessary to improve customer purchases. ClinialTrials.gov: NCT02774330 . Registered May 4, 2016 (retrospectively registered).

  10. A Model to Operate an On-Campus Retail Store for Workplace Experiential Learning

    ERIC Educational Resources Information Center

    Truman, Kiru; Mason, Roger B.; Venter, Petrus

    2017-01-01

    Many retailers argue that university students do not have the practical experience and skills required in the workplace when graduating. This paper reports on research undertaken to address this issue and to identify a model to guide development and implementation of a retail store, on a university campus, to be used for work-integrated learning.…

  11. C++ and data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naumann, Axel; /CERN; Canal, Philippe

    2008-01-01

    High performance computing with a large code base and C++ has proved to be a good combination. But when it comes to storing data, C++ is a problematic choice: it offers no support for serialization, type definitions are amazingly complex to parse, and the dependency analysis (what does object A need to be stored?) is incredibly difficult. Nevertheless, the LHC data consists of C++ objects that are serialized with help from ROOT's reflection database and interpreter CINT. The fact that we can do it on that scale, and the performance with which we do it makes this approach unique andmore » stirs interest even outside HEP. I will show how CINT collects and stores information about C++ types, what the current major challenges are (dictionary size), and what CINT and ROOT have done and plan to do about it.« less

  12. Djeen (Database for Joomla!'s Extensible Engine): a research information management system for flexible multi-technology project administration.

    PubMed

    Stahl, Olivier; Duvergey, Hugo; Guille, Arnaud; Blondin, Fanny; Vecchio, Alexandre Del; Finetti, Pascal; Granjeaud, Samuel; Vigy, Oana; Bidaut, Ghislain

    2013-06-06

    With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. We developed Djeen (Database for Joomla!'s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group.Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material.

  13. Bigger is better: Improved nature conservation and economic returns from landscape-level mitigation.

    PubMed

    Kennedy, Christina M; Miteva, Daniela A; Baumgarten, Leandro; Hawthorne, Peter L; Sochi, Kei; Polasky, Stephen; Oakleaf, James R; Uhlhorn, Elizabeth M; Kiesecker, Joseph

    2016-07-01

    Impact mitigation is a primary mechanism on which countries rely to reduce environmental externalities and balance development with conservation. Mitigation policies are transitioning from traditional project-by-project planning to landscape-level planning. Although this larger-scale approach is expected to provide greater conservation benefits at the lowest cost, empirical justification is still scarce. Using commercial sugarcane expansion in the Brazilian Cerrado as a case study, we apply economic and biophysical steady-state models to quantify the benefits of the Brazilian Forest Code (FC) under landscape- and property-level planning. We find that FC compliance imposes small costs to business but can generate significant long-term benefits to nature: supporting 32 (±37) additional species (largely habitat specialists), storing 593,000 to 2,280,000 additional tons of carbon worth $69 million to $265 million ($ pertains to U.S. dollars), and marginally improving surface water quality. Relative to property-level compliance, we find that landscape-level compliance reduces total business costs by $19 million to $35 million per 6-year sugarcane growing cycle while often supporting more species and storing more carbon. Our results demonstrate that landscape-level mitigation provides cost-effective conservation and can be used to promote sustainable development.

  14. Microstructure and Property Modifications of Cold Rolled IF Steel by Local Laser Annealing

    NASA Astrophysics Data System (ADS)

    Hallberg, Håkan; Adamski, Frédéric; Baïz, Sarah; Castelnau, Olivier

    2017-10-01

    Laser annealing experiments are performed on cold rolled IF steel whereby highly localized microstructure and property modification are achieved. The microstructure is seen to develop by strongly heterogeneous recrystallization to provide steep gradients, across the submillimeter scale, of grain size and crystallographic texture. Hardness mapping by microindentation is used to reveal the corresponding gradients in macroscopic properties. A 2D level set model of the microstructure development is established as a tool to further optimize the method and to investigate, for example, the development of grain size variations due to the strong and transient thermal gradient. Particular focus is given to the evolution of the beneficial γ-fiber texture during laser annealing. The simulations indicate that the influence of selective growth based on anisotropic grain boundary properties only has a minor effect on texture evolution compared to heterogeneous stored energy, temperature variations, and nucleation conditions. It is also shown that although the α-fiber has an initial frequency advantage, the higher probability of γ-nucleation, in combination with a higher stored energy driving force in this fiber, promotes a stronger presence of the γ-fiber as also observed in experiments.

  15. Uniformity of cylindrical imploding underwater shockwaves at very small radii

    NASA Astrophysics Data System (ADS)

    Yanuka, D.; Rososhek, A.; Bland, S. N.; Krasik, Ya. E.

    2017-11-01

    We compare the convergent shockwaves generated from underwater, cylindrical arrays of copper wire exploded by multiple kilo-ampere current pulses on nanosecond and microsecond scales. In both cases, the pulsed power devices used for the experiments had the same stored energy (˜500 J) and the wire mass was adjusted to optimize energy transfer to the shockwave. Laser backlit framing images of the shock front were achieved down to the radius of 30 μm. It was found that even in the case of initial azimuthal non-symmetry, the shock wave self-repairs in the final stages of its motion, leading to a highly uniform implosion. In both these and previous experiments, interference fringes have been observed in streak and framing images as the shockwave approached the axis. We have been able to accurately model the origin of the fringes, which is due to the propagation of the laser beam diffracting off the uniform converging shock front. The dynamics of the shockwave and its uniformity at small radii indicate that even with only 500 J stored energies, this technique should produce pressures above 1010 Pa on the axis, with temperatures and densities ideal for warm dense matter research.

  16. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    PubMed Central

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  17. Designing for Peta-Scale in the LSST Database

    NASA Astrophysics Data System (ADS)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  18. Long-term evolution of the force-free twisted magnetosphere of a magnetar

    NASA Astrophysics Data System (ADS)

    Akgün, T.; Cerdá-Durán, P.; Miralles, J. A.; Pons, J. A.

    2017-12-01

    We study the long-term quasi-steady evolution of the force-free magnetosphere of a magnetar coupled to its internal magnetic field. We find that magnetospheric currents can be maintained on long time-scales of the order of thousands of years. Meanwhile, the energy, helicity and twist stored in the magnetosphere all gradually increase over the course of this evolution, until a critical point is reached, beyond which a force-free magnetosphere cannot be constructed. At this point, some large-scale magnetospheric rearrangement, possibly resulting in an outburst or a flare, must occur, releasing a large fraction of the stored energy, helicity and twist. After that, the quasi-steady evolution should continue in a similar manner from the new initial conditions. The time-scale for reaching this critical point depends on the overall magnetic field strength and on the relative fraction of the toroidal field. The energy stored in the force-free magnetosphere is found to be up to ∼30 per cent larger than the corresponding vacuum energy. This implies that for a 1014 G field at the pole, the energy budget available for fast magnetospheric events is of the order of a few 1044 erg. The spin-down rate is estimated to increase by up to ∼60 per cent, since the dipole content in the magnetosphere is enhanced by the currents present there. A rough estimate of the braking index n reveals that it is systematically n < 3 for the most part of the evolution, consistent with actual measurements for pulsars and early estimates for several magnetars.

  19. Systems Modeling of Ca2+ Homeostasis and Mobilization in Platelets Mediated by IP3 and Store-Operated Ca2+ Entry

    PubMed Central

    Dolan, Andrew T.; Diamond, Scott L.

    2014-01-01

    Resting platelets maintain a stable level of low cytoplasmic calcium ([Ca2+]cyt) and high dense tubular system calcium ([Ca2+]dts). During thrombosis, activators cause a transient rise in inositol trisphosphate (IP3) to trigger calcium mobilization from stores and elevation of [Ca2+]cyt. Another major source of [Ca2+]cyt elevation is store-operated calcium entry (SOCE) through plasmalemmal calcium channels that open in response to store depletion as [Ca2+]dts drops. A 34-species systems model employed kinetics describing IP3-receptor, DTS-plasmalemma puncta formation, SOCE via assembly of STIM1 and Orai1, and the plasmalemma and sarco/endoplasmic reticulum Ca2+-ATPases. Four constraints were imposed: calcium homeostasis before activation; stable in zero extracellular calcium; IP3-activatable; and functional SOCE. Using a Monte Carlo method to sample three unknown parameters and nine initial concentrations in a 12-dimensional space near measured or expected values, we found that model configurations that were responsive to stimuli and demonstrated significant SOCE required high inner membrane electric potential (>−70 mV) and low resting IP3 concentrations. The absence of puncta in resting cells was required to prevent spontaneous store depletion in calcium-free media. Ten-fold increases in IP3 caused saturated calcium mobilization. This systems model represents a critical step in being able to predict platelets’ phenotypes during hemostasis or thrombosis. PMID:24806937

  20. Downscaling of a global climate model for estimation of runoff, sediment yield and dam storage: A case study of Pirapama basin, Brazil

    NASA Astrophysics Data System (ADS)

    Braga, Ana Cláudia F. Medeiros; Silva, Richarde Marques da; Santos, Celso Augusto Guimarães; Galvão, Carlos de Oliveira; Nobre, Paulo

    2013-08-01

    The coastal zone of northeastern Brazil is characterized by intense human activities and by large settlements and also experiences high soil losses that can contribute to environmental damage. Therefore, it is necessary to build an integrated modeling-forecasting system for rainfall-runoff erosion that assesses plans for water availability and sediment yield that can be conceived and implemented. In this work, we present an evaluation of an integrated modeling system for a basin located in this region with a relatively low predictability of seasonal rainfall and a small area (600 km2). The National Center for Environmental Predictions - NCEP’s Regional Spectral Model (RSM) nested within the Center for Weather Forecasting and Climate Studies - CPTEC’s Atmospheric General Circulation Model (AGCM) were investigated in this study, and both are addressed in the simulation work. The rainfall analysis shows that: (1) the dynamic downscaling carried out by the regional RSM model approximates the frequency distribution of the daily observed data set although errors were detected in the magnitude and timing (anticipation of peaks, for example) at the daily scale, (2) an unbiased precipitation forecast seemed to be essential for use of the results in hydrological models, and (3) the information directly extracted from the global model may also be useful. The simulated runoff and reservoir-stored volumes are strongly linked to rainfall, and their estimation accuracy was significantly improved at the monthly scale, thus rendering the results useful for management purposes. The runoff-erosion forecasting displayed a large sediment yield that was consistent with the predicted rainfall.

  1. The actin cytoskeleton in store-mediated calcium entry

    PubMed Central

    Rosado, Juan A; Sage, Stewart O

    2000-01-01

    Store-mediated Ca2+ entry is the main pathway for Ca2+ influx in platelets and many other cells. Several hypotheses have considered both direct and indirect coupling mechanisms between the endoplasmic reticulum and the plasma membrane. Here we pay particular attention to new insights into the regulation of store-mediated Ca2+ entry: the role of the cytoskeleton in a secretion-like coupling model. In this model, Ca2+ entry may be mediated by a reversible trafficking and coupling of the endoplasmic reticulum with the plasma membrane, that shows close parallels to the events mediating secretion. As with secretion, the actin cytoskeleton plays an inhibitory role in the activation of Ca2+ entry by preventing the approach and coupling of the endoplasmic reticulum with the plasma membrane, making cytoskeletal remodelling a key event in the activation of Ca2+ entry. We also review recent advances investigating the regulation of store-mediated Ca2+ entry by small GTPases and phosphoinositides, which might be involved in the store-mediated Ca2+ entry pathway through roles in the remodelling of the cytoskeleton. PMID:10896713

  2. Enabling Object Storage via shims for Grid Middleware

    NASA Astrophysics Data System (ADS)

    Cadellin Skipsey, Samuel; De Witt, Shaun; Dewhurst, Alastair; Britton, David; Roy, Gareth; Crooks, David

    2015-12-01

    The Object Store model has quickly become the basis of most commercially successful mass storage infrastructure, backing so-called ”Cloud” storage such as Amazon S3, but also underlying the implementation of most parallel distributed storage systems. Many of the assumptions in Object Store design are similar, but not identical, to concepts in the design of Grid Storage Elements, although the requirement for ”POSIX-like” filesystem structures on top of SEs makes the disjunction seem larger. As modern Object Stores provide many features that most Grid SEs do not (block level striping, parallel access, automatic file repair, etc.), it is of interest to see how easily we can provide interfaces to typical Object Stores via plugins and shims for Grid tools, and how well experiments can adapt their data models to them. We present evaluation of, and first-deployment experiences with, (for example) Xrootd-Ceph interfaces for direct object-store access, as part of an initiative within GridPP[1] hosted at RAL. Additionally, we discuss the tradeoffs and experience of developing plugins for the currently-popular Ceph parallel distributed filesystem for the GFAL2 access layer, at Glasgow.

  3. Landscape Conservation of Aquatic Habitats Promotes Watershed-scale Biological, Biogeochemical, and Hydrological Functions

    EPA Science Inventory

    Wetlands are exceptionally productive landscape features that provide critical habitat for endemic species, threatened/endangered and migratory animals, store floodwaters and maintain baseflows in stream systems, recharge groundwaters, and biogeochemically and physically affect n...

  4. The MIT Alewife Machine: A Large-Scale Distributed-Memory Multiprocessor

    DTIC Science & Technology

    1991-06-01

    Symposium on Compiler Construction, June 1986. [14] Daniel Gajski , David Kuck, Duncan Lawrie, and Ahmed Saleh. Cedar - A Large Scale Multiprocessor. In...Directory Methods. In Proceedings 17th Annual International Symposium on Computer Architecture, June 1990. [31] G . M. Papadopoulos and D.E. Culler...Monsoon: An Explicit Token-Store Ar- chitecture. In Proceedings 17th Annual International Symposium on Computer Architecture, June 1990. [32] G . F

  5. Convenience stores are the key food environment influence on nutrients available from household food supplies in Texas Border Colonias

    PubMed Central

    2013-01-01

    Background Few studies have focused on the relationship between the retail food environment and household food supplies. This study examines spatial access to retail food stores, food shopping habits, and nutrients available in household food supplies among 50 Mexican-origin families residing in Texas border colonias. Methods The design was cross-sectional; data were collected in the home March to June 2010 by promotora-researchers. Ground-truthed methods enumerated traditional (supercenters, supermarkets, grocery stores), convenience (convenience stores and food marts), and non-traditional (dollar stores, discount stores) retail food stores. Spatial access was computed using the network distance from each participant’s residence to each food store. Data included survey data and two household food inventories (HFI) of the presence and amount of food items in the home. The Spanish language interviewer-administered survey included demographics, transportation access, food purchasing, food and nutrition assistance program participation, and the 18-item Core Food Security Module. Nutrition Data Systems for Research (NDS-R) was used to calculate HFI nutrients. Adult equivalent adjustment constants (AE), based on age and gender calorie needs, were calculated based on the age- and gender composition of each household and used to adjust HFI nutrients for household composition. Data were analyzed using bivariate analysis and linear regression models to determine the association of independent variables with the availability of each AE-adjusted nutrient. Results Regression models showed that households in which the child independently purchased food from a convenience store at least once a week had foods and beverages with increased amounts of total energy, total fat, and saturated fat. A greater distance to the nearest convenience store was associated with reduced amounts of total energy, vitamin D, total sugar, added sugar, total fat, and saturated fat. Participation in the National School Lunch Program (NSLP) was associated with lower household levels of total energy, calcium, vitamin C, sodium, vitamin D, and saturated fat. Spatial access and utilization of supermarkets and dollar stores were not associated with nutrient availability. Conclusions Although household members frequently purchased food items from supermarkets or dollar stores, it was spatial access to and frequent utilization of convenience food stores that influenced the amount of nutrients present in Texas border colonia households. These findings also suggest that households which participate in NSLP have reduced AE-adjusted nutrients available in the home. The next step will target changes within convenience stores to improve in-store marketing of foods and beverages to children and adults. PMID:23327426

  6. Convenience stores are the key food environment influence on nutrients available from household food supplies in Texas Border Colonias.

    PubMed

    Sharkey, Joseph R; Dean, Wesley R; Nalty, Courtney C; Xu, Jin

    2013-01-17

    Few studies have focused on the relationship between the retail food environment and household food supplies. This study examines spatial access to retail food stores, food shopping habits, and nutrients available in household food supplies among 50 Mexican-origin families residing in Texas border colonias. The design was cross-sectional; data were collected in the home March to June 2010 by promotora-researchers. Ground-truthed methods enumerated traditional (supercenters, supermarkets, grocery stores), convenience (convenience stores and food marts), and non-traditional (dollar stores, discount stores) retail food stores. Spatial access was computed using the network distance from each participant's residence to each food store. Data included survey data and two household food inventories (HFI) of the presence and amount of food items in the home. The Spanish language interviewer-administered survey included demographics, transportation access, food purchasing, food and nutrition assistance program participation, and the 18-item Core Food Security Module. Nutrition Data Systems for Research (NDS-R) was used to calculate HFI nutrients. Adult equivalent adjustment constants (AE), based on age and gender calorie needs, were calculated based on the age- and gender composition of each household and used to adjust HFI nutrients for household composition. Data were analyzed using bivariate analysis and linear regression models to determine the association of independent variables with the availability of each AE-adjusted nutrient. Regression models showed that households in which the child independently purchased food from a convenience store at least once a week had foods and beverages with increased amounts of total energy, total fat, and saturated fat. A greater distance to the nearest convenience store was associated with reduced amounts of total energy, vitamin D, total sugar, added sugar, total fat, and saturated fat. Participation in the National School Lunch Program (NSLP) was associated with lower household levels of total energy, calcium, vitamin C, sodium, vitamin D, and saturated fat. Spatial access and utilization of supermarkets and dollar stores were not associated with nutrient availability. Although household members frequently purchased food items from supermarkets or dollar stores, it was spatial access to and frequent utilization of convenience food stores that influenced the amount of nutrients present in Texas border colonia households. These findings also suggest that households which participate in NSLP have reduced AE-adjusted nutrients available in the home. The next step will target changes within convenience stores to improve in-store marketing of foods and beverages to children and adults.

  7. Surface temperatures in New York City: Geospatial data enables the accurate prediction of radiative heat transfer.

    PubMed

    Ghandehari, Masoud; Emig, Thorsten; Aghamohamadnia, Milad

    2018-02-02

    Despite decades of research seeking to derive the urban energy budget, the dynamics of thermal exchange in the densely constructed environment is not yet well understood. Using New York City as a study site, we present a novel hybrid experimental-computational approach for a better understanding of the radiative heat transfer in complex urban environments. The aim of this work is to contribute to the calculation of the urban energy budget, particularly the stored energy. We will focus our attention on surface thermal radiation. Improved understanding of urban thermodynamics incorporating the interaction of various bodies, particularly in high rise cities, will have implications on energy conservation at the building scale, and for human health and comfort at the urban scale. The platform presented is based on longwave hyperspectral imaging of nearly 100 blocks of Manhattan, in addition to a geospatial radiosity model that describes the collective radiative heat exchange between multiple buildings. Despite assumptions in surface emissivity and thermal conductivity of buildings walls, the close comparison of temperatures derived from measurements and computations is promising. Results imply that the presented geospatial thermodynamic model of urban structures can enable accurate and high resolution analysis of instantaneous urban surface temperatures.

  8. Unleashing elastic energy: dynamics of energy release in rubber bands and impulsive biological systems

    NASA Astrophysics Data System (ADS)

    Ilton, Mark; Cox, Suzanne; Egelmeers, Thijs; Patek, S. N.; Crosby, Alfred J.

    Impulsive biological systems - which include mantis shrimp, trap-jaw ants, and venus fly traps - can reach high speeds by using elastic elements to store and rapidly release energy. The material behavior and shape changes critical to achieving rapid energy release in these systems are largely unknown due to limitations of materials testing instruments operating at high speed and large displacement. In this work, we perform fundamental, proof-of-concept measurements on the tensile retraction of elastomers. Using high speed imaging, the kinematics of retraction are measured for elastomers with varying mechanical properties and geometry. Based on the kinematics, the rate of energy dissipation in the material is determined as a function of strain and strain-rate, along with a scaling relation which describes the dependence of maximum velocity on material properties. Understanding this scaling relation along with the material failure limits of the elastomer allows the prediction of material properties required for optimal performance. We demonstrate this concept experimentally by optimizing for maximum velocity in our synthetic model system, and achieve retraction velocities that exceed those in biological impulsive systems. This model system provides a foundation for future work connecting continuum performance to molecular architecture in impulsive systems.

  9. Public-Private Partnerships in Cloud-Computing Services in the Context of Genomic Research.

    PubMed

    Granados Moreno, Palmira; Joly, Yann; Knoppers, Bartha Maria

    2017-01-01

    Public-private partnerships (PPPs) have been increasingly used to spur and facilitate innovation in a number of fields. In healthcare, the purpose of using a PPP is commonly to develop and/or provide vaccines and drugs against communicable diseases, mainly in developing or underdeveloped countries. With the advancement of technology and of the area of genomics, these partnerships also focus on large-scale genomic research projects that aim to advance the understanding of diseases that have a genetic component and to develop personalized treatments. This new focus has created new forms of PPPs that involve information technology companies, which provide computing infrastructure and services to store, analyze, and share the massive amounts of data genomic-related projects produce. In this article, we explore models of PPPs proposed to handle, protect, and share the genomic data collected and to further develop genomic-based medical products. We also identify the reasons that make these models suitable and the challenges they have yet to overcome. To achieve this, we describe the details and complexities of MSSNG, International Cancer Genome Consortium, and 100,000 Genomes Project, the three PPPs that focus on large-scale genomic research to better understand the genetic components of autism, cancer, rare diseases, and infectious diseases with the intention to find appropriate treatments. Organized as PPP and employing cloud-computing services, the three projects have advanced quickly and are likely to be important sources of research and development for future personalized medicine. However, there still are unresolved matters relating to conflicts of interest, commercialization, and data control. Learning from the challenges encountered by past PPPs allowed us to establish that developing guidelines to adequately manage personal health information stored in clouds and ensuring the protection of data integrity and privacy would be critical steps in the development of future PPPs.

  10. The Genealogical Population Dynamics of HIV-1 in a Large Transmission Chain: Bridging within and among Host Evolutionary Rates

    PubMed Central

    Vrancken, Bram; Rambaut, Andrew; Suchard, Marc A.; Drummond, Alexei; Baele, Guy; Derdelinckx, Inge; Van Wijngaerden, Eric; Vandamme, Anne-Mieke; Van Laethem, Kristel; Lemey, Philippe

    2014-01-01

    Transmission lies at the interface of human immunodeficiency virus type 1 (HIV-1) evolution within and among hosts and separates distinct selective pressures that impose differences in both the mode of diversification and the tempo of evolution. In the absence of comprehensive direct comparative analyses of the evolutionary processes at different biological scales, our understanding of how fast within-host HIV-1 evolutionary rates translate to lower rates at the between host level remains incomplete. Here, we address this by analyzing pol and env data from a large HIV-1 subtype C transmission chain for which both the timing and the direction is known for most transmission events. To this purpose, we develop a new transmission model in a Bayesian genealogical inference framework and demonstrate how to constrain the viral evolutionary history to be compatible with the transmission history while simultaneously inferring the within-host evolutionary and population dynamics. We show that accommodating a transmission bottleneck affords the best fit our data, but the sparse within-host HIV-1 sampling prevents accurate quantification of the concomitant loss in genetic diversity. We draw inference under the transmission model to estimate HIV-1 evolutionary rates among epidemiologically-related patients and demonstrate that they lie in between fast intra-host rates and lower rates among epidemiologically unrelated individuals infected with HIV subtype C. Using a new molecular clock approach, we quantify and find support for a lower evolutionary rate along branches that accommodate a transmission event or branches that represent the entire backbone of transmitted lineages in our transmission history. Finally, we recover the rate differences at the different biological scales for both synonymous and non-synonymous substitution rates, which is only compatible with the ‘store and retrieve’ hypothesis positing that viruses stored early in latently infected cells preferentially transmit or establish new infections upon reactivation. PMID:24699231

  11. Public–Private Partnerships in Cloud-Computing Services in the Context of Genomic Research

    PubMed Central

    Granados Moreno, Palmira; Joly, Yann; Knoppers, Bartha Maria

    2017-01-01

    Public–private partnerships (PPPs) have been increasingly used to spur and facilitate innovation in a number of fields. In healthcare, the purpose of using a PPP is commonly to develop and/or provide vaccines and drugs against communicable diseases, mainly in developing or underdeveloped countries. With the advancement of technology and of the area of genomics, these partnerships also focus on large-scale genomic research projects that aim to advance the understanding of diseases that have a genetic component and to develop personalized treatments. This new focus has created new forms of PPPs that involve information technology companies, which provide computing infrastructure and services to store, analyze, and share the massive amounts of data genomic-related projects produce. In this article, we explore models of PPPs proposed to handle, protect, and share the genomic data collected and to further develop genomic-based medical products. We also identify the reasons that make these models suitable and the challenges they have yet to overcome. To achieve this, we describe the details and complexities of MSSNG, International Cancer Genome Consortium, and 100,000 Genomes Project, the three PPPs that focus on large-scale genomic research to better understand the genetic components of autism, cancer, rare diseases, and infectious diseases with the intention to find appropriate treatments. Organized as PPP and employing cloud-computing services, the three projects have advanced quickly and are likely to be important sources of research and development for future personalized medicine. However, there still are unresolved matters relating to conflicts of interest, commercialization, and data control. Learning from the challenges encountered by past PPPs allowed us to establish that developing guidelines to adequately manage personal health information stored in clouds and ensuring the protection of data integrity and privacy would be critical steps in the development of future PPPs. PMID:28164085

  12. Long-Term Memory Stabilized by Noise-Induced Rehearsal

    PubMed Central

    Wei, Yi

    2014-01-01

    Cortical networks can maintain memories for decades despite the short lifetime of synaptic strengths. Can a neural network store long-lasting memories in unstable synapses? Here, we study the effects of ongoing spike-timing-dependent plasticity (STDP) on the stability of memory patterns stored in synapses of an attractor neural network. We show that certain classes of STDP rules can stabilize all stored memory patterns despite a short lifetime of synapses. In our model, unstructured neural noise, after passing through the recurrent network connections, carries the imprint of all memory patterns in temporal correlations. STDP, combined with these correlations, leads to reinforcement of all stored patterns, even those that are never explicitly visited. Our findings may provide the functional reason for irregular spiking displayed by cortical neurons and justify models of system memory consolidation. Therefore, we propose that irregular neural activity is the feature that helps cortical networks maintain stable connections. PMID:25411507

  13. Adiabatic quantum optimization for associative memory recall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seddiqi, Hadayat; Humble, Travis S.

    Hopfield networks are a variant of associative memory that recall patterns stored in the couplings of an Ising model. Stored memories are conventionally accessed as fixed points in the network dynamics that correspond to energetic minima of the spin state. We show that memories stored in a Hopfield network may also be recalled by energy minimization using adiabatic quantum optimization (AQO). Numerical simulations of the underlying quantum dynamics allow us to quantify AQO recall accuracy with respect to the number of stored memories and noise in the input key. We investigate AQO performance with respect to how memories are storedmore » in the Ising model according to different learning rules. Our results demonstrate that AQO recall accuracy varies strongly with learning rule, a behavior that is attributed to differences in energy landscapes. Consequently, learning rules offer a family of methods for programming adiabatic quantum optimization that we expect to be useful for characterizing AQO performance.« less

  14. Adiabatic quantum optimization for associative memory recall

    DOE PAGES

    Seddiqi, Hadayat; Humble, Travis S.

    2014-12-22

    Hopfield networks are a variant of associative memory that recall patterns stored in the couplings of an Ising model. Stored memories are conventionally accessed as fixed points in the network dynamics that correspond to energetic minima of the spin state. We show that memories stored in a Hopfield network may also be recalled by energy minimization using adiabatic quantum optimization (AQO). Numerical simulations of the underlying quantum dynamics allow us to quantify AQO recall accuracy with respect to the number of stored memories and noise in the input key. We investigate AQO performance with respect to how memories are storedmore » in the Ising model according to different learning rules. Our results demonstrate that AQO recall accuracy varies strongly with learning rule, a behavior that is attributed to differences in energy landscapes. Consequently, learning rules offer a family of methods for programming adiabatic quantum optimization that we expect to be useful for characterizing AQO performance.« less

  15. A computationally efficient parallel Levenberg-Marquardt algorithm for highly parameterized inverse model analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Youzuo; O'Malley, Daniel; Vesselinov, Velimir V.

    Inverse modeling seeks model parameters given a set of observations. However, for practical problems because the number of measurements is often large and the model parameters are also numerous, conventional methods for inverse modeling can be computationally expensive. We have developed a new, computationally-efficient parallel Levenberg-Marquardt method for solving inverse modeling problems with a highly parameterized model space. Levenberg-Marquardt methods require the solution of a linear system of equations which can be prohibitively expensive to compute for moderate to large-scale problems. Our novel method projects the original linear problem down to a Krylov subspace, such that the dimensionality of themore » problem can be significantly reduced. Furthermore, we store the Krylov subspace computed when using the first damping parameter and recycle the subspace for the subsequent damping parameters. The efficiency of our new inverse modeling algorithm is significantly improved using these computational techniques. We apply this new inverse modeling method to invert for random transmissivity fields in 2D and a random hydraulic conductivity field in 3D. Our algorithm is fast enough to solve for the distributed model parameters (transmissivity) in the model domain. The algorithm is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). By comparing with Levenberg-Marquardt methods using standard linear inversion techniques such as QR or SVD methods, our Levenberg-Marquardt method yields a speed-up ratio on the order of ~10 1 to ~10 2 in a multi-core computational environment. Furthermore, our new inverse modeling method is a powerful tool for characterizing subsurface heterogeneity for moderate- to large-scale problems.« less

  16. A computationally efficient parallel Levenberg-Marquardt algorithm for highly parameterized inverse model analyses

    DOE PAGES

    Lin, Youzuo; O'Malley, Daniel; Vesselinov, Velimir V.

    2016-09-01

    Inverse modeling seeks model parameters given a set of observations. However, for practical problems because the number of measurements is often large and the model parameters are also numerous, conventional methods for inverse modeling can be computationally expensive. We have developed a new, computationally-efficient parallel Levenberg-Marquardt method for solving inverse modeling problems with a highly parameterized model space. Levenberg-Marquardt methods require the solution of a linear system of equations which can be prohibitively expensive to compute for moderate to large-scale problems. Our novel method projects the original linear problem down to a Krylov subspace, such that the dimensionality of themore » problem can be significantly reduced. Furthermore, we store the Krylov subspace computed when using the first damping parameter and recycle the subspace for the subsequent damping parameters. The efficiency of our new inverse modeling algorithm is significantly improved using these computational techniques. We apply this new inverse modeling method to invert for random transmissivity fields in 2D and a random hydraulic conductivity field in 3D. Our algorithm is fast enough to solve for the distributed model parameters (transmissivity) in the model domain. The algorithm is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). By comparing with Levenberg-Marquardt methods using standard linear inversion techniques such as QR or SVD methods, our Levenberg-Marquardt method yields a speed-up ratio on the order of ~10 1 to ~10 2 in a multi-core computational environment. Furthermore, our new inverse modeling method is a powerful tool for characterizing subsurface heterogeneity for moderate- to large-scale problems.« less

  17. A corner store intervention to improve access to fruits and vegetables in two Latino communities.

    PubMed

    Albert, Stephanie L; Langellier, Brent A; Sharif, Mienah Z; Chan-Golston, Alec M; Prelip, Michael L; Elena Garcia, Rosa; Glik, Deborah C; Belin, Thomas R; Brookmeyer, Ron; Ortega, Alexander N

    2017-08-01

    Investments have been made to alter the food environment of neighbourhoods that have a disproportionate number of unhealthy food venues. Corner store conversions are one strategy to increase access to fruits and vegetables (F&V). Although the literature shows modest success, the effectiveness of these interventions remains equivocal. The present paper reports on the evaluation of Proyecto MercadoFRESCO, a corner store conversion intervention in two Latino communities. A repeated cross-sectional design was employed. Data were stratified by intervention arm and bivariate tests assessed changes over time. Logistic and multiple regression models with intervention arm, time and the interaction of intervention and time were conducted. Supplementary analyses account for clustering of patrons within stores and staggering of store conversions. Three stores were converted and five stores served as comparisons in East Los Angeles and Boyle Heights, California, USA. Store patrons were interviewed before (n550) and after (n407) the intervention. Relative to patrons of comparison stores, patrons of intervention stores demonstrated more favourable perceptions of corner stores and increased purchasing of F&V during that store visit. Changes were not detected in store patronage, percentage of weekly dollars spent on food for F&V or daily consumption of F&V. Consistent with some extant food environment literature, findings demonstrate limited effects. Investments should be made in multilevel, comprehensive interventions that target a variety retail food outlets rather than focusing on corner stores exclusively. Complementary policies limiting the availability, affordability and marketing of energy-dense, nutrient-poor foods should also be pursued.

  18. A corner store intervention to improve access to fruits and vegetables in two Latino communities

    PubMed Central

    Albert, Stephanie L; Langellier, Brent A; Sharif, Mienah Z; Chan-Golston, Alec M; Prelip, Michael L; Garcia, Rosa Elena; Glik, Deborah C; Belin, Thomas R; Brookmeyer, Ron; Ortega, Alexander N

    2017-01-01

    Objective Investments have been made to alter the food environment of neighbourhoods that have a disproportionate number of unhealthy food venues. Corner store conversions are one strategy to increase access to fruits and vegetables (F&V). Although the literature shows modest success, the effectiveness of these interventions remains equivocal. The present paper reports on the evaluation of Proyecto MercadoFRESCO, a corner store conversion intervention in two Latino communities. Design A repeated cross-sectional design was employed. Data were stratified by intervention arm and bivariate tests assessed changes over time. Logistic and multiple regression models with intervention arm, time and the interaction of intervention and time were conducted. Supplementary analyses account for clustering of patrons within stores and staggering of store conversions. Setting Three stores were converted and five stores served as comparisons in East Los Angeles and Boyle Heights, California, USA. Subjects Store patrons were interviewed before (n 550) and after (n 407) the intervention. Results Relative to patrons of comparison stores, patrons of intervention stores demonstrated more favourable perceptions of corner stores and increased purchasing of F&V during that store visit. Changes were not detected in store patronage, percentage of weekly dollars spent on food for F&V or daily consumption of F&V. Conclusions Consistent with some extant food environment literature, findings demonstrate limited effects. Investments should be made in multilevel, comprehensive interventions that target a variety retail food outlets rather than focusing on corner stores exclusively. Complementary policies limiting the availability, affordability and marketing of energy-dense, nutrient-poor foods should also be pursued. PMID:28578744

  19. Establishing a Framework for Community Modeling in Hydrologic Science: Recommendations from the CUAHSI CHyMP Initiative

    NASA Astrophysics Data System (ADS)

    Arrigo, J. S.; Famiglietti, J. S.; Murdoch, L. C.; Lakshmi, V.; Hooper, R. P.

    2012-12-01

    The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) continues a major effort towards supporting Community Hydrologic Modeling. From 2009 - 2011, the Community Hydrologic Modeling Platform (CHyMP) initiative held three workshops, the ultimate goal of which was to produce recommendations and an implementation plan to establish a community modeling program that enables comprehensive simulation of water anywhere on the North American continent. Such an effort would include connections to and advances in global climate models, biogeochemistry, and efforts of other disciplines that require an understanding of water patterns and processes in the environment. To achieve such a vision will require substantial investment in human and cyber-infrastructure and significant advances in the science of hydrologic modeling and spatial scaling. CHyMP concluded with a final workshop, held March 2011, and produced several recommendations. CUAHSI and the university community continue to advance community modeling and implement these recommendations through several related and follow on efforts. Key results from the final 2011 workshop included agreement among participants that the community is ready to move forward with implementation. It is recognized that initial implementation of this larger effort can begin with simulation capabilities that currently exist, or that can be easily developed. CHyMP identified four key activities in support of community modeling: benchmarking, dataset evaluation and development, platform evaluation, and developing a national water model framework. Key findings included: 1) The community supported the idea of a National Water Model framework; a community effort is needed to explore what the ultimate implementation of a National Water Model is. A true community modeling effort would support the modeling of "water anywhere" and would include all relevant scales and processes. 2) Implementation of a community modeling program could initially focus on continental scale modeling of water quantity (rather than quality). The goal of this initial model is the comprehensive description of water stores and fluxes in such a way to permit linkage to GCM's, biogeochemical, ecological, and geomorphic models. This continental scale focus allows systematic evaluation of our current state of knowledge and data, leverages existing efforts done by large scale modelers, contributes to scientific discovery that informs globally and societal relevant questions, and provides an initial framework to evaluate hydrologic information relevant to other disciplines and a structure into which to incorporate other classes of hydrologic models. 3) Dataset development will be a key aspect of any successful national water model implementation. Our current knowledge of the subsurface is limiting our ability to truly integrate soil and groundwater into large scale models, and to answering critical science questions with societal relevance (i.e. groundwater's influence on climate). 4) The CHyMP workshops and efforts to date have achieved collaboration between university scientists, government agencies and the private sector that must be maintained. Follow on efforts in community modeling should aim at leveraging and maintaining this collaboration for maximum scientific and societal benefit.

  20. Threshold voltage variation depending on single grain boundary and stored charges in an adjacent cell for vertical silicon–oxide–nitride–oxide–silicon NAND flash memory

    NASA Astrophysics Data System (ADS)

    Oh, Hyeongwan; Kim, Jiwon; Baek, Rock-Hyun; Lee, Jeong-Soo

    2018-04-01

    The effects of single grain boundary (SGB) position and stored electron charges in an adjacent cell in silicon–oxide–nitride–oxide–silicon (SONOS) structures on the variations of threshold voltage (V th) were investigated using technology computer-aided design (TCAD) simulation. As the bit line voltage increases, the SGB position causing the maximum V th variation was shifted from the center to the source side in the channel, owing to the drain-induced grain barrier lowering effect. When the SGB is located in the spacer region, the potential interaction from both the SGB and the stored electron charges in the adjacent cell becomes significant and thus resulting in larger V th variation. In contrast, when the SGB is located at the center of the channel, the peak position of potential barrier is shifted to the center, so that the influence of the adjacent cell is diminished. As the gate length is scaled down to 20 nm, the influence of stored charges in adjacent cells becomes significant, resulting in larger V th variations.

  1. Effective Pb2+ removal from water using nanozerovalent iron stored 10 months

    NASA Astrophysics Data System (ADS)

    Ahmed, M. A.; Bishay, Samiha T.; Ahmed, Fatma M.; El-Dek, S. I.

    2017-10-01

    Heavy metal removal from water required reliable and cost-effective considerations, fast separation as well as easy methodology. In this piece of research, nanozerovalent iron (NZVI) was prepared as ideal sorbent for Pb2+ removal. The sample was characterized using X-ray diffraction (XRD), high-resolution transmission electron microscope (HRTEM), and atomic force microscope (AFM-SPM). Batch experiments comprised the effect of pH value and contact time on the adsorption process. The same NZVI was stored for a shelf time (10 months) and the batch experiment was repeated. The outcomes of the investigation assured that NZVI publicized an extraordinary large metal uptake (98%) after a short contact time (10 h). The stored sample revealed the same effectiveness on Pb2+ removal under the same conditions. The results of the physical properties, magnetic susceptibility, and conductance were correlated with the adsorption efficiency. This work offers evidence that these NZVI particles could be potential candidate for Pb2+ removal in large scale, stored for a long time using a simple, green, and cost-effective methodology, and represent an actual feedback in waste water treatment.

  2. Comparison of dietary supplement product knowledge and confidence between pharmacists and health food store employees.

    PubMed

    Coon, Scott A; Stevens, Vanessa W; Brown, Jack E; Wolff, Stephen E; Wrobel, Mark J

    2015-01-01

    To determine pharmacists' and health food store employees' knowledge about the safety and efficacy of common, nonvitamin, nonmineral dietary supplements in a retail setting and confidence in discussing, recommending, and acquiring knowledge about complementary and alternative medicine (CAM). Cross-sectional survey. Central and western New York in May and June 2012. Knowledge and confidence survey scores based on true/false and Likert scale responses. Pharmacists' mean knowledge score was significantly higher than that of health food store employees (8.42 vs. 6.15 items of 15 total knowledge questions). Adjusting for differences in experience, education, occupation, and confidence, knowledge scores were significantly higher for pharmacists and those with a higher total confidence score. Pharmacists were significantly less confident about the safety and efficacy of CAM comparatively (13 vs. 16 items of 20 total questions). Pharmacists scored significantly higher than health food store employees on a survey assessing knowledge of dietary supplements' safety and efficacy. Despite the significant difference, scores were unacceptably low for pharmacists, highlighting a knowledge deficit in subject matter.

  3. Water, gravity and trees: Relationship of tree-ring widths and total water storage dynamics

    NASA Astrophysics Data System (ADS)

    Creutzfeldt, B.; Heinrich, I.; Merz, B.; Blume, T.; Güntner, A.

    2012-04-01

    Water stored in the subsurface as groundwater or soil moisture is the main fresh water source not only for drinking water and food production but also for the natural vegetation. In a changing environment water availability becomes a critical issue in many different regions. Long-term observations of the past are needed to improve the understanding of the hydrological system and the prediction of future developments. Tree ring data have repeatedly proved to be valuable sources for reconstructing long-term climate dynamics, e.g. temperature, precipitation and different hydrological variables. In water-limited environments, tree growth is primarily influenced by total water stored in the subsurface and hence, tree-ring records usually contain information about subsurface water storage. The challenge is to retrieve the information on total water storage from tree rings, because a training dataset of water stored in the sub-surface is required for calibration against the tree-ring series. However, measuring water stored in the subsurface is notoriously difficult. We here present high-precision temporal gravimeter measurements which allow for the depth-integrated quantification of total water storage dynamics at the field scale. In this study, we evaluate the relationship of total water storage change and tree ring growth also in the context of the complex interactions of other meteorological forcing factors. A tree-ring chronology was derived from a Norway spruce stand in the Bavarian Forest, Germany. Total water storage dynamics were measured directly by the superconducting gravimeter of the Geodetic Observatory Wettzell for a 9-years period. Time series were extended to 63-years period by a hydrological model using gravity data as the only calibration constrain. Finally, water storage changes were reconstructed based on the relationship between the hydrological model and the tree-ring chronology. Measurement results indicate that tree-ring growth is primarily controlled by total water storage in the subsurface. But high uncertainties intervals of the correlation coefficient urges for the extension of the measurement period. This multi-disciplinary study, combining hydrology, dendrochronology and geodesy shows that temporal gravimeter measurements may give us the unique opportunity to retrieve the information of total water storage contained in tree-ring records to reconstruct total water storage dynamics. Knowing the relationship of water storage and tree-ring growth can also support the reconstruction of other climate records based on tree-ring series, help with hydrological model testing and can improve our knowledge of long-term variations of water storage in the past.

  4. Neighborhood socioeconomic characteristics and differences in the availability of healthy food stores and restaurants in Sao Paulo, Brazil

    PubMed Central

    Duran, Ana Clara; Diez Roux, Ana V; do Rosario DO Latorre, Maria; Jaime, Patricia C

    2013-01-01

    Differential access to healthy foods has been hypothesized to contribute to health disparities, but evidence from low and middle-income countries is still scarce. This study examines whether the access of healthy foods varies across store types and neighborhoods of different socioeconomic statuses (SES) in a large Brazilian city. A cross-sectional study was conducted in 2010–2011 across 52 census tracts. Healthy food access was measured by a comprehensive in-store data collection, summarized into two indexes developed for retail food stores (HFSI) and restaurants (HMRI). Descriptive analyses and multilevel models were used to examine associations of store type and neighborhood SES with healthy food access. Fast food restaurants were more likely to be located in low SES neighborhoods whereas supermarkets and full service restaurants were more likely to be found in higher SES neighborhoods. Multilevel analyses showed that both store type and neighborhood SES were independently associated with in-store food measures. We found differences in the availability of healthy food stores and restaurants in Sao Paulo city favoring middle and high SES neighborhoods. PMID:23747923

  5. Interstitial and Interlayer Ion Diffusion Geometry Extraction in Graphitic Nanosphere Battery Materials.

    PubMed

    Gyulassy, Attila; Knoll, Aaron; Lau, Kah Chun; Wang, Bei; Bremer, Peer-Timo; Papka, Michael E; Curtiss, Larry A; Pascucci, Valerio

    2016-01-01

    Large-scale molecular dynamics (MD) simulations are commonly used for simulating the synthesis and ion diffusion of battery materials. A good battery anode material is determined by its capacity to store ion or other diffusers. However, modeling of ion diffusion dynamics and transport properties at large length and long time scales would be impossible with current MD codes. To analyze the fundamental properties of these materials, therefore, we turn to geometric and topological analysis of their structure. In this paper, we apply a novel technique inspired by discrete Morse theory to the Delaunay triangulation of the simulated geometry of a thermally annealed carbon nanosphere. We utilize our computed structures to drive further geometric analysis to extract the interstitial diffusion structure as a single mesh. Our results provide a new approach to analyze the geometry of the simulated carbon nanosphere, and new insights into the role of carbon defect size and distribution in determining the charge capacity and charge dynamics of these carbon based battery materials.

  6. Interstitial and Interlayer Ion Diffusion Geometry Extraction in Graphitic Nanosphere Battery Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gyulassy, Attila; Knoll, Aaron; Lau, Kah Chun

    2016-01-01

    Large-scale molecular dynamics (MD) simulations are commonly used for simulating the synthesis and ion diffusion of battery materials. A good battery anode material is determined by its capacity to store ion or other diffusers. However, modeling of ion diffusion dynamics and transport properties at large length and long time scales would be impossible with current MD codes. To analyze the fundamental properties of these materials, therefore, we turn to geometric and topological analysis of their structure. In this paper, we apply a novel technique inspired by discrete Morse theory to the Delaunay triangulation of the simulated geometry of a thermallymore » annealed carbon nanosphere. We utilize our computed structures to drive further geometric analysis to extract the interstitial diffusion structure as a single mesh. Our results provide a new approach to analyze the geometry of the simulated carbon nanosphere, and new insights into the role of carbon defect size and distribution in determining the charge capacity and charge dynamics of these carbon based battery materials.« less

  7. Interstitial and interlayer ion diffusion geometry extraction in graphitic nanosphere battery materials

    DOE PAGES

    Gyulassy, Attila; Knoll, Aaron; Lau, Kah Chun; ...

    2016-01-31

    Large-scale molecular dynamics (MD) simulations are commonly used for simulating the synthesis and ion diffusion of battery materials. A good battery anode material is determined by its capacity to store ion or other diffusers. However, modeling of ion diffusion dynamics and transport properties at large length and long time scales would be impossible with current MD codes. To analyze the fundamental properties of these materials, therefore, we turn to geometric and topological analysis of their structure. In this paper, we apply a novel technique inspired by discrete Morse theory to the Delaunay triangulation of the simulated geometry of a thermallymore » annealed carbon nanosphere. We utilize our computed structures to drive further geometric analysis to extract the interstitial diffusion structure as a single mesh. Lastly, our results provide a new approach to analyze the geometry of the simulated carbon nanosphere, and new insights into the role of carbon defect size and distribution in determining the charge capacity and charge dynamics of these carbon based battery materials.« less

  8. Object permanence in the food-storing coal tit (Periparus ater) and the non-storing great tit (Parus major): Is the mental representation required?

    PubMed

    Marhounová, Lucie; Frynta, Daniel; Fuchs, Roman; Landová, Eva

    2017-05-01

    Object permanence is a cognitive ability that enables animals to mentally represent the continuous existence of temporarily hidden objects. Generally, it develops gradually through six qualitative stages, the evolution of which may be connected with some specific ecological and behavioral factors. In birds, the advanced object permanence skills were reported in several storing species of the Corvidae family. In order to test the association between food-storing and achieved performance within the stages, we compared food-storing coal tits (Periparus ater) and nonstoring great tits (Parus major) using an adapted version of Uzgiris & Hunt's Scale 1 tasks. The coal tits significantly outperformed the great tits in searching for completely hidden objects. Most of the great tits could not solve the task when the object disappeared completely. However, the upper limit for both species is likely to be Stage 4. The coal tits could solve problems with simply hidden objects, but they used alternative strategies rather than mental representation when searching for completely hidden objects, especially if choosing between two locations. Our results also suggest that neophobia did not affect the overall performance in the object permanence tasks. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Multi-scale interactions affecting transport, storage, and processing of solutes and sediments in stream corridors (Invited)

    NASA Astrophysics Data System (ADS)

    Harvey, J. W.; Packman, A. I.

    2010-12-01

    Surface water and groundwater flow interact with the channel geomorphology and sediments in ways that determine how material is transported, stored, and transformed in stream corridors. Solute and sediment transport affect important ecological processes such as carbon and nutrient dynamics and stream metabolism, processes that are fundamental to stream health and function. Many individual mechanisms of transport and storage of solute and sediment have been studied, including surface water exchange between the main channel and side pools, hyporheic flow through shallow and deep subsurface flow paths, and sediment transport during both baseflow and floods. A significant challenge arises from non-linear and scale-dependent transport resulting from natural, fractal fluvial topography and associated broad, multi-scale hydrologic interactions. Connections between processes and linkages across scales are not well understood, imposing significant limitations on system predictability. The whole-stream tracer experimental approach is popular because of the spatial averaging of heterogeneous processes; however the tracer results, implemented alone and analyzed using typical models, cannot usually predict transport beyond the very specific conditions of the experiment. Furthermore, the results of whole stream tracer experiments tend to be biased due to unavoidable limitations associated with sampling frequency, measurement sensitivity, and experiment duration. We recommend that whole-stream tracer additions be augmented with hydraulic and topographic measurements and also with additional tracer measurements made directly in storage zones. We present examples of measurements that encompass interactions across spatial and temporal scales and models that are transferable to a wide range of flow and geomorphic conditions. These results show how the competitive effects between the different forces driving hyporheic flow, operating at different spatial scales, creates a situation where hyporheic fluxes cannot be accurately estimated without considering multi-scale effects. Our modeling captures the dominance of small-scale features such as bedforms that drive the majority of hyporheic flow, but it also captures how hyporheic flow is substantially modified by relatively small changes in streamflow or groundwater flow. The additional field measurements add sensitivity and power to whole stream tracer additions by improving resolution of the relative importance of storage at different scales (e.g. bar-scale versus bedform-scale). This information is critical in identifying hot spots where important biogeochemical reactions occur. In summary, interpreting multi-scale interactions in streams requires models that are physically based and that incorporate non-linear process dynamics. Such models can take advantage of increasingly comprehensive field data to integrate transport processes across spatially variable flow and geomorphic conditions. The most useful field and modeling approaches will be those that are simple enough to be easily implemented by users from various disciplines but comprehensive enough to produce meaningful predictions for a wide range of flow and geomorphic scenarios. This capability is needed to support improved strategies for protecting stream ecological health in the face of accelerating land use and climate change.

  10. The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases

    PubMed Central

    KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM

    2011-01-01

    Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874

  11. The effect of mystery shopper reports on age verification for tobacco purchases.

    PubMed

    Krevor, Brad S; Ponicki, William R; Grube, Joel W; DeJong, William

    2011-09-01

    Mystery shops involving attempted tobacco purchases by young buyers have been implemented in order to monitor retail stores' performance in refusing underage sales. Anecdotal evidence suggests that mystery shop visits with immediate feedback to store personnel can improve age verification. This study investigated the effect of monthly and twice-monthly mystery shop reports on age verification. Mystery shoppers visited 45 Walgreens stores 20 times. The stores were randomly assigned to 1 of 3 conditions. Control group stores received no feedback, whereas 2 treatment groups received feedback communications on every visit (twice monthly) or on every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Postbaseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement compared with the control group stores. Verification rates increased significantly during the study period for all 3 groups, with delayed improvement among control group stores. Communication between managers regarding the mystery shop program may account for the delayed age-verification improvements observed in the control group stores. Encouraging interstore communication might extend the benefits of mystery shop programs beyond those stores that receive this intervention. Copyright © Taylor & Francis Group, LLC

  12. Small organic molecule based flow battery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huskinson, Brian; Marshak, Michael; Aziz, Michael J.

    The invention provides an electrochemical cell based on a new chemistry for a flow battery for large scale, e.g., gridscale, electrical energy storage. Electrical energy is stored chemically at an electrochemical electrode by the protonation of small organic molecules called quinones to hydroquinones. The proton is provided by a complementary electrochemical reaction at the other electrode. These reactions are reversed to deliver electrical energy. A flow battery based on this concept can operate as a closed system. The flow battery architecture has scaling advantages over solid electrode batteries for large scale energy storage.

  13. Piezoelectric energy harvesting computer controlled test bench

    NASA Astrophysics Data System (ADS)

    Vázquez-Rodriguez, M.; Jiménez, F. J.; de Frutos, J.; Alonso, D.

    2016-09-01

    In this paper a new computer controlled (C.C.) laboratory test bench is presented. The patented test bench is made up of a C.C. road traffic simulator, C.C. electronic hardware involved in automating measurements, and test bench control software interface programmed in LabVIEW™. Our research is focused on characterizing electronic energy harvesting piezoelectric-based elements in road traffic environments to extract (or "harvest") maximum power. In mechanical to electrical energy conversion, mechanical impacts or vibrational behavior are commonly used, and several major problems need to be solved to perform optimal harvesting systems including, but no limited to, primary energy source modeling, energy conversion, and energy storage. It is described a novel C.C. test bench that obtains, in an accurate and automatized process, a generalized linear equivalent electrical model of piezoelectric elements and piezoelectric based energy store harvesting circuits in order to scale energy generation with multiple devices integrated in different topologies.

  14. Assessing hydrometeorological impacts with terrestrial and aerial Lidar data in Monterrey, México

    NASA Astrophysics Data System (ADS)

    Yepez Rincon, F.; Lozano Garcia, D.; Vela Coiffier, P.; Rivera Rivera, L.

    2013-10-01

    Light Detection Ranging (Lidar) is an efficient tool to gather points reflected from a terrain and store them in a xyz coordinate system, allowing the generation of 3D data sets to manage geoinformation. Translation of these coordinates, from an arbitrary system into a geographical base, makes data feasible and useful to calculate volumes and define topographic characteristics at different scales. Lidar technological advancement in topographic mapping enables the generation of highly accurate and densely sampled elevation models, which are in high demand by many industries like construction, mining and forestry. This study merges terrestrial and aerial Lidar data to evaluate the effectiveness of these tools assessing volumetric changes after a hurricane event of riverbeds and scour bridges The resulted information could be an optimal approach to improve hydrological and hydraulic models, to aid authorities in proper to decision making in construction, urban planning, and homeland security.

  15. Deep ART Neural Model for Biologically Inspired Episodic Memory and Its Application to Task Performance of Robots.

    PubMed

    Park, Gyeong-Moon; Yoo, Yong-Ho; Kim, Deok-Hwa; Kim, Jong-Hwan; Gyeong-Moon Park; Yong-Ho Yoo; Deok-Hwa Kim; Jong-Hwan Kim; Yoo, Yong-Ho; Park, Gyeong-Moon; Kim, Jong-Hwan; Kim, Deok-Hwa

    2018-06-01

    Robots are expected to perform smart services and to undertake various troublesome or difficult tasks in the place of humans. Since these human-scale tasks consist of a temporal sequence of events, robots need episodic memory to store and retrieve the sequences to perform the tasks autonomously in similar situations. As episodic memory, in this paper we propose a novel Deep adaptive resonance theory (ART) neural model and apply it to the task performance of the humanoid robot, Mybot, developed in the Robot Intelligence Technology Laboratory at KAIST. Deep ART has a deep structure to learn events, episodes, and even more like daily episodes. Moreover, it can retrieve the correct episode from partial input cues robustly. To demonstrate the effectiveness and applicability of the proposed Deep ART, experiments are conducted with the humanoid robot, Mybot, for performing the three tasks of arranging toys, making cereal, and disposing of garbage.

  16. From cosmos to connectomes: the evolution of data-intensive science.

    PubMed

    Burns, Randal; Vogelstein, Joshua T; Szalay, Alexander S

    2014-09-17

    The analysis of data requires computation: originally by hand and more recently by computers. Different models of computing are designed and optimized for different kinds of data. In data-intensive science, the scale and complexity of data exceeds the comfort zone of local data stores on scientific workstations. Thus, cloud computing emerges as the preeminent model, utilizing data centers and high-performance clusters, enabling remote users to access and query subsets of the data efficiently. We examine how data-intensive computational systems originally built for cosmology, the Sloan Digital Sky Survey (SDSS), are now being used in connectomics, at the Open Connectome Project. We list lessons learned and outline the top challenges we expect to face. Success in computational connectomics would drastically reduce the time between idea and discovery, as SDSS did in cosmology. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Piezoelectric energy harvesting computer controlled test bench.

    PubMed

    Vázquez-Rodriguez, M; Jiménez, F J; de Frutos, J; Alonso, D

    2016-09-01

    In this paper a new computer controlled (C.C.) laboratory test bench is presented. The patented test bench is made up of a C.C. road traffic simulator, C.C. electronic hardware involved in automating measurements, and test bench control software interface programmed in LabVIEW™. Our research is focused on characterizing electronic energy harvesting piezoelectric-based elements in road traffic environments to extract (or "harvest") maximum power. In mechanical to electrical energy conversion, mechanical impacts or vibrational behavior are commonly used, and several major problems need to be solved to perform optimal harvesting systems including, but no limited to, primary energy source modeling, energy conversion, and energy storage. It is described a novel C.C. test bench that obtains, in an accurate and automatized process, a generalized linear equivalent electrical model of piezoelectric elements and piezoelectric based energy store harvesting circuits in order to scale energy generation with multiple devices integrated in different topologies.

  18. Practical modeling approaches for geological storage of carbon dioxide.

    PubMed

    Celia, Michael A; Nordbotten, Jan M

    2009-01-01

    The relentless increase of anthropogenic carbon dioxide emissions and the associated concerns about climate change have motivated new ideas about carbon-constrained energy production. One technological approach to control carbon dioxide emissions is carbon capture and storage, or CCS. The underlying idea of CCS is to capture the carbon before it emitted to the atmosphere and store it somewhere other than the atmosphere. Currently, the most attractive option for large-scale storage is in deep geological formations, including deep saline aquifers. Many physical and chemical processes can affect the fate of the injected CO2, with the overall mathematical description of the complete system becoming very complex. Our approach to the problem has been to reduce complexity as much as possible, so that we can focus on the few truly important questions about the injected CO2, most of which involve leakage out of the injection formation. Toward this end, we have established a set of simplifying assumptions that allow us to derive simplified models, which can be solved numerically or, for the most simplified cases, analytically. These simplified models allow calculation of solutions to large-scale injection and leakage problems in ways that traditional multicomponent multiphase simulators cannot. Such simplified models provide important tools for system analysis, screening calculations, and overall risk-assessment calculations. We believe this is a practical and important approach to model geological storage of carbon dioxide. It also serves as an example of how complex systems can be simplified while retaining the essential physics of the problem.

  19. Interannual variation of carbon fluxes from three contrasting evergreen forests: the role of forest dynamics and climate.

    PubMed

    Sierra, Carlos A; Loescher, Henry W; Harmon, Mark E; Richardson, Andrew D; Hollinger, David Y; Perakis, Steven S

    2009-10-01

    Interannual variation of carbon fluxes can be attributed to a number of biotic and abiotic controls that operate at different spatial and temporal scales. Type and frequency of disturbance, forest dynamics, and climate regimes are important sources of variability. Assessing the variability of carbon fluxes from these specific sources can enhance the interpretation of past and current observations. Being able to separate the variability caused by forest dynamics from that induced by climate will also give us the ability to determine if the current observed carbon fluxes are within an expected range or whether the ecosystem is undergoing unexpected change. Sources of interannual variation in ecosystem carbon fluxes from three evergreen ecosystems, a tropical, a temperate coniferous, and a boreal forest, were explored using the simulation model STANDCARB. We identified key processes that introduced variation in annual fluxes, but their relative importance differed among the ecosystems studied. In the tropical site, intrinsic forest dynamics contributed approximately 30% of the total variation in annual carbon fluxes. In the temperate and boreal sites, where many forest processes occur over longer temporal scales than those at the tropical site, climate controlled more of the variation among annual fluxes. These results suggest that climate-related variability affects the rates of carbon exchange differently among sites. Simulations in which temperature, precipitation, and radiation varied from year to year (based on historical records of climate variation) had less net carbon stores than simulations in which these variables were held constant (based on historical records of monthly average climate), a result caused by the functional relationship between temperature and respiration. This suggests that, under a more variable temperature regime, large respiratory pulses may become more frequent and high enough to cause a reduction in ecosystem carbon stores. Our results also show that the variation of annual carbon fluxes poses an important challenge in our ability to determine whether an ecosystem is a source, a sink, or is neutral in regard to CO2 at longer timescales. In simulations where climate change negatively affected ecosystem carbon stores, there was a 20% chance of committing Type II error, even with 20 years of sequential data.

  20. Ecosystem growth and development.

    PubMed

    Fath, Brian D; Jørgensen, Sven E; Patten, Bernard C; Straskraba, Milan

    2004-11-01

    One of the most important features of biosystems is how they are able to maintain local order (low entropy) within their system boundaries. At the ecosystem scale, this organization can be observed in the thermodynamic parameters that describe it, such that these parameters can be used to track ecosystem growth and development during succession. Thermodynamically, ecosystem growth is the increase of energy throughflow and stored biomass, and ecosystem development is the internal reorganization of these energy mass stores, which affect transfers, transformations, and time lags within the system. Several proposed hypotheses describe thermodynamically the orientation or natural tendency that ecosystems follow during succession, and here, we consider five: minimize specific entropy production, maximize dissipation, maximize exergy storage (includes biomass and information), maximize energy throughflow, and maximize retention time. These thermodynamic orientors were previously all shown to occur to some degree during succession, and here we present a refinement by observing them during different stages of succession. We view ecosystem succession as a series of four growth and development stages: boundary, structural, network, and informational. We demonstrate how each of these ecological thermodynamic orientors behaves during the different growth and development stages, and show that while all apply during some stages only maximizing energy throughflow and maximizing exergy storage are applicable during all four stages. Therefore, we conclude that the movement away from thermodynamic equilibrium, and the subsequent increase in organization during ecosystem growth and development, is a result of system components and configurations that maximize the flux of useful energy and the amount of stored exergy. Empirical data and theoretical models support these conclusions.

  1. Storage filters upland suspended sediment signals delivered from watersheds

    USGS Publications Warehouse

    Pizzuto, James E.; Keeler, Jeremy; Skalak, Katherine; Karwan, Diana

    2017-01-01

    Climate change, tectonics, and humans create long- and short-term temporal variations in the supply of suspended sediment to rivers. These signals, generated in upland erosional areas, are filtered by alluvial storage before reaching the basin outlet. We quantified this filter using a random walk model driven by sediment budget data, a power-law distributed probability density function (PDF) to determine how long sediment remains stored, and a constant downstream drift velocity during transport of 157 km/yr. For 25 km of transport, few particles are stored, and the median travel time is 0.2 yr. For 1000 km of transport, nearly all particles are stored, and the median travel time is 2.5 m.y. Both travel-time distributions are power laws. The 1000 km travel-time distribution was then used to filter sinusoidal input signals with periods of 10 yr and 104 yr. The 10 yr signal is delayed by 12.5 times its input period, damped by a factor of 380, and is output as a power law. The 104 yr signal is delayed by 0.15 times its input period, damped by a factor of 3, and the output signal retains its sinusoidal input form (but with a power-law “tail”). Delivery time scales for these two signals are controlled by storage; in-channel transport time is insignificant, and low-frequency signals are transmitted with greater fidelity than high-frequency signals. These signal modifications are essential to consider when evaluating watershed restoration schemes designed to control sediment loading, and where source-area geomorphic processes are inferred from the geologic record.

  2. The association of point-of-sale cigarette marketing with cravings to smoke: results from a cross-sectional population-based study

    PubMed Central

    Siahpush, Mohammad; Shaikh, Raees A; Cummings, K Michael; Hyland, Andrew; Dodd, Michael; Carlson, Les; Kessler, Asia Sikora; Meza, Jane; Wan, Neng; Wakefield, Melanie

    2015-01-01

    Objective To examine the association between recalled exposure to point-of-sale (POS) cigarette marketing (ie, pack displays, advertisements and promotions such as discounts) and reported cravings to smoke while visiting a store. Methods Data were collected using a telephone survey of a cross-sectional sample of 999 adult smokers in Omaha, Nebraska. Recalled exposure to POS cigarette marketing was measured by asking respondents about noticing (a) pack displays, (b) advertisements and (c) promotions in store in their neighbourhood. A 3-item scale indicating the frequency of experiencing cravings to smoke in locations where cigarettes are sold was created by asking respondents: (1) “feel a craving for a cigarette?” (2) “feel like nothing would be better than smoking a cigarette?” and (3) “feel like all you want is a cigarette?” The association between recalled exposure to POS cigarette marketing and cravings was estimated using ordinary least squares linear regression models, controlling for nicotine dependence, gender, age, race/ethnicity, income, education, frequency of visiting stores in one’s neighbourhood and method of recruitment into the study. Results Recalled exposure to POS cigarette displays (p<0.001) and advertisements (p=0.002), but not promotions (p=0.06), was associated with more frequent cravings to smoke. Conclusions Recalled exposure to POS cigarette marketing is associated with cravings to smoke as predicted by laboratory studies on the effects of smoking cues on cigarette craving. Policies that reduce or eliminate POS cigarette marketing could reduce cigarette cravings and might attenuate impulse buying of cigarettes. PMID:26024797

  3. From Fibrils to Toughness: Multi-Scale Mechanics of Fibrillating Interfaces in Stretchable Electronics

    PubMed Central

    van der Sluis, Olaf; Vossen, Bart; Geers, Marc

    2018-01-01

    Metal-elastomer interfacial systems, often encountered in stretchable electronics, demonstrate remarkably high interface fracture toughness values. Evidently, a large gap exists between the rather small adhesion energy levels at the microscopic scale (‘intrinsic adhesion’) and the large measured macroscopic work-of-separation. This energy gap is closed here by unravelling the underlying dissipative mechanisms through a systematic numerical/experimental multi-scale approach. This self-containing contribution collects and reviews previously published results and addresses the remaining open questions by providing new and independent results obtained from an alternative experimental set-up. In particular, the experimental studies on Cu-PDMS (Poly(dimethylsiloxane)) samples conclusively reveal the essential role of fibrillation mechanisms at the micro-meter scale during the metal-elastomer delamination process. The micro-scale numerical analyses on single and multiple fibrils show that the dynamic release of the stored elastic energy by multiple fibril fracture, including the interaction with the adjacent deforming bulk PDMS and its highly nonlinear behaviour, provide a mechanistic understanding of the high work-of-separation. An experimentally validated quantitative relation between the macroscopic work-of-separation and peel front height is established from the simulation results. Finally, it is shown that a micro-mechanically motivated shape of the traction-separation law in cohesive zone models is essential to describe the delamination process in fibrillating metal-elastomer systems in a physically meaningful way. PMID:29393908

  4. Memory reduction through higher level language hardware

    NASA Technical Reports Server (NTRS)

    Kerner, H.; Gellman, L.

    1972-01-01

    Application of large scale integration in computers to reduce size and manufacturing costs and to produce improvements in logic function is discussed. Use of FORTRAN 4 as computer language for this purpose is described. Effectiveness of method in storing information is illustrated.

  5. Validation of Pacific Northwest hydrologic landscapes at the catchment scale

    EPA Science Inventory

    The interaction between the physical properties of a catchment (form) and climatic forcing of precipitation and energy control how water is partitioned, stored, and conveyed through a catchment (function). Hydrologic Landscapes (HLs) were previously developed across Oregon and de...

  6. Genome-to-Watershed Predictive Understanding of Terrestrial Environments

    NASA Astrophysics Data System (ADS)

    Hubbard, S. S.; Agarwal, D.; Banfield, J. F.; Beller, H. R.; Brodie, E.; Long, P.; Nico, P. S.; Steefel, C. I.; Tokunaga, T. K.; Williams, K. H.

    2014-12-01

    Although terrestrial environments play a critical role in cycling water, greenhouse gasses, and other life-critical elements, the complexity of interactions among component microbes, plants, minerals, migrating fluids and dissolved constituents hinders predictive understanding of system behavior. The 'Sustainable Systems 2.0' project is developing genome-to-watershed scale predictive capabilities to quantify how the microbiome affects biogeochemical watershed functioning, how watershed-scale hydro-biogeochemical processes affect microbial functioning, and how these interactions co-evolve with climate and land-use changes. Development of such predictive capabilities is critical for guiding the optimal management of water resources, contaminant remediation, carbon stabilization, and agricultural sustainability - now and with global change. Initial investigations are focused on floodplains in the Colorado River Basin, and include iterative model development, experiments and observations with an early emphasis on subsurface aspects. Field experiments include local-scale experiments at Rifle CO to quantify spatiotemporal metabolic and geochemical responses to O2and nitrate amendments as well as floodplain-scale monitoring to quantify genomic and biogeochemical response to natural hydrological perturbations. Information obtained from such experiments are represented within GEWaSC, a Genome-Enabled Watershed Simulation Capability, which is being developed to allow mechanistic interrogation of how genomic information stored in a subsurface microbiome affects biogeochemical cycling. This presentation will describe the genome-to-watershed scale approach as well as early highlights associated with the project. Highlights include: first insights into the diversity of the subsurface microbiome and metabolic roles of organisms involved in subsurface nitrogen, sulfur and hydrogen and carbon cycling; the extreme variability of subsurface DOC and hydrological controls on carbon and nitrogen cycling; geophysical identification of floodplain hotspots that are useful for model parameterization; and GEWaSC demonstration of how incorporation of identified microbial metabolic processes improves prediction of the larger system biogeochemical behavior.

  7. When GIS zooms in: spatio-genetic maps of multipaternity in Armadillidium vulgare.

    PubMed

    Bech, Nicolas; Depeux, Charlotte; Durand, Sylvine; Debenest, Catherine; Lafitte, Alexandra; Beltran-Bech, Sophie

    2017-12-01

    Geographic information system (GIS) tools are designed to illustrate, analyse and integrate geographic or spatial data, usually on a macroscopic scale. By contrast, genetic tools focus on a microscopic scale. Because in reality, landscapes have no predefined scale, our original study aims to develop a new approach, combining both cartographic and genetic approaches to explore microscopic landscapes. For this, we focused on Armadillidium vulgare, a terrestrial isopod model in which evolutionary pressures imposed by terrestrial life have led to the development of internal fertilisation and, consequently, to associated physiological changes. Among these, the emergence of internal receptacles, found in many taxa ranging from mammals to arthropods, allowed females to store sperm from several partners, enabling multipaternity. Among arthropods, terrestrial isopods like the polygynandrous A. vulgare present a female structure, the marsupium, in which fertilised eggs migrate and develop into mancae (larval stage). To test our innovative combined approach, we proposed different males to four independent females, and at the end of incubation in the marsupium, we mapped (using GIS methods) and genotyped (using 12 microsatellite markers) all the incubated mancae. This methodology permitted to obtain spatio-genetic maps describing heterozygosity and spatial distribution of mancae and of multipaternity within the marsupial landscape. We discussed the interest of this kind of multidisciplinary approach which could improve in this case our understanding of sexual selection mechanisms in this terrestrial crustacean. Beyond the interesting model-focused insights, the main challenge of this study was the transfer of GIS techniques to a microscopic scale and our results appear so as pioneers rendering GIS tools available for studies involving imagery whatever their study scale.

  8. Topsoil N-budget model in orchard farming to evaluate groundwater nitrate contamination

    NASA Astrophysics Data System (ADS)

    Wijayanti, Yureana; Budihardjo, Kadarwati; Sakamoto, Yasushi; Setyandito, Oki

    2017-12-01

    A small scale field research was conducted in an orchard farming area in Kofu, Japan, where nitrate contamination was found in groundwater. The purpose of assessing the leaching of nitrate in this study is to understand the transformation and transport process of N-source in topsoil that leads to nitrate contamination of groundwater. In order to calculate N-budget in the soil, the model was utilized to predict the nitrogen leaching. In this res earch, the N-budget model was modified to evaluate influence of precipitation and application pattern of fertilizer and manure compost. The result shows that at the time before the addition of manure compost and fertilizer, about 75% of fertilizer leach from topsoil. Every month, the average remaining nitrate in soil from fertilizer and manure compost are 22% and 50%, respectively. The accumulation of this monthly manure compost nitrate, which stored in soil, should be carefully monitored. It could become the potential source of nitrate leaching to groundwater in the future.

  9. Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets

    NASA Astrophysics Data System (ADS)

    Juric, Mario

    2011-01-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.

  10. Method to predict external store carriage characteristics at transonic speeds

    NASA Technical Reports Server (NTRS)

    Rosen, Bruce S.

    1988-01-01

    Development of a computational method for prediction of external store carriage characteristics at transonic speeds is described. The geometric flexibility required for treatment of pylon-mounted stores is achieved by computing finite difference solutions on a five-level embedded grid arrangement. A completely automated grid generation procedure facilitates applications. Store modeling capability consists of bodies of revolution with multiple fore and aft fins. A body-conforming grid improves the accuracy of the computed store body flow field. A nonlinear relaxation scheme developed specifically for modified transonic small disturbance flow equations enhances the method's numerical stability and accuracy. As a result, treatment of lower aspect ratio, more highly swept and tapered wings is possible. A limited supersonic freestream capability is also provided. Pressure, load distribution, and force/moment correlations show good agreement with experimental data for several test cases. A detailed computer program description for the Transonic Store Carriage Loads Prediction (TSCLP) Code is included.

  11. Enhancing CIDOC-CRM and compatible models with the concept of multiple interpretation

    NASA Astrophysics Data System (ADS)

    Van Ruymbeke, M.; Hallot, P.; Billen, R.

    2017-08-01

    Modelling cultural heritage and archaeological objects is used as much for management as for research purposes. To ensure the sustainable benefit of digital data, models benefit from taking the data specificities of historical and archaeological domains into account. Starting from a conceptual model tailored to storing these specificities, we present, in this paper, an extended mapping to CIDOC-CRM and its compatible models. Offering an ideal framework to structure and highlight the best modelling practices, these ontologies are essentially dedicated to storing semantic data which provides information about cultural heritage objects. Based on this standard, our proposal focuses on multiple interpretation and sequential reality.

  12. Fabrication and optimization of a conducting polymer sensor array using stored grain model volatiles.

    PubMed

    Hossain, Md Eftekhar; Rahman, G M Aminur; Freund, Michael S; Jayas, Digvir S; White, Noel D G; Shafai, Cyrus; Thomson, Douglas J

    2012-03-21

    During storage, grain can experience significant degradation in quality due to a variety of physical, chemical, and biological interactions. Most commonly, these losses are associated with insects or fungi. Continuous monitoring and an ability to differentiate between sources of spoilage are critical for rapid and effective intervention to minimize deterioration or losses. Therefore, there is a keen interest in developing a straightforward, cost-effective, and efficient method for monitoring of stored grain. Sensor arrays are currently used for classifying liquors, perfumes, and the quality of food products by mimicking the mammalian olfactory system. The use of this technology for monitoring of stored grain and identification of the source of spoilage is a new application, which has the potential for broad impact. The main focus of the work described herein is on the fabrication and optimization of a carbon black (CB) polymer sensor array to monitor stored grain model volatiles associated with insect secretions (benzene derivatives) and fungi (aliphatic hydrocarbon derivatives). Various methods of statistical analysis (RSD, PCA, LDA, t test) were used to select polymers for the array that were optimum for distinguishing between important compound classes (quinones, alcohols) and to minimize the sensitivity for other parameters such as humidity. The performance of the developed sensor array was satisfactory to demonstrate identification and separation of stored grain model volatiles at ambient conditions.

  13. A petabyte size electronic library using the N-Gram memory engine

    NASA Technical Reports Server (NTRS)

    Bugajski, Joseph M.

    1993-01-01

    A model library containing petabytes of data is proposed by Triada, Ltd., Ann Arbor, Michigan. The library uses the newly patented N-Gram Memory Engine (Neurex), for storage, compression, and retrieval. Neurex splits data into two parts: a hierarchical network of associative memories that store 'information' from data and a permutation operator that preserves sequence. Neurex is expected to offer four advantages in mass storage systems. Neurex representations are dense, fully reversible, hence less expensive to store. Neurex becomes exponentially more stable with increasing data flow; thus its contents and the inverting algorithm may be mass produced for low cost distribution. Only a small permutation operator would be recalled from the library to recover data. Neurex may be enhanced to recall patterns using a partial pattern. Neurex nodes are measures of their pattern. Researchers might use nodes in statistical models to avoid costly sorting and counting procedures. Neurex subsumes a theory of learning and memory that the author believes extends information theory. Its first axiom is a symmetry principle: learning creates memory and memory evidences learning. The theory treats an information store that evolves from a null state to stationarity. A Neurex extracts information data without a priori knowledge; i.e., unlike neural networks, neither feedback nor training is required. The model consists of an energetically conservative field of uniformly distributed events with variable spatial and temporal scale, and an observer walking randomly through this field. A bank of band limited transducers (an 'eye'), each transducer in a bank being tuned to a sub-band, outputs signals upon registering events. Output signals are 'observed' by another transducer bank (a mid-brain), except the band limit of the second bank is narrower than the band limit of the first bank. The banks are arrayed as n 'levels' or 'time domains, td.' The banks are the hierarchical network (a cortex) and transducers are (associative) memories. A model Neurex was built and studied. Data were 50 MB to 10 GB samples of text, data base, and images: black/white, grey scale, and high resolution in several spectral bands. Memories at td, S(m(sub td)), were plotted against outputs of memories at td-1. S(m(sub td)) was Boltzman distributed, and memory frequencies exhibited self-organized criticality (SOC); i.e., 'l/f(sup beta)' after long exposures to data. Whereas output signals from level n may be encoded with B(sub output) = O(-log(2)f(sup beta)) bits, and input data encoded with B(sub input) = O((S(td)/S(td-1))(sup n)), B(sup output)/B(sub input) is much less than 1 always, the Neurex determines a canonical code for data and it is a lossless data compressor. Further tests are underway to confirm these results with more data types and larger samples.

  14. Carbon storage in US wetlands

    PubMed Central

    Nahlik, A. M.; Fennessy, M. S.

    2016-01-01

    Wetland soils contain some of the highest stores of soil carbon in the biosphere. However, there is little understanding of the quantity and distribution of carbon stored in our remaining wetlands or of the potential effects of human disturbance on these stocks. Here we use field data from the 2011 National Wetland Condition Assessment to provide unbiased estimates of soil carbon stocks for wetlands at regional and national scales. We find that wetlands in the conterminous United States store a total of 11.52 PgC, much of which is within soils deeper than 30 cm. Freshwater inland wetlands, in part due to their substantial areal extent, hold nearly ten-fold more carbon than tidal saltwater sites—indicating their importance in regional carbon storage. Our data suggest a possible relationship between carbon stocks and anthropogenic disturbance. These data highlight the need to protect wetlands to mitigate the risk of avoidable contributions to climate change. PMID:27958272

  15. Carbon storage in US wetlands

    NASA Astrophysics Data System (ADS)

    Nahlik, A. M.; Fennessy, M. S.

    2016-12-01

    Wetland soils contain some of the highest stores of soil carbon in the biosphere. However, there is little understanding of the quantity and distribution of carbon stored in our remaining wetlands or of the potential effects of human disturbance on these stocks. Here we use field data from the 2011 National Wetland Condition Assessment to provide unbiased estimates of soil carbon stocks for wetlands at regional and national scales. We find that wetlands in the conterminous United States store a total of 11.52 PgC, much of which is within soils deeper than 30 cm. Freshwater inland wetlands, in part due to their substantial areal extent, hold nearly ten-fold more carbon than tidal saltwater sites--indicating their importance in regional carbon storage. Our data suggest a possible relationship between carbon stocks and anthropogenic disturbance. These data highlight the need to protect wetlands to mitigate the risk of avoidable contributions to climate change.

  16. Fabrication of two one-fifth scale fiber composite flywheels

    NASA Astrophysics Data System (ADS)

    1980-04-01

    Two fiber composite flywheel rotors were fabricated. These flywheels are scaled down versions of two automotive flywheels built for Sandia Laboratories. The flywheels store a maximum of 0.97 kWh of kinetic energy. Energy density is 87 Wh/kg at the maximum operating speed of 56,000 rpm. The nominal flywheel dimensions are 35 cm diameter by 13 cm axial height. The weight of the flywheel rotor assembly is 11.16 kg.

  17. Reducing a congestion with introduce the greedy algorithm on traffic light control

    NASA Astrophysics Data System (ADS)

    Catur Siswipraptini, Puji; Hendro Martono, Wisnu; Hartanti, Dian

    2018-03-01

    The density of vehicles causes congestion seen at every junction in the city of jakarta due to the static or manual traffic timing lamp system consequently the length of the queue at the junction is uncertain. The research has been aimed at designing a sensor based traffic system based on the queue length detection of the vehicle to optimize the duration of the green light. In detecting the length of the queue of vehicles using infrared sensor assistance placed in each intersection path, then apply Greedy algorithm to help accelerate the movement of green light duration for the path that requires, while to apply the traffic lights regulation program based on greedy algorithm which is then stored on microcontroller with Arduino Mega 2560 type. Where a developed system implements the greedy algorithm with the help of the infrared sensor it will extend the duration of the green light on the long vehicle queue and accelerate the duration of the green light at the intersection that has the queue not too dense. Furthermore, the design is made to form an artificial form of the actual situation of the scale model or simple simulator (next we just called as scale model of simulator) of the intersection then tested. Sensors used are infrared sensors, where the placement of sensors in each intersection on the scale model is placed within 10 cm of each sensor and serves as a queue detector. From the results of the test process on the scale model with a longer queue obtained longer green light time so it will fix the problem of long queue of vehicles. Using greedy algorithms can add long green lights for 2 seconds on tracks that have long queues at least three sensor levels and accelerate time at other intersections that have longer queue sensor levels less than level three.

  18. Grocery Shopping How Individuals and Built Environments Influence Choice of Travel Mode.

    PubMed

    Jiao, Junfeng; Moudon, Anne Vernez; Drewnowski, Adam

    This research investigated the influences of socioeconomic characteristics of individual travelers and of the environments where the travelers live and shop on choice of travel mode for grocery shopping. The data on travel for grocery shopping came from 2,001 respondents to the 2009 Seattle Obesity Study survey in King County, Washington. Eighty-eight percent of the respondents drove to their grocery stores, whereas 12% used transit or taxis, walked, biked, or carpooled. The addresses of 1,994 homes and 1,901 primary grocery stores used by respondents were geographically coded. The characteristics of built environments in the neighborhoods around homes and grocery stores and the distances between those homes and stores were measured in a geographic information system. Four binary logistic models estimated the impact of individual socioeconomic characteristics, distance, and built environments around homes and grocery stores on the travel mode used for grocery shopping. Fourteen variables were significantly related to mode choice. The strongest predictors of driving to the grocery store were more cars per adult household member, more adults per household, living in a single-family house, longer distances between homes and grocery stores (both the stores used and the nearest stores), and more at-ground parking around the grocery store used. Higher street density, more quick-service restaurants around homes, and more nonchain grocery stores near the primary grocery store used were related to not driving. Results suggested that reductions of distances between homes and grocery stores, clustering of grocery stores and other food establishments, and reductions in the amount of the parking around them could lead to less driving for grocery shopping.

  19. Grocery Shopping How Individuals and Built Environments Influence Choice of Travel Mode

    PubMed Central

    Jiao, Junfeng; Moudon, Anne Vernez; Drewnowski, Adam

    2014-01-01

    This research investigated the influences of socioeconomic characteristics of individual travelers and of the environments where the travelers live and shop on choice of travel mode for grocery shopping. The data on travel for grocery shopping came from 2,001 respondents to the 2009 Seattle Obesity Study survey in King County, Washington. Eighty-eight percent of the respondents drove to their grocery stores, whereas 12% used transit or taxis, walked, biked, or carpooled. The addresses of 1,994 homes and 1,901 primary grocery stores used by respondents were geographically coded. The characteristics of built environments in the neighborhoods around homes and grocery stores and the distances between those homes and stores were measured in a geographic information system. Four binary logistic models estimated the impact of individual socioeconomic characteristics, distance, and built environments around homes and grocery stores on the travel mode used for grocery shopping. Fourteen variables were significantly related to mode choice. The strongest predictors of driving to the grocery store were more cars per adult household member, more adults per household, living in a single-family house, longer distances between homes and grocery stores (both the stores used and the nearest stores), and more at-ground parking around the grocery store used. Higher street density, more quick-service restaurants around homes, and more nonchain grocery stores near the primary grocery store used were related to not driving. Results suggested that reductions of distances between homes and grocery stores, clustering of grocery stores and other food establishments, and reductions in the amount of the parking around them could lead to less driving for grocery shopping. PMID:25729127

  20. Effects of large deep-seated landslides on hillslope morphology, western Southern Alps, New Zealand

    NASA Astrophysics Data System (ADS)

    Korup, Oliver

    2006-03-01

    Morphometric analysis and air photo interpretation highlight geomorphic imprints of large landslides (i.e., affecting ≥1 km2) on hillslopes in the western Southern Alps (WSA), New Zealand. Large landslides attain kilometer-scale runout, affect >50% of total basin relief, and in 70% are slope clearing, and thus relief limiting. Landslide terrain shows lower mean local relief, relief variability, slope angles, steepness, and concavity than surrounding terrain. Measuring mean slope angle smoothes out local landslide morphology, masking any relationship between large landslides and possible threshold hillslopes. Large failures also occurred on low-gradient slopes, indicating persistent low-frequency/high-magnitude hillslope adjustment independent of fluvial bedrock incision. At the basin and hillslope scale, slope-area plots partly constrain the effects of landslides on geomorphic process regimes. Landslide imprints gradually blend with relief characteristics at orogen scale (102 km), while being sensitive to length scales of slope failure, topography, sampling, and digital elevation model resolution. This limits means of automated detection, and underlines the importance of local morphologic contrasts for detecting large landslides in the WSA. Landslide controls on low-order drainage include divide lowering and shifting, formation of headwater basins and hanging valleys, and stream piracy. Volumes typically mobilized, yet still stored in numerous deposits despite high denudation rates, are >107 m3, and theoretically equal to 102 years of basin-wide debris production from historic shallow landslides; lack of absolute ages precludes further estimates. Deposit size and mature forest cover indicate residence times of 101-104 years. On these timescales, large landslides require further attention in landscape evolution models of tectonically active orogens.

  1. GraphReduce: Large-Scale Graph Analytics on Accelerator-Based HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Dipanjan; Agarwal, Kapil; Song, Shuaiwen

    2015-09-30

    Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of both edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the hostmore » and the device.« less

  2. Physics-based and human-derived information fusion for analysts

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael

    2017-05-01

    Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.

  3. Database recovery using redundant disk arrays

    NASA Technical Reports Server (NTRS)

    Mourad, Antoine N.; Fuchs, W. K.; Saab, Daniel G.

    1992-01-01

    Redundant disk arrays provide a way for achieving rapid recovery from media failures with a relatively low storage cost for large scale database systems requiring high availability. In this paper a method is proposed for using redundant disk arrays to support rapid-recovery from system crashes and transaction aborts in addition to their role in providing media failure recovery. A twin page scheme is used to store the parity information in the array so that the time for transaction commit processing is not degraded. Using an analytical model, it is shown that the proposed method achieves a significant increase in the throughput of database systems using redundant disk arrays by reducing the number of recovery operations needed to maintain the consistency of the database.

  4. Recovery issues in databases using redundant disk arrays

    NASA Technical Reports Server (NTRS)

    Mourad, Antoine N.; Fuchs, W. K.; Saab, Daniel G.

    1993-01-01

    Redundant disk arrays provide a way for achieving rapid recovery from media failures with a relatively low storage cost for large scale database systems requiring high availability. In this paper we propose a method for using redundant disk arrays to support rapid recovery from system crashes and transaction aborts in addition to their role in providing media failure recovery. A twin page scheme is used to store the parity information in the array so that the time for transaction commit processing is not degraded. Using an analytical model, we show that the proposed method achieves a significant increase in the throughput of database systems using redundant disk arrays by reducing the number of recovery operations needed to maintain the consistency of the database.

  5. Performance evaluation of redundant disk array support for transaction recovery

    NASA Technical Reports Server (NTRS)

    Mourad, Antoine N.; Fuchs, W. Kent; Saab, Daniel G.

    1991-01-01

    Redundant disk arrays provide a way of achieving rapid recovery from media failures with a relatively low storage cost for large scale data systems requiring high availability. Here, we propose a method for using redundant disk arrays to support rapid recovery from system crashes and transaction aborts in addition to their role in providing media failure recovery. A twin page scheme is used to store the parity information in the array so that the time for transaction commit processing is not degraded. Using an analytical model, we show that the proposed method achieves a significant increase in the throughput of database systems using redundant disk arrays by reducing the number of recovery operations needed to maintain the consistency of the database.

  6. An overview of available crop growth and yield models for studies and assessments in agriculture.

    PubMed

    Di Paola, Arianna; Valentini, Riccardo; Santini, Monia

    2016-02-01

    The scientific community offers numerous crop models with different levels of sophistication. In such a wide range of crop models, users should have the possibility to choose the most suitable, in terms of detail, scale and representativeness, to their objectives. However, even when an appropriate choice is made, model limitations should be clarified such that modelling studies are put in the proper perspective and robust applications are achieved. This work is an overview of available models to simulate crop growth and yield. A summary matrix with more than 70 crop models is provided, storing the main model characteristics that can help users to choose the proper tool according to their purposes. Overall, we found that two main aspects of models, despite their importance, are not always clear from the published references, i.e. the versatility of the models, in terms of reliable transferability to different conditions, and the degree of complexity. Hence, the developers of models should be encouraged to pay more attention to clarifying the model limitations and limits of applicability, and users should make an effort in proper model selection, to save time often devoted to iteration of tuning steps to force an inappropriate model to be adapted to their own purpose. © 2015 Society of Chemical Industry.

  7. Experimental investigations and geochemical modelling of site-specific fluid-fluid and fluid-rock interactions in underground storage of CO2/H2/CH4 mixtures: the H2STORE project

    NASA Astrophysics Data System (ADS)

    De Lucia, Marco; Pilz, Peter

    2015-04-01

    Underground gas storage is increasingly regarded as a technically viable option for meeting the energy demand and environmental targets of many industrialized countries. Besides the long-term CO2 sequestration, energy can be chemically stored in form of CO2/CH4/H2 mixtures, for example resulting from excess wind energy. A precise estimation of the impact of such gas mixtures on the mineralogical, geochemical and petrophysical properties of specific reservoirs and caprocks is crucial for site selection and optimization of storage depth. Underground gas storage is increasingly regarded as a technically viable option for meeting environmental targets and the energy demand through storage in form of H2 or CH4, i.e. resulting from excess wind energy. Gas storage in salt caverns is nowadays a mature technology; in regions where favorable geologic structures such as salt diapires are not available, however, gas storage can only be implemented in porous media such as depleted gas and oil reservoirs or suitable saline aquifers. In such settings, a significant amount of in-situ gas components such as CO2, CH4 (and N2) will always be present, making the CO2/CH4/H2 system of particular interest. A precise estimation of the impact of their gas mixtures on the mineralogical, geochemical and petrophysical properties of specific reservoirs and caprocks is therefore crucial for site selection and optimization of storage depth. In the framework of the collaborative research project H2STORE, the feasibility of industrial-scale gas storage in porous media in several potential siliciclastic depleted gas and oil reservoirs or suitable saline aquifers is being investigated by means of experiments and modelling on actual core materials from the evaluated sites. Among them are the Altmark depleted gas reservoir in Saxony-Anhalt and the Ketzin pilot site for CO2 storage in Brandenburg (Germany). Further sites are located in the Molasse basin in South Germany and Austria. In particular, two work packages hosted at the German Research Centre for Geosciences (GFZ) focus on the fluid-fluid and fluid-rock interactions triggered by CO2, H2 and their mixtures. Laboratory experiments expose core samples to hydrogen and CO2/hydrogen mixtures under site-specific conditions (temperatures up to 200 °C and pressure up to 300 bar). The resulting qualitative and, whereas possible, quantitative data are expected to ameliorate the precision of predictive geochemical and reactive transport modelling, which is also performed within the project. The combination of experiments, chemical and mineralogical analyses and models is needed to improve the knowledge about: (1) solubility model and mixing rule for multicomponent gas mixtures in high saline formation fluids: no data are namely available in literature for H2-charged gas mixtures in the conditions expected in the potential sites; (2) chemical reactivity of different mineral assemblages and formation fluids in a broad spectrum of P-T conditions and composition of the stored gas mixtures; (3) thermodynamics and kinetics of relevant reactions involving mineral dissolution or precipitation. The resulting amelioration of site characterization and the overall enhancement in understanding the potential processes will benefit the operational reliability, the ecological tolerance, and the economic efficiency of future energy storing plants, crucial aspects for public acceptance and for industrial investors.

  8. Development of a Model for a Small Store Operation for Fashion Merchandising Students to be Utilized by Instructors in Fashion Merchandising Programs.

    ERIC Educational Resources Information Center

    Tans, Nancy

    The report describes a model for establishing a small store to be operated by fashion merchandising students for academic credit within a post-secondary school program. It is intended to bridge the gap between graduation and employment by offering the student a hands-on retailing and merchandising experience during school hours before graduation.…

  9. Category Accessibility and Recall Accuracy: The Impact of Exposure to Mass Media in Witness Recall Situations.

    ERIC Educational Resources Information Center

    Tamborini, Ron; And Others

    The R.S. Wyer and T.K. Srull model suggests that when humans process information and store it in memory they create construct categories that are somewhat like storage bins. According to this model, when information is placed in these bins, it is stored in the order that it is received or used, with the most recently processed information always…

  10. Squeezed states and graviton-entropy production in the early universe

    NASA Technical Reports Server (NTRS)

    Giovannini, Massimo

    1994-01-01

    Squeezed states are a very useful framework for the quantum treatment of tensor perturbations (i.e. gravitons production) in the early universe. In particular, the non equilibrium entropy growth in a cosmological process of pair production is completely determined by the associated squeezing parameter and is insensitive to the number of particles in the initial state. The total produced entropy may represent a significant fraction of the entropy stored today in the cosmic blackbody radiation, provided pair production originates from a change in the background metric at a curvature scale of the Planck order. Within the formalism of squeezed thermal states it is also possible to discuss the stimulated emission of gravitons from an initial thermal bath, under the action of the cosmic gravitational background field. We find that at low energy the graviton production is enhanced, if compared with spontaneous creation from the vacuum; as a consequence, the inflation scale must be lowered, in order not to exceed the observed CMB quadrupole anisotropy. This effect is important, in particular, for models based on a symmetry-breaking transition which require, as initial condition, a state of thermal equilibrium at temperatures higher than the inflation scale and in which inflation has a minimal duration.

  11. Verification of Hydrologic Landscape Derived Basin-Scale Classifications in the Pacific Northwest

    EPA Science Inventory

    The interaction between the physical properties of a catchment (form) and climatic forcing of precipitation and energy control how water is partitioned, stored, and conveyed through a catchment (function). Hydrologic Landscapes (HLs) were previously developed across Oregon and de...

  12. GRACE storage-runoff hystereses reveal the dynamics of regional watersheds

    EPA Science Inventory

    Watersheds function as integrated systems where climate and geology govern the movement of water. In situ instrumentation can provide local-scale insights into the non-linear relationship between streamflow and water stored in a watershed as snow, soil moisture, and groundwater. ...

  13. Survivability of porcine epidemic diarrhea virus (PEDV) in bovine plasma submitted to spray drying processing and held at different time by temperature storage conditions.

    PubMed

    Pujols, Joan; Segalés, Joaquim

    2014-12-05

    Bovine plasma was inoculated with porcine epidemic diarrhea virus (PEDV) at an average final titer of 4.2 log10 TCID50/mL to determine the effect of spray drying on viral inactivation. Using a laboratory scale drier, inoculated plasma was spray dried at 200 °C inlet temperature and either 70 or 80 °C throughout substance. Both liquid and dried samples were subjected to three passages on VERO cell monolayers to determine PEDV infectivity. Results indicated liquid samples contained infective virus, but none of the spray dried samples were infectious. Also, survivability of PEDV inoculated on spray dried bovine plasma (SDBP) and stored at 4, 12 or 22 °C was determined for 7, 14 and 21 days. Commercial SDBP powder was inoculated with PEDV to an average final titer of 2.8 log10 TCID50/g. Five samples per time and temperature conditions were subjected to three passages on VERO cell monolayers to determine PEDV infectivity. The virus was non-infectious for all samples stored at 22 °C at 7, 14 and 21 days. PEDV was infective in 1 out of 5 samples stored at 12 °C at 7 days, but none of the samples stored for 14 and 21 days were infectious in cell culture. For samples stored at 4 °C, 4 out of 5 samples were infectious at 7 days, 1 out of 5 samples were infectious at 14 days, but none were infectious at 21 days. In summary, PEDV was not infectious on cell culture within 7 days when stored at room temperature and within 21 days when stored at refrigerated temperature. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Neuronvisio: A Graphical User Interface with 3D Capabilities for NEURON.

    PubMed

    Mattioni, Michele; Cohen, Uri; Le Novère, Nicolas

    2012-01-01

    The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three-dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://neuronvisio.org) aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model. Neuronvisio also facilitates access to previously published models, allowing users to browse, download, and locally run NEURON models stored in ModelDB. Neuronvisio uses the matplotlib library to plot simulation results and uses the HDF standard format to store simulation results. Neuronvisio can be viewed as an extension of NEURON, facilitating typical user workflows such as model browsing, selection, download, compilation, and simulation. The 3D viewer simplifies the exploration of complex model structure, while matplotlib permits the plotting of high-quality graphs. The newly introduced ability of saving numerical results allows users to perform additional analysis on their previous simulations.

  15. The origin of volatiles in the Earth's mantle

    NASA Astrophysics Data System (ADS)

    Hier-Majumder, Saswata; Hirschmann, Marc M.

    2017-08-01

    The Earth's deep interior contains significant reservoirs of volatiles such as H, C, and N. Due to the incompatible nature of these volatile species, it has been difficult to reconcile their storage in the residual mantle immediately following crystallization of the terrestrial magma ocean (MO). As the magma ocean freezes, it is commonly assumed that very small amounts of melt are retained in the residual mantle, limiting the trapped volatile concentration in the primordial mantle. In this article, we show that inefficient melt drainage out of the freezing front can retain large amounts of volatiles hosted in the trapped melt in the residual mantle while creating a thick early atmosphere. Using a two-phase flow model, we demonstrate that compaction within the moving freezing front is inefficient over time scales characteristic of magma ocean solidification. We employ a scaling relation between the trapped melt fraction, the rate of compaction, and the rate of freezing in our magma ocean evolution model. For cosmochemically plausible fractions of volatiles delivered during the later stages of accretion, our calculations suggest that up to 77% of total H2O and 12% of CO2 could have been trapped in the mantle during magma ocean crystallization. The assumption of a constant trapped melt fraction underestimates the mass of volatiles in the residual mantle by more than an order of magnitude.Plain Language SummaryThe Earth's deep interior contains substantial amounts of volatile elements like C, H, and N. How these elements got sequestered in the Earth's interior has long been a topic of debate. It is generally assumed that most of these elements escaped the interior of the Earth during the first few hundred thousand years to create a primitive atmosphere, leaving the mantle reservoir nearly empty. In this work, we show that the key to this paradox involves the very early stages of crystallization of the mantle from a global magma ocean. Using numerical models, we show that the mantle stored substantially higher amounts of volatiles than previously thought, thanks to large quantities of melt trapped in the mantle due to rapid freezing of the magma ocean. Our models show that up to 77% of the total planetary budget of water and 12% of CO2 can be stored in the mantle due to this previously unaccounted process.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA569965','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA569965"><span>Chaining for Flexible and High-Performance Key-Value Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2012-09-01</p> <p>store that is fault tolerant achieves high performance and availability, and offers strong data consistency? We present a new replication protocol...effective high performance data access and analytics, many sites use simpler data model “ NoSQL ” systems. ese systems store and retrieve data only by...DRAM, Flash, and disk-based storage; can act as an unreliable cache or a durable store ; and can offer strong or weak data consistency. e value of</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/8723904','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/8723904"><span>Depletion of intracellular calcium stores facilitates the influx of extracellular calcium in platelet derived growth factor stimulated A172 glioblastoma cells.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Vereb, G; Szöllösi, J; Mátyus, L; Balázs, M; Hyun, W C; Feuerstein, B G</p> <p>1996-05-01</p> <p>Calcium signaling in non-excitable cells is the consequence of calcium release from intracellular stores, at times followed by entry of extracellular calcium through the plasma membrane. To study whether entry of calcium depends upon the level of saturation of intracellular stores, we measured calcium channel opening in the plasma membrane of single confluent A172 glioblastoma cells stimulated with platelet derived growth factor (PDGF) and/or bradykinin (BK). We monitored the entry of extracellular calcium by measuring manganese quenching of Indo-1 fluorescence. PDGF raised intracellular calcium concentration ([Ca2+]i) after a dose-dependent delay (tdel) and then opened calcium channels after a dose-independent delay (tch). At higher doses (> 3 nM), BK increased [Ca2+]i after a tdel approximately 0 s, and tch decreased inversely with both dose and peak [Ca2+]i. Experiments with thapsigargin (TG), BK, and PDGF indicated that BK and PDGF share intracellular Ca2+ pools that are sensitive to TG. When these stores were depleted by treatment with BK and intracellular BAPTA, tdel did not change, but tch fell to almost 0 s in PDGF stimulated cells, indicating that depletion of calcium stores affects calcium channel opening in the plasma membrane. Our data support the capacitative model for calcium channel opening and the steady-state model describing quantal Ca2+ release from intracellular stores.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1344652-user-assisted-store-recycling-dynamic-task-graph-schedulers','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1344652-user-assisted-store-recycling-dynamic-task-graph-schedulers"><span>User-Assisted Store Recycling for Dynamic Task Graph Schedulers</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Kurt, Mehmet Can; Krishnamoorthy, Sriram; Agrawal, Gagan</p> <p></p> <p>The emergence of the multi-core era has led to increased interest in designing effective yet practical parallel programming models. Models based on task graphs that operate on single-assignment data are attractive in several ways: they can support dynamic applications and precisely represent the available concurrency. However, they also require nuanced algorithms for scheduling and memory management for efficient execution. In this paper, we consider memory-efficient dynamic scheduling of task graphs. Specifically, we present a novel approach for dynamically recycling the memory locations assigned to data items as they are produced by tasks. We develop algorithms to identify memory-efficient store recyclingmore » functions by systematically evaluating the validity of a set of (user-provided or automatically generated) alternatives. Because recycling function can be input data-dependent, we have also developed support for continued correct execution of a task graph in the presence of a potentially incorrect store recycling function. Experimental evaluation demonstrates that our approach to automatic store recycling incurs little to no overheads, achieves memory usage comparable to the best manually derived solutions, often produces recycling functions valid across problem sizes and input parameters, and efficiently recovers from an incorrect choice of store recycling functions.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28767093','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28767093"><span>Investigating the Spatial Dimension of Food Access.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Yenerall, Jackie; You, Wen; Hill, Jennie</p> <p>2017-08-02</p> <p>The purpose of this article is to investigate the sensitivity of food access models to a dataset's spatial distribution and the empirical definition of food access, which contributes to understanding the mixed findings of previous studies. Data was collected in the Dan River Region in the United States using a telephone survey for individual-level variables ( n = 784) and a store audit for the location of food retailers and grocery store quality. Spatial scanning statistics assessed the spatial distribution of obesity and detected a cluster of grocery stores overlapping with a cluster of obesity centered on a grocery store suggesting that living closer to a grocery store increased the likelihood of obesity. Logistic regression further examined this relationship while controlling for demographic and other food environment variables. Similar to the cluster analysis results, increased distance to a grocery store significantly decreased the likelihood of obesity in the urban subsample (average marginal effects, AME = -0.09, p -value = 0.02). However, controlling for grocery store quality nullified these results (AME = -0.12, p -value = 0.354). Our findings suggest that measuring grocery store accessibility as the distance to the nearest grocery store captures variability in the spatial distribution of the health outcome of interest that may not reflect a causal relationship between the food environment and health.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5580570','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5580570"><span>Investigating the Spatial Dimension of Food Access</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Yenerall, Jackie; You, Wen</p> <p>2017-01-01</p> <p>The purpose of this article is to investigate the sensitivity of food access models to a dataset’s spatial distribution and the empirical definition of food access, which contributes to understanding the mixed findings of previous studies. Data was collected in the Dan River Region in the United States using a telephone survey for individual-level variables (n = 784) and a store audit for the location of food retailers and grocery store quality. Spatial scanning statistics assessed the spatial distribution of obesity and detected a cluster of grocery stores overlapping with a cluster of obesity centered on a grocery store suggesting that living closer to a grocery store increased the likelihood of obesity. Logistic regression further examined this relationship while controlling for demographic and other food environment variables. Similar to the cluster analysis results, increased distance to a grocery store significantly decreased the likelihood of obesity in the urban subsample (average marginal effects, AME = −0.09, p-value = 0.02). However, controlling for grocery store quality nullified these results (AME = −0.12, p-value = 0.354). Our findings suggest that measuring grocery store accessibility as the distance to the nearest grocery store captures variability in the spatial distribution of the health outcome of interest that may not reflect a causal relationship between the food environment and health. PMID:28767093</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_21 --> <div id="page_22" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="421"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22045805','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22045805"><span>The economic impact of state cigarette taxes and smoke-free air policies on convenience stores.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Huang, Jidong; Chaloupka, Frank J</p> <p>2013-03-01</p> <p>To investigate whether increasing state cigarette taxes and/or enacting stronger smoke-free air (SFA) policies have negative impact on convenience store density in a state, a proxy that is determined by store openings and closings, which reflects store profits. State-level business count estimates for convenience stores for 50 states and District of Columbia from 1997 to 2009 were analysed using two-way fixed effects regression techniques that control for state-specific and year-specific determinants of convenience store density. The impact of tax and SFA policies was examined using a quasi-experimental research design that exploits changes in cigarette taxes and SFA policies within a state over time. Taxes are found to be uncorrelated with the density of combined convenience stores and gas stations in a state. Taxes are positively correlated with the density of convenience stores; however, the magnitude of this correlation is small, with a 10% increase in state cigarette taxes associated with a 0.19% (p<0.05) increase in the number of convenience stores per million people in a state. State-level SFA policies do not correlate with convenience store density in a state, regardless whether gas stations were included. These results are robust across different model specifications. In addition, they are robust with regard to the inclusion/exclusion of other state-level tobacco control measures and gasoline prices. Contrary to tobacco industry and related organisations' claims, higher cigarette taxes and stronger SFA policies do not negatively affect convenience stores.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2004EP%26S...56..773O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2004EP%26S...56..773O"><span>Earthquake cycles and physical modeling of the process leading up to a large earthquake</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ohnaka, Mitiyasu</p> <p>2004-08-01</p> <p>A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5685056','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5685056"><span>Short-term effects of stored homologous red blood cell transfusion on cardiorespiratory function and inflammation: an experimental study in a hypovolemia model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Biagini, S.; Dale, C.S.; Real, J.M.; Moreira, E.S.; Carvalho, C.R.R.; Schettino, G.P.P.; Wendel, S.; Azevedo, L.C.P.</p> <p>2017-01-01</p> <p>The pathophysiological mechanisms associated with the effects of red blood cell (RBC) transfusion on cardiopulmonary function and inflammation are unclear. We developed an experimental model of homologous 14-days stored RBC transfusion in hypovolemic swine to evaluate the short-term effects of transfusion on cardiopulmonary system and inflammation. Sixteen healthy male anesthetized swine (68±3.3 kg) were submitted to controlled hemorrhage (25% of blood volume). Two units of non-filtered RBC from each animal were stored under blood bank conditions for 14 days. After 30 min of hypovolemia, the control group (n=8) received an infusion of lactated Ringer's solution (three times the removed volume). The transfusion group (n=8) received two units of homologous 14-days stored RBC and lactated Ringer's solution in a volume that was three times the difference between blood removed and blood transfusion infused. Both groups were followed up for 6 h after resuscitation with collection of hemodynamic and respiratory data. Cytokines and RNA expression were measured in plasma and lung tissue. Stored RBC transfusion significantly increased mixed oxygen venous saturation and arterial oxygen content. Transfusion was not associated with alterations on pulmonary function. Pulmonary concentrations of cytokines were not different between groups. Gene expression for lung cytokines demonstrated a 2-fold increase in mRNA level for inducible nitric oxide synthase and a 0.5-fold decrease in mRNA content for IL-21 in the transfused group. Thus, stored homologous RBC transfusion in a hypovolemia model improved cardiovascular parameters but did not induce significant effects on microcirculation, pulmonary inflammation and respiratory function up to 6 h after transfusion. PMID:29185590</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23747923','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23747923"><span>Neighborhood socioeconomic characteristics and differences in the availability of healthy food stores and restaurants in Sao Paulo, Brazil.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Duran, Ana Clara; Diez Roux, Ana V; Latorre, Maria do Rosario D O; Jaime, Patricia Constante</p> <p>2013-09-01</p> <p>Differential access to healthy foods has been hypothesized to contribute to health disparities, but evidence from low and middle-income countries is still scarce. This study examines whether the access of healthy foods varies across store types and neighborhoods of different socioeconomic statuses (SES) in a large Brazilian city. A cross-sectional study was conducted in 2010-2011 across 52 census tracts. Healthy food access was measured by a comprehensive in-store data collection, summarized into two indexes developed for retail food stores (HFSI) and restaurants (HMRI). Descriptive analyses and multilevel models were used to examine associations of store type and neighborhood SES with healthy food access. Fast food restaurants were more likely to be located in low SES neighborhoods whereas supermarkets and full service restaurants were more likely to be found in higher SES neighborhoods. Multilevel analyses showed that both store type and neighborhood SES were independently associated with in-store food measures. We found differences in the availability of healthy food stores and restaurants in Sao Paulo city favoring middle and high SES neighborhoods. © 2013 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFM.H11J..09P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFM.H11J..09P"><span>On the Role of Multi-Scale Processes in CO2 Storage Security and Integrity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pruess, K.; Kneafsey, T. J.</p> <p>2008-12-01</p> <p>Consideration of multiple scales in subsurface processes is usually referred to the spatial domain, where we may attempt to relate process descriptions and parameters from pore and bench (Darcy) scale to much larger field and regional scales. However, multiple scales occur also in the time domain, and processes extending over a broad range of time scales may be very relevant to CO2 storage and containment. In some cases, such as in the convective instability induced by CO2 dissolution in saline waters, space and time scales are coupled in the sense that perturbations induced by CO2 injection will grow concurrently over many orders of magnitude in both space and time. In other cases, CO2 injection may induce processes that occur on short time scales, yet may affect large regions. Possible examples include seismicity that may be triggered by CO2 injection, or hypothetical release events such as "pneumatic eruptions" that may discharge substantial amounts of CO2 over a short time period. This paper will present recent advances in our experimental and modeling studies of multi-scale processes. Specific examples that will be discussed include (1) the process of CO2 dissolution-diffusion-convection (DDC), that can greatly accelerate the rate at which free-phase CO2 is stored as aqueous solute; (2) self- enhancing and self-limiting processes during CO2 leakage through faults, fractures, or improperly abandoned wells; and (3) porosity and permeability reduction from salt precipitation near CO2 injection wells, and mitigation of corresponding injectivity loss. This work was supported by the Office of Basic Energy Sciences and by the Zero Emission Research and Technology project (ZERT) under Contract No. DE-AC02-05CH11231 with the U.S. Department of Energy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70131489','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70131489"><span>Africa-wide monitoring of small surface water bodies using multisource satellite data: a monitoring system for FEWS NET: chapter 5</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Velpuri, Naga Manohar; Senay, Gabriel B.; Rowland, James; Verdin, James P.; Alemu, Henok; Melesse, Assefa M.; Abtew, Wossenu; Setegn, Shimelis G.</p> <p>2014-01-01</p> <p>Continental Africa has the highest volume of water stored in wetlands, large lakes, reservoirs, and rivers, yet it suffers from problems such as water availability and access. With climate change intensifying the hydrologic cycle and altering the distribution and frequency of rainfall, the problem of water availability and access will increase further. Famine Early Warning Systems Network (FEWS NET) funded by the United States Agency for International Development (USAID) has initiated a large-scale project to monitor small to medium surface water points in Africa. Under this project, multisource satellite data and hydrologic modeling techniques are integrated to monitor several hundreds of small to medium surface water points in Africa. This approach has been already tested to operationally monitor 41 water points in East Africa. The validation of modeled scaled depths with field-installed gauge data demonstrated the ability of the model to capture both the spatial patterns and seasonal variations. Modeled scaled estimates captured up to 60 % of the observed gauge variability with a mean root-mean-square error (RMSE) of 22 %. The data on relative water level, precipitation, and evapotranspiration (ETo) for water points in East and West Africa were modeled since 1998 and current information is being made available in near-real time. This chapter presents the approach, results from the East African study, and the first phase of expansion activities in the West Africa region. The water point monitoring network will be further expanded to cover much of sub-Saharan Africa. The goal of this study is to provide timely information on the water availability that would support already established FEWS NET activities in Africa. This chapter also presents the potential improvements in modeling approach to be implemented during future expansion in Africa.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19950065501&hterms=self+recognition&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D90%26Ntt%3Dself%2Brecognition','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19950065501&hterms=self+recognition&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D90%26Ntt%3Dself%2Brecognition"><span>Associative Pattern Recognition In Analog VLSI Circuits</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Tawel, Raoul</p> <p>1995-01-01</p> <p>Winner-take-all circuit selects best-match stored pattern. Prototype cascadable very-large-scale integrated (VLSI) circuit chips built and tested to demonstrate concept of electronic associative pattern recognition. Based on low-power, sub-threshold analog complementary oxide/semiconductor (CMOS) VLSI circuitry, each chip can store 128 sets (vectors) of 16 analog values (vector components), vectors representing known patterns as diverse as spectra, histograms, graphs, or brightnesses of pixels in images. Chips exploit parallel nature of vector quantization architecture to implement highly parallel processing in relatively simple computational cells. Through collective action, cells classify input pattern in fraction of microsecond while consuming power of few microwatts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/1001568','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/1001568"><span>An evaluation of condition indices for birds</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Johnson, D.H.; Krapu, G.L.; Reinecke, K.J.; Jorde, Dennis G.</p> <p>1985-01-01</p> <p>A Lipid Index, the ratio of fat to fat-free dry weight, is proposed as a measure of fat stores in birds. The estimation of the index from field measurements of live birds is illustrated with data on the sandhill crane (Grus canadensis) and greater white-fronted goose (Anser albifrons). Of the various methods of assessing fat stores, lipid extraction is the most accurate but also the most involved. Water extraction is a simpler laboratory method that provides a good index to fat and can be calibrated to serve as an estimator. Body weight itself is often inadequate as a condition index, but scaling by morphological measurements can markedly improve its value.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4740405','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4740405"><span>Oscillation, Conduction Delays, and Learning Cooperate to Establish Neural Competition in Recurrent Networks</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Kato, Hideyuki; Ikeguchi, Tohru</p> <p>2016-01-01</p> <p>Specific memory might be stored in a subnetwork consisting of a small population of neurons. To select neurons involved in memory formation, neural competition might be essential. In this paper, we show that excitable neurons are competitive and organize into two assemblies in a recurrent network with spike timing-dependent synaptic plasticity (STDP) and axonal conduction delays. Neural competition is established by the cooperation of spontaneously induced neural oscillation, axonal conduction delays, and STDP. We also suggest that the competition mechanism in this paper is one of the basic functions required to organize memory-storing subnetworks into fine-scale cortical networks. PMID:26840529</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25411507','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25411507"><span>Long-term memory stabilized by noise-induced rehearsal.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wei, Yi; Koulakov, Alexei A</p> <p>2014-11-19</p> <p>Cortical networks can maintain memories for decades despite the short lifetime of synaptic strengths. Can a neural network store long-lasting memories in unstable synapses? Here, we study the effects of ongoing spike-timing-dependent plasticity (STDP) on the stability of memory patterns stored in synapses of an attractor neural network. We show that certain classes of STDP rules can stabilize all stored memory patterns despite a short lifetime of synapses. In our model, unstructured neural noise, after passing through the recurrent network connections, carries the imprint of all memory patterns in temporal correlations. STDP, combined with these correlations, leads to reinforcement of all stored patterns, even those that are never explicitly visited. Our findings may provide the functional reason for irregular spiking displayed by cortical neurons and justify models of system memory consolidation. Therefore, we propose that irregular neural activity is the feature that helps cortical networks maintain stable connections. Copyright © 2014 the authors 0270-6474/14/3415804-12$15.00/0.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006PhRvA..73f2310B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006PhRvA..73f2310B"><span>Quantum Darwinism: Entanglement, branches, and the emergent classicality of redundantly stored quantum information</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Blume-Kohout, Robin; Zurek, Wojciech H.</p> <p>2006-06-01</p> <p>We lay a comprehensive foundation for the study of redundant information storage in decoherence processes. Redundancy has been proposed as a prerequisite for objectivity, the defining property of classical objects. We consider two ensembles of states for a model universe consisting of one system and many environments: the first consisting of arbitrary states, and the second consisting of “singly branching” states consistent with a simple decoherence model. Typical states from the random ensemble do not store information about the system redundantly, but information stored in branching states has a redundancy proportional to the environment’s size. We compute the specific redundancy for a wide range of model universes, and fit the results to a simple first-principles theory. Our results show that the presence of redundancy divides information about the system into three parts: classical (redundant); purely quantum; and the borderline, undifferentiated or “nonredundant,” information.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28827555','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28827555"><span>A Mathematical Model for Storage and Recall of Images using Targeted Synchronization of Coupled Maps.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Palaniyandi, P; Rangarajan, Govindan</p> <p>2017-08-21</p> <p>We propose a mathematical model for storage and recall of images using coupled maps. We start by theoretically investigating targeted synchronization in coupled map systems wherein only a desired (partial) subset of the maps is made to synchronize. A simple method is introduced to specify coupling coefficients such that targeted synchronization is ensured. The principle of this method is extended to storage/recall of images using coupled Rulkov maps. The process of adjusting coupling coefficients between Rulkov maps (often used to model neurons) for the purpose of storing a desired image mimics the process of adjusting synaptic strengths between neurons to store memories. Our method uses both synchronisation and synaptic weight modification, as the human brain is thought to do. The stored image can be recalled by providing an initial random pattern to the dynamical system. The storage and recall of the standard image of Lena is explicitly demonstrated.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27411141','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27411141"><span>Image interpolation used in three-dimensional range data compression.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhang, Shaoze; Zhang, Jianqi; Huang, Xi; Liu, Delian</p> <p>2016-05-20</p> <p>Advances in the field of three-dimensional (3D) scanning have made the acquisition of 3D range data easier and easier. However, with the large size of 3D range data comes the challenge of storing and transmitting it. To address this challenge, this paper presents a framework to further compress 3D range data using image interpolation. We first use a virtual fringe-projection system to store 3D range data as images, and then apply the interpolation algorithm to the images to reduce their resolution to further reduce the data size. When the 3D range data are needed, the low-resolution image is scaled up to its original resolution by applying the interpolation algorithm, and then the scaled-up image is decoded and the 3D range data are recovered according to the decoded result. Experimental results show that the proposed method could further reduce the data size while maintaining a low rate of error.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5114016','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5114016"><span>Rotating waves during human sleep spindles organize global patterns of activity that repeat precisely through the night</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Muller, Lyle; Piantoni, Giovanni; Koller, Dominik; Cash, Sydney S; Halgren, Eric; Sejnowski, Terrence J</p> <p>2016-01-01</p> <p>During sleep, the thalamus generates a characteristic pattern of transient, 11-15 Hz sleep spindle oscillations, which synchronize the cortex through large-scale thalamocortical loops. Spindles have been increasingly demonstrated to be critical for sleep-dependent consolidation of memory, but the specific neural mechanism for this process remains unclear. We show here that cortical spindles are spatiotemporally organized into circular wave-like patterns, organizing neuronal activity over tens of milliseconds, within the timescale for storing memories in large-scale networks across the cortex via spike-time dependent plasticity. These circular patterns repeat over hours of sleep with millisecond temporal precision, allowing reinforcement of the activity patterns through hundreds of reverberations. These results provide a novel mechanistic account for how global sleep oscillations and synaptic plasticity could strengthen networks distributed across the cortex to store coherent and integrated memories. DOI: http://dx.doi.org/10.7554/eLife.17267.001 PMID:27855061</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2002IJNAM..26.1313C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2002IJNAM..26.1313C"><span>A theoretical framework for constructing elastic/plastic constitutive models of triaxial tests</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Collins, Ian F.; Hilder, Tamsyn</p> <p>2002-11-01</p> <p>Modern ideas of thermomechanics are used to develop families of models describing the elastic/plastic behaviour of cohesionless soils deforming under triaxial conditions. Once the form of the free energy and dissipation potential functions have been specified, the corresponding yield loci, flow rules, isotropic and kinematic hardening rules as well as the elasticity law are deduced in a systematic manner. The families contain the classical linear frictional (Coulomb type) models and the classical critical state models as special cases. The generalized models discussed here include non-associated flow rules, shear as well as volumetric hardening, anisotropic responses and rotational yield loci. The various parameters needed to describe the models can be interpreted in terms of ratio of the plastic work, which is dissipated, to that which is stored. Non-associated behaviour is found to occur whenever this division between dissipated and stored work is not equal. Micro-level interpretations of stored plastic work are discussed. The models automatically satisfy the laws of thermodynamics, and there is no need to invoke any stability postulates. Some classical forms of the peak-strength/dilatancy relationship are established theoretically. Some representative drained and undrained paths are computed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA565203','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA565203"><span>Complexity, Robustness, and Network Thermodynamics in Large-Scale and Multiagent Systems: A Hybrid Control Approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2012-01-11</p> <p>dynamic behavior , wherein a dissipative dynamical system can deliver only a fraction of its energy to its surroundings and can store only a fraction of the...collection of interacting subsystems. The behavior and properties of the aggregate large-scale system can then be deduced from the behaviors of the...uniqueness is established. This state space formalism of thermodynamics shows that the behavior of heat, as described by the conservation equations of</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA346700','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA346700"><span>Planetary Waves and Mesoscale Disturbances in the Middle and Upper Atmosphere</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1998-05-14</p> <p>processing of ionogram records made us to begin designing a computer - controlled system to collect, store, display and scale the ionograms in digital...circuit board " L - 154". L - 154 passed signals from the re- ceiver and the system of the control to computer in order to collect in for motion...the main purpose of the PSMOS project is the establishment of a ground-based mesopause observing system for the investigation of planetary scale</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24780056','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24780056"><span>Production of nitrous oxide from anaerobic digester centrate and its use as a co-oxidant of biogas to enhance energy recovery.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Scherson, Yaniv D; Woo, Sung-Geun; Criddle, Craig S</p> <p>2014-05-20</p> <p>Coupled Aerobic-anoxic Nitrous Decomposition Operation (CANDO) is a new process for wastewater treatment that removes nitrogen from wastewater and recovers energy from the nitrogen in three steps: (1) NH4(+) oxidation to NO2(-); (2) NO2(-) reduction to N2O gas; and (3) N2O conversion to N2 with energy production. In this work, we optimize Steps 1 and 2 for anaerobic digester centrate, and we evaluate Step 3 for a full-scale biogas-fed internal combustion engine. Using a continuous stirred reactor coupled to a bench-scale sequencing batch reactor, we observed sustained partial oxidation of NH4(+) to NO2(-) and sustained (3 months) partial reduction of NO2(-) to N2O (75-80% conversion, mass basis), with >95% nitrogen removal (Step 2). Alternating pulses of acetate and NO2(-) selected for Comamonas (38%), Ciceribacter (16%), and Clostridium (11%). Some species stored polyhydroxybutyrate (PHB) and coupled oxidation of PHB to reduction of NO2(-) to N2O. Some species also stored phosphorus as polyphosphate granules. Injections of N2O into a biogas-fed engine at flow rates simulating a full-scale system increased power output by 5.7-7.3%. The results underscore the need for more detailed assessment of bioreactor community ecology and justify pilot- and full-scale testing.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29635394','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29635394"><span>Field of genes: using Apache Kafka as a bioinformatic data repository.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lawlor, Brendan; Lynch, Richard; Mac Aogáin, Micheál; Walsh, Paul</p> <p>2018-04-01</p> <p>Bioinformatic research is increasingly dependent on large-scale datasets, accessed either from private or public repositories. An example of a public repository is National Center for Biotechnology Information's (NCBI's) Reference Sequence (RefSeq). These repositories must decide in what form to make their data available. Unstructured data can be put to almost any use but are limited in how access to them can be scaled. Highly structured data offer improved performance for specific algorithms but limit the wider usefulness of the data. We present an alternative: lightly structured data stored in Apache Kafka in a way that is amenable to parallel access and streamed processing, including subsequent transformations into more highly structured representations. We contend that this approach could provide a flexible and powerful nexus of bioinformatic data, bridging the gap between low structure on one hand, and high performance and scale on the other. To demonstrate this, we present a proof-of-concept version of NCBI's RefSeq database using this technology. We measure the performance and scalability characteristics of this alternative with respect to flat files. The proof of concept scales almost linearly as more compute nodes are added, outperforming the standard approach using files. Apache Kafka merits consideration as a fast and more scalable but general-purpose way to store and retrieve bioinformatic data, for public, centralized reference datasets such as RefSeq and for private clinical and experimental data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017BGeo...14.4161M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017BGeo...14.4161M"><span>Process-based modelling of NH3 exchange with grazed grasslands</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Móring, Andrea; Vieno, Massimo; Doherty, Ruth M.; Milford, Celia; Nemitz, Eiko; Twigg, Marsailidh M.; Horváth, László; Sutton, Mark A.</p> <p>2017-09-01</p> <p>In this study the GAG model, a process-based ammonia (NH3) emission model for urine patches, was extended and applied for the field scale. The new model (GAG_field) was tested over two modelling periods, for which micrometeorological NH3 flux data were available. Acknowledging uncertainties in the measurements, the model was able to simulate the main features of the observed fluxes. The temporal evolution of the simulated NH3 exchange flux was found to be dominated by NH3 emission from the urine patches, offset by simultaneous NH3 deposition to areas of the field not affected by urine. The simulations show how NH3 fluxes over a grazed field in a given day can be affected by urine patches deposited several days earlier, linked to the interaction of volatilization processes with soil pH dynamics. Sensitivity analysis showed that GAG_field was more sensitive to soil buffering capacity (β), field capacity (θfc) and permanent wilting point (θpwp) than the patch-scale model. The reason for these different sensitivities is dual. Firstly, the difference originates from the different scales. Secondly, the difference can be explained by the different initial soil pH and physical properties, which determine the maximum volume of urine that can be stored in the NH3 source layer. It was found that in the case of urine patches with a higher initial soil pH and higher initial soil water content, the sensitivity of NH3 exchange to β was stronger. Also, in the case of a higher initial soil water content, NH3 exchange was more sensitive to the changes in θfc and θpwp. The sensitivity analysis showed that the nitrogen content of urine (cN) is associated with high uncertainty in the simulated fluxes. However, model experiments based on cN values randomized from an estimated statistical distribution indicated that this uncertainty is considerably smaller in practice. Finally, GAG_field was tested with a constant soil pH of 7.5. The variation of NH3 fluxes simulated in this way showed a good agreement with those from the simulations with the original approach, accounting for a dynamically changing soil pH. These results suggest a way for model simplification when GAG_field is applied later at regional scale.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1916675C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1916675C"><span>A new climate modeling framework for convection-resolving simulation at continental scale</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Charpilloz, Christophe; di Girolamo, Salvatore; Arteaga, Andrea; Fuhrer, Oliver; Hoefler, Torsten; Schulthess, Thomas; Schär, Christoph</p> <p>2017-04-01</p> <p>Major uncertainties remain in our understanding of the processes that govern the water cycle in a changing climate and their representation in weather and climate models. Of particular concern are heavy precipitation events of convective origin (thunderstorms and rain showers). The aim of the crCLIM project [1] is to propose a new climate modeling framework that alleviates the I/O-bottleneck in large-scale, convection-resolving climate simulations and thus to enable new analysis techniques for climate scientists. Due to the large computational costs, convection-resolving simulations are currently restricted to small computational domains or very short time scales, unless the largest available supercomputers system such as hybrid CPU-GPU architectures are used [3]. Hence, the COSMO model has been adapted to run on these architectures for research and production purposes [2]. However, the amount of generated data also increases and storing this data becomes infeasible making the analysis of simulations results impractical. To circumvent this problem and enable high-resolution models in climate we propose a data-virtualization layer (DVL) that re-runs simulations on demand and transparently manages the data for the analysis, that means we trade off computational effort (time) for storage (space). This approach also requires a bit-reproducible version of the COSMO model that produces identical results on different architectures (CPUs and GPUs) [4] that will be coupled with a performance model in order enable optimal re-runs depending on requirements of the re-run and available resources. In this contribution, we discuss the strategy to develop the DVL, a first performance model, the challenge of bit-reproducibility and the first results of the crCLIM project. [1] http://www.c2sm.ethz.ch/research/crCLIM.html [2] O. Fuhrer, C. Osuna, X. Lapillonne, T. Gysi, M. Bianco, and T. Schulthess. "Towards gpu-accelerated operational weather forecasting." In The GPU Technology Conference, GTC. 2013. [3] D. Leutwyler, O. Fuhrer, X. Lapillonne, D. Lüthi, and C. Schär. "Towards European-scale convection-resolving climate simulations with GPUs: a study with COSMO 4.19." Geoscientific Model Development 9, no. 9 (2016): 3393. [4] A. Arteaga, O. Fuhrer, and T. Hoefler. "Designing bit-reproducible portable high-performance applications." In Parallel and Distributed Processing Symposium, 2014 IEEE 28th International, pp. 1235-1244. IEEE, 2014.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H13C1374T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H13C1374T"><span>Applying A Multi-Objective Based Procedure to SWAT Modelling in Alpine Catchments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tuo, Y.; Disse, M.; Chiogna, G.</p> <p>2017-12-01</p> <p>In alpine catchments, water management practices can lead to conflicts between upstream and downstream stakeholders, like in the Adige river basin (Italy). A correct prediction of available water resources plays an important part, for example, in defining how much water can be stored for hydropower production in upstream reservoirs without affecting agricultural activities downstream. Snow is a crucial hydrological component that highly affects seasonal behavior of streamflow. Therefore, a realistic representation of snow dynamics is fundamental for water management operations in alpine catchments. The Soil and Water Assessment Tool (SWAT) model has been applied in alpine catchments worldwide. However, during model calibration of catchment scale applications, snow parameters were generally estimated based on streamflow records rather than on snow measurements. This may lead to streamflow predictions with wrong snow melt contribution. This work highlights the importance of considering snow measurements in the calibration of the SWAT model for alpine hydrology and compares various calibration methodologies. In addition to discharge records, snow water equivalent time series of both subbasin scale and monitoring station were also utilized to evaluate the model performance by comparing with the SWAT subbasin and elevation band snow outputs. Comparing model results obtained calibrating the model using discharge data only and discharge data along with snow water equivalent data, we show that the latter approach allows us to improve the reliability of snow simulations while maintaining good estimations of streamflow. With a more reliable representation of snow dynamics, the hydrological model can provide more accurate references for proposing adequate water management solutions. This study offers to the wide SWAT user community an effective approach to improve streamflow predictions in alpine catchments and hence support decision makers in water allocation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25810470','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25810470"><span>The Collective Impact Model and Its Potential for Health Promotion: Overview and Case Study of a Healthy Retail Initiative in San Francisco.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Flood, Johnna; Minkler, Meredith; Hennessey Lavery, Susana; Estrada, Jessica; Falbe, Jennifer</p> <p>2015-10-01</p> <p>As resources for health promotion become more constricted, it is increasingly important to collaborate across sectors, including the private sector. Although many excellent models for cross-sector collaboration have shown promise in the health field, collective impact (CI), an emerging model for creating larger scale change, has yet to receive much study. Complementing earlier collaboration approaches, CI has five core tenets: a shared agenda, shared measurement systems, mutually reinforcing activities, continuous communication, and a central infrastructure. In this article, we describe the CI model and its key dimensions and constructs. We briefly compare CI to community coalition action theory and discuss our use of the latter to provide needed detail as we apply CI in a critical case study analysis of the Tenderloin Healthy Corner Store Coalition in San Francisco, California. Using Yin's multimethod approach, we illustrate how CI strategies, augmented by the community coalition action theory, are being used, and with what successes or challenges, to help affect community- and policy-level change to reduce tobacco and alcohol advertising and sales, while improving healthy, affordable, and sustainable food access. We discuss the strengths and weaknesses of CI as a framework for health promotion, as well as the benefits, challenges, and initial outcomes of the healthy retail project and its opportunities for scale-up. Implications for health promotion practice and research also are discussed. © 2015 Society for Public Health Education.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA554166','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA554166"><span>Research for Future Training Modeling and Simulation Strategies</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2011-09-01</p> <p>it developed an “ecosystem” for the content industry—first for iTunes and now in the iPad for publishers and gamers. The iTunes Store that Apple...launched in 2003 provides an excellent analogy to training users. Initially, users could purchase 200,000 iTunes items. Today, the store has over...its iPod and iTune Store has fundamentally changed the music industry and the way the end users expect to buy things. iPod owners used to buy albums</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN51C1863X','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN51C1863X"><span>Bring NASA Scientific Data into GIS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Xu, H.</p> <p>2016-12-01</p> <p>NASA's Earth Observation System (EOS) and many other missions produce data of huge volume and near real time which drives the research and understanding of climate change. Geographic Information System (GIS) is a technology used for the management, visualization and analysis of spatial data. Since it's inception in the 1960s, GIS has been applied to many fields at the city, state, national, and world scales. People continue to use it today to analyze and visualize trends, patterns, and relationships from the massive datasets of scientific data. There is great interest in both the scientific and GIS communities in improving technologies that can bring scientific data into a GIS environment, where scientific research and analysis can be shared through the GIS platform to the public. Most NASA scientific data are delivered in the Hierarchical Data Format (HDF), a format is both flexible and powerful. However, this flexibility results in challenges when trying to develop supported GIS software - data stored with HDF formats lack a unified standard and convention among these products. The presentation introduces an information model that enables ArcGIS software to ingest NASA scientific data and create a multidimensional raster - univariate and multivariate hypercubes - for scientific visualization and analysis. We will present the framework how ArcGIS leverages the open source GDAL (Geospatial Data Abstract Library) to support its raster data access, discuss how we overcame the GDAL drivers limitations in handing scientific products that are stored with HDF4 and HDF5 formats and how we improve the way in modeling the multidimensionality with GDAL. In additional, we will talk about the direction of ArcGIS handling NASA products and demonstrate how the multidimensional information model can help scientists work with various data products such as MODIS, MOPPIT, SMAP as well as many data products in a GIS environment.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016ClDy...46.1287S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016ClDy...46.1287S"><span>Deglacial climate, carbon cycle and ocean chemistry changes in response to a terrestrial carbon release</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Simmons, C. T.; Matthews, H. D.; Mysak, L. A.</p> <p>2016-02-01</p> <p>Researchers have proposed that a significant portion of the post-glacial rise in atmospheric CO2 could be due to the respiration of permafrost carbon stocks that formed over the course of glaciation. In this paper, we used the University of Victoria Earth System Climate Model v. 2.9 to simulate the deglacial and interglacial carbon cycle from the last glacial maximum to the present. The model's sensitivity to mid and high latitude terrestrial carbon storage is evaluated by including a 600 Pg C carbon pool parameterized to respire in concert with decreases in ice sheet surface area. The respiration of this stored carbon during the early stages of deglaciation had a large effect on the carbon cycle in these simulations, allowing atmospheric CO2 to increase by 40 ppmv in the model, with an additional 20 ppmv increase occurring in the case of a more realistic, prescribed CO2 radiative warming. These increases occurred prior to large-scale carbon uptake due to the reestablishment of boreal forests and peatlands in the proxy record (beginning in the early Holocene). Surprisingly, the large external carbon input to the atmosphere and oceans did not increase sediment dissolution and mean ocean alkalinity relative to a control simulation without the high latitude carbon reservoir. In addition, our simulations suggest that an early deglacial terrestrial carbon release may come closer to explaining some observed deglacial changes in deep-ocean carbonate concentrations than simulations without such a release. We conclude that the respiration of glacial soil carbon stores may have been an important contributor to the deglacial CO2 rise, particularly in the early stages of deglaciation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.B31I..01M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.B31I..01M"><span>Plants Regulate Soil Organic Matter Decomposition in Response to Sea Level Rise</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Megonigal, P.; Mueller, P.; Jensen, K.</p> <p>2014-12-01</p> <p>Tidal wetlands have a large capacity for producing and storing organic matter, making their role in the global carbon budget disproportionate to their land area. Most of the organic matter stored in these systems is in soils where it contributes 2-5 times more to surface accretion than an equal mass of minerals. Soil organic matter (SOM) sequestration is the primary process by which tidal wetlands become perched high in the tidal frame, decreasing their vulnerability to accelerated sea level rise. Plant growth responses to sea level rise are well understood and represented in century-scale forecast models of soil surface elevation change. We understand far less about the response of soil organic matter decomposition to rapid sea level rise. Here we quantified the effects of sea level on SOM decomposition rates by exposing planted and unplanted tidal marsh monoliths to experimentally manipulated flood duration. The study was performed in a field-based mesocosm facility at the Smithsonian's Global Change Research Wetland. SOM decomposition rate was quantified as CO2 efflux, with plant- and SOM-derived CO2 separated with a two end-member δ13C-CO2 model. Despite the dogma that decomposition rates are inversely related to flooding, SOM mineralization was not sensitive to flood duration over a 35 cm range in soil surface elevation. However, decomposition rates were strongly and positively related to aboveground biomass (R2≥0.59, p≤0.01). We conclude that soil carbon loss through decomposition is driven by plant responses to sea level in this intensively studied tidal marsh. If this result applies more generally to tidal wetlands, it has important implications for modeling soil organic matter and surface elevation change in response to accelerated sea level rise.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H53F1534H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H53F1534H"><span>Strategies to reduce the complexity of hydrologic data assimilation for high-dimensional models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hernandez, F.; Liang, X.</p> <p>2017-12-01</p> <p>Probabilistic forecasts in the geosciences offer invaluable information by allowing to estimate the uncertainty of predicted conditions (including threats like floods and droughts). However, while forecast systems based on modern data assimilation algorithms are capable of producing multi-variate probability distributions of future conditions, the computational resources required to fully characterize the dependencies between the model's state variables render their applicability impractical for high-resolution cases. This occurs because of the quadratic space complexity of storing the covariance matrices that encode these dependencies and the cubic time complexity of performing inference operations with them. In this work we introduce two complementary strategies to reduce the size of the covariance matrices that are at the heart of Bayesian assimilation methods—like some variants of (ensemble) Kalman filters and of particle filters—and variational methods. The first strategy involves the optimized grouping of state variables by clustering individual cells of the model into "super-cells." A dynamic fuzzy clustering approach is used to take into account the states (e.g., soil moisture) and forcings (e.g., precipitation) of each cell at each time step. The second strategy consists in finding a compressed representation of the covariance matrix that still encodes the most relevant information but that can be more efficiently stored and processed. A learning and a belief-propagation inference algorithm are developed to take advantage of this modified low-rank representation. The two proposed strategies are incorporated into OPTIMISTS, a state-of-the-art hybrid Bayesian/variational data assimilation algorithm, and comparative streamflow forecasting tests are performed using two watersheds modeled with the Distributed Hydrology Soil Vegetation Model (DHSVM). Contrasts are made between the efficiency gains and forecast accuracy losses of each strategy used in isolation, and of those achieved through their coupling. We expect these developments to help catalyze improvements in the predictive accuracy of large-scale forecasting operations by lowering the costs of deploying advanced data assimilation techniques.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/950002','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/950002"><span>Application of the SCALE TSUNAMI Tools for the Validation of Criticality Safety Calculations Involving 233U</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Mueller, Don; Rearden, Bradley T; Hollenbach, Daniel F</p> <p>2009-02-01</p> <p>The Radiochemical Development Facility at Oak Ridge National Laboratory has been storing solid materials containing 233U for decades. Preparations are under way to process these materials into a form that is inherently safe from a nuclear criticality safety perspective. This will be accomplished by down-blending the {sup 233}U materials with depleted or natural uranium. At the request of the U.S. Department of Energy, a study has been performed using the SCALE sensitivity and uncertainty analysis tools to demonstrate how these tools could be used to validate nuclear criticality safety calculations of selected process and storage configurations. ISOTEK nuclear criticality safetymore » staff provided four models that are representative of the criticality safety calculations for which validation will be needed. The SCALE TSUNAMI-1D and TSUNAMI-3D sequences were used to generate energy-dependent k{sub eff} sensitivity profiles for each nuclide and reaction present in the four safety analysis models, also referred to as the applications, and in a large set of critical experiments. The SCALE TSUNAMI-IP module was used together with the sensitivity profiles and the cross-section uncertainty data contained in the SCALE covariance data files to propagate the cross-section uncertainties ({Delta}{sigma}/{sigma}) to k{sub eff} uncertainties ({Delta}k/k) for each application model. The SCALE TSUNAMI-IP module was also used to evaluate the similarity of each of the 672 critical experiments with each application. Results of the uncertainty analysis and similarity assessment are presented in this report. A total of 142 experiments were judged to be similar to application 1, and 68 experiments were judged to be similar to application 2. None of the 672 experiments were judged to be adequately similar to applications 3 and 4. Discussion of the uncertainty analysis and similarity assessment is provided for each of the four applications. Example upper subcritical limits (USLs) were generated for application 1 based on trending of the energy of average lethargy of neutrons causing fission, trending of the TSUNAMI similarity parameters, and use of data adjustment techniques.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1001046','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1001046"><span>Pore-scale mechanisms of gas flow in tight sand reservoirs</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Silin, D.; Kneafsey, T.J.; Ajo-Franklin, J.B.</p> <p>2010-11-30</p> <p>Tight gas sands are unconventional hydrocarbon energy resource storing large volume of natural gas. Microscopy and 3D imaging of reservoir samples at different scales and resolutions provide insights into the coaredo not significantly smaller in size than conventional sandstones, the extremely dense grain packing makes the pore space tortuous, and the porosity is small. In some cases the inter-granular void space is presented by micron-scale slits, whose geometry requires imaging at submicron resolutions. Maximal Inscribed Spheres computations simulate different scenarios of capillary-equilibrium two-phase fluid displacement. For tight sands, the simulations predict an unusually low wetting fluid saturation threshold, at whichmore » the non-wetting phase becomes disconnected. Flow simulations in combination with Maximal Inscribed Spheres computations evaluate relative permeability curves. The computations show that at the threshold saturation, when the nonwetting fluid becomes disconnected, the flow of both fluids is practically blocked. The nonwetting phase is immobile due to the disconnectedness, while the permeability to the wetting phase remains essentially equal to zero due to the pore space geometry. This observation explains the Permeability Jail, which was defined earlier by others. The gas is trapped by capillarity, and the brine is immobile due to the dynamic effects. At the same time, in drainage, simulations predict that the mobility of at least one of the fluids is greater than zero at all saturations. A pore-scale model of gas condensate dropout predicts the rate to be proportional to the scalar product of the fluid velocity and pressure gradient. The narrowest constriction in the flow path is subject to the highest rate of condensation. The pore-scale model naturally upscales to the Panfilov's Darcy-scale model, which implies that the condensate dropout rate is proportional to the pressure gradient squared. Pressure gradient is the greatest near the matrix-fracture interface. The distinctive two-phase flow properties of tight sand imply that a small amount of gas condensate can seriously affect the recovery rate by blocking gas flow. Dry gas injection, pressure maintenance, or heating can help to preserve the mobility of gas phase. A small amount of water can increase the mobility of gas condensate.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.H21E1420W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.H21E1420W"><span>Simulation of Porous Medium Hydrogen Storage - Estimation of Storage Capacity and Deliverability for a North German anticlinal Structure</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, B.; Bauer, S.; Pfeiffer, W. T.</p> <p>2015-12-01</p> <p>Large scale energy storage will be required to mitigate offsets between electric energy demand and the fluctuating electric energy production from renewable sources like wind farms, if renewables dominate energy supply. Porous formations in the subsurface could provide the large storage capacities required if chemical energy carriers such as hydrogen gas produced during phases of energy surplus are stored. This work assesses the behavior of a porous media hydrogen storage operation through numerical scenario simulation of a synthetic, heterogeneous sandstone formation formed by an anticlinal structure. The structural model is parameterized using data available for the North German Basin as well as data given for formations with similar characteristics. Based on the geological setting at the storage site a total of 15 facies distributions is generated and the hydrological parameters are assigned accordingly. Hydraulic parameters are spatially distributed according to the facies present and include permeability, porosity relative permeability and capillary pressure. The storage is designed to supply energy in times of deficiency on the order of seven days, which represents the typical time span of weather conditions with no wind. It is found that using five injection/extraction wells 21.3 mio sm³ of hydrogen gas can be stored and retrieved to supply 62,688 MWh of energy within 7 days. This requires a ratio of working to cushion gas of 0.59. The retrievable energy within this time represents the demand of about 450000 people. Furthermore it is found that for longer storage times, larger gas volumes have to be used, for higher delivery rates additionally the number of wells has to be increased. The formation investigated here thus seems to offer sufficient capacity and deliverability to be used for a large scale hydrogen gas storage operation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN13B1663W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN13B1663W"><span>GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.</p> <p>2016-12-01</p> <p>Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19860003824','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19860003824"><span>On applications of chimera grid schemes to store separation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Cougherty, F. C.; Benek, J. A.; Steger, J. L.</p> <p>1985-01-01</p> <p>A finite difference scheme which uses multiple overset meshes to simulate the aerodynamics of aircraft/store interaction and store separation is described. In this chimera, or multiple mesh, scheme, a complex configuration is mapped using a major grid about the main component of the configuration, and minor overset meshes are used to map each additional component such as a store. As a first step in modeling the aerodynamics of store separation, two dimensional inviscid flow calculations were carried out in which one of the minor meshes is allowed to move with respect to the major grid. Solutions of calibrated two dimensional problems indicate that allowing one mesh to move with respect to another does not adversely affect the time accuracy of an unsteady solution. Steady, inviscid three dimensional computations demonstrate the capability to simulate complex configurations, including closely packed multiple bodies.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.6074S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.6074S"><span>Fjordic Environments of Scotland: A National Inventory of Sedimentary Blue Carbon.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Smeaton, Craig; Austin, William; Davies, Althea; Baltzer, Agnes; Howe, John</p> <p>2016-04-01</p> <p>Coastal sediments potentially hold a significant store of carbon; yet there has been no comprehensive attempt to quantitatively determine the quantity of carbon in these stores. Using Scottish sea lochs (fjords) we have established a Holocene record of the quantity and type of carbon held within the sediment store of a typical Scottish sea loch. Through the use of both seismic geophysics and geochemical measurements we have developed a methodology to make first-order estimations of the carbon held with the sediment of sea lochs. This methodology was applied to four sea lochs with differing geographical locations, catchments, freshwater inputs to produce the first sedimentary Blue Carbon estimates. The resulting carbon inventories show clearly that these sea lochs hold a significant store of sedimentary carbon; for example, Loch Sunart in Argyll stores an estimated 26.88 ± 0.52 Mt C. A direct comparison of the organic carbon content per unit area suggest sea lochs have a greater OC storage potential between than Scottish peatlands on long, Holocene timescales (Loch Sunart = 0.234 Mt OC km-2; Peatland = 0.093 Mt OC km-2 (Chapman et al. 2009). The carbon values calculated for these sea lochs have been used to estimate the total carbon held within Scotland's 110 sea lochs and these up-scaled estimations are for the first time, reviewed in the context of Scotland's known terrestrial stores. Chapman, S. J., Bell, J., Donnelly, D. and Lilly, A.: Carbon stocks in Scottish peatlands, Soil Use Manag., 25(2), 105-112, doi:10.1111/j.1475-2743.2009.00219.x, 2009.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/9929269','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/9929269"><span>Scaling an expert system data mart: more facilities in real-time.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>McNamee, L A; Launsby, B D; Frisse, M E; Lehmann, R; Ebker, K</p> <p>1998-01-01</p> <p>Clinical Data Repositories are being rapidly adopted by large healthcare organizations as a method of centralizing and unifying clinical data currently stored in diverse and isolated information systems. Once stored in a clinical data repository, healthcare organizations seek to use this centralized data to store, analyze, interpret, and influence clinical care, quality and outcomes. A recent trend in the repository field has been the adoption of data marts--specialized subsets of enterprise-wide data taken from a larger repository designed specifically to answer highly focused questions. A data mart exploits the data stored in the repository, but can use unique structures or summary statistics generated specifically for an area of study. Thus, data marts benefit from the existence of a repository, are less general than a repository, but provide more effective and efficient support for an enterprise-wide data analysis task. In previous work, we described the use of batch processing for populating data marts directly from legacy systems. In this paper, we describe an architecture that uses both primary data sources and an evolving enterprise-wide clinical data repository to create real-time data sources for a clinical data mart to support highly specialized clinical expert systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19750078088','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19750078088"><span>Investigation of Ejection Releases of an MB-1 Rocket from a 0.04956-Scaled Model of the Convair F-106A Airplane at Several Mach Numbers and Simulated Altitudes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Lee, J. B.; Basford, R. C.</p> <p>1957-01-01</p> <p>As a continuation of an investigation of the ejection release characteristics of an internally carried MB-1 rocket in the Convair F-106A airplane, fin modifications at additional Mach numbers and simulated altitudes have been studied in the 27- by 27-inch preflight jet of the Langley Pilotless Aircraft Research Station at Wallops Island, Va. The MB-1 rocket was ejected with fins open, fins closed, fins closed with a shroud around the fins, and fins folded with a "boattail" placed in between the fins. Dynamically scaled models (0.0^956 scale) were tested at simulated altitudes of 12,000, 18,850, and 27,500 feet at subsonic Mach numbers and at 18,850, 27,500, and 40,000 feet for Mach numbers of 1-39, 1-59, and 1.98. Successful ejections can be obtained for over 10 store diameters from release point by the use of a shroud around the folded fins with the proper ejection velocity and nose-down pitching moment at release. In one case investigated it was found desirable to close off the front one-third of the bomb bay. It appeared that the fins should be opened after release and within 5 "to 6 rocket diameters if no modifications are made on the rocket. An increase in fuselage angle of attack caused higher nose-up pitch rates after release.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.ars.usda.gov/research/publications/publication/?seqNo115=269455','TEKTRAN'); return false;" href="http://www.ars.usda.gov/research/publications/publication/?seqNo115=269455"><span>Tomato functional genomics database (TFGD): a comprehensive collection and analysis package for tomato functional genomics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ars.usda.gov/research/publications/find-a-publication/">USDA-ARS?s Scientific Manuscript database</a></p> <p></p> <p></p> <p>Tomato Functional Genomics Database (TFGD; http://ted.bti.cornell.edu) provides a comprehensive systems biology resource to store, mine, analyze, visualize and integrate large-scale tomato functional genomics datasets. The database is expanded from the previously described Tomato Expression Database...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=biological+AND+networks&id=ED558281','ERIC'); return false;" href="https://eric.ed.gov/?q=biological+AND+networks&id=ED558281"><span>Querying Large Biological Network Datasets</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Gulsoy, Gunhan</p> <p>2013-01-01</p> <p>New experimental methods has resulted in increasing amount of genetic interaction data to be generated every day. Biological networks are used to store genetic interaction data gathered. Increasing amount of data available requires fast large scale analysis methods. Therefore, we address the problem of querying large biological network datasets.…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=142471&Lab=ORD&keyword=enviromental&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=142471&Lab=ORD&keyword=enviromental&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>CONNECTICUT GROUND WATER QUALITY CLASSIFICATIONS - WELLS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>This is a 1:24,000-scale datalayer of Ground Water Quality Classifications for public supply wells in Connecticut. It is a polygon Shapefile that includes GAA areas for public water supply wells. Each polygon is assigned a GAA ground water quality class, which is stored in the d...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=storage&id=ED553730','ERIC'); return false;" href="https://eric.ed.gov/?q=storage&id=ED553730"><span>Improving Performance and Predictability of Storage Arrays</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Altiparmak, Nihat</p> <p>2013-01-01</p> <p>Massive amount of data is generated everyday through sensors, Internet transactions, social networks, video, and all other digital sources available. Many organizations store this data to enable breakthrough discoveries and innovation in science, engineering, medicine, and commerce. Such massive scale of data poses new research problems called big…</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29939286','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29939286"><span>Susceptibility to price discounting of soda by neighbourhood educational status: an ecological analysis of disparities in soda consumption using point-of-purchase transaction data in Montreal, Canada.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mamiya, Hiroshi; Moodie, Erica E M; Ma, Yu; Buckeridge, David L</p> <p>2018-06-22</p> <p>Price discounting is a marketing tactic used frequently by food industries and retailers, but the extent to which education modifies the effect of discounting on the purchasing of unhealthy foods has received little attention. We investigated whether there was a differential association of price discounting of soda with store-level soda purchasing records between 2008 and 2013 by store-neighbourhood education in Montreal, Canada. Using data on grocery purchase transactions from a sample of supermarkets, pharmacies, supercentres and convenience stores, we performed an ecological time-series analysis, modelling weekly store-level sales of soda as a function of store-level price discounting, store- and neighbourhood-level confounders and an interaction term between discounting and categorical education in the neighbourhood of each store. Analysis by store type (n = 18 743, 12 437, 3965 and 49 533 store-weeks for superstores, pharmacies, supercentres and convenience stores, respectively) revealed that the effect measure modification of discounting by neighbourhood education on soda purchasing was lower in stores in the more educated neighbourhoods, most notably in pharmacies: -0.020 [95% confidence interval (CI): -0.028, -0.012] and -0.038 (95% CI: -0.051, -0.025), for middle- and high-education categories, respectively). Weaker effect modification was observed in convenience stores. There was no evidence of effect modification in supercentres or superstores. Price discounting is an important environmental risk factor for soda purchasing and can widen education inequalities in excess sugar intake across levels of education. Interventions to regulate price discounting warrant further investigation as a public health strategy to improve population nutrition, particularly in lower-education neighbourhoods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1240674-implementation-dynamic-extensible-adaptive-locally-exchangeable-measures-idealem','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1240674-implementation-dynamic-extensible-adaptive-locally-exchangeable-measures-idealem"><span>Implementation of Dynamic Extensible Adaptive Locally Exchangeable Measures (IDEALEM) v 0.1</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Sim, Alex; Lee, Dongeun; Wu, K. John</p> <p>2016-03-04</p> <p>Handling large streaming data is essential for various applications such as network traffic analysis, social networks, energy cost trends, and environment modeling. However, it is in general intractable to store, compute, search, and retrieve large streaming data. This software addresses a fundamental issue, which is to reduce the size of large streaming data and still obtain accurate statistical analysis. As an example, when a high-speed network such as 100 Gbps network is monitored, the collected measurement data rapidly grows so that polynomial time algorithms (e.g., Gaussian processes) become intractable. One possible solution to reduce the storage of vast amounts ofmore » measured data is to store a random sample, such as one out of 1000 network packets. However, such static sampling methods (linear sampling) have drawbacks: (1) it is not scalable for high-rate streaming data, and (2) there is no guarantee of reflecting the underlying distribution. In this software, we implemented a dynamic sampling algorithm, based on the recent technology from the relational dynamic bayesian online locally exchangeable measures, that reduces the storage of data records in a large scale, and still provides accurate analysis of large streaming data. The software can be used for both online and offline data records.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4942327','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4942327"><span>Bigger is better: Improved nature conservation and economic returns from landscape-level mitigation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Kennedy, Christina M.; Miteva, Daniela A.; Baumgarten, Leandro; Hawthorne, Peter L.; Sochi, Kei; Polasky, Stephen; Oakleaf, James R.; Uhlhorn, Elizabeth M.; Kiesecker, Joseph</p> <p>2016-01-01</p> <p>Impact mitigation is a primary mechanism on which countries rely to reduce environmental externalities and balance development with conservation. Mitigation policies are transitioning from traditional project-by-project planning to landscape-level planning. Although this larger-scale approach is expected to provide greater conservation benefits at the lowest cost, empirical justification is still scarce. Using commercial sugarcane expansion in the Brazilian Cerrado as a case study, we apply economic and biophysical steady-state models to quantify the benefits of the Brazilian Forest Code (FC) under landscape- and property-level planning. We find that FC compliance imposes small costs to business but can generate significant long-term benefits to nature: supporting 32 (±37) additional species (largely habitat specialists), storing 593,000 to 2,280,000 additional tons of carbon worth $69 million to $265 million ($ pertains to U.S. dollars), and marginally improving surface water quality. Relative to property-level compliance, we find that landscape-level compliance reduces total business costs by $19 million to $35 million per 6-year sugarcane growing cycle while often supporting more species and storing more carbon. Our results demonstrate that landscape-level mitigation provides cost-effective conservation and can be used to promote sustainable development. PMID:27419225</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018PhRvE..97f2305S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018PhRvE..97f2305S"><span>Hysteresis, neural avalanches, and critical behavior near a first-order transition of a spiking neural network</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Scarpetta, Silvia; Apicella, Ilenia; Minati, Ludovico; de Candia, Antonio</p> <p>2018-06-01</p> <p>Many experimental results, both in vivo and in vitro, support the idea that the brain cortex operates near a critical point and at the same time works as a reservoir of precise spatiotemporal patterns. However, the mechanism at the basis of these observations is still not clear. In this paper we introduce a model which combines both these features, showing that scale-free avalanches are the signature of a system posed near the spinodal line of a first-order transition, with many spatiotemporal patterns stored as dynamical metastable attractors. Specifically, we studied a network of leaky integrate-and-fire neurons whose connections are the result of the learning of multiple spatiotemporal dynamical patterns, each with a randomly chosen ordering of the neurons. We found that the network shows a first-order transition between a low-spiking-rate disordered state (down), and a high-rate state characterized by the emergence of collective activity and the replay of one of the stored patterns (up). The transition is characterized by hysteresis, or alternation of up and down states, depending on the lifetime of the metastable states. In both cases, critical features and neural avalanches are observed. Notably, critical phenomena occur at the edge of a discontinuous phase transition, as recently observed in a network of glow lamps.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012JGRF..117.0A05E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012JGRF..117.0A05E"><span>Modeling wood dynamics, jam formation, and sediment storage in a gravel-bed stream</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Eaton, B. C.; Hassan, M. A.; Davidson, S. L.</p> <p>2012-12-01</p> <p>In small and intermediate sized streams, the interaction between wood and bed material transport often determines the nature of the physical habitat, which in turn influences the health of the stream's ecosystem. We present a stochastic model that can be used to simulate the effects on physical habitat of forest fires, climate change, and other environmental disturbances that alter wood recruitment. The model predicts large wood (LW) loads in a stream as well as the volume of sediment stored by the wood; while it is parameterized to describe gravel bed streams similar to a well-studied field prototype, Fishtrap Creek, British Columbia, it can be calibrated to other systems as well. In the model, LW pieces are produced and modified over time as a result of random tree-fall, LW breakage, LW movement, and piece interaction to form LW jams. Each LW piece traps a portion of the annual bed material transport entering the reach and releases the stored sediment when the LW piece is entrained and moved. The equations governing sediment storage are based on a set of flume experiments also scaled to the field prototype. The model predicts wood loads ranging from 70 m3/ha to more than 300 m3/ha, with a mean value of 178 m3/ha: both the range and the mean value are consistent with field data from streams with similar riparian forest types and climate. The model also predicts an LW jam spacing that is consistent with field data. Furthermore, our modeling results demonstrate that the high spatial and temporal variability in sediment storage, sediment transport, and channel morphology associated with LW-dominated streams occurs only when LW pieces interact and form jams. Model runs that do not include jam formation are much less variable. These results suggest that river restoration efforts using engineered LW pieces that are fixed in place and not permitted to interact will be less successful at restoring the geomorphic processes responsible for producing diverse, productive physical habitats than efforts using LW pieces that are free to move, interact, and form LW jams.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.ars.usda.gov/research/publications/publication/?seqNo115=278101','TEKTRAN'); return false;" href="http://www.ars.usda.gov/research/publications/publication/?seqNo115=278101"><span>Sampling stored product insect pests: a comparison of four statistical sampling models for probability of pest detection</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ars.usda.gov/research/publications/find-a-publication/">USDA-ARS?s Scientific Manuscript database</a></p> <p></p> <p></p> <p>Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.ars.usda.gov/research/publications/publication/?seqNo115=307855','TEKTRAN'); return false;" href="http://www.ars.usda.gov/research/publications/publication/?seqNo115=307855"><span>Stored grain pack factors for wheat: comparison of three methods to field measurements</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ars.usda.gov/research/publications/find-a-publication/">USDA-ARS?s Scientific Manuscript database</a></p> <p></p> <p></p> <p>Storing grain in bulk storage units results in grain packing from overbearing pressure, which increases grain bulk density and storage-unit capacity. This study compared pack factors of hard red winter (HRW) wheat in vertical storage bins using different methods: the existing packing model (WPACKING...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26630762','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26630762"><span>INDIVIDUAL-BASED MODELS: POWERFUL OR POWER STRUGGLE?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Willem, L; Stijven, S; Hens, N; Vladislavleva, E; Broeckhove, J; Beutels, P</p> <p>2015-01-01</p> <p>Individual-based models (IBMs) offer endless possibilities to explore various research questions but come with high model complexity and computational burden. Large-scale IBMs have become feasible but the novel hardware architectures require adapted software. The increased model complexity also requires systematic exploration to gain thorough system understanding. We elaborate on the development of IBMs for vaccine-preventable infectious diseases and model exploration with active learning. Investment in IBM simulator code can lead to significant runtime reductions. We found large performance differences due to data locality. Sorting the population once, reduced simulation time by a factor two. Storing person attributes separately instead of using person objects also seemed more efficient. Next, we improved model performance up to 70% by structuring potential contacts based on health status before processing disease transmission. The active learning approach we present is based on iterative surrogate modelling and model-guided experimentation. Symbolic regression is used for nonlinear response surface modelling with automatic feature selection. We illustrate our approach using an IBM for influenza vaccination. After optimizing the parameter spade, we observed an inverse relationship between vaccination coverage and the clinical attack rate reinforced by herd immunity. These insights can be used to focus and optimise research activities, and to reduce both dimensionality and decision uncertainty.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1917086B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1917086B"><span>Lithospheric structure of the Western Alps as seen by full-waveform inversion of CIFALPS teleseismic data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Beller, Stephen; Monteiller, Vadim; Operto, Stéphane; Nolet, Guust; Paul, Anne; Zhao, Liang</p> <p>2017-04-01</p> <p>Full-waveform inversion (FWI) is a powerful but constitutionally intensive technique that aims to recover 3D multiparameter images of the subsurface by minimising the waveform difference between the full recorded and modelled seismograms. This method has recently been adapted and successfully applied in lithospheric settings by tackling teleseismic waveform modelling with hybrid methods. For each event, a global scale simulation is performed once and for all to store the wavefield solutions on the edges of the lithospheric target. Then, for each modelling involved in the FWI process, these global scale solutions are injected within the lithospheric medium from the boundaries. We present the results of the application of teleseismic FWI to the data acquired by the CIFALPS experiment that was conducted in the Western Alps to gain new insights its lithospheric structure and geodynamic evolution of the alpine range. Nine teleseismic events were inverted to infer 3D models of density, P-wave velocity and S-wave velocity of the crust and the upper-mantle down to 200 km depth. Our models show clear evidences of continental subduction during the alpine orogeny. They outline a dipping European Moho down to 75 km depth and finely delineate the geometry of the Ivrea body at the suture between European and Adriatic plates. Deeper, in the mantle a slow S-wave velocity anomaly might indicate the location of the European slab detachment. Overall, FWI models give access to new seismic images that fill the resolution gap between smooth tomographic model and sharp receiver function images of the lithosphere and enable integrated interpretations of crustal and upper-mantle structures.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008SPIE.6860E..12E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008SPIE.6860E..12E"><span>CARS microscopy for the monitoring of lipid storage in C. elegans</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Enejder, Annika; Brackmann, Christian; Axäng, Claes; Åkeson, Madeleine; Pilon, Marc</p> <p>2008-02-01</p> <p>After several years of proof-of-principle measurements and focus on technological development, it is timely to make full use of the capabilities of CARS microscopy within the biosciences. We have here identified an urgent biological problem, to which CARS microscopy provides unique insights and consequently may become a widely accepted experimental procedure. In order to improve present understanding of mechanisms underlying dysfunctional metabolism regulation reported for many of our most wide-spread diseases (obesity, diabetes, cardio-vascular diseases etc.), we have monitored genetic and environmental impacts on cellular lipid storage in the model organism C. elegans in vivo in a full-scale biological study. Important advantages of CARS microscopy could be demonstrated compared to present technology, i.e. fluorescence microscopy of labelled lipid stores. The fluorescence signal varies not only with the presence of lipids, but also with the systemic distribution of the fluorophore and the chemical properties of the surrounding medium. By instead probing high-density regions of CH bonds naturally occurring in the sample, the CARS process was shown to provide a consistent representation of the lipid stores. The increased accumulation of lipid stores in mutants with deficiencies in the insulin and transforming growth factor signalling pathways could hereby be visualized and quantified. Furthermore, spectral CARS microscopy measurements in the C-H bond region of 2780-2930 cm -1 provided the interesting observation that this accumulation comes with a shift in the ordering of the lipids from gel- to liquid phase. The present study illustrates that CARS microscopy has a strong potential to become an important instrument for systemic studies of lipid storage mechanisms in living organisms, providing new insights into the phenomena underlying metabolic disorders.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26024797','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26024797"><span>The association of point-of-sale cigarette marketing with cravings to smoke: results from a cross-sectional population-based study.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Siahpush, Mohammad; Shaikh, Raees A; Cummings, K Michael; Hyland, Andrew; Dodd, Michael; Carlson, Les; Kessler, Asia Sikora; Meza, Jane; Wan, Neng; Wakefield, Melanie</p> <p>2016-07-01</p> <p>To examine the association between recalled exposure to point-of-sale (POS) cigarette marketing (ie, pack displays, advertisements and promotions such as discounts) and reported cravings to smoke while visiting a store. Data were collected using a telephone survey of a cross-sectional sample of 999 adult smokers in Omaha, Nebraska. Recalled exposure to POS cigarette marketing was measured by asking respondents about noticing (a) pack displays, (b) advertisements and (c) promotions in store in their neighbourhood. A 3-item scale indicating the frequency of experiencing cravings to smoke in locations where cigarettes are sold was created by asking respondents: (1) "feel a craving for a cigarette?" (2) "feel like nothing would be better than smoking a cigarette?" and (3) "feel like all you want is a cigarette?" The association between recalled exposure to POS cigarette marketing and cravings was estimated using ordinary least squares linear regression models, controlling for nicotine dependence, gender, age, race/ethnicity, income, education, frequency of visiting stores in one's neighbourhood and method of recruitment into the study. Recalled exposure to POS cigarette displays (p<0.001) and advertisements (p=0.002), but not promotions (p=0.06), was associated with more frequent cravings to smoke. Recalled exposure to POS cigarette marketing is associated with cravings to smoke as predicted by laboratory studies on the effects of smoking cues on cigarette craving. Policies that reduce or eliminate POS cigarette marketing could reduce cigarette cravings and might attenuate impulse buying of cigarettes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H53N..03O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H53N..03O"><span>Evaluating a Local Ensemble Transform Kalman Filter snow cover data assimilation method to estimate SWE within a high-resolution hydrologic modeling framework across Western US mountainous regions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Oaida, C. M.; Andreadis, K.; Reager, J. T., II; Famiglietti, J. S.; Levoe, S.</p> <p>2017-12-01</p> <p>Accurately estimating how much snow water equivalent (SWE) is stored in mountainous regions characterized by complex terrain and snowmelt-driven hydrologic cycles is not only greatly desirable, but also a big challenge. Mountain snowpack exhibits high spatial variability across a broad range of spatial and temporal scales due to a multitude of physical and climatic factors, making it difficult to observe or estimate in its entirety. Combing remotely sensed data and high resolution hydrologic modeling through data assimilation (DA) has the potential to provide a spatially and temporally continuous SWE dataset at horizontal scales that capture sub-grid snow spatial variability and are also relevant to stakeholders such as water resource managers. Here, we present the evaluation of a new snow DA approach that uses a Local Ensemble Transform Kalman Filter (LETKF) in tandem with the Variable Infiltration Capacity macro-scale hydrologic model across the Western United States, at a daily temporal resolution, and a horizontal resolution of 1.75 km x 1.75 km. The LETKF is chosen for its relative simplicity, ease of implementation, and computational efficiency and scalability. The modeling/DA system assimilates daily MODIS Snow Covered Area and Grain Size (MODSCAG) fractional snow cover over, and has been developed to efficiently calculate SWE estimates over extended periods of time and covering large regional-scale areas at relatively high spatial resolution, ultimately producing a snow reanalysis-type dataset. Here we focus on the assessment of SWE produced by the DA scheme over several basins in California's Sierra Nevada Mountain range where Airborne Snow Observatory data is available, during the last five water years (2013-2017), which include both one of the driest and one of the wettest years. Comparison against such a spatially distributed SWE observational product provides a greater understanding of the model's ability to estimate SWE and SWE spatial variability, and highlights under which conditions snow cover DA can add value in estimating SWE.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27065179','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27065179"><span>Producer-retailer integrated EMQ system with machine breakdown, rework failures, and a discontinuous inventory issuing policy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chiu, Singa Wang; Chen, Shin-Wei; Chiu, Yuan-Shyi Peter; Li, Ting-Wei</p> <p>2016-01-01</p> <p>This study develops two extended economic manufacturing quantity (EMQ)-based models with a discontinuous product issuing policy, random machine breakdown, and rework failures. Various real conditions in production processes, end-product delivery, and intra-supply chains such as a producer-retailer integrated scheme are examined. The first model incorporates a discontinuous multi-delivery policy into a prior work (Chiu et al. in Proc Inst Mech Eng B J Eng 223:183-194, 2009) in lieu of their continuous policy. Such an enhanced model can address situations in supply chain environments, where finished products are transported to outside retail stores (or customers). The second model further combines retailer's stock holding costs into the first model. This extended EMQ model is applicable in situations in present-day manufacturing firms where finished products are distributed to company's own retail stores (or regional sales offices) and stocked there for sale. Two aforementioned extended EMQ models are investigated, respectively. Mathematical modeling along with iterative algorithms are employed to derive the optimal production run times that minimize the expected total system costs, including the costs incurred in production units, transportation, and retail stores, for these integrated EMQ systems. Numerical examples are provided to demonstrate the practical application of the research results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.A33H3301E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.A33H3301E"><span>Capability of a Mobile Monitoring System to Provide Real-Time Data Broadcasting and Near Real-Time Source Attribution</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Erickson, M.; Olaguer, J.; Wijesinghe, A.; Colvin, J.; Neish, B.; Williams, J.</p> <p>2014-12-01</p> <p>It is becoming increasingly important to understand the emissions and health effects of industrial facilities. Many areas have no or limited sustained monitoring capabilities, making it difficult to quantify the major pollution sources affecting human health, especially in fence line communities. Developments in real-time monitoring and micro-scale modeling offer unique ways to tackle these complex issues. This presentation will demonstrate the capability of coupling real-time observations with micro-scale modeling to provide real-time information and near real-time source attribution. The Houston Advanced Research Center constructed the Mobile Acquisition of Real-time Concentrations (MARC) laboratory. MARC consists of a Ford E-350 passenger van outfitted with a Proton Transfer Reaction Mass Spectrometer (PTR-MS) and meteorological equipment. This allows for the fast measurement of various VOCs important to air quality. The data recorded from the van is uploaded to an off-site database and the information is broadcast to a website in real-time. This provides for off-site monitoring of MARC's observations, which allows off-site personnel to provide immediate input to the MARC operators on how to best achieve project objectives. The information stored in the database can also be used to provide near real-time source attribution. An inverse model has been used to ascertain the amount, location, and timing of emissions based on MARC measurements in the vicinity of industrial sites. The inverse model is based on a 3D micro-scale Eulerian forward and adjoint air quality model known as the HARC model. The HARC model uses output from the Quick Urban and Industrial Complex (QUIC) wind model and requires a 3D digital model of the monitored facility based on lidar or industrial permit data. MARC is one of the instrument platforms deployed during the 2014 Benzene and other Toxics Exposure Study (BEE-TEX) in Houston, TX. The main goal of the study is to quantify and explain the origin of ambient exposure to hazardous air pollutants in an industrial fence line community near the Houston Ship Channel. Preliminary results derived from analysis of MARC observations during the BEE-TEX experiment will be presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5062079','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5062079"><span>The Photovoltaic Heat Island Effect: Larger solar power plants increase local temperatures</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Barron-Gafford, Greg A.; Minor, Rebecca L.; Allen, Nathan A.; Cronin, Alex D.; Brooks, Adria E.; Pavao-Zuckerman, Mitchell A.</p> <p>2016-01-01</p> <p>While photovoltaic (PV) renewable energy production has surged, concerns remain about whether or not PV power plants induce a “heat island” (PVHI) effect, much like the increase in ambient temperatures relative to wildlands generates an Urban Heat Island effect in cities. Transitions to PV plants alter the way that incoming energy is reflected back to the atmosphere or absorbed, stored, and reradiated because PV plants change the albedo, vegetation, and structure of the terrain. Prior work on the PVHI has been mostly theoretical or based upon simulated models. Furthermore, past empirical work has been limited in scope to a single biome. Because there are still large uncertainties surrounding the potential for a PHVI effect, we examined the PVHI empirically with experiments that spanned three biomes. We found temperatures over a PV plant were regularly 3–4 °C warmer than wildlands at night, which is in direct contrast to other studies based on models that suggested that PV systems should decrease ambient temperatures. Deducing the underlying cause and scale of the PVHI effect and identifying mitigation strategies are key in supporting decision-making regarding PV development, particularly in semiarid landscapes, which are among the most likely for large-scale PV installations. PMID:27733772</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2005CG.....31..179F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2005CG.....31..179F"><span>Multi-resolution extension for transmission of geodata in a mobile context</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Follin, Jean-Michel; Bouju, Alain; Bertrand, Frédéric; Boursier, Patrice</p> <p>2005-03-01</p> <p>A solution is proposed for the management of multi-resolution vector data in a mobile spatial information visualization system. The client-server architecture and the models of data and transfer of the system are presented first. The aim of this system is to reduce data exchanged between client and server by reusing data already present on the client side. Then, an extension of this system to multi-resolution data is proposed. Our solution is based on the use of increments in a multi-scale database. A database architecture where data sets for different predefined scales are precomputed and stored on the server side is adopted. In this model, each object representing the same real world entities at different levels of detail has to be linked beforehand. Increments correspond to the difference between two datasets with different levels of detail. They are transmitted in order to increase (or decrease) the detail to the client upon request. They include generalization and refinement operators allowing transitions between the different levels. Finally, a framework suited to the transfer of multi-resolution data in a mobile context is presented. This allows reuse of data locally available at different levels of detail and, in this way, reduces the amount of data transferred between client and server.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011PhDT........79F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011PhDT........79F"><span>Developing Higher-Order Materials Knowledge Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fast, Anthony Nathan</p> <p>2011-12-01</p> <p>Advances in computational materials science and novel characterization techniques have allowed scientists to probe deeply into a diverse range of materials phenomena. These activities are producing enormous amounts of information regarding the roles of various hierarchical material features in the overall performance characteristics displayed by the material. Connecting the hierarchical information over disparate domains is at the crux of multiscale modeling. The inherent challenge of performing multiscale simulations is developing scale bridging relationships to couple material information between well separated length scales. Much progress has been made in the development of homogenization relationships which replace heterogeneous material features with effective homogenous descriptions. These relationships facilitate the flow of information from lower length scales to higher length scales. Meanwhile, most localization relationships that link the information from a from a higher length scale to a lower length scale are plagued by computationally intensive techniques which are not readily integrated into multiscale simulations. The challenge of executing fully coupled multiscale simulations is augmented by the need to incorporate the evolution of the material structure that may occur under conditions such as material processing. To address these challenges with multiscale simulation, a novel framework called the Materials Knowledge System (MKS) has been developed. This methodology efficiently extracts, stores, and recalls microstructure-property-processing localization relationships. This approach is built on the statistical continuum theories developed by Kroner that express the localization of the response field at the microscale using a series of highly complex convolution integrals, which have historically been evaluated analytically. The MKS approach dramatically improves the accuracy of these expressions by calibrating the convolution kernels in these expressions to results from previously validated physics-based models. These novel tools have been validated for the elastic strain localization in moderate contrast dual-phase composites by direct comparisons with predictions from finite element model. The versatility of the approach is further demonstrated by its successful application to capturing the structure evolution during spinodal decomposition of a binary alloy. Lastly, some key features in the future application of the MKS approach are developed using the Portevin-le Chaterlier effect. It has been shown with these case studies that the MKS approach is capable of accurately reproducing the results from physics based models with a drastic reduction in computational requirements.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24514017','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24514017"><span>Process evaluation of a food marketing and environmental change intervention in Tiendas that serve Latino immigrants in North Carolina.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Baquero, Barbara; Linnan, Laura; Laraia, Barbara A; Ayala, Guadalupe X</p> <p>2014-11-01</p> <p>This article describes a comprehensive process evaluation of an efficacious store-based intervention that increased store customers' fruit and vegetable consumption. The process evaluation plan was designed at study inception and implemented at baseline, during the intervention, and at immediate postintervention. Four Latino food stores were randomly assigned either to an intervention or to a control condition. Data were collected from store managers, employees, and 139 Latino customers. Researchers used manager, employee, and customer interviews; weekly observations of the store environment; and implementation logs to assess reach, dose delivered, dose received, and fidelity. Results indicated that it is possible to reach customers in a store-based intervention. Indicators of dose delivered demonstrated that the intervention was implemented as planned, and in the case of employee training, it exceeded the plan. Dose received data indicated that customers moderately engaged with the intervention activities. Together these suggest that the intervention was delivered with good fidelity. Comprehensive process evaluation efforts can facilitate the identification and elimination of barriers to implementation. This approach can serve as a model for future store-based interventions. The study demonstrated that it is feasible to implement Latino food store-based interventions to increase access to and consumption of fruits and vegetables. © 2014 Society for Public Health Education.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1246891','DOE-PATENT-XML'); return false;" href="https://www.osti.gov/servlets/purl/1246891"><span>Building energy analysis tool</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars</p> <p>2016-04-12</p> <p>A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.H22A..01K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.H22A..01K"><span>Exploring landscapes and ecosystems by studying their streams</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kirchner, J. W.</p> <p>2016-12-01</p> <p>Streams integrate fluxes of water, solutes, and sediment from their catchments, and thus they act as mirrors of the surrounding landscape. Patterns of streamflow, chemistry, and sediment flux can therefore shed light on physical, chemical, and biological processes at the scale of whole ecosystems. However, landscapes also exhibit preferential flow and pervasive heterogeneity on all scales, and therefore store waters over a wide spectrum of time scales, complicating efforts to interpret hydrological and geochemical signals in streamwaters. Here I review current and recent research exploring how landscapes store, mix, and release water and solutes to streams. Groundwater levels and stream flows exhibit diurnal cycles in response to snowmelt in springtime and transpiration during the growing season. These cycles vividly illustrate how aquifers and streams mirror ecological processes in their surrounding landscapes. Stream networks extend and retract, both seasonally and in response to individual rainfall events, dynamically mapping out variations in subsurface transmissivity and in the balance between precipitation and transpiration. Water quality time series spanning the periodic table, from H+ to U, exhibit universal fractal scaling on time scales from hours to decades. This scaling behavior is a temporal expression of the spatial heterogeneity that pervades the subsurface, and it confounds efforts to identify water quality trends. Isotope tracers such as 18O, 2H, 3H, and 14C can used to quantify water ages over seven orders of magnitude, from hours to thousands of years. These tracers show that substantial fractions of streamflow are hours, days, and months old, even in streams fed by aquifers with significant proportions of pre-Holocene groundwater. Examples such as these will be presented to illustrate the close coupling between landscapes and the waters that drain them, and to demonstrate how streams can be used as windows into landscape processes.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4086706','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4086706"><span>A genome-scale metabolic flux model of Escherichia coli K–12 derived from the EcoCyc database</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2014-01-01</p> <p>Background Constraint-based models of Escherichia coli metabolic flux have played a key role in computational studies of cellular metabolism at the genome scale. We sought to develop a next-generation constraint-based E. coli model that achieved improved phenotypic prediction accuracy while being frequently updated and easy to use. We also sought to compare model predictions with experimental data to highlight open questions in E. coli biology. Results We present EcoCyc–18.0–GEM, a genome-scale model of the E. coli K–12 MG1655 metabolic network. The model is automatically generated from the current state of EcoCyc using the MetaFlux software, enabling the release of multiple model updates per year. EcoCyc–18.0–GEM encompasses 1445 genes, 2286 unique metabolic reactions, and 1453 unique metabolites. We demonstrate a three-part validation of the model that breaks new ground in breadth and accuracy: (i) Comparison of simulated growth in aerobic and anaerobic glucose culture with experimental results from chemostat culture and simulation results from the E. coli modeling literature. (ii) Essentiality prediction for the 1445 genes represented in the model, in which EcoCyc–18.0–GEM achieves an improved accuracy of 95.2% in predicting the growth phenotype of experimental gene knockouts. (iii) Nutrient utilization predictions under 431 different media conditions, for which the model achieves an overall accuracy of 80.7%. The model’s derivation from EcoCyc enables query and visualization via the EcoCyc website, facilitating model reuse and validation by inspection. We present an extensive investigation of disagreements between EcoCyc–18.0–GEM predictions and experimental data to highlight areas of interest to E. coli modelers and experimentalists, including 70 incorrect predictions of gene essentiality on glucose, 80 incorrect predictions of gene essentiality on glycerol, and 83 incorrect predictions of nutrient utilization. Conclusion Significant advantages can be derived from the combination of model organism databases and flux balance modeling represented by MetaFlux. Interpretation of the EcoCyc database as a flux balance model results in a highly accurate metabolic model and provides a rigorous consistency check for information stored in the database. PMID:24974895</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1366305','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1366305"><span>Impact of Data Placement on Resilience in Large-Scale Object Storage Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Carns, Philip; Harms, Kevin; Jenkins, John</p> <p></p> <p>Distributed object storage architectures have become the de facto standard for high-performance storage in big data, cloud, and HPC computing. Object storage deployments using commodity hardware to reduce costs often employ object replication as a method to achieve data resilience. Repairing object replicas after failure is a daunting task for systems with thousands of servers and billions of objects, however, and it is increasingly difficult to evaluate such scenarios at scale on realworld systems. Resilience and availability are both compromised if objects are not repaired in a timely manner. In this work we leverage a high-fidelity discrete-event simulation model tomore » investigate replica reconstruction on large-scale object storage systems with thousands of servers, billions of objects, and petabytes of data. We evaluate the behavior of CRUSH, a well-known object placement algorithm, and identify configuration scenarios in which aggregate rebuild performance is constrained by object placement policies. After determining the root cause of this bottleneck, we then propose enhancements to CRUSH and the usage policies atop it to enable scalable replica reconstruction. We use these methods to demonstrate a simulated aggregate rebuild rate of 410 GiB/s (within 5% of projected ideal linear scaling) on a 1,024-node commodity storage system. We also uncover an unexpected phenomenon in rebuild performance based on the characteristics of the data stored on the system.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017PhRvX...7d1071M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017PhRvX...7d1071M"><span>Neutral Theory and Scale-Free Neural Dynamics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Martinello, Matteo; Hidalgo, Jorge; Maritan, Amos; di Santo, Serena; Plenz, Dietmar; Muñoz, Miguel A.</p> <p>2017-10-01</p> <p>Neural tissues have been consistently observed to be spontaneously active and to generate highly variable (scale-free distributed) outbursts of activity in vivo and in vitro. Understanding whether these heterogeneous patterns of activity stem from the underlying neural dynamics operating at the edge of a phase transition is a fascinating possibility, as criticality has been argued to entail many possible important functional advantages in biological computing systems. Here, we employ a well-accepted model for neural dynamics to elucidate an alternative scenario in which diverse neuronal avalanches, obeying scaling, can coexist simultaneously, even if the network operates in a regime far from the edge of any phase transition. We show that perturbations to the system state unfold dynamically according to a "neutral drift" (i.e., guided only by stochasticity) with respect to the background of endogenous spontaneous activity, and that such a neutral dynamics—akin to neutral theories of population genetics and of biogeography—implies marginal propagation of perturbations and scale-free distributed causal avalanches. We argue that causal information, not easily accessible to experiments, is essential to elucidate the nature and statistics of neural avalanches, and that neutral dynamics is likely to play an important role in the cortex functioning. We discuss the implications of these findings to design new empirical approaches to shed further light on how the brain processes and stores information.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27094493','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27094493"><span>Glucose 6-phosphate dehydrogenase deficient subjects may be better "storers" than donors of red blood cells.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Tzounakas, Vassilis L; Kriebardis, Anastasios G; Georgatzakou, Hara T; Foudoulaki-Paparizos, Leontini E; Dzieciatkowska, Monika; Wither, Matthew J; Nemkov, Travis; Hansen, Kirk C; Papassideri, Issidora S; D'Alessandro, Angelo; Antonelou, Marianna H</p> <p>2016-07-01</p> <p>Storage of packed red blood cells (RBCs) is associated with progressive accumulation of lesions, mostly triggered by energy and oxidative stresses, which potentially compromise the effectiveness of the transfusion therapy. Concerns arise as to whether glucose 6-phosphate dehydrogenase deficient subjects (G6PD(-)), ~5% of the population in the Mediterranean area, should be accepted as routine donors in the light of the increased oxidative stress their RBCs suffer from. To address this question, we first performed morphology (scanning electron microscopy), physiology and omics (proteomics and metabolomics) analyses on stored RBCs from healthy or G6PD(-) donors. We then used an in vitro model of transfusion to simulate transfusion outcomes involving G6PD(-) donors or recipients, by reconstituting G6PD(-) stored or fresh blood with fresh or stored blood from healthy volunteers, respectively, at body temperature. We found that G6PD(-) cells store well in relation to energy, calcium and morphology related parameters, though at the expenses of a compromised anti-oxidant system. Additional stimuli, mimicking post-transfusion conditions (37°C, reconstitution with fresh healthy blood, incubation with oxidants) promoted hemolysis and oxidative lesions in stored G6PD(-) cells in comparison to controls. On the other hand, stored healthy RBC units showed better oxidative parameters and lower removal signaling when reconstituted with G6PD(-) fresh blood compared to control. Although the measured parameters of stored RBCs from the G6PD deficient donors appeared to be acceptable, the results from the in vitro model of transfusion suggest that G6PD(-) RBCs could be more susceptible to hemolysis and oxidative stresses post-transfusion. On the other hand, their chronic exposure to oxidative stress might make them good recipients, as they better tolerate exposure to oxidatively damaged long stored healthy RBCs. Copyright © 2016 Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23202184','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23202184"><span>Combination and selection of traffic safety expert judgments for the prevention of driving risks.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Cabello, Enrique; Conde, Cristina; de Diego, Isaac Martín; Moguerza, Javier M; Redchuk, Andrés</p> <p>2012-11-02</p> <p>In this paper, we describe a new framework to combine experts’ judgments for the prevention of driving risks in a cabin truck. In addition, the methodology shows how to choose among the experts the one whose predictions fit best the environmental conditions. The methodology is applied over data sets obtained from a high immersive cabin truck simulator in natural driving conditions. A nonparametric model, based in Nearest Neighbors combined with Restricted Least Squared methods is developed. Three experts were asked to evaluate the driving risk using a Visual Analog Scale (VAS), in order to measure the driving risk in a truck simulator where the vehicle dynamics factors were stored. Numerical results show that the methodology is suitable for embedding in real time systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21124400','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21124400"><span>Towards quantum chemistry on a quantum computer.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lanyon, B P; Whitfield, J D; Gillett, G G; Goggin, M E; Almeida, M P; Kassal, I; Biamonte, J D; Mohseni, M; Powell, B J; Barbieri, M; Aspuru-Guzik, A; White, A G</p> <p>2010-02-01</p> <p>Exact first-principles calculations of molecular properties are currently intractable because their computational cost grows exponentially with both the number of atoms and basis set size. A solution is to move to a radically different model of computing by building a quantum computer, which is a device that uses quantum systems themselves to store and process data. Here we report the application of the latest photonic quantum computer technology to calculate properties of the smallest molecular system: the hydrogen molecule in a minimal basis. We calculate the complete energy spectrum to 20 bits of precision and discuss how the technique can be expanded to solve large-scale chemical problems that lie beyond the reach of modern supercomputers. These results represent an early practical step toward a powerful tool with a broad range of quantum-chemical applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=bird&pg=5&id=EJ862614','ERIC'); return false;" href="https://eric.ed.gov/?q=bird&pg=5&id=EJ862614"><span>Are Judgments of Semantic Relatedness Systematically Impaired in Alzheimer's Disease?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Hornberger, M.; Bell, B.; Graham, K. S.; Rogers, T. T.</p> <p>2009-01-01</p> <p>We employed a triadic comparison task in patients with Alzheimer's disease (AD) and healthy controls to contrast (a) multidimensional scaling (MDS) and accuracy-based assessments of semantic memory, and (b) degraded-store versus degraded-access accounts of semantic impairment in Alzheimer's disease (AD). Similar to other studies using triadic…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/420644','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/420644"><span>Vitrification of plutonium at Rocky Flats the argument for a pilot plant</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Moore, L.</p> <p>1996-05-01</p> <p>Current plans for stabilizing and storing the plutonium at Rocky Flats Plant fail to put the material in a form suitable for disposition and resistant to proliferation. Vitrification should be considered as an alternate technology. The vitrification should begin with a small-scale pilot plant.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/21208108-single-atoms-mot','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/21208108-single-atoms-mot"><span>Single atoms in a MOT</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Meschede, Dieter; Ueberholz, Bernd; Gomer, Victor</p> <p>1999-06-11</p> <p>We are experimenting with individual neutral cesium atoms stored in a magneto-optical trap. The atoms are detected by their resonance fluorescence, and fluorescence fluctuations contain signatures of the atomic internal and external degrees of freedom. This noninvasive probe provides a rich source of information about atomic dynamics at all relevant time scales.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=saas+OR+software+AND+service&pg=5&id=ED575752','ERIC'); return false;" href="https://eric.ed.gov/?q=saas+OR+software+AND+service&pg=5&id=ED575752"><span>Discovering and Mitigating Software Vulnerabilities through Large-Scale Collaboration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Zhao, Mingyi</p> <p>2016-01-01</p> <p>In today's rapidly digitizing society, people place their trust in a wide range of digital services and systems that deliver latest news, process financial transactions, store sensitive information, etc. However, this trust does not have a solid foundation, because software code that supports this digital world has security vulnerabilities. These…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=data+AND+warehouse&pg=5&id=EJ647561','ERIC'); return false;" href="https://eric.ed.gov/?q=data+AND+warehouse&pg=5&id=EJ647561"><span>Mining a Web Citation Database for Author Co-Citation Analysis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>He, Yulan; Hui, Siu Cheung</p> <p>2002-01-01</p> <p>Proposes a mining process to automate author co-citation analysis based on the Web Citation Database, a data warehouse for storing citation indices of Web publications. Describes the use of agglomerative hierarchical clustering for author clustering and multidimensional scaling for displaying author cluster maps, and explains PubSearch, a…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.8956D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.8956D"><span>A dam-reservoir module for a semi-distributed hydrological model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>de Lavenne, Alban; Thirel, Guillaume; Andréassian, Vazken; Perrin, Charles; Ramos, Maria-Helena</p> <p>2017-04-01</p> <p>Developing modeling tools that help to assess the spatial distribution of water resources is a key issue to achieve better solutions for the optimal management of water availability among users in a river basin. Streamflow dynamics depends on (i) the spatial variability of rainfall, (ii) the heterogeneity of catchment behavior and response, and (iii) local human regulations (e.g., reservoirs) that store and control surface water. These aspects can be successfully handled by distributed or semi-distributed hydrological models. In this study, we develop a dam-reservoir module within a semi-distributed rainfall-runoff model (de Lavenne et al. 2016). The model runs at the daily time step, and has five parameters for each sub-catchment as well as a streamflow velocity parameter for flow routing. Its structure is based on two stores, one for runoff production and one for routing. The calibration of the model is performed from upstream to downstream sub-catchments, which efficiently uses spatially-distributed streamflow measurements. In a previous study, Payan et al. (2008) described a strategy to implement a dam module within a lumped rainfall-runoff model. Here we propose to adapt this strategy to a semi-distributed hydrological modelling framework. In this way, the specific location of existing reservoirs inside a river basin is explicitly accounted for. Our goal is to develop a tool that can provide answers to the different issues involved in spatial water management in human-influenced contexts and at large modelling scales. The approach is tested for the Seine basin in France. Results are shown for model performance with and without the dam module. Also, a comparison with the lumped GR5J model highlights the improvements obtained in model performance by considering human influences more explicitly, and by facilitating parameter identifiability. This work opens up new perspectives for streamflow naturalization analyses and scenario-based spatial assessment of water resources under global change. References de Lavenne, A.; Thirel, G.; Andréassian, V.; Perrin, C. & Ramos, M.-H. (2016), 'Spatial variability of the parameters of a semi-distributed hydrological model', PIAHS 373, 87-94. Payan, J.-L.; Perrin, C.; Andréassian, V. & Michel, C. (2008), 'How can man-made water reservoirs be accounted for in a lumped rainfall-runoff model?', Water Resour. Res. 44(3), W03420.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006IJCli..26.1671E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006IJCli..26.1671E"><span>Simulating air temperature in an urban street canyon in all weather conditions using measured data at a reference meteorological station</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Erell, E.; Williamson, T.</p> <p>2006-10-01</p> <p>A model is proposed that adapts data from a standard meteorological station to provide realistic site-specific air temperature in a city street exposed to the same meso-scale environment. In addition to a rudimentary description of the two sites, the canyon air temperature (CAT) model requires only inputs measured at standard weather stations; yet it is capable of accurately predicting the evolution of air temperature in all weather conditions for extended periods. It simulates the effect of urban geometry on radiant exchange; the effect of moisture availability on latent heat flux; energy stored in the ground and in building surfaces; air flow in the street based on wind above roof height; and the sensible heat flux from individual surfaces and from the street canyon as a whole. The CAT model has been tested on field data measured in a monitoring program carried out in Adelaide, Australia, in 2000-2001. After calibrating the model, predicted air temperature correlated well with measured data in all weather conditions over extended periods. The experimental validation provides additional evidence in support of a number of parameterisation schemes incorporated in the model to account for sensible heat and storage flux.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23204646','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23204646"><span>Protein Simulation Data in the Relational Model.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Simms, Andrew M; Daggett, Valerie</p> <p>2012-10-01</p> <p>High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost-significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3507464','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3507464"><span>Protein Simulation Data in the Relational Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Simms, Andrew M.; Daggett, Valerie</p> <p>2011-01-01</p> <p>High performance computing is leading to unprecedented volumes of data. Relational databases offer a robust and scalable model for storing and analyzing scientific data. However, these features do not come without a cost—significant design effort is required to build a functional and efficient repository. Modeling protein simulation data in a relational database presents several challenges: the data captured from individual simulations are large, multi-dimensional, and must integrate with both simulation software and external data sites. Here we present the dimensional design and relational implementation of a comprehensive data warehouse for storing and analyzing molecular dynamics simulations using SQL Server. PMID:23204646</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H52D..01G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H52D..01G"><span>Seasonal Dynamics of River Corridor Exchange Across the Continental United States</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gomez-Velez, J. D.; Harvey, J. W.; Scott, D.; Boyer, E. W.; Schmadel, N. M.</p> <p>2017-12-01</p> <p>River corridors store and convey mass and energy from landscapes to the ocean, altering water quality and ecosystem functioning at the local, reach, and watershed scales. As water moves through river corridors from headwaters streams to coastal estuaries, dynamic exchange between the river channel and its adjacent riparian, floodplain, and hyporheic zones, combined with ponded waters such as lakes and reservoirs, results in the emergence of hot spots and moments for biogeochemical transformations. In this work, we used the model Networks with EXchange and Subsurface Storage (NEXSS) to estimate seasonal variations in river corridor exchange fluxes and residence times along the continental United States. Using a simple routing scheme, we translate these estimates into a cumulative measure of river corridor connectivity at the watershed scale, differentiating the contributions of hyporheic zones, floodplains, and ponded waters. We find that the relative role of these exchange subsystems changes seasonally, driven by the intra-seasonal variability of discharge. In addition, we find that seasonal variations in discharge and the biogeochemical potential of hyporheic zones are out of phase. This behavior results in a significant reduction in hyporheic water quality functions during high flows and emphasizes the potential importance of reconnecting floodplains for managing water quality during seasonal high flows. Physical parameterizations of river corridor processes are critical to model and predict water quality and to sustainably manage water resources under present and future socio-economic and climatic conditions. Parsimonious models like NEXSS can play a key role in the design, implementation, and evaluation of sustainable management practices that target both water quantity and quality at the scale of the nation. This research is a product of the John Wesley Powell Center River Corridor Working Group.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/10570506','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/10570506"><span>A matched case-control study of convenience store robbery risk factors.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hendricks, S A; Landsittel, D P; Amandus, H E; Malcan, J; Bell, J</p> <p>1999-11-01</p> <p>Convenience store clerks have been shown to be at high risk for assault and homicide, mostly owing to robbery or robbery attempts. Although the literature consistently indicates that at least some environmental designs are effective deterrents of robbery, the significance of individual interventions and policies has differed across past studies. To address these issues, a matched case-control study of 400 convenience store robberies in three metropolitan areas of Virginia was conducted. Conditional logistic regression was implemented to evaluate the significance of various environmental designs and other factors possibly related to convenience store robbery. Findings indicate that numerous characteristics of the surrounding environment and population were significantly associated with convenience store robbery. Results also showed that, on a univariate level, most crime prevention factors were significantly associated with a lower risk for robbery. Using a forward selection process, a multivariate model, which included cash handling policy, bullet-resistant shielding, and numerous characteristics of the surrounding area and population, was identified. This study addressed numerous limitations of the previous literature by prospectively collecting extensive data on a large sample of diverse convenience stores and directly addressing the current theory on the robbers' selection of a target store through a matched case-control design.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27488233','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27488233"><span>Trends in racial/ethnic and income disparities in foods and beverages consumed and purchased from stores among US households with children, 2000-2013.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ng, Shu Wen; Poti, Jennifer M; Popkin, Barry M</p> <p>2016-09-01</p> <p>It is unclear whether racial/ethnic and income differences in foods and beverages obtained from stores contribute to disparities in caloric intake over time. We sought to determine whether there are disparities in calories obtained from store-bought consumer packaged goods (CPGs), whether brands (name brands compared with private labels) matter, and if disparities have changed over time. We used NHANES individual dietary intake data among households with children along with the Nielsen Homescan data on CPG purchases among households with children. With NHANES, we compared survey-weighted energy intakes for 2003-2006 and 2009-2012 from store and nonstore sources by race/ethnicity [non-Hispanic whites (NHWs), non-Hispanic blacks (NHBs), and Hispanic Mexican-Americans) and income [≤185% federal poverty line (FPL), 186-400% FPL, and >400% FPL]. With the Nielsen data, we compared 2000-2013 trends in calories purchased from CPGs (obtained from stores) across brands by race/ethnicity (NHW, NHB, and Hispanic) and income. We conducted random-effect models to derive adjusted trends and differences in calories purchased (708,175 observations from 64,709 unique households) and tested whether trends were heterogeneous by race/ethnicity or income. Store-bought foods and beverages represented the largest component of dietary intake, with greater decreases in energy intakes in nonstore sources for foods and in store sources for beverages. Beverages from stores consistently decreased in all subpopulations. However, in adjusted models, reductions in CPG calories purchased in 2009-2012 were slower for NHB and low-income households than for NHW and high-income households, respectively. The decline in calories from name-brand food purchases was slower among NHB, Hispanic, and lowest-income households. NHW and high-income households had the highest absolute calories purchased in 2000. Across 2 large data sources, we found decreases in intake and purchases of beverages from stores across racial/ethnic and income groups. However, potentially beneficial reductions in calories purchased were more pronounced in some subgroups over others. © 2016 American Society for Nutrition.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4997294','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4997294"><span>Trends in racial/ethnic and income disparities in foods and beverages consumed and purchased from stores among US households with children, 2000–201312</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Poti, Jennifer M; Popkin, Barry M</p> <p>2016-01-01</p> <p>Background: It is unclear whether racial/ethnic and income differences in foods and beverages obtained from stores contribute to disparities in caloric intake over time. Objective: We sought to determine whether there are disparities in calories obtained from store-bought consumer packaged goods (CPGs), whether brands (name brands compared with private labels) matter, and if disparities have changed over time. Design: We used NHANES individual dietary intake data among households with children along with the Nielsen Homescan data on CPG purchases among households with children. With NHANES, we compared survey-weighted energy intakes for 2003–2006 and 2009–2012 from store and nonstore sources by race/ethnicity [non-Hispanic whites (NHWs), non-Hispanic blacks (NHBs), and Hispanic Mexican-Americans) and income [≤185% federal poverty line (FPL), 186–400% FPL, and >400% FPL]. With the Nielsen data, we compared 2000–2013 trends in calories purchased from CPGs (obtained from stores) across brands by race/ethnicity (NHW, NHB, and Hispanic) and income. We conducted random-effect models to derive adjusted trends and differences in calories purchased (708,175 observations from 64,709 unique households) and tested whether trends were heterogeneous by race/ethnicity or income. Results: Store-bought foods and beverages represented the largest component of dietary intake, with greater decreases in energy intakes in nonstore sources for foods and in store sources for beverages. Beverages from stores consistently decreased in all subpopulations. However, in adjusted models, reductions in CPG calories purchased in 2009–2012 were slower for NHB and low-income households than for NHW and high-income households, respectively. The decline in calories from name-brand food purchases was slower among NHB, Hispanic, and lowest-income households. NHW and high-income households had the highest absolute calories purchased in 2000. Conclusions: Across 2 large data sources, we found decreases in intake and purchases of beverages from stores across racial/ethnic and income groups. However, potentially beneficial reductions in calories purchased were more pronounced in some subgroups over others. PMID:27488233</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA226632','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA226632"><span>The Two-Store Model of Memory: Past Criticisms, Current Status, and Future Directions (Het Twee-Stadia Model van het Geheugen: Terugblik, Huidige Status en Toekomstige Ontwikkelingen)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1990-06-01</p> <p>levels -of- processing " ( Craik & Lockhart , 1972) en de "working memory" (Baddeley & Hitch, 1974) benaderingen deze verschijnselen beter...of storing information in LTS. 2.2 The levels -of- processing framework Craik and Lockhart (1972), in a very influential paper, proposed what they...Cognitive theory. (Vol. 1). (pp. 151-171). Hillsdale, N.J.: Erlbaum. Craik , F.l.M & Lockhart , R.S. (1972). Levels of processing : A frame- work</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>