How much a galaxy knows about its large-scale environment?: An information theoretic perspective
NASA Astrophysics Data System (ADS)
Pandey, Biswajit; Sarkar, Suman
2017-05-01
The small-scale environment characterized by the local density is known to play a crucial role in deciding the galaxy properties but the role of large-scale environment on galaxy formation and evolution still remain a less clear issue. We propose an information theoretic framework to investigate the influence of large-scale environment on galaxy properties and apply it to the data from the Galaxy Zoo project that provides the visual morphological classifications of ˜1 million galaxies from the Sloan Digital Sky Survey. We find a non-zero mutual information between morphology and environment that decreases with increasing length-scales but persists throughout the entire length-scales probed. We estimate the conditional mutual information and the interaction information between morphology and environment by conditioning the environment on different length-scales and find a synergic interaction between them that operates up to at least a length-scales of ˜30 h-1 Mpc. Our analysis indicates that these interactions largely arise due to the mutual information shared between the environments on different length-scales.
Large-scale environments of narrow-line Seyfert 1 galaxies
NASA Astrophysics Data System (ADS)
Järvelä, E.; Lähteenmäki, A.; Lietzen, H.; Poudel, A.; Heinämäki, P.; Einasto, M.
2017-09-01
Studying large-scale environments of narrow-line Seyfert 1 (NLS1) galaxies gives a new perspective on their properties, particularly their radio loudness. The large-scale environment is believed to have an impact on the evolution and intrinsic properties of galaxies, however, NLS1 sources have not been studied in this context before. We have a large and diverse sample of 1341 NLS1 galaxies and three separate environment data sets constructed using Sloan Digital Sky Survey. We use various statistical methods to investigate how the properties of NLS1 galaxies are connected to the large-scale environment, and compare the large-scale environments of NLS1 galaxies with other active galactic nuclei (AGN) classes, for example, other jetted AGN and broad-line Seyfert 1 (BLS1) galaxies, to study how they are related. NLS1 galaxies reside in less dense environments than any of the comparison samples, thus confirming their young age. The average large-scale environment density and environmental distribution of NLS1 sources is clearly different compared to BLS1 galaxies, thus it is improbable that they could be the parent population of NLS1 galaxies and unified by orientation. Within the NLS1 class there is a trend of increasing radio loudness with increasing large-scale environment density, indicating that the large-scale environment affects their intrinsic properties. Our results suggest that the NLS1 class of sources is not homogeneous, and furthermore, that a considerable fraction of them are misclassified. We further support a published proposal to replace the traditional classification to radio-loud, and radio-quiet or radio-silent sources with a division into jetted and non-jetted sources.
SCALE(ing)-UP Teaching: A Case Study of Student Motivation in an Undergraduate Course
ERIC Educational Resources Information Center
Chittum, Jessica R.; McConnell, Kathryne Drezek; Sible, Jill
2017-01-01
Teaching large classes is increasingly common; thus, demand for effective large-class pedagogy is rising. One method, titled "SCALE-UP" (Student-Centered Active Learning Environment for Undergraduate Programs), is intended for large classes and involves collaborative, active learning in a technology-rich and student-centered environment.…
Interaction of a cumulus cloud ensemble with the large-scale environment
NASA Technical Reports Server (NTRS)
Arakawa, A.; Schubert, W.
1973-01-01
Large-scale modification of the environment by cumulus clouds is discussed in terms of entrainment, detrainment, evaporation, and subsidence. Drying, warming, and condensation by vertical displacement of air are considered as well as budget equations for mass, static energy, water vapor, and liquid water.
The large-scale effect of environment on galactic conformity
NASA Astrophysics Data System (ADS)
Sun, Shuangpeng; Guo, Qi; Wang, Lan; Lacey, Cedric G.; Wang, Jie; Gao, Liang; Pan, Jun
2018-07-01
We use a volume-limited galaxy sample from the Sloan Digital Sky Survey Data Release 7 to explore the dependence of galactic conformity on the large-scale environment, measured on ˜4 Mpc scales. We find that the star formation activity of neighbour galaxies depends more strongly on the environment than on the activity of their primary galaxies. In underdense regions most neighbour galaxies tend to be active, while in overdense regions neighbour galaxies are mostly passive, regardless of the activity of their primary galaxies. At a given stellar mass, passive primary galaxies reside in higher density regions than active primary galaxies, leading to the apparently strong conformity signal. The dependence of the activity of neighbour galaxies on environment can be explained by the corresponding dependence of the fraction of satellite galaxies. Similar results are found for galaxies in a semi-analytical model, suggesting that no new physics is required to explain the observed large-scale conformity.
Large-Scale Networked Virtual Environments: Architecture and Applications
ERIC Educational Resources Information Center
Lamotte, Wim; Quax, Peter; Flerackers, Eddy
2008-01-01
Purpose: Scalability is an important research topic in the context of networked virtual environments (NVEs). This paper aims to describe the ALVIC (Architecture for Large-scale Virtual Interactive Communities) approach to NVE scalability. Design/methodology/approach: The setup and results from two case studies are shown: a 3-D learning environment…
ERIC Educational Resources Information Center
Kraemer, David J. M.; Schinazi, Victor R.; Cawkwell, Philip B.; Tekriwal, Anand; Epstein, Russell A.; Thompson-Schill, Sharon L.
2017-01-01
Using novel virtual cities, we investigated the influence of verbal and visual strategies on the encoding of navigation-relevant information in a large-scale virtual environment. In 2 experiments, participants watched videos of routes through 4 virtual cities and were subsequently tested on their memory for observed landmarks and their ability to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shih, Patrick
2012-03-22
Patrick Shih, representing both the University of California, Berkeley and JGI, gives a talk titled "CyanoGEBA: A Better Understanding of Cynobacterial Diversity through Large-scale Genomics" at the JGI 7th Annual Users Meeting: Genomics of Energy & Environment Meeting on March 22, 2012 in Walnut Creek, California.
Shih, Patrick
2018-01-10
Patrick Shih, representing both the University of California, Berkeley and JGI, gives a talk titled "CyanoGEBA: A Better Understanding of Cynobacterial Diversity through Large-scale Genomics" at the JGI 7th Annual Users Meeting: Genomics of Energy & Environment Meeting on March 22, 2012 in Walnut Creek, California.
ERIC Educational Resources Information Center
Niklas, Frank; Nguyen, Cuc; Cloney, Daniel S.; Tayler, Collette; Adams, Raymond
2016-01-01
Favourable home learning environments (HLEs) support children's literacy, numeracy and social development. In large-scale research, HLE is typically measured by self-report survey, but there is little consistency between studies and many different items and latent constructs are observed. Little is known about the stability of these items and…
The Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey
NASA Astrophysics Data System (ADS)
Squires, Gordon K.; Lubin, L. M.; Gal, R. R.
2007-05-01
We present the motivation, design, and latest results from the Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 Mpc around 20 known galaxy clusters at z > 0.6. When complete, the survey will cover nearly 5 square degrees, all targeted at high-density regions, making it complementary and comparable to field surveys such as DEEP2, GOODS, and COSMOS. For the survey, we are using the Large Format Camera on the Palomar 5-m and SuPRIME-Cam on the Subaru 8-m to obtain optical/near-infrared imaging of an approximately 30 arcmin region around previously studied high-redshift clusters. Colors are used to identify likely member galaxies which are targeted for follow-up spectroscopy with the DEep Imaging Multi-Object Spectrograph on the Keck 10-m. This technique has been used to identify successfully the Cl 1604 supercluster at z = 0.9, a large scale structure containing at least eight clusters (Gal & Lubin 2004; Gal, Lubin & Squires 2005). We present the most recent structures to be photometrically and spectroscopically confirmed through this program, discuss the properties of the member galaxies as a function of environment, and describe our planned multi-wavelength (radio, mid-IR, and X-ray) observations of these systems. The goal of this survey is to identify and examine a statistical sample of large scale structures during an active period in the assembly history of the most massive clusters. With such a sample, we can begin to constrain large scale cluster dynamics and determine the effect of the larger environment on galaxy evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahn, Kyungjin, E-mail: kjahn@chosun.ac.kr
We study the dynamical effect of the relative velocity between dark matter and baryonic fluids, which remained supersonic after the epoch of recombination. The impact of this supersonic motion on the formation of cosmological structures was first formulated by Tseliakhovich and Hirata, in terms of the linear theory of small-scale fluctuations coupled to large-scale, relative velocities in mean-density regions. In their formalism, they limited the large-scale density environment to be that of the global mean density. We improve on their formulation by allowing variation in the density environment as well as the relative velocities. This leads to a new typemore » of coupling between large-scale and small-scale modes. We find that the small-scale fluctuation grows in a biased way: faster in the overdense environment and slower in the underdense environment. We also find that the net effect on the global power spectrum of the density fluctuation is to boost its overall amplitude from the prediction by Tseliakhovich and Hirata. Correspondingly, the conditional mass function of cosmological halos and the halo bias parameter are both affected in a similar way. The discrepancy between our prediction and that of Tseliakhovich and Hirata is significant, and therefore, the related cosmology and high-redshift astrophysics should be revisited. The mathematical formalism of this study can be used for generating cosmological initial conditions of small-scale perturbations in generic, overdense (underdense) background patches.« less
On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat
NASA Astrophysics Data System (ADS)
Hua, H.
2016-12-01
Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.
The HI Content of Galaxies as a Function of Local Density and Large-Scale Environment
NASA Astrophysics Data System (ADS)
Thoreen, Henry; Cantwell, Kelly; Maloney, Erin; Cane, Thomas; Brough Morris, Theodore; Flory, Oscar; Raskin, Mark; Crone-Odekon, Mary; ALFALFA Team
2017-01-01
We examine the HI content of galaxies as a function of environment, based on a catalogue of 41527 galaxies that are part of the 70% complete Arecibo Legacy Fast-ALFA (ALFALFA) survey. We use nearest-neighbor methods to characterize local environment, and a modified version of the algorithm developed for the Galaxy and Mass Assembly (GAMA) survey to classify large-scale environment as group, filament, tendril, or void. We compare the HI content in these environments using statistics that include both HI detections and the upper limits on detections from ALFALFA. The large size of the sample allows to statistically compare the HI content in different environments for early-type galaxies as well as late-type galaxies. This work is supported by NSF grants AST-1211005 and AST-1637339, the Skidmore Faculty-Student Summer Research program, and the Schupf Scholars program.
NASA Astrophysics Data System (ADS)
Velten, Andreas
2017-05-01
Light scattering is a primary obstacle to optical imaging in a variety of different environments and across many size and time scales. Scattering complicates imaging on large scales when imaging through the atmosphere when imaging from airborne or space borne platforms, through marine fog, or through fog and dust in vehicle navigation, for example in self driving cars. On smaller scales, scattering is the major obstacle when imaging through human tissue in biomedical applications. Despite the large variety of participating materials and size scales, light transport in all these environments is usually described with very similar scattering models that are defined by the same small set of parameters, including scattering and absorption length and phase function. We attempt a study of scattering and methods of imaging through scattering across different scales and media, particularly with respect to the use of time of flight information. We can show that using time of flight, in addition to spatial information, provides distinct advantages in scattering environments. By performing a comparative study of scattering across scales and media, we are able to suggest scale models for scattering environments to aid lab research. We also can transfer knowledge and methodology between different fields.
Halo assembly bias and the tidal anisotropy of the local halo environment
NASA Astrophysics Data System (ADS)
Paranjape, Aseem; Hahn, Oliver; Sheth, Ravi K.
2018-05-01
We study the role of the local tidal environment in determining the assembly bias of dark matter haloes. Previous results suggest that the anisotropy of a halo's environment (i.e. whether it lies in a filament or in a more isotropic region) can play a significant role in determining the eventual mass and age of the halo. We statistically isolate this effect, using correlations between the large-scale and small-scale environments of simulated haloes at z = 0 with masses between 1011.6 ≲ (m/h-1 M⊙) ≲ 1014.9. We probe the large-scale environment, using a novel halo-by-halo estimator of linear bias. For the small-scale environment, we identify a variable αR that captures the tidal anisotropy in a region of radius R = 4R200b around the halo and correlates strongly with halo bias at fixed mass. Segregating haloes by αR reveals two distinct populations. Haloes in highly isotropic local environments (αR ≲ 0.2) behave as expected from the simplest, spherically averaged analytical models of structure formation, showing a negative correlation between their concentration and large-scale bias at all masses. In contrast, haloes in anisotropic, filament-like environments (αR ≳ 0.5) tend to show a positive correlation between bias and concentration at any mass. Our multiscale analysis cleanly demonstrates how the overall assembly bias trend across halo mass emerges as an average over these different halo populations, and provides valuable insights towards building analytical models that correctly incorporate assembly bias. We also discuss potential implications for the nature and detectability of galaxy assembly bias.
Probabilistic double guarantee kidnapping detection in SLAM.
Tian, Yang; Ma, Shugen
2016-01-01
For determining whether kidnapping has happened and which type of kidnapping it is while a robot performs autonomous tasks in an unknown environment, a double guarantee kidnapping detection (DGKD) method has been proposed. The good performance of DGKD in a relative small environment is shown. However, a limitation of DGKD is found in a large-scale environment by our recent work. In order to increase the adaptability of DGKD in a large-scale environment, an improved method called probabilistic double guarantee kidnapping detection is proposed in this paper to combine probability of features' positions and the robot's posture. Simulation results demonstrate the validity and accuracy of the proposed method.
The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications
NASA Technical Reports Server (NTRS)
Johnston, William E.
2002-01-01
With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.
Evaluating the Large-Scale Environment of Extreme Events Using Reanalyses
NASA Astrophysics Data System (ADS)
Bosilovich, M. G.; Schubert, S. D.; Koster, R. D.; da Silva, A. M., Jr.; Eichmann, A.
2014-12-01
Extreme conditions and events have always been a long standing concern in weather forecasting and national security. While some evidence indicates extreme weather will increase in global change scenarios, extremes are often related to the large scale atmospheric circulation, but also occurring infrequently. Reanalyses assimilate substantial amounts of weather data and a primary strength of reanalysis data is the representation of the large-scale atmospheric environment. In this effort, we link the occurrences of extreme events or climate indicators to the underlying regional and global weather patterns. Now, with greater than 3o years of data, reanalyses can include multiple cases of extreme events, and thereby identify commonality among the weather to better characterize the large-scale to global environment linked to the indicator or extreme event. Since these features are certainly regionally dependent, and also, the indicators of climate are continually being developed, we outline various methods to analyze the reanalysis data and the development of tools to support regional evaluation of the data. Here, we provide some examples of both individual case studies and composite studies of similar events. For example, we will compare the large scale environment for Northeastern US extreme precipitation with that of highest mean precipitation seasons. Likewise, southerly winds can shown to be a major contributor to very warm days in the Northeast winter. While most of our development has involved NASA's MERRA reanalysis, we are also looking forward to MERRA-2 which includes several new features that greatly improve the representation of weather and climate, especially for the regions and sectors involved in the National Climate Assessment.
Ground-water flow in low permeability environments
Neuzil, Christopher E.
1986-01-01
Certain geologic media are known to have small permeability; subsurface environments composed of these media and lacking well developed secondary permeability have groundwater flow sytems with many distinctive characteristics. Moreover, groundwater flow in these environments appears to influence the evolution of certain hydrologic, geologic, and geochemical systems, may affect the accumulation of pertroleum and ores, and probably has a role in the structural evolution of parts of the crust. Such environments are also important in the context of waste disposal. This review attempts to synthesize the diverse contributions of various disciplines to the problem of flow in low-permeability environments. Problems hindering analysis are enumerated together with suggested approaches to overcoming them. A common thread running through the discussion is the significance of size- and time-scale limitations of the ability to directly observe flow behavior and make measurements of parameters. These limitations have resulted in rather distinct small- and large-scale approaches to the problem. The first part of the review considers experimental investigations of low-permeability flow, including in situ testing; these are generally conducted on temporal and spatial scales which are relatively small compared with those of interest. Results from this work have provided increasingly detailed information about many aspects of the flow but leave certain questions unanswered. Recent advances in laboratory and in situ testing techniques have permitted measurements of permeability and storage properties in progressively “tighter” media and investigation of transient flow under these conditions. However, very large hydraulic gradients are still required for the tests; an observational gap exists for typical in situ gradients. The applicability of Darcy's law in this range is therefore untested, although claims of observed non-Darcian behavior appear flawed. Two important nonhydraulic flow phenomena, osmosis and ultrafiltration, are experimentally well established in prepared clays but have been incompletely investigated, particularly in undisturbed geologic media. Small-scale experimental results form much of the basis for analyses of flow in low-permeability environments which occurs on scales of time and size too large to permit direct observation. Such large-scale flow behavior is the focus of the second part of the review. Extrapolation of small-scale experimental experience becomes an important and sometimes controversial problem in this context. In large flow systems under steady state conditions the regional permeability can sometimes be determined, but systems with transient flow are more difficult to analyze. The complexity of the problem is enhanced by the sensitivity of large-scale flow to the effects of slow geologic processes. One-dimensional studies have begun to elucidate how simple burial or exhumation can generate transient flow conditions by changing the state of stress and temperature and by burial metamorphism. Investigation of the more complex problem of the interaction of geologic processes and flow in two and three dimensions is just beginning. Because these transient flow analyses have largely been based on flow in experimental scale systems or in relatively permeable systems, deformation in response to effective stress changes is generally treated as linearly elastic; however, this treatment creates difficulties for the long periods of interest because viscoelastic deformation is probably significant. Also, large-scale flow simulations in argillaceous environments generally have neglected osmosis and ultrafiltration, in part because extrapolation of laboratory experience with coupled flow to large scales under in situ conditions is controversial. Nevertheless, the effects are potentially quite important because the coupled flow might cause ultra long lived transient conditions. The difficulties associated with analysis are matched by those of characterizing hydrologic conditions in tight environments; measurements of hydraulic head and sampling of pore fluids have been done only rarely because of the practical difficulties involved. These problems are also discussed in the second part of this paper.
Efficient On-Demand Operations in Large-Scale Infrastructures
ERIC Educational Resources Information Center
Ko, Steven Y.
2009-01-01
In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…
SmallTool - a toolkit for realizing shared virtual environments on the Internet
NASA Astrophysics Data System (ADS)
Broll, Wolfgang
1998-09-01
With increasing graphics capabilities of computers and higher network communication speed, networked virtual environments have become available to a large number of people. While the virtual reality modelling language (VRML) provides users with the ability to exchange 3D data, there is still a lack of appropriate support to realize large-scale multi-user applications on the Internet. In this paper we will present SmallTool, a toolkit to support shared virtual environments on the Internet. The toolkit consists of a VRML-based parsing and rendering library, a device library, and a network library. This paper will focus on the networking architecture, provided by the network library - the distributed worlds transfer and communication protocol (DWTP). DWTP provides an application-independent network architecture to support large-scale multi-user environments on the Internet.
NASA Astrophysics Data System (ADS)
Tan, Zhihong; Kaul, Colleen M.; Pressel, Kyle G.; Cohen, Yair; Schneider, Tapio; Teixeira, João.
2018-03-01
Large-scale weather forecasting and climate models are beginning to reach horizontal resolutions of kilometers, at which common assumptions made in existing parameterization schemes of subgrid-scale turbulence and convection—such as that they adjust instantaneously to changes in resolved-scale dynamics—cease to be justifiable. Additionally, the common practice of representing boundary-layer turbulence, shallow convection, and deep convection by discontinuously different parameterizations schemes, each with its own set of parameters, has contributed to the proliferation of adjustable parameters in large-scale models. Here we lay the theoretical foundations for an extended eddy-diffusivity mass-flux (EDMF) scheme that has explicit time-dependence and memory of subgrid-scale variables and is designed to represent all subgrid-scale turbulence and convection, from boundary layer dynamics to deep convection, in a unified manner. Coherent up and downdrafts in the scheme are represented as prognostic plumes that interact with their environment and potentially with each other through entrainment and detrainment. The more isotropic turbulence in their environment is represented through diffusive fluxes, with diffusivities obtained from a turbulence kinetic energy budget that consistently partitions turbulence kinetic energy between plumes and environment. The cross-sectional area of up and downdrafts satisfies a prognostic continuity equation, which allows the plumes to cover variable and arbitrarily large fractions of a large-scale grid box and to have life cycles governed by their own internal dynamics. Relatively simple preliminary proposals for closure parameters are presented and are shown to lead to a successful simulation of shallow convection, including a time-dependent life cycle.
NASA Astrophysics Data System (ADS)
Wang, Peng; Luo, Yu; Kang, Xi; Libeskind, Noam I.; Wang, Lei; Zhang, Youcai; Tempel, Elmo; Guo, Quan
2018-06-01
The alignment between satellites and central galaxies has been studied in detail both in observational and theoretical works. The widely accepted fact is that satellites preferentially reside along the major axis of their central galaxy. However, the origin and large-scale environmental dependence of this alignment are still unknown. In an attempt to determine these variables, we use data constructed from Sloan Digital Sky Survey DR7 to investigate the large-scale environmental dependence of this alignment with emphasis on examining the alignment’s dependence on the color of the central galaxy. We find a very strong large-scale environmental dependence of the satellite–central alignment (SCA) in groups with blue centrals. Satellites of blue centrals in knots are preferentially located perpendicular to the major axes of the centrals, and the alignment angle decreases with environment, namely, when going from knots to voids. The alignment angle strongly depends on the {}0.1(g-r) color of centrals. We suggest that the SCA is the result of a competition between satellite accretion within large-scale structure (LSS) and galaxy evolution inside host halos. For groups containing red central galaxies, the SCA is mainly determined by the evolution effect, while for blue central dominated groups, the effect of the LSS plays a more important role, especially in knots. Our results provide an explanation for how the SCA forms within different large-scale environments. The perpendicular case in groups and knots with blue centrals may also provide insight into understanding similar polar arrangements, such as the formation of the Milky Way and Centaurus A’s satellite system.
ERIC Educational Resources Information Center
Glazer, Joshua L.; Peurach, Donald J.
2013-01-01
The development and scale-up of school improvement networks is among the most important educational innovations of the last decade, and current federal, state, and district efforts attempt to use school improvement networks as a mechanism for supporting large-scale change. The potential of improvement networks, however, rests on the extent to…
Fan, Jessie X; Hanson, Heidi A; Zick, Cathleen D; Brown, Barbara B; Kowaleski-Jones, Lori; Smith, Ken R
2014-08-19
Empirical studies of the association between neighbourhood food environments and individual obesity risk have found mixed results. One possible cause of these mixed findings is the variation in neighbourhood geographic scale used. The purpose of this paper was to examine how various neighbourhood geographic scales affected the estimated relationship between food environments and obesity risk. Cross-sectional secondary data analysis. Salt Lake County, Utah, USA. 403,305 Salt Lake County adults 25-64 in the Utah driver license database between 1995 and 2008. Utah driver license data were geo-linked to 2000 US Census data and Dun & Bradstreet business data. Food outlets were classified into the categories of large grocery stores, convenience stores, limited-service restaurants and full-service restaurants, and measured at four neighbourhood geographic scales: Census block group, Census tract, ZIP code and a 1 km buffer around the resident's house. These measures were regressed on individual obesity status using multilevel random intercept regressions. Obesity. Food environment was important for obesity but the scale of the relevant neighbourhood differs for different type of outlets: large grocery stores were not significant at all four geographic scales, limited-service restaurants at the medium-to-large scale (Census tract or larger) and convenience stores and full-service restaurants at the smallest scale (Census tract or smaller). The choice of neighbourhood geographic scale can affect the estimated significance of the association between neighbourhood food environments and individual obesity risk. However, variations in geographic scale alone do not explain the mixed findings in the literature. If researchers are constrained to use one geographic scale with multiple categories of food outlets, using Census tract or 1 km buffer as the neighbourhood geographic unit is likely to allow researchers to detect most significant relationships. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-03-01
To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. For the purposes of this Guide, large-scale Federal renewable energy projects are defined as renewable energy facilities larger than 10 megawatts (MW) that are sited on Federal property and lands and typically financed and owned by third parties.1 The U.S. Department of Energy’s Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessarymore » private capital to complete them. This Guide is intended to provide a general resource that will begin to develop the Federal employee’s awareness and understanding of the project developer’s operating environment and the private sector’s awareness and understanding of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this Guide has been organized to match Federal processes with typical phases of commercial project development. FEMP collaborated with the National Renewable Energy Laboratory (NREL) and professional project developers on this Guide to ensure that Federal projects have key elements recognizable to private sector developers and investors. The main purpose of this Guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project. This framework begins the translation between the Federal and private sector operating environments. When viewing the overall« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
FTHENAKIS,V.; ZWEIBEL,K.; MOSKOWITZ,P.
1999-02-01
The objective of the workshop ``Photovoltaics and the Environment'' was to bring together PV manufacturers and industry analysts to define EH and S issues related to the large-scale commercialization of PV technologies.
NASA Astrophysics Data System (ADS)
Widyaningrum, E.; Gorte, B. G. H.
2017-05-01
LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information Agency in Indonesia. As a progressive advanced technology, Geographic Information System (GIS) open possibilities to deal with geospatial data automatic processing and analyses. Considering further needs of spatial data sharing and integration, the one stop processing of LiDAR data in a GIS environment is considered a powerful and efficient approach for the base map provision. The quality of the automated topographic base map is assessed and analysed based on its completeness, correctness, quality, and the confusion matrix.
Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities (Book)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2013-03-01
To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. The U.S. Department of Energy's Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessary private capital to complete them. This guide is intended to provide a general resource that will begin to develop the Federal employee's awareness and understanding of the project developer's operating environment and the private sector's awareness and understandingmore » of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this guide has been organized to match Federal processes with typical phases of commercial project development. The main purpose of this guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trujillo, Angelina Michelle
Strategy, Planning, Acquiring- very large scale computing platforms come and go and planning for immensely scalable machines often precedes actual procurement by 3 years. Procurement can be another year or more. Integration- After Acquisition, machines must be integrated into the computing environments at LANL. Connection to scalable storage via large scale storage networking, assuring correct and secure operations. Management and Utilization – Ongoing operations, maintenance, and trouble shooting of the hardware and systems software at massive scale is required.
NASA Astrophysics Data System (ADS)
West, Ruth G.; Margolis, Todd; Prudhomme, Andrew; Schulze, Jürgen P.; Mostafavi, Iman; Lewis, J. P.; Gossmann, Joachim; Singh, Rajvikram
2014-02-01
Scalable Metadata Environments (MDEs) are an artistic approach for designing immersive environments for large scale data exploration in which users interact with data by forming multiscale patterns that they alternatively disrupt and reform. Developed and prototyped as part of an art-science research collaboration, we define an MDE as a 4D virtual environment structured by quantitative and qualitative metadata describing multidimensional data collections. Entire data sets (e.g.10s of millions of records) can be visualized and sonified at multiple scales and at different levels of detail so they can be explored interactively in real-time within MDEs. They are designed to reflect similarities and differences in the underlying data or metadata such that patterns can be visually/aurally sorted in an exploratory fashion by an observer who is not familiar with the details of the mapping from data to visual, auditory or dynamic attributes. While many approaches for visual and auditory data mining exist, MDEs are distinct in that they utilize qualitative and quantitative data and metadata to construct multiple interrelated conceptual coordinate systems. These "regions" function as conceptual lattices for scalable auditory and visual representations within virtual environments computationally driven by multi-GPU CUDA-enabled fluid dyamics systems.
Information Power Grid Posters
NASA Technical Reports Server (NTRS)
Vaziri, Arsi
2003-01-01
This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.
Copy of Using Emulation and Simulation to Understand the Large-Scale Behavior of the Internet.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adalsteinsson, Helgi; Armstrong, Robert C.; Chiang, Ken
2008-10-01
We report on the work done in the late-start LDRDUsing Emulation and Simulation toUnderstand the Large-Scale Behavior of the Internet. We describe the creation of a researchplatform that emulates many thousands of machines to be used for the study of large-scale inter-net behavior. We describe a proof-of-concept simple attack we performed in this environment.We describe the successful capture of a Storm bot and, from the study of the bot and furtherliterature search, establish large-scale aspects we seek to understand via emulation of Storm onour research platform in possible follow-on work. Finally, we discuss possible future work.3
Jiao, Jialong; Ren, Huilong; Adenya, Christiaan Adika; Chen, Chaohe
2017-01-01
Wave-induced motion and load responses are important criteria for ship performance evaluation. Physical experiments have long been an indispensable tool in the predictions of ship’s navigation state, speed, motions, accelerations, sectional loads and wave impact pressure. Currently, majority of the experiments are conducted in laboratory tank environment, where the wave environments are different from the realistic sea waves. In this paper, a laboratory tank testing system for ship motions and loads measurement is reviewed and reported first. Then, a novel large-scale model measurement technique is developed based on the laboratory testing foundations to obtain accurate motion and load responses of ships in realistic sea conditions. For this purpose, a suite of advanced remote control and telemetry experimental system was developed in-house to allow for the implementation of large-scale model seakeeping measurement at sea. The experimental system includes a series of technique sensors, e.g., the Global Position System/Inertial Navigation System (GPS/INS) module, course top, optical fiber sensors, strain gauges, pressure sensors and accelerometers. The developed measurement system was tested by field experiments in coastal seas, which indicates that the proposed large-scale model testing scheme is capable and feasible. Meaningful data including ocean environment parameters, ship navigation state, motions and loads were obtained through the sea trial campaign. PMID:29109379
The build up of the correlation between halo spin and the large-scale structure
NASA Astrophysics Data System (ADS)
Wang, Peng; Kang, Xi
2018-01-01
Both simulations and observations have confirmed that the spin of haloes/galaxies is correlated with the large-scale structure (LSS) with a mass dependence such that the spin of low-mass haloes/galaxies tend to be parallel with the LSS, while that of massive haloes/galaxies tend to be perpendicular with the LSS. It is still unclear how this mass dependence is built up over time. We use N-body simulations to trace the evolution of the halo spin-LSS correlation and find that at early times the spin of all halo progenitors is parallel with the LSS. As time goes on, mass collapsing around massive halo is more isotropic, especially the recent mass accretion along the slowest collapsing direction is significant and it brings the halo spin to be perpendicular with the LSS. Adopting the fractional anisotropy (FA) parameter to describe the degree of anisotropy of the large-scale environment, we find that the spin-LSS correlation is a strong function of the environment such that a higher FA (more anisotropic environment) leads to an aligned signal, and a lower anisotropy leads to a misaligned signal. In general, our results show that the spin-LSS correlation is a combined consequence of mass flow and halo growth within the cosmic web. Our predicted environmental dependence between spin and large-scale structure can be further tested using galaxy surveys.
Environment and host as large-scale controls of ectomycorrhizal fungi.
van der Linde, Sietse; Suz, Laura M; Orme, C David L; Cox, Filipa; Andreae, Henning; Asi, Endla; Atkinson, Bonnie; Benham, Sue; Carroll, Christopher; Cools, Nathalie; De Vos, Bruno; Dietrich, Hans-Peter; Eichhorn, Johannes; Gehrmann, Joachim; Grebenc, Tine; Gweon, Hyun S; Hansen, Karin; Jacob, Frank; Kristöfel, Ferdinand; Lech, Paweł; Manninger, Miklós; Martin, Jan; Meesenburg, Henning; Merilä, Päivi; Nicolas, Manuel; Pavlenda, Pavel; Rautio, Pasi; Schaub, Marcus; Schröck, Hans-Werner; Seidling, Walter; Šrámek, Vít; Thimonier, Anne; Thomsen, Iben Margrete; Titeux, Hugues; Vanguelova, Elena; Verstraeten, Arne; Vesterdal, Lars; Waldner, Peter; Wijk, Sture; Zhang, Yuxin; Žlindra, Daniel; Bidartondo, Martin I
2018-06-06
Explaining the large-scale diversity of soil organisms that drive biogeochemical processes-and their responses to environmental change-is critical. However, identifying consistent drivers of belowground diversity and abundance for some soil organisms at large spatial scales remains problematic. Here we investigate a major guild, the ectomycorrhizal fungi, across European forests at a spatial scale and resolution that is-to our knowledge-unprecedented, to explore key biotic and abiotic predictors of ectomycorrhizal diversity and to identify dominant responses and thresholds for change across complex environmental gradients. We show the effect of 38 host, environment, climate and geographical variables on ectomycorrhizal diversity, and define thresholds of community change for key variables. We quantify host specificity and reveal plasticity in functional traits involved in soil foraging across gradients. We conclude that environmental and host factors explain most of the variation in ectomycorrhizal diversity, that the environmental thresholds used as major ecosystem assessment tools need adjustment and that the importance of belowground specificity and plasticity has previously been underappreciated.
ERIC Educational Resources Information Center
Dowd, Amy Jo; Pisani, Lauren
2013-01-01
Children's reading skill development is influenced by availability of reading materials, reading habits and opportunity to read. Save the Children's Literacy Boost data have replicated this finding across numerous developing contexts. Meanwhile international large-scale reading assessments do not capture detail on current home literacy. The…
Large-Scale Innovation and Change in UK Higher Education
ERIC Educational Resources Information Center
Brown, Stephen
2013-01-01
This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…
BactoGeNIE: A large-scale comparative genome visualization for big displays
Aurisano, Jillian; Reda, Khairi; Johnson, Andrew; ...
2015-08-13
The volume of complete bacterial genome sequence data available to comparative genomics researchers is rapidly increasing. However, visualizations in comparative genomics--which aim to enable analysis tasks across collections of genomes--suffer from visual scalability issues. While large, multi-tiled and high-resolution displays have the potential to address scalability issues, new approaches are needed to take advantage of such environments, in order to enable the effective visual analysis of large genomics datasets. In this paper, we present Bacterial Gene Neighborhood Investigation Environment, or BactoGeNIE, a novel and visually scalable design for comparative gene neighborhood analysis on large display environments. We evaluate BactoGeNIE throughmore » a case study on close to 700 draft Escherichia coli genomes, and present lessons learned from our design process. In conclusion, BactoGeNIE accommodates comparative tasks over substantially larger collections of neighborhoods than existing tools and explicitly addresses visual scalability. Given current trends in data generation, scalable designs of this type may inform visualization design for large-scale comparative research problems in genomics.« less
BactoGeNIE: a large-scale comparative genome visualization for big displays
2015-01-01
Background The volume of complete bacterial genome sequence data available to comparative genomics researchers is rapidly increasing. However, visualizations in comparative genomics--which aim to enable analysis tasks across collections of genomes--suffer from visual scalability issues. While large, multi-tiled and high-resolution displays have the potential to address scalability issues, new approaches are needed to take advantage of such environments, in order to enable the effective visual analysis of large genomics datasets. Results In this paper, we present Bacterial Gene Neighborhood Investigation Environment, or BactoGeNIE, a novel and visually scalable design for comparative gene neighborhood analysis on large display environments. We evaluate BactoGeNIE through a case study on close to 700 draft Escherichia coli genomes, and present lessons learned from our design process. Conclusions BactoGeNIE accommodates comparative tasks over substantially larger collections of neighborhoods than existing tools and explicitly addresses visual scalability. Given current trends in data generation, scalable designs of this type may inform visualization design for large-scale comparative research problems in genomics. PMID:26329021
BactoGeNIE: A large-scale comparative genome visualization for big displays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aurisano, Jillian; Reda, Khairi; Johnson, Andrew
The volume of complete bacterial genome sequence data available to comparative genomics researchers is rapidly increasing. However, visualizations in comparative genomics--which aim to enable analysis tasks across collections of genomes--suffer from visual scalability issues. While large, multi-tiled and high-resolution displays have the potential to address scalability issues, new approaches are needed to take advantage of such environments, in order to enable the effective visual analysis of large genomics datasets. In this paper, we present Bacterial Gene Neighborhood Investigation Environment, or BactoGeNIE, a novel and visually scalable design for comparative gene neighborhood analysis on large display environments. We evaluate BactoGeNIE throughmore » a case study on close to 700 draft Escherichia coli genomes, and present lessons learned from our design process. In conclusion, BactoGeNIE accommodates comparative tasks over substantially larger collections of neighborhoods than existing tools and explicitly addresses visual scalability. Given current trends in data generation, scalable designs of this type may inform visualization design for large-scale comparative research problems in genomics.« less
NASA Astrophysics Data System (ADS)
Zhang, Pengsong; Jiang, Shanping; Yang, Linhua; Zhang, Bolun
2018-01-01
In order to meet the requirement of high precision thermal distortion measurement foraΦ4.2m deployable mesh antenna of satellite in vacuum and cryogenic environment, based on Digital Close-range Photogrammetry and Space Environment Test Technology of Spacecraft, a large scale antenna distortion measurement system under vacuum and cryogenic environment is developed in this paper. The antenna Distortion measurement system (ADMS) is the first domestic independently developed thermal distortion measurement system for large antenna, which has successfully solved non-contact high precision distortion measurement problem in large spacecraft structure under vacuum and cryogenic environment. The measurement accuracy of ADMS is better than 50 μm/5m, which has reached international advanced level. The experimental results show that the measurement system has great advantages in large structural measurement of spacecrafts, and also has broad application prospects in space or other related fields.
He, Bo; Zhang, Shujing; Yan, Tianhong; Zhang, Tao; Liang, Yan; Zhang, Hongjin
2011-01-01
Mobile autonomous systems are very important for marine scientific investigation and military applications. Many algorithms have been studied to deal with the computational efficiency problem required for large scale simultaneous localization and mapping (SLAM) and its related accuracy and consistency. Among these methods, submap-based SLAM is a more effective one. By combining the strength of two popular mapping algorithms, the Rao-Blackwellised particle filter (RBPF) and extended information filter (EIF), this paper presents a combined SLAM-an efficient submap-based solution to the SLAM problem in a large scale environment. RBPF-SLAM is used to produce local maps, which are periodically fused into an EIF-SLAM algorithm. RBPF-SLAM can avoid linearization of the robot model during operating and provide a robust data association, while EIF-SLAM can improve the whole computational speed, and avoid the tendency of RBPF-SLAM to be over-confident. In order to further improve the computational speed in a real time environment, a binary-tree-based decision-making strategy is introduced. Simulation experiments show that the proposed combined SLAM algorithm significantly outperforms currently existing algorithms in terms of accuracy and consistency, as well as the computing efficiency. Finally, the combined SLAM algorithm is experimentally validated in a real environment by using the Victoria Park dataset.
Measures of galaxy environment - I. What is 'environment'?
NASA Astrophysics Data System (ADS)
Muldrew, Stuart I.; Croton, Darren J.; Skibba, Ramin A.; Pearce, Frazer R.; Ann, Hong Bae; Baldry, Ivan K.; Brough, Sarah; Choi, Yun-Young; Conselice, Christopher J.; Cowan, Nicolas B.; Gallazzi, Anna; Gray, Meghan E.; Grützbauch, Ruth; Li, I.-Hui; Park, Changbom; Pilipenko, Sergey V.; Podgorzec, Bret J.; Robotham, Aaron S. G.; Wilman, David J.; Yang, Xiaohu; Zhang, Youcai; Zibetti, Stefano
2012-01-01
The influence of a galaxy's environment on its evolution has been studied and compared extensively in the literature, although differing techniques are often used to define environment. Most methods fall into two broad groups: those that use nearest neighbours to probe the underlying density field and those that use fixed apertures. The differences between the two inhibit a clean comparison between analyses and leave open the possibility that, even with the same data, different properties are actually being measured. In this work, we apply 20 published environment definitions to a common mock galaxy catalogue constrained to look like the local Universe. We find that nearest-neighbour-based measures best probe the internal densities of high-mass haloes, while at low masses the interhalo separation dominates and acts to smooth out local density variations. The resulting correlation also shows that nearest-neighbour galaxy environment is largely independent of dark matter halo mass. Conversely, aperture-based methods that probe superhalo scales accurately identify high-density regions corresponding to high-mass haloes. Both methods show how galaxies in dense environments tend to be redder, with the exception of the largest apertures, but these are the strongest at recovering the background dark matter environment. We also warn against using photometric redshifts to define environment in all but the densest regions. When considering environment, there are two regimes: the 'local environment' internal to a halo best measured with nearest neighbour and 'large-scale environment' external to a halo best measured with apertures. This leads to the conclusion that there is no universal environment measure and the most suitable method depends on the scale being probed.
Homogenization of a Directed Dispersal Model for Animal Movement in a Heterogeneous Environment.
Yurk, Brian P
2016-10-01
The dispersal patterns of animals moving through heterogeneous environments have important ecological and epidemiological consequences. In this work, we apply the method of homogenization to analyze an advection-diffusion (AD) model of directed movement in a one-dimensional environment in which the scale of the heterogeneity is small relative to the spatial scale of interest. We show that the large (slow) scale behavior is described by a constant-coefficient diffusion equation under certain assumptions about the fast-scale advection velocity, and we determine a formula for the slow-scale diffusion coefficient in terms of the fast-scale parameters. We extend the homogenization result to predict invasion speeds for an advection-diffusion-reaction (ADR) model with directed dispersal. For periodic environments, the homogenization approximation of the solution of the AD model compares favorably with numerical simulations. Invasion speed approximations for the ADR model also compare favorably with numerical simulations when the spatial period is sufficiently small.
High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing
NASA Astrophysics Data System (ADS)
Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.
2015-12-01
Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.
Heslot, Nicolas; Akdemir, Deniz; Sorrells, Mark E; Jannink, Jean-Luc
2014-02-01
Development of models to predict genotype by environment interactions, in unobserved environments, using environmental covariates, a crop model and genomic selection. Application to a large winter wheat dataset. Genotype by environment interaction (G*E) is one of the key issues when analyzing phenotypes. The use of environment data to model G*E has long been a subject of interest but is limited by the same problems as those addressed by genomic selection methods: a large number of correlated predictors each explaining a small amount of the total variance. In addition, non-linear responses of genotypes to stresses are expected to further complicate the analysis. Using a crop model to derive stress covariates from daily weather data for predicted crop development stages, we propose an extension of the factorial regression model to genomic selection. This model is further extended to the marker level, enabling the modeling of quantitative trait loci (QTL) by environment interaction (Q*E), on a genome-wide scale. A newly developed ensemble method, soft rule fit, was used to improve this model and capture non-linear responses of QTL to stresses. The method is tested using a large winter wheat dataset, representative of the type of data available in a large-scale commercial breeding program. Accuracy in predicting genotype performance in unobserved environments for which weather data were available increased by 11.1% on average and the variability in prediction accuracy decreased by 10.8%. By leveraging agronomic knowledge and the large historical datasets generated by breeding programs, this new model provides insight into the genetic architecture of genotype by environment interactions and could predict genotype performance based on past and future weather scenarios.
The impact of Lyman-α radiative transfer on large-scale clustering in the Illustris simulation
NASA Astrophysics Data System (ADS)
Behrens, C.; Byrohl, C.; Saito, S.; Niemeyer, J. C.
2018-06-01
Context. Lyman-α emitters (LAEs) are a promising probe of the large-scale structure at high redshift, z ≳ 2. In particular, the Hobby-Eberly Telescope Dark Energy Experiment aims at observing LAEs at 1.9 < z < 3.5 to measure the baryon acoustic oscillation (BAO) scale and the redshift-space distortion (RSD). However, it has been pointed out that the complicated radiative transfer (RT) of the resonant Lyman-α emission line generates an anisotropic selection bias in the LAE clustering on large scales, s ≳ 10 Mpc. This effect could potentially induce a systematic error in the BAO and RSD measurements. Also, there exists a recent claim to have observational evidence of the effect in the Lyman-α intensity map, albeit statistically insignificant. Aims: We aim at quantifying the impact of the Lyman-α RT on the large-scale galaxy clustering in detail. For this purpose, we study the correlations between the large-scale environment and the ratio of an apparent Lyman-α luminosity to an intrinsic one, which we call the "observed fraction", at 2 < z < 6. Methods: We apply our Lyman-α RT code by post-processing the full Illustris simulations. We simply assume that the intrinsic luminosity of the Lyman-α emission is proportional to the star formation rate of galaxies in Illustris, yielding a sufficiently large sample of LAEs to measure the anisotropic selection bias. Results: We find little correlation between large-scale environment and the observed fraction induced by the RT, and hence a smaller anisotropic selection bias than has previously been claimed. We argue that the anisotropy was overestimated in previous work due to insufficient spatial resolution; it is important to keep the resolution such that it resolves the high-density region down to the scale of the interstellar medium, that is, 1 physical kpc. We also find that the correlation can be further enhanced by assumptions in modeling intrinsic Lyman-α emission.
Moon-based Earth Observation for Large Scale Geoscience Phenomena
NASA Astrophysics Data System (ADS)
Guo, Huadong; Liu, Guang; Ding, Yixing
2016-07-01
The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.
Lim, Chun Yi; Law, Mary; Khetani, Mary; Rosenbaum, Peter; Pollock, Nancy
2018-08-01
To estimate the psychometric properties of a culturally adapted version of the Young Children's Participation and Environment Measure (YC-PEM) for use among Singaporean families. This is a prospective cohort study. Caregivers of 151 Singaporean children with (n = 83) and without (n = 68) developmental disabilities, between 0 and 7 years, completed the YC-PEM (Singapore) questionnaire with 3 participation scales (frequency, involvement, and change desired) and 1 environment scale for three settings: home, childcare/preschool, and community. Setting-specific estimates of internal consistency, test-retest reliability, and construct validity were obtained. Internal consistency estimates varied from .59 to .92 for the participation scales and .73 to .79 for the environment scale. Test-retest reliability estimates from the YC-PEM conducted on two occasions, 2-3 weeks apart, varied from .39 to .89 for the participation scales and from .65 to .80 for the environment scale. Moderate to large differences were found in participation and perceived environmental support between children with and without a disability. YC-PEM (Singapore) scales have adequate psychometric properties except for low internal consistency for the childcare/preschool participation frequency scale and low test-retest reliability for home participation frequency scale. The YC-PEM (Singapore) may be used for population-level studies involving young children with and without developmental disabilities.
Zaehringer, Julie G; Wambugu, Grace; Kiteme, Boniface; Eckert, Sandra
2018-05-01
Africa has been heavily targeted by large-scale agricultural investments (LAIs) throughout the last decade, with scarcely known impacts on local social-ecological systems. In Kenya, a large number of LAIs were made in the region northwest of Mount Kenya. These large-scale farms produce vegetables and flowers mainly for European markets. However, land use in the region remains dominated by small-scale crop and livestock farms with less than 1 ha of land each, who produce both for their own subsistence and for the local markets. We interviewed 100 small-scale farmers living near five different LAIs to elicit their perceptions of the impacts that these LAIs have on their land use and the overall environment. Furthermore, we analyzed remotely sensed land cover and land use data to assess land use change in the vicinity of the five LAIs. While land use change did not follow a clear trend, a number of small-scale farmers did adapt their crop management to environmental changes such as a reduced river water flows and increased pests, which they attributed to the presence of LAIs. Despite the high number of open conflicts between small-scale land users and LAIs around the issue of river water abstraction, the main environmental impact, felt by almost half of the interviewed land users, was air pollution with agrochemicals sprayed on the LAIs' land. Even though only a low percentage of local land users and their household members were directly involved with LAIs, a large majority of respondents favored the presence of LAIs nearby, as they are believed to contribute to the region's overall economic development. Copyright © 2018 Elsevier Ltd. All rights reserved.
Gannotti, Mary E; Law, Mary; Bailes, Amy F; OʼNeil, Margaret E; Williams, Uzma; DiRezze, Briano
2016-01-01
A step toward advancing research about rehabilitation service associated with positive outcomes for children with cerebral palsy is consensus about a conceptual framework and measures. A Delphi process was used to establish consensus among clinicians and researchers in North America. Directors of large pediatric rehabilitation centers, clinicians from large hospitals, and researchers with expertise in outcomes participated (N = 18). Andersen's model of health care utilization framed outcomes: consumer satisfaction, activity, participation, quality of life, and pain. Measures agreed upon included Participation and Environment Measure for Children and Youth, Measure of Processes of Care, PEDI-CAT, KIDSCREEN-10, PROMIS Pediatric Pain Interference Scale, Visual Analog Scale for pain intensity, PROMIS Global Health Short Form, Family Environment Scale, Family Support Scale, and functional classification levels for gross motor, manual ability, and communication. Universal forms for documenting service use are needed. Findings inform clinicians and researchers concerned with outcome assessment.
Large-Scale Distributed Computational Fluid Dynamics on the Information Power Grid Using Globus
NASA Technical Reports Server (NTRS)
Barnard, Stephen; Biswas, Rupak; Saini, Subhash; VanderWijngaart, Robertus; Yarrow, Maurice; Zechtzer, Lou; Foster, Ian; Larsson, Olle
1999-01-01
This paper describes an experiment in which a large-scale scientific application development for tightly-coupled parallel machines is adapted to the distributed execution environment of the Information Power Grid (IPG). A brief overview of the IPG and a description of the computational fluid dynamics (CFD) algorithm are given. The Globus metacomputing toolkit is used as the enabling device for the geographically-distributed computation. Modifications related to latency hiding and Load balancing were required for an efficient implementation of the CFD application in the IPG environment. Performance results on a pair of SGI Origin 2000 machines indicate that real scientific applications can be effectively implemented on the IPG; however, a significant amount of continued effort is required to make such an environment useful and accessible to scientists and engineers.
Assessment of Disturbance at Three Spatial Scales in Two Large Tropical Reservoirs
Large reservoirs vary from lentic to lotic systems in time and space. Therefore our objective was to assess disturbance gradients for two large tropical reservoirs and their influences on benthic macroinvertebrates. We tested three hypothesis: 1) a disturbance gradient of environ...
Tracking a head-mounted display in a room-sized environment with head-mounted cameras
NASA Astrophysics Data System (ADS)
Wang, Jih-Fang; Azuma, Ronald T.; Bishop, Gary; Chi, Vernon; Eyles, John; Fuchs, Henry
1990-10-01
This paper presents our efforts to accurately track a Head-Mounted Display (HMD) in a large environment. We review our current benchtop prototype (introduced in {WCF9O]), then describe our plans for building the full-scale system. Both systems use an inside-oui optical tracking scheme, where lateraleffect photodiodes mounted on the user's helmet view flashing infrared beacons placed in the environment. Church's method uses the measured 2D image positions and the known 3D beacon locations to recover the 3D position and orientation of the helmet in real-time. We discuss the implementation and performance of the benchtop prototype. The full-scale system design includes ceiling panels that hold the infrared beacons and a new sensor arrangement of two photodiodes with holographic lenses. In the full-scale system, the user can walk almost anywhere under the grid of ceiling panels, making the working volume nearly as large as the room.
NASA Astrophysics Data System (ADS)
Cautun, Marius; van de Weygaert, Rien; Jones, Bernard J. T.; Frenk, Carlos S.; Hellwing, Wojciech A.
2015-01-01
One of the important unknowns of current cosmology concerns the effects of the large scale distribution of matter on the formation and evolution of dark matter haloes and galaxies. One main difficulty in answering this question lies in the absence of a robust and natural way of identifying the large scale environments and their characteristics. This work summarizes the NEXUS+ formalism which extends and improves our multiscale scale-space MMF method. The new algorithm is very successful in tracing the Cosmic Web components, mainly due to its novel filtering of the density in logarithmic space. The method, due to its multiscale and hierarchical character, has the advantage of detecting all the cosmic structures, either prominent or tenuous, without preference for a certain size or shape. The resulting filamentary and wall networks can easily be characterized by their direction, thickness, mass density and density profile. These additional environmental properties allows to us to investigate not only the effect of environment on haloes, but also how it correlates with the environment characteristics.
NASA Astrophysics Data System (ADS)
Reed, K. A.; Chavas, D. R.
2017-12-01
Hazardous Convective Weather (HCW), such as severe thunderstorms and tornadoes, poses significant risk to life and property in the United States every year. While these HCW events are small scale, they develop principally within favorable larger-scale environments (i.e., HCW environments). Why these large-scale environments are confined to specific regions, particularly the Eastern United States, is not well understood. This can, in part, be related to a limited fundamental knowledge of how the climate system creates HCW environment, which provides uncertainty in how HCW environments may be altered in a changing climate. Previous research has identified the Gulf of Mexico to the south and elevated terrain upstream as key geographic contributors to the generation of HCW environments over the Eastern United States. This work investigates the relative role of these geographic features through "component denial" experiments in the Community Atmosphere Model version 5 (CAM5). In particular, CAM5 simulations where topography is removed (globally and regionally) and/or the Gulf of Mexico is converted to land is compared to a CAM5 control simulation of current climate following the Atmospheric Model Intercomparison Project (AMIP) protocols. In addition to exploring differences in general characteristics of the large-scale environments amongst the experiments, HCW changes will be explored through a combination of high shear and high Convective Available Potential Energy (CAPE) environments. Preliminary work suggests that the removal of elevated terrain reduces the inland extent of HCW environments in the United States, but not the existence of these events altogether. This indicates that topography is crucial for inland HCW environments but perhaps not for their existence in general (e.g., near the Gulf of Mexico). This initial work is a crucial first step to building a reduced-complexity framework within CAM5 to quantify how land-ocean contrast and elevated terrain control HCW environments.
The Use of Bacteria for Remediation of Mercury Contaminated Groundwater
Many processes of mercury transformation in the environment are bacteria mediated. Mercury properties cause some difficulties of remediation of mercury contaminated environment. Despite the significance of the problem of mercury pollution, methods of large scale bioremediation ...
Lagrangian statistics of mesoscale turbulence in a natural environment: The Agulhas return current.
Carbone, Francesco; Gencarelli, Christian N; Hedgecock, Ian M
2016-12-01
The properties of mesoscale geophysical turbulence in an oceanic environment have been investigated through the Lagrangian statistics of sea surface temperature measured by a drifting buoy within the Agulhas return current, where strong temperature mixing produces locally sharp temperature gradients. By disentangling the large-scale forcing which affects the small-scale statistics, we found that the statistical properties of intermittency are identical to those obtained from the multifractal prediction in the Lagrangian frame for the velocity trajectory. The results suggest a possible universality of turbulence scaling.
Yurk, Brian P
2018-07-01
Animal movement behaviors vary spatially in response to environmental heterogeneity. An important problem in spatial ecology is to determine how large-scale population growth and dispersal patterns emerge within highly variable landscapes. We apply the method of homogenization to study the large-scale behavior of a reaction-diffusion-advection model of population growth and dispersal. Our model includes small-scale variation in the directed and random components of movement and growth rates, as well as large-scale drift. Using the homogenized model we derive simple approximate formulas for persistence conditions and asymptotic invasion speeds, which are interpreted in terms of residence index. The homogenization results show good agreement with numerical solutions for environments with a high degree of fragmentation, both with and without periodicity at the fast scale. The simplicity of the formulas, and their connection to residence index make them appealing for studying the large-scale effects of a variety of small-scale movement behaviors.
Virtual Environments Supporting Learning and Communication in Special Needs Education
ERIC Educational Resources Information Center
Cobb, Sue V. G.
2007-01-01
Virtual reality (VR) describes a set of technologies that allow users to explore and experience 3-dimensional computer-generated "worlds" or "environments." These virtual environments can contain representations of real or imaginary objects on a small or large scale (from modeling of molecular structures to buildings, streets, and scenery of a…
NASA Technical Reports Server (NTRS)
Braun, R. D.; Kroo, I. M.
1995-01-01
Collaborative optimization is a design architecture applicable in any multidisciplinary analysis environment but specifically intended for large-scale distributed analysis applications. In this approach, a complex problem is hierarchically de- composed along disciplinary boundaries into a number of subproblems which are brought into multidisciplinary agreement by a system-level coordination process. When applied to problems in a multidisciplinary design environment, this scheme has several advantages over traditional solution strategies. These advantageous features include reducing the amount of information transferred between disciplines, the removal of large iteration-loops, allowing the use of different subspace optimizers among the various analysis groups, an analysis framework which is easily parallelized and can operate on heterogenous equipment, and a structural framework that is well-suited for conventional disciplinary organizations. In this article, the collaborative architecture is developed and its mathematical foundation is presented. An example application is also presented which highlights the potential of this method for use in large-scale design applications.
Effect of dry large-scale vertical motions on initial MJO convective onset
NASA Astrophysics Data System (ADS)
Powell, Scott W.; Houze, Robert A.
2015-05-01
Anomalies of eastward propagating large-scale vertical motion with ~30 day variability at Addu City, Maldives, move into the Indian Ocean from the west and are implicated in Madden-Julian Oscillation (MJO) convective onset. Using ground-based radar and large-scale forcing data derived from a sounding array, typical profiles of environmental heating, moisture sink, vertical motion, moisture advection, and Eulerian moisture tendency are computed for periods prior to those during which deep convection is prevalent and those during which moderately deep cumulonimbi do not form into deep clouds. Convection with 3-7 km tops is ubiquitous but present in greater numbers when tropospheric moistening occurs below 600 hPa. Vertical eddy convergence of moisture in shallow to moderately deep clouds is likely responsible for moistening during a 3-7 day long transition period between suppressed and active MJO conditions, although moistening via evaporation of cloud condensate detrained into the environment of such clouds may also be important. Reduction in large-scale subsidence, associated with a vertical velocity structure that travels with a dry eastward propagating zonal wavenumbers 1-1.5 structure in zonal wind, drives a steepening of the lapse rate below 700 hPa, which supports an increase in moderately deep moist convection. As the moderately deep cumulonimbi moisten the lower troposphere, more deep convection develops, which itself moistens the upper troposphere. Reduction in large-scale subsidence associated with the eastward propagating feature reinforces the upper tropospheric moistening, helping to then rapidly make the environment conducive to formation of large stratiform precipitation regions, whose heating is critical for MJO maintenance.
Xu, Jiuping; Feng, Cuiying
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.
Xu, Jiuping
2014-01-01
This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708
Zheng, Wei; Yan, Xiaoyong; Zhao, Wei; Qian, Chengshan
2017-12-20
A novel large-scale multi-hop localization algorithm based on regularized extreme learning is proposed in this paper. The large-scale multi-hop localization problem is formulated as a learning problem. Unlike other similar localization algorithms, the proposed algorithm overcomes the shortcoming of the traditional algorithms which are only applicable to an isotropic network, therefore has a strong adaptability to the complex deployment environment. The proposed algorithm is composed of three stages: data acquisition, modeling and location estimation. In data acquisition stage, the training information between nodes of the given network is collected. In modeling stage, the model among the hop-counts and the physical distances between nodes is constructed using regularized extreme learning. In location estimation stage, each node finds its specific location in a distributed manner. Theoretical analysis and several experiments show that the proposed algorithm can adapt to the different topological environments with low computational cost. Furthermore, high accuracy can be achieved by this method without setting complex parameters.
de Boer, B; Hamers, J P H; Beerens, H C; Zwakhalen, S M G; Tan, F E S; Verbeek, H
2015-11-02
In nursing home care, new care environments directed towards small-scale and homelike environments are developing. The green care farm, which provides 24-h nursing home care for people with dementia, is one such new care environment. Knowledge is needed on the relation between environmental features of green care farms such as nature, domesticity and offering care in small groups and the influence on the daily lives of residents. The aim of this study is to explore (1) the daily lives of residents, (2) the quality of care and (3) the experiences of caregivers on green care farms compared with other nursing home care environments. An observational longitudinal study including a baseline and a six-month follow-up measurement is carried out. Four types of nursing home care environments are included: (1) large scale nursing home ward, (2) small scale living facility on the terrain of a larger nursing home (3) stand-alone small scale living facility and (4) green care farm. Quality of care is examined through structure, process and outcome indicators. The primary outcome measure is the daily life of residents, assessed by ecological momentary assessments. Aspects of daily life include (1) activity (activity performed by the resident, the engagement in this activity and the degree of physical effort); (2) physical environment (the location of the resident and the interaction with the physical environment); (3) social environment (the level and type of social interaction, and with whom this social interaction took place) and (4) psychological well-being (mood and agitation). In addition, social engagement, quality of life, behavioral symptoms and agitation are evaluated through questionnaires. Furthermore, demographics, cognitive impairment, functional dependence and the severity of dementia are assessed. Semi-structured interviews are performed with caregivers regarding their experiences with the different nursing home care environments. This is the first study investigating green care farms providing 24-h nursing home care for people with dementia. The study provides valuable insight into the daily lives of residents, the quality of care, and the experiences of caregivers at green care farms in comparison with other nursing home care environments including small-scale care environments and large scale nursing home wards.
Learning, climate and the evolution of cultural capacity.
Whitehead, Hal
2007-03-21
Patterns of environmental variation influence the utility, and thus evolution, of different learning strategies. I use stochastic, individual-based evolutionary models to assess the relative advantages of 15 different learning strategies (genetic determination, individual learning, vertical social learning, horizontal/oblique social learning, and contingent combinations of these) when competing in variable environments described by 1/f noise. When environmental variation has little effect on fitness, then genetic determinism persists. When environmental variation is large and equal over all time-scales ("white noise") then individual learning is adaptive. Social learning is advantageous in "red noise" environments when variation over long time-scales is large. Climatic variability increases with time-scale, so that short-lived organisms should be able to rely largely on genetic determination. Thermal climates usually are insufficiently red for social learning to be advantageous for species whose fitness is very determined by temperature. In contrast, population trajectories of many species, especially large mammals and aquatic carnivores, are sufficiently red to promote social learning in their predators. The ocean environment is generally redder than that on land. Thus, while individual learning should be adaptive for many longer-lived organisms, social learning will often be found in those dependent on the populations of other species, especially if they are marine. This provides a potential explanation for the evolution of a prevalence of social learning, and culture, in humans and cetaceans.
Capabilities of the Large-Scale Sediment Transport Facility
2016-04-01
experiments in wave /current environments. INTRODUCTION: The LSTF (Figure 1) is a large-scale laboratory facility capable of simulating conditions...comparable to low- wave energy coasts. The facility was constructed to address deficiencies in existing methods for calculating longshore sediment...transport. The LSTF consists of a 30 m wide, 50 m long, 1.4 m deep basin. Waves are generated by four digitally controlled wave makers capable of producing
The dynamics and evolution of clusters of galaxies
NASA Technical Reports Server (NTRS)
Geller, Margaret; Huchra, John P.
1987-01-01
Research was undertaken to produce a coherent picture of the formation and evolution of large-scale structures in the universe. The program is divided into projects which examine four areas: the relationship between individual galaxies and their environment; the structure and evolution of individual rich clusters of galaxies; the nature of superclusters; and the large-scale distribution of individual galaxies. A brief review of results in each area is provided.
Lanktree, Matthew B; Hegele, Robert A
2009-02-26
Despite the recent success of genome-wide association studies (GWASs) in identifying loci consistently associated with coronary artery disease (CAD), a large proportion of the genetic components of CAD and its metabolic risk factors, including plasma lipids, type 2 diabetes and body mass index, remain unattributed. Gene-gene and gene-environment interactions might produce a meaningful improvement in quantification of the genetic determinants of CAD. Testing for gene-gene and gene-environment interactions is thus a new frontier for large-scale GWASs of CAD. There are several anecdotal examples of monogenic susceptibility to CAD in which the phenotype was worsened by an adverse environment. In addition, small-scale candidate gene association studies with functional hypotheses have identified gene-environment interactions. For future evaluation of gene-gene and gene-environment interactions to achieve the same success as the single gene associations reported in recent GWASs, it will be important to pre-specify agreed standards of study design and statistical power, environmental exposure measurement, phenomic characterization and analytical strategies. Here we discuss these issues, particularly in relation to the investigation and potential clinical utility of gene-gene and gene-environment interactions in CAD.
STAR FORMATION AND SUPERCLUSTER ENVIRONMENT OF 107 NEARBY GALAXY CLUSTERS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cohen, Seth A.; Hickox, Ryan C.; Wegner, Gary A.
We analyze the relationship between star formation (SF), substructure, and supercluster environment in a sample of 107 nearby galaxy clusters using data from the Sloan Digital Sky Survey. Previous works have investigated the relationships between SF and cluster substructure, and cluster substructure and supercluster environment, but definitive conclusions relating all three of these variables has remained elusive. We find an inverse relationship between cluster SF fraction ( f {sub SF}) and supercluster environment density, calculated using the Galaxy luminosity density field at a smoothing length of 8 h {sup −1} Mpc (D8). The slope of f {sub SF} versus D8more » is −0.008 ± 0.002. The f {sub SF} of clusters located in low-density large-scale environments, 0.244 ± 0.011, is higher than for clusters located in high-density supercluster cores, 0.202 ± 0.014. We also divide superclusters, according to their morphology, into filament- and spider-type systems. The inverse relationship between cluster f {sub SF} and large-scale density is dominated by filament- rather than spider-type superclusters. In high-density cores of superclusters, we find a higher f {sub SF} in spider-type superclusters, 0.229 ± 0.016, than in filament-type superclusters, 0.166 ± 0.019. Using principal component analysis, we confirm these results and the direct correlation between cluster substructure and SF. These results indicate that cluster SF is affected by both the dynamical age of the cluster (younger systems exhibit higher amounts of SF); the large-scale density of the supercluster environment (high-density core regions exhibit lower amounts of SF); and supercluster morphology (spider-type superclusters exhibit higher amounts of SF at high densities).« less
NASA Technical Reports Server (NTRS)
Bezos, Gaudy M.; Cambell, Bryan A.; Melson, W. Edward
1989-01-01
A research technique to obtain large-scale aerodynamic data in a simulated natural rain environment has been developed. A 10-ft chord NACA 64-210 wing section wing section equipped with leading-edge and trailing-edge high-lift devices was tested as part of a program to determine the effect of highly-concentrated, short-duration rainfall on airplane performance. Preliminary dry aerodynamic data are presented for the high-lift configuration at a velocity of 100 knots and an angle of attack of 18 deg. Also, data are presented on rainfield uniformity and rainfall concentration intensity levels obtained during the calibration of the rain simulation system.
Homogenization of Large-Scale Movement Models in Ecology
Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.
2011-01-01
A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.
Surveying the CGM and IGM across 4 orders of magnitude in environmental density
NASA Astrophysics Data System (ADS)
Burchett, Joseph
2017-08-01
Environment matters when it comes to galaxy evolution, and the mechanisms driving this evolution are reflected in the diffuse gas residing within the large-scale structures enveloping the cosmic galaxy population. QSO absorption lines effectively probe the circumgalactic medium (CGM) and intragroup and intracluster media, and work thus far hints at profound environmental effects on the CGM. However, sample sizes remain small, and a unifying picture of the gas characteristics across diverse environments has yet to emerge. Within the Sloan Digital Sky Survey, we have identified a sample volume containing a remarkable diversity in large-scale environment with an array of voids, >10,000 groups, several filaments, and 5 clusters, including the Coma Supercluster and CfA Great Wall. Leveraging the Hubble Spectroscopic Legacy Archive (HSLA), we propose a study using >360 background QSOs probing this volume to study the effects of large-scale environment on CGM and intergalactic medium (IGM) gas. The z = 0.019-0.028 spectroscopic galaxy sample is uniformly complete to galaxies L > 0.03 L* and, with the HSLA, produces 200 galaxy/sightline pairs within 300-kpc impact parameters across a wide range of environmental densities and structures.Upon quantifying the galaxy environment and identifying/measuring the QSO absorption lines at z = 0.019-0.028, we will pursue the following primary science goals:1. Constrain the CGM/IGM physical conditions across four orders of magnitude in galaxy density2. Compare ionic abundances and ionization states in the CGM of galaxies in filaments vs. voids3. Statistically investigate the IGM/CGM gas properties from structure to structure
Using Computing and Data Grids for Large-Scale Science and Engineering
NASA Technical Reports Server (NTRS)
Johnston, William E.
2001-01-01
We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.
NASA Astrophysics Data System (ADS)
Huveneers, François
2018-04-01
We investigate the long-time behavior of a passive particle evolving in a one-dimensional diffusive random environment, with diffusion constant D . We consider two cases: (a) The particle is pulled forward by a small external constant force and (b) there is no systematic bias. Theoretical arguments and numerical simulations provide evidence that the particle is eventually trapped by the environment. This is diagnosed in two ways: The asymptotic speed of the particle scales quadratically with the external force as it goes to zero, and the fluctuations scale diffusively in the unbiased environment, up to possible logarithmic corrections in both cases. Moreover, in the large D limit (homogenized regime), we find an important transient region giving rise to other, finite-size scalings, and we describe the crossover to the true asymptotic behavior.
ERIC Educational Resources Information Center
Reynolds, Amy L.; Weigand, Matthew J.
2010-01-01
This study examined the relationships among academic and psychological attitudes and academic achievement of first-year students. The College Resilience Scale, the Academic Motivation Scale, the College Self-Efficacy Inventory, and the University Environment Scale were administered to 164 first-year undergraduate students enrolled at a large RU/VH…
The potential for pharmaceuticals in the environment to cause adverse ecological effects is of increasing concern. Given the thousands of active pharmaceutical ingredients (APIs) which can enter the aquatic environment through various means, a current challenge in aquatic toxicol...
Sawata, Hiroshi; Ueshima, Kenji; Tsutani, Kiichiro
2011-04-14
Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP) study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1) to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2) to find ways to improve the environment surrounding clinical trials in Japan more generally. We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization), websites of related medical societies, the University Hospital Medical Information Network (UMIN) Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs). Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5%) was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not disclosed. To improve the quality of clinical trials, all sponsors should register trials and disclose the funding sources before the enrolment of participants, and publish their results after the completion of each study.
The f ( R ) halo mass function in the cosmic web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braun-Bates, F. von; Winther, H.A.; Alonso, D.
An important indicator of modified gravity is the effect of the local environment on halo properties. This paper examines the influence of the local tidal structure on the halo mass function, the halo orientation, spin and the concentration-mass relation. We use the excursion set formalism to produce a halo mass function conditional on large-scale structure. Our simple model agrees well with simulations on large scales at which the density field is linear or weakly non-linear. Beyond this, our principal result is that f ( R ) does affect halo abundances, the halo spin parameter and the concentration-mass relationship in anmore » environment-independent way, whereas we find no appreciable deviation from \\text(ΛCDM) for the mass function with fixed environment density, nor the alignment of the orientation and spin vectors of the halo to the eigenvectors of the local cosmic web. There is a general trend for greater deviation from \\text(ΛCDM) in underdense environments and for high-mass haloes, as expected from chameleon screening.« less
a Framework for Voxel-Based Global Scale Modeling of Urban Environments
NASA Astrophysics Data System (ADS)
Gehrung, Joachim; Hebel, Marcus; Arens, Michael; Stilla, Uwe
2016-10-01
The generation of 3D city models is a very active field of research. Modeling environments as point clouds may be fast, but has disadvantages. These are easily solvable by using volumetric representations, especially when considering selective data acquisition, change detection and fast changing environments. Therefore, this paper proposes a framework for the volumetric modeling and visualization of large scale urban environments. Beside an architecture and the right mix of algorithms for the task, two compression strategies for volumetric models as well as a data quality based approach for the import of range measurements are proposed. The capabilities of the framework are shown on a mobile laser scanning dataset of the Technical University of Munich. Furthermore the loss of the compression techniques is evaluated and their memory consumption is compared to that of raw point clouds. The presented results show that generation, storage and real-time rendering of even large urban models are feasible, even with off-the-shelf hardware.
Pluess, Andrea R; Frank, Aline; Heiri, Caroline; Lalagüe, Hadrien; Vendramin, Giovanni G; Oddou-Muratorio, Sylvie
2016-04-01
The evolutionary potential of long-lived species, such as forest trees, is fundamental for their local persistence under climate change (CC). Genome-environment association (GEA) analyses reveal if species in heterogeneous environments at the regional scale are under differential selection resulting in populations with potential preadaptation to CC within this area. In 79 natural Fagus sylvatica populations, neutral genetic patterns were characterized using 12 simple sequence repeat (SSR) markers, and genomic variation (144 single nucleotide polymorphisms (SNPs) out of 52 candidate genes) was related to 87 environmental predictors in the latent factor mixed model, logistic regressions and isolation by distance/environmental (IBD/IBE) tests. SSR diversity revealed relatedness at up to 150 m intertree distance but an absence of large-scale spatial genetic structure and IBE. In the GEA analyses, 16 SNPs in 10 genes responded to one or several environmental predictors and IBE, corrected for IBD, was confirmed. The GEA often reflected the proposed gene functions, including indications for adaptation to water availability and temperature. Genomic divergence and the lack of large-scale neutral genetic patterns suggest that gene flow allows the spread of advantageous alleles in adaptive genes. Thereby, adaptation processes are likely to take place in species occurring in heterogeneous environments, which might reduce their regional extinction risk under CC. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.
Large-scale P2P network based distributed virtual geographic environment (DVGE)
NASA Astrophysics Data System (ADS)
Tan, Xicheng; Yu, Liang; Bian, Fuling
2007-06-01
Virtual Geographic Environment has raised full concern as a kind of software information system that helps us understand and analyze the real geographic environment, and it has also expanded to application service system in distributed environment--distributed virtual geographic environment system (DVGE), and gets some achievements. However, limited by the factor of the mass data of VGE, the band width of network, as well as numerous requests and economic, etc. DVGE still faces some challenges and problems which directly cause the current DVGE could not provide the public with high-quality service under current network mode. The Rapid development of peer-to-peer network technology has offered new ideas of solutions to the current challenges and problems of DVGE. Peer-to-peer network technology is able to effectively release and search network resources so as to realize efficient share of information. Accordingly, this paper brings forth a research subject on Large-scale peer-to-peer network extension of DVGE as well as a deep study on network framework, routing mechanism, and DVGE data management on P2P network.
Software environment for implementing engineering applications on MIMD computers
NASA Technical Reports Server (NTRS)
Lopez, L. A.; Valimohamed, K. A.; Schiff, S.
1990-01-01
In this paper the concept for a software environment for developing engineering application systems for multiprocessor hardware (MIMD) is presented. The philosophy employed is to solve the largest problems possible in a reasonable amount of time, rather than solve existing problems faster. In the proposed environment most of the problems concerning parallel computation and handling of large distributed data spaces are hidden from the application program developer, thereby facilitating the development of large-scale software applications. Applications developed under the environment can be executed on a variety of MIMD hardware; it protects the application software from the effects of a rapidly changing MIMD hardware technology.
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Huffman, Allan W.; Lux, Kevin M.; Charney, Joseph J.; Riordan, Allan J.; Lin, Yuh-Lang; Proctor, Fred H. (Technical Monitor)
2002-01-01
A 44 case study analysis of the large-scale atmospheric structure associated with development of accident-producing aircraft turbulence is described. Categorization is a function of the accident location, altitude, time of year, time of day, and the turbulence category, which classifies disturbances. National Centers for Environmental Prediction Reanalyses data sets and satellite imagery are employed to diagnose synoptic scale predictor fields associated with the large-scale environment preceding severe turbulence. These analyses indicate a predominance of severe accident-producing turbulence within the entrance region of a jet stream at the synoptic scale. Typically, a flow curvature region is just upstream within the jet entrance region, convection is within 100 km of the accident, vertical motion is upward, absolute vorticity is low, vertical wind shear is increasing, and horizontal cold advection is substantial. The most consistent predictor is upstream flow curvature and nearby convection is the second most frequent predictor.
Fan, Yuying; Zheng, Qiulan; Liu, Shiqing; Li, Qiujie
2016-07-01
To explore the relationships among perceived work environment, psychological empowerment and job engagement of clinical nurses in Harbin, China. Previous studies have focused on organisational factors or nurses' personal characteristics contributing to job engagement. Limited studies have examined the effects of perceived work environment and psychological empowerment on job engagement among Chinese nurses. A cross-sectional quantitative survey with 923 registered nurses at four large university hospitals in China was carried out. Research instruments included the Chinese versions of the perceived nurse work environment scale, the psychological empowerment scale and the job engagement scale. The relationships of the variables were tested using structural equation modelling. Structural equation modelling revealed a good fit of the model, χ(2) /df = 4.46, GFI = 0.936, CFI = 0.957. Perceived work environment was a significant positive direct predictor of psychological empowerment and job engagement. Psychological empowerment was a significant positive direct contributor to job engagement and had a mediating effect on the relationship between perceived work environment and job engagement. Perceived work environment may result in increased job engagement by facilitating the development of psychological empowerment. For nurse managers wishing to increase nurse engagement and to achieve effective management, both perceived work environment and psychological empowerment are factors that need to be well controlled in the process of nurse administration. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Choi, Jin-Ho; Seo, Kyong-Hwan
2017-06-01
This work seeks to find the most effective parameters in a deep convection scheme (relaxed Arakawa-Schubert scheme) of the National Centers of Environmental Prediction Climate Forecast System model for improved simulation of the Madden-Julian Oscillation (MJO). A suite of sensitivity experiments are performed by changing physical components such as the relaxation parameter of mass flux for adjustment of the environment, the evaporation rate from large-scale precipitation, the moisture trigger threshold using relative humidity of the boundary layer, and the fraction of re-evaporation of convective (subgrid-scale) rainfall. Among them, the last two parameters are found to produce a significant improvement. Increasing the strength of these two parameters reduces light rainfall that inhibits complete formation of the tropical convective system or supplies more moisture that help increase a potential energy to large-scale environment in the lower troposphere (especially at 700 hPa), leading to moisture preconditioning favorable for further development and eastward propagation of the MJO. In a more humid environment, more organized MJO structure (i.e., space-time spectral signal, eastward propagation, and tilted vertical structure) is produced.
Peace Operations in Mali: Theory into Practice Then Measuring Effectiveness
2017-06-09
community’s response along two broad lines of effort (LOE): Creating a Safe and Secure Environment and promoting Stable Governance. When seeking to achieve a... Safe and Secure Environment , two objectives were measured. Objective #1 sought the Cessation of Large Scale Violence. Success was attained, as...Creating a Safe and Secure Environment and promoting Stable Governance. When seeking to achieve a Safe and Secure Environment , two objectives were
Impact of spectral nudging on the downscaling of tropical cyclones in regional climate simulations
NASA Astrophysics Data System (ADS)
Choi, Suk-Jin; Lee, Dong-Kyou
2016-06-01
This study investigated the simulations of three months of seasonal tropical cyclone (TC) activity over the western North Pacific using the Advanced Research WRF Model. In the control experiment (CTL), the TC frequency was considerably overestimated. Additionally, the tracks of some TCs tended to have larger radii of curvature and were shifted eastward. The large-scale environments of westerly monsoon flows and subtropical Pacific highs were unreasonably simulated. The overestimated frequency of TC formation was attributed to a strengthened westerly wind field in the southern quadrants of the TC center. In comparison with the experiment with the spectral nudging method, the strengthened wind speed was mainly modulated by large-scale flow that was greater than approximately 1000 km in the model domain. The spurious formation and undesirable tracks of TCs in the CTL were considerably improved by reproducing realistic large-scale atmospheric monsoon circulation with substantial adjustment between large-scale flow in the model domain and large-scale boundary forcing modified by the spectral nudging method. The realistic monsoon circulation took a vital role in simulating realistic TCs. It revealed that, in the downscaling from large-scale fields for regional climate simulations, scale interaction between model-generated regional features and forced large-scale fields should be considered, and spectral nudging is a desirable method in the downscaling method.
Responding to Environmental Challenges in Central Asia and the Caspian Basin
2001-03-01
regional environment, political tension, economic destabilization and loss, to rendering large agricultural areas unsuitable for cultivation as...large-scale deforesta tion and inappropriate farm ing practices, particularly the cultivation of marginal lands without soil conserva tion measures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonnesen, Stephanie; Cen, Renyue, E-mail: stonnes@gmail.com, E-mail: cen@astro.princeton.edu
2015-10-20
The connection between dark matter halos and galactic baryons is often not well constrained nor well resolved in cosmological hydrodynamical simulations. Thus, halo occupation distribution models that assign galaxies to halos based on halo mass are frequently used to interpret clustering observations, even though it is well known that the assembly history of dark matter halos is related to their clustering. In this paper we use high-resolution hydrodynamical cosmological simulations to compare the halo and stellar mass growth of galaxies in a large-scale overdensity to those in a large-scale underdensity (on scales of about 20 Mpc). The simulation reproduces assemblymore » bias, in which halos have earlier formation times in overdense environments than in underdense regions. We find that the ratio of stellar mass to halo mass is larger in overdense regions in central galaxies residing in halos with masses between 10{sup 11} and 10{sup 12.9} M{sub ⊙}. When we force the local density (within 2 Mpc) at z = 0 to be the same for galaxies in the large-scale over- and underdensities, we find the same results. We posit that this difference can be explained by a combination of earlier formation times, more interactions at early times with neighbors, and more filaments feeding galaxies in overdense regions. This result puts the standard practice of assigning stellar mass to halos based only on their mass, rather than considering their larger environment, into question.« less
The large-scale environment from cosmological simulations - I. The baryonic cosmic web
NASA Astrophysics Data System (ADS)
Cui, Weiguang; Knebe, Alexander; Yepes, Gustavo; Yang, Xiaohu; Borgani, Stefano; Kang, Xi; Power, Chris; Staveley-Smith, Lister
2018-01-01
Using a series of cosmological simulations that includes one dark-matter-only (DM-only) run, one gas cooling-star formation-supernova feedback (CSF) run and one that additionally includes feedback from active galactic nuclei (AGNs), we classify the large-scale structures with both a velocity-shear-tensor code (VWEB) and a tidal-tensor code (PWEB). We find that the baryonic processes have almost no impact on large-scale structures - at least not when classified using aforementioned techniques. More importantly, our results confirm that the gas component alone can be used to infer the filamentary structure of the universe practically un-biased, which could be applied to cosmology constraints. In addition, the gas filaments are classified with its velocity (VWEB) and density (PWEB) fields, which can theoretically connect to the radio observations, such as H I surveys. This will help us to bias-freely link the radio observations with dark matter distributions at large scale.
NASA Technical Reports Server (NTRS)
1985-01-01
The goal of defining a CO2 laser transmitter approach suited to Shuttle Coherent Atmospheric Lidar Experiment (SCALE) requirements is discussed. The adaptation of the existing WINDVAN system to the shuttle environment is addressed. The size, weight, reliability, and efficiency of the existing WINDVAN system are largely compatible with SCALE requirements. Repacking is needed for compatibility with vacuum and thermal environments. Changes are required to ensure survival through launch and landing, mechanical, vibration, and acoustic loads. Existing WINDVAN thermal management approaches depending on convection need to be upgraded zero gravity operations.
Successes and Challenges in Transitioning to Large Enrollment NEXUS/Physics IPLS Labs
NASA Astrophysics Data System (ADS)
Moore, Kimberly
2017-01-01
UMd-PERG's NEXUS/Physics for Life Sciences laboratory curriculum, piloted in 2012-2013 in small test classes, has been implemented in large-enrollment environments at UMD from 2013-present. These labs address physical issues at biological scales using microscopy, image and video analysis, electrophoresis, and spectroscopy in an open, non-protocol-driven environment. We have collected a wealth of data (surveys, video analysis, etc.) that enables us to get a sense of the students' responses to this curriculum in a large-enrollment environment and with teaching assistants both `new to' and `experienced in' the labs. In this talk, we will provide a brief overview of what we have learned, including the challenges of transitioning to large N, student perception then and now, and comparisons of our large-enrollment results to the results from our pilot study. We will close with a discussion of the acculturation of teaching assistants to this novel environment and suggestions for sustainability.
The Saskatchewan River Basin - a large scale observatory for water security research (Invited)
NASA Astrophysics Data System (ADS)
Wheater, H. S.
2013-12-01
The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.
Advanced Connectivity Analysis (ACA): a Large Scale Functional Connectivity Data Mining Environment.
Chen, Rong; Nixon, Erika; Herskovits, Edward
2016-04-01
Using resting-state functional magnetic resonance imaging (rs-fMRI) to study functional connectivity is of great importance to understand normal development and function as well as a host of neurological and psychiatric disorders. Seed-based analysis is one of the most widely used rs-fMRI analysis methods. Here we describe a freely available large scale functional connectivity data mining software package called Advanced Connectivity Analysis (ACA). ACA enables large-scale seed-based analysis and brain-behavior analysis. It can seamlessly examine a large number of seed regions with minimal user input. ACA has a brain-behavior analysis component to delineate associations among imaging biomarkers and one or more behavioral variables. We demonstrate applications of ACA to rs-fMRI data sets from a study of autism.
Social sciences in Puget Sound recovery
Katharine F. Wellman; Kelly Biedenweg; Kathleen Wolf
2014-01-01
Advancing the recovery of large-scale ecosystems, such as the Puget Sound inWashington State, requires improved knowledge of the interdependencies between nature and humans in that basin region. As Biedenweg et al. (this issue) illustrate, human wellbeing and human behavior do not occur independently of the biophysical environment. Natural environments contribute to...
NASA Technical Reports Server (NTRS)
Yanai, M.; Esbensen, S.; Chu, J.
1972-01-01
The bulk properties of tropical cloud clusters, as the vertical mass flux, the excess temperature, and moisture and the liquid water content of the clouds, are determined from a combination of the observed large-scale heat and moisture budgets over an area covering the cloud cluster, and a model of a cumulus ensemble which exchanges mass, heat, vapor and liquid water with the environment through entrainment and detrainment. The method also provides an understanding of how the environmental air is heated and moistened by the cumulus convection. An estimate of the average cloud cluster properties and the heat and moisture balance of the environment, obtained from 1956 Marshall Islands data, is presented.
Sign: large-scale gene network estimation environment for high performance computing.
Tamada, Yoshinori; Shimamura, Teppei; Yamaguchi, Rui; Imoto, Seiya; Nagasaki, Masao; Miyano, Satoru
2011-01-01
Our research group is currently developing software for estimating large-scale gene networks from gene expression data. The software, called SiGN, is specifically designed for the Japanese flagship supercomputer "K computer" which is planned to achieve 10 petaflops in 2012, and other high performance computing environments including Human Genome Center (HGC) supercomputer system. SiGN is a collection of gene network estimation software with three different sub-programs: SiGN-BN, SiGN-SSM and SiGN-L1. In these three programs, five different models are available: static and dynamic nonparametric Bayesian networks, state space models, graphical Gaussian models, and vector autoregressive models. All these models require a huge amount of computational resources for estimating large-scale gene networks and therefore are designed to be able to exploit the speed of 10 petaflops. The software will be available freely for "K computer" and HGC supercomputer system users. The estimated networks can be viewed and analyzed by Cell Illustrator Online and SBiP (Systems Biology integrative Pipeline). The software project web site is available at http://sign.hgc.jp/ .
NASA Astrophysics Data System (ADS)
Hristova-Veleva, S.; Chao, Y.; Vane, D.; Lambrigtsen, B.; Li, P. P.; Knosp, B.; Vu, Q. A.; Su, H.; Dang, V.; Fovell, R.; Tanelli, S.; Garay, M.; Willis, J.; Poulsen, W.; Fishbein, E.; Ao, C. O.; Vazquez, J.; Park, K. J.; Callahan, P.; Marcus, S.; Haddad, Z.; Fetzer, E.; Kahn, R.
2007-12-01
In spite of recent improvements in hurricane track forecast accuracy, currently there are still many unanswered questions about the physical processes that determine hurricane genesis, intensity, track and impact on large- scale environment. Furthermore, a significant amount of work remains to be done in validating hurricane forecast models, understanding their sensitivities and improving their parameterizations. None of this can be accomplished without a comprehensive set of multiparameter observations that are relevant to both the large- scale and the storm-scale processes in the atmosphere and in the ocean. To address this need, we have developed a prototype of a comprehensive hurricane information system of high- resolution satellite, airborne and in-situ observations and model outputs pertaining to: i) the thermodynamic and microphysical structure of the storms; ii) the air-sea interaction processes; iii) the larger-scale environment as depicted by the SST, ocean heat content and the aerosol loading of the environment. Our goal was to create a one-stop place to provide the researchers with an extensive set of observed hurricane data, and their graphical representation, together with large-scale and convection-resolving model output, all organized in an easy way to determine when coincident observations from multiple instruments are available. Analysis tools will be developed in the next step. The analysis tools will be used to determine spatial, temporal and multiparameter covariances that are needed to evaluate model performance, provide information for data assimilation and characterize and compare observations from different platforms. We envision that the developed hurricane information system will help in the validation of the hurricane models, in the systematic understanding of their sensitivities and in the improvement of the physical parameterizations employed by the models. Furthermore, it will help in studying the physical processes that affect hurricane development and impact on large-scale environment. This talk will describe the developed prototype of the hurricane information systems. Furthermore, we will use a set of WRF hurricane simulations and compare simulated to observed structures to illustrate how the information system can be used to discriminate between simulations that employ different physical parameterizations. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics ans Space Administration.
Homogenization techniques for population dynamics in strongly heterogeneous landscapes.
Yurk, Brian P; Cobbold, Christina A
2018-12-01
An important problem in spatial ecology is to understand how population-scale patterns emerge from individual-level birth, death, and movement processes. These processes, which depend on local landscape characteristics, vary spatially and may exhibit sharp transitions through behavioural responses to habitat edges, leading to discontinuous population densities. Such systems can be modelled using reaction-diffusion equations with interface conditions that capture local behaviour at patch boundaries. In this work we develop a novel homogenization technique to approximate the large-scale dynamics of the system. We illustrate our approach, which also generalizes to multiple species, with an example of logistic growth within a periodic environment. We find that population persistence and the large-scale population carrying capacity is influenced by patch residence times that depend on patch preference, as well as movement rates in adjacent patches. The forms of the homogenized coefficients yield key theoretical insights into how large-scale dynamics arise from the small-scale features.
NASA Astrophysics Data System (ADS)
Vardoulaki, Eleni; Faustino Jimenez Andrade, Eric; Delvecchio, Ivan; Karim, Alexander; Smolčić, Vernesa; Magnelli, Benjamin; Bertoldi, Frank; Schinnener, Eva; Sargent, Mark; Finoguenov, Alexis; VLA COSMOS Team
2018-01-01
The radio sources associated with active galactic nuclei (AGN) can exhibit a variety of radio structures, from simple to more complex, giving rise to a variety of classification schemes. The question which still remains open, given deeper surveys revealing new populations of radio sources, is whether this plethora of radio structures can be attributed to the physical properties of the host or to the environment. Here we present an analysis on the radio structure of radio-selected AGN from the VLA-COSMOS Large Project at 3 GHz (JVLA-COSMOS; Smolčić et al.) in relation to: 1) their linear projected size, 2) the Eddington ratio, and 3) the environment their hosts lie within. We classify these as FRI (jet-like) and FRII (lobe-like) based on the FR-type classification scheme, and compare them to a sample of jet-less radio AGN in JVLA-COSMOS. We measure their linear projected sizes using a semi-automatic machine learning technique. Their Eddington ratios are calculated from X-ray data available for COSMOS. As environmental probes we take the X-ray groups (hundreds kpc) and the density fields (~Mpc-scale) in COSMOS. We find that FRII radio sources are on average larger than FRIs, which agrees with literature. But contrary to past studies, we find no dichotomy in FR objects in JVLA-COSMOS given their Eddington ratios, as on average they exhibit similar values. Furthermore our results show that the large-scale environment does not explain the observed dichotomy in lobe- and jet-like FR-type objects as both types are found on similar environments, but it does affect the shape of the radio structure introducing bents for objects closer to the centre of an X-ray group.
Kongelf, Anine; Bandewar, Sunita V S; Bharat, Shalini; Collumbien, Martine
2015-01-01
In the last decade, community mobilisation (CM) interventions targeting female sex workers (FSWs) have been scaled-up in India's national response to the HIV epidemic. This included the Bill and Melinda Gates Foundation's Avahan programme which adopted a business approach to plan and manage implementation at scale. With the focus of evaluation efforts on measuring effectiveness and health impacts there has been little analysis thus far of the interaction of the CM interventions with the sex work industry in complex urban environments. Between March and July 2012 semi-structured, in-depth interviews and focus group discussions were conducted with 63 HIV intervention implementers, to explore challenges of HIV prevention among FSWs in Mumbai. A thematic analysis identified contextual factors that impact CM implementation. Large-scale interventions are not only impacted by, but were shown to shape the dynamic social context. Registration practices and programme monitoring were experienced as stigmatising, reflected in shifting client preferences towards women not disclosing as 'sex workers'. This combined with urban redevelopment and gentrification of traditional red light areas, forcing dispersal and more 'hidden' ways of solicitation, further challenging outreach and collectivisation. Participants reported that brothel owners and 'pimps' continued to restrict access to sex workers and the heterogeneous 'community' of FSWs remains fragmented with high levels of mobility. Stakeholder engagement was poor and mobilising around HIV prevention not compelling. Interventions largely failed to respond to community needs as strong target-orientation skewed activities towards those most easily measured and reported. Large-scale interventions have been impacted by and contributed to an increasingly complex sex work environment in Mumbai, challenging outreach and mobilisation efforts. Sex workers remain a vulnerable and disempowered group needing continued support and more comprehensive services.
Quality of life in small-scaled homelike nursing homes: an 8-month controlled trial.
Kok, Jeroen S; Nielen, Marjan M A; Scherder, Erik J A
2018-02-27
Quality of life is a clinical highly relevant outcome for residents with dementia. The question arises whether small scaled homelike facilities are associated with better quality of life than regular larger scale nursing homes do. A sample of 145 residents living in a large scale care facility were followed over 8 months. Half of the sample (N = 77) subsequently moved to a small scaled facility. Quality of life aspects were measured with the QUALIDEM and GIP before and after relocation. We found a significant Group x Time interaction on measures of anxiety meaning that residents who moved to small scale units became less anxious than residents who stayed on the regular care large-scale units. No significant differences were found on other aspects of quality of life. This study demonstrates that residents who move from a large scale facility to a small scale environment can improve an aspect of quality of life by showing a reduction in anxiety. Current Controlled Trials ISRCTN11151241 . registration date: 21-06-2017. Retrospectively registered.
Biology-Inspired Distributed Consensus in Massively-Deployed Sensor Networks
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Lodding, Kenneth N.; Olariu, Stephan; Wilson, Larry; Xin, Chunsheng
2005-01-01
Promises of ubiquitous control of the physical environment by large-scale wireless sensor networks open avenues for new applications that are expected to redefine the way we live and work. Most of recent research has concentrated on developing techniques for performing relatively simple tasks in small-scale sensor networks assuming some form of centralized control. The main contribution of this work is to propose a new way of looking at large-scale sensor networks, motivated by lessons learned from the way biological ecosystems are organized. Indeed, we believe that techniques used in small-scale sensor networks are not likely to scale to large networks; that such large-scale networks must be viewed as an ecosystem in which the sensors/effectors are organisms whose autonomous actions, based on local information, combine in a communal way to produce global results. As an example of a useful function, we demonstrate that fully distributed consensus can be attained in a scalable fashion in massively deployed sensor networks where individual motes operate based on local information, making local decisions that are aggregated across the network to achieve globally-meaningful effects.
New design for interfacing computers to the Octopus network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sloan, L.J.
1977-03-14
The Lawrence Livermore Laboratory has several large-scale computers which are connected to the Octopus network. Several difficulties arise in providing adequate resources along with reliable performance. To alleviate some of these problems a new method of bringing large computers into the Octopus environment is proposed.
Cloud-enabled large-scale land surface model simulations with the NASA Land Information System
NASA Astrophysics Data System (ADS)
Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.
2017-12-01
Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and describe the potential deployment of this information technology with other NASA applications.
Role of substrate quality on IC performance and yields
NASA Technical Reports Server (NTRS)
Thomas, R. N.
1981-01-01
The development of silicon and gallium arsenide crystal growth for the production of large diameter substrates are discussed. Large area substrates of significantly improved compositional purity, dopant distribution and structural perfection on a microscopic as well as macroscopic scale are important requirements. The exploratory use of magnetic fields to suppress convection effects in Czochralski crystal growth is addressed. The growth of large crystals in space appears impractical at present however the efforts to improve substrate quality could benefit from the experiences gained in smaller scale growth experiments conducted in the zero gravity environment of space.
Using stroboscopic flow imaging to validate large-scale computational fluid dynamics simulations
NASA Astrophysics Data System (ADS)
Laurence, Ted A.; Ly, Sonny; Fong, Erika; Shusteff, Maxim; Randles, Amanda; Gounley, John; Draeger, Erik
2017-02-01
The utility and accuracy of computational modeling often requires direct validation against experimental measurements. The work presented here is motivated by taking a combined experimental and computational approach to determine the ability of large-scale computational fluid dynamics (CFD) simulations to understand and predict the dynamics of circulating tumor cells in clinically relevant environments. We use stroboscopic light sheet fluorescence imaging to track the paths and measure the velocities of fluorescent microspheres throughout a human aorta model. Performed over complex physiologicallyrealistic 3D geometries, large data sets are acquired with microscopic resolution over macroscopic distances.
Not a Copernican observer: biased peculiar velocity statistics in the local Universe
NASA Astrophysics Data System (ADS)
Hellwing, Wojciech A.; Nusser, Adi; Feix, Martin; Bilicki, Maciej
2017-05-01
We assess the effect of the local large-scale structure on the estimation of two-point statistics of the observed radial peculiar velocities of galaxies. A large N-body simulation is used to examine these statistics from the perspective of random observers as well as 'Local Group-like' observers conditioned to reside in an environment resembling the observed Universe within 20 Mpc. The local environment systematically distorts the shape and amplitude of velocity statistics with respect to ensemble-averaged measurements made by a Copernican (random) observer. The Virgo cluster has the most significant impact, introducing large systematic deviations in all the statistics. For a simple 'top-hat' selection function, an idealized survey extending to ˜160 h-1 Mpc or deeper is needed to completely mitigate the effects of the local environment. Using shallower catalogues leads to systematic deviations of the order of 50-200 per cent depending on the scale considered. For a flat redshift distribution similar to the one of the CosmicFlows-3 survey, the deviations are even more prominent in both the shape and amplitude at all separations considered (≲100 h-1 Mpc). Conclusions based on statistics calculated without taking into account the impact of the local environment should be revisited.
A novel representation of groundwater dynamics in large-scale land surface modelling
NASA Astrophysics Data System (ADS)
Rahman, Mostaquimur; Rosolem, Rafael; Kollet, Stefan
2017-04-01
Land surface processes are connected to groundwater dynamics via shallow soil moisture. For example, groundwater affects evapotranspiration (by influencing the variability of soil moisture) and runoff generation mechanisms. However, contemporary Land Surface Models (LSM) generally consider isolated soil columns and free drainage lower boundary condition for simulating hydrology. This is mainly due to the fact that incorporating detailed groundwater dynamics in LSMs usually requires considerable computing resources, especially for large-scale applications (e.g., continental to global). Yet, these simplifications undermine the potential effect of groundwater dynamics on land surface mass and energy fluxes. In this study, we present a novel approach of representing high-resolution groundwater dynamics in LSMs that is computationally efficient for large-scale applications. This new parameterization is incorporated in the Joint UK Land Environment Simulator (JULES) and tested at the continental-scale.
Large-scale solar magnetic fields and H-alpha patterns
NASA Technical Reports Server (NTRS)
Mcintosh, P. S.
1972-01-01
Coronal and interplanetary magnetic fields computed from measurements of large-scale photospheric magnetic fields suffer from interruptions in day-to-day observations and the limitation of using only measurements made near the solar central meridian. Procedures were devised for inferring the lines of polarity reversal from H-alpha solar patrol photographs that map the same large-scale features found on Mt. Wilson magnetograms. These features may be monitored without interruption by combining observations from the global network of observatories associated with NOAA's Space Environment Services Center. The patterns of inferred magnetic fields may be followed accurately as far as 60 deg from central meridian. Such patterns will be used to improve predictions of coronal features during the next solar eclipse.
Applications of Parallel Process HiMAP for Large Scale Multidisciplinary Problems
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; Potsdam, Mark; Rodriguez, David; Kwak, Dochay (Technical Monitor)
2000-01-01
HiMAP is a three level parallel middleware that can be interfaced to a large scale global design environment for code independent, multidisciplinary analysis using high fidelity equations. Aerospace technology needs are rapidly changing. Computational tools compatible with the requirements of national programs such as space transportation are needed. Conventional computation tools are inadequate for modern aerospace design needs. Advanced, modular computational tools are needed, such as those that incorporate the technology of massively parallel processors (MPP).
Scale-Up: Improving Large Enrollment Physics Courses
NASA Astrophysics Data System (ADS)
Beichner, Robert
1999-11-01
The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project is working to establish a learning environment that will promote increased conceptual understanding, improved problem-solving performance, and greater student satisfaction, while still maintaining class sizes of approximately 100. We are also addressing the new ABET engineering accreditation requirements for inquiry-based learning along with communication and team-oriented skills development. Results of studies of our latest classroom design, plans for future classroom space, and the current iteration of instructional materials will be discussed.
NASA Technical Reports Server (NTRS)
Fisher, Scott S.
1986-01-01
A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed for use as a multipurpose interface environment. The system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, application scenarios, and research directions are described.
Online Learning Experiences of New versus Continuing Learners: A Large-Scale Replication Study
ERIC Educational Resources Information Center
Li, Nai; Marsh, Vicky; Rienties, Bart; Whitelock, Denise
2017-01-01
A vast body of research has indicated the importance of distinguishing new vs. continuing students' learning experiences in blended and online environments. Continuing learners may have developed learning and coping mechanisms for "surviving" in such learning environments, while new learners might still need to adjust their learning…
One Spatial Map or Many? Spatial Coding of Connected Environments
ERIC Educational Resources Information Center
Han, Xue; Becker, Suzanna
2014-01-01
We investigated how humans encode large-scale spatial environments using a virtual taxi game. We hypothesized that if 2 connected neighborhoods are explored jointly, people will form a single integrated spatial representation of the town. However, if the neighborhoods are first learned separately and later observed to be connected, people will…
NASA Astrophysics Data System (ADS)
Koyama, Yusei; Hayashi, Masao; Tanaka, Masayuki; Kodama, Tadayuki; Shimakawa, Rhythm; Yamamoto, Moegi; Nakata, Fumiaki; Tanaka, Ichi; Suzuki, Tomoko L.; Tadaki, Ken-ichi; Nishizawa, Atsushi J.; Yabe, Kiyoto; Toba, Yoshiki; Lin, Lihwai; Jian, Hung-Yu; Komiyama, Yutaka
2018-01-01
We present the environmental dependence of color, stellar mass, and star formation (SF) activity in Hα-selected galaxies along the large-scale structure at z = 0.4 hosting twin clusters in the DEEP2-3 field, discovered by the Subaru Strategic Program of Hyper Suprime-Cam (HSC SSP). By combining photo-z-selected galaxies and Hα emitters selected with broadband and narrowband (NB) data from the recent data release of HSC SSP (DR1), we confirm that galaxies in higher-density environments or galaxies in cluster central regions show redder colors. We find that there still remains a possible color-density and color-radius correlation even if we restrict the sample to Hα-selected galaxies, probably due to the presence of massive Hα emitters in denser regions. We also find a hint of increased star formation rates (SFR) amongst Hα emitters toward the highest-density environment, again primarily driven by the excess of red/massive Hα emitters in high-density environments, while their specific SFRs do not significantly change with environment. This work demonstrates the power of the HSC SSP NB data for studying SF galaxies across environments in the distant universe.
ERIC Educational Resources Information Center
Guth, Douglas J.
2017-01-01
A community college's success hinges in large part on the effectiveness of its teaching faculty, no more so than in times of major organizational change. However, any large-scale foundational shift requires institutional buy-in, with the onus on leadership to create an environment where everyone is working together toward the same endpoint.…
NASA Technical Reports Server (NTRS)
Strom, Stephen; Sargent, Wallace L. W.; Wolff, Sidney; Ahearn, Michael F.; Angel, J. Roger; Beckwith, Steven V. W.; Carney, Bruce W.; Conti, Peter S.; Edwards, Suzan; Grasdalen, Gary
1991-01-01
Optical/infrared (O/IR) astronomy in the 1990's is reviewed. The following subject areas are included: research environment; science opportunities; technical development of the 1980's and opportunities for the 1990's; and ground-based O/IR astronomy outside the U.S. Recommendations are presented for: (1) large scale programs (Priority 1: a coordinated program for large O/IR telescopes); (2) medium scale programs (Priority 1: a coordinated program for high angular resolution; Priority 2: a new generation of 4-m class telescopes); (3) small scale programs (Priority 1: near-IR and optical all-sky surveys; Priority 2: a National Astrometric Facility); and (4) infrastructure issues (develop, purchase, and distribute optical CCDs and infrared arrays; a program to support large optics technology; a new generation of large filled aperture telescopes; a program to archive and disseminate astronomical databases; and a program for training new instrumentalists)
Adam E. Duerr; Tricia A. Miller; Kerri L. Cornell Duerr; Michael J. Lanzone; Amy Fesnock; Todd E. Katzner
2015-01-01
Anthropogenic development has great potential to affect fragile desert environments. Large-scale development of renewable energy infrastructure is planned for many desert ecosystems. Development plans should account for anthropogenic effects to distributions and abundance of rare or sensitive wildlife; however, baseline data on abundance and distribution of such...
Tropical agricultural is a major threat to biodiversity worldwide. In addition to the direct impacts of converting native vegetation to agriculture this process is accompanied by a wider set of human-induced disturbances, many of which are poorly addressed by existing environment...
The Role of Scheduling in Observing Teacher-Child Interactions
ERIC Educational Resources Information Center
Cash, Anne H.; Pianta, Robert C.
2014-01-01
Observational assessment is being used on a large scale to evaluate the quality of interactions between teachers and children in classroom environments. When one performs observations at scale, features of the protocol such as the scheduling of observations can potentially influence observed scores. In this study interactions were observed for 88…
Activity-Based Introductory Physics Reform *
NASA Astrophysics Data System (ADS)
Thornton, Ronald
2004-05-01
Physics education research has shown that learning environments that engage students and allow them to take an active part in their learning can lead to large conceptual gains compared to those of good traditional instruction. Examples of successful curricula and methods include Peer Instruction, Just in Time Teaching, RealTime Physics, Workshop Physics, Scale-Up, and Interactive Lecture Demonstrations (ILDs). RealTime Physics promotes interaction among students in a laboratory setting and makes use of powerful real-time data logging tools to teach concepts as well as quantitative relationships. An active learning environment is often difficult to achieve in large lecture sessions and Workshop Physics and Scale-Up largely eliminate lectures in favor of collaborative student activities. Peer Instruction, Just in Time Teaching, and Interactive Lecture Demonstrations (ILDs) make lectures more interactive in complementary ways. This presentation will introduce these reforms and use Interactive Lecture Demonstrations (ILDs) with the audience to illustrate the types of curricula and tools used in the curricula above. ILDs make use real experiments, real-time data logging tools and student interaction to create an active learning environment in large lecture classes. A short video of students involved in interactive lecture demonstrations will be shown. The results of research studies at various institutions to measure the effectiveness of these methods will be presented.
Scaling Relations between Gas and Star Formation in Nearby Galaxies
NASA Astrophysics Data System (ADS)
Bigiel, Frank; Leroy, Adam; Walter, Fabian
2011-04-01
High resolution, multi-wavelength maps of a sizeable set of nearby galaxies have made it possible to study how the surface densities of H i, H2 and star formation rate (ΣHI, ΣH2, ΣSFR) relate on scales of a few hundred parsecs. At these scales, individual galaxy disks are comfortably resolved, making it possible to assess gas-SFR relations with respect to environment within galaxies. ΣH2, traced by CO intensity, shows a strong correlation with ΣSFR and the ratio between these two quantities, the molecular gas depletion time, appears to be constant at about 2 Gyr in large spiral galaxies. Within the star-forming disks of galaxies, ΣSFR shows almost no correlation with ΣHI. In the outer parts of galaxies, however, ΣSFR does scale with ΣHI, though with large scatter. Combining data from these different environments yields a distribution with multiple regimes in Σgas - ΣSFR space. If the underlying assumptions to convert observables to physical quantities are matched, even combined datasets based on different SFR tracers, methodologies and spatial scales occupy a well define locus in Σgas - ΣSFR space.
A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations
NASA Astrophysics Data System (ADS)
Demir, I.; Agliamzanov, R.
2014-12-01
Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.
Grid cells form a global representation of connected environments.
Carpenter, Francis; Manson, Daniel; Jeffery, Kate; Burgess, Neil; Barry, Caswell
2015-05-04
The firing patterns of grid cells in medial entorhinal cortex (mEC) and associated brain areas form triangular arrays that tessellate the environment [1, 2] and maintain constant spatial offsets to each other between environments [3, 4]. These cells are thought to provide an efficient metric for navigation in large-scale space [5-8]. However, an accurate and universal metric requires grid cell firing patterns to uniformly cover the space to be navigated, in contrast to recent demonstrations that environmental features such as boundaries can distort [9-11] and fragment [12] grid patterns. To establish whether grid firing is determined by local environmental cues, or provides a coherent global representation, we recorded mEC grid cells in rats foraging in an environment containing two perceptually identical compartments connected via a corridor. During initial exposures to the multicompartment environment, grid firing patterns were dominated by local environmental cues, replicating between the two compartments. However, with prolonged experience, grid cell firing patterns formed a single, continuous representation that spanned both compartments. Thus, we provide the first evidence that in a complex environment, grid cell firing can form the coherent global pattern necessary for them to act as a metric capable of supporting large-scale spatial navigation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Grid Cells Form a Global Representation of Connected Environments
Carpenter, Francis; Manson, Daniel; Jeffery, Kate; Burgess, Neil; Barry, Caswell
2015-01-01
Summary The firing patterns of grid cells in medial entorhinal cortex (mEC) and associated brain areas form triangular arrays that tessellate the environment [1, 2] and maintain constant spatial offsets to each other between environments [3, 4]. These cells are thought to provide an efficient metric for navigation in large-scale space [5–8]. However, an accurate and universal metric requires grid cell firing patterns to uniformly cover the space to be navigated, in contrast to recent demonstrations that environmental features such as boundaries can distort [9–11] and fragment [12] grid patterns. To establish whether grid firing is determined by local environmental cues, or provides a coherent global representation, we recorded mEC grid cells in rats foraging in an environment containing two perceptually identical compartments connected via a corridor. During initial exposures to the multicompartment environment, grid firing patterns were dominated by local environmental cues, replicating between the two compartments. However, with prolonged experience, grid cell firing patterns formed a single, continuous representation that spanned both compartments. Thus, we provide the first evidence that in a complex environment, grid cell firing can form the coherent global pattern necessary for them to act as a metric capable of supporting large-scale spatial navigation. PMID:25913404
Statistical Model Applied to NetFlow for Network Intrusion Detection
NASA Astrophysics Data System (ADS)
Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.
Lorenz, D; Armbruster, W; Vogelgesang, C; Hoffmann, H; Pattar, A; Schmidt, D; Volk, T; Kubulus, D
2016-09-01
Chief emergency physicians are regarded as an important element in the care of the injured and sick following mass casualty accidents. Their education is very theoretical; practical content in contrast often falls short. Limitations are usually the very high costs of realistic (large-scale) exercises, poor reproducibility of the scenarios, and poor corresponding results. To substantially improve the educational level because of the complexity of mass casualty accidents, modified training concepts are required that teach the not only the theoretical but above all the practical skills considerably more intensively than at present. Modern training concepts should make it possible for the learner to realistically simulate decision processes. This article examines how interactive virtual environments are applicable for the education of emergency personnel and how they could be designed. Virtual simulation and training environments offer the possibility of simulating complex situations in an adequately realistic manner. The so-called virtual reality (VR) used in this context is an interface technology that enables free interaction in addition to a stereoscopic and spatial representation of virtual large-scale emergencies in a virtual environment. Variables in scenarios such as the weather, the number wounded, and the availability of resources, can be changed at any time. The trainees are able to practice the procedures in many virtual accident scenes and act them out repeatedly, thereby testing the different variants. With the aid of the "InSitu" project, it is possible to train in a virtual reality with realistically reproduced accident situations. These integrated, interactive training environments can depict very complex situations on a scale of 1:1. Because of the highly developed interactivity, the trainees can feel as if they are a direct part of the accident scene and therefore identify much more with the virtual world than is possible with desktop systems. Interactive, identifiable, and realistic training environments based on projector systems could in future enable a repetitive exercise with changes within a decision tree, in reproducibility, and within different occupational groups. With a hard- and software environment numerous accident situations can be depicted and practiced. The main expense is the creation of the virtual accident scenes. As the appropriate city models and other three-dimensional geographical data are already available, this expenditure is very low compared with the planning costs of a large-scale exercise.
Distributed Lag Models: Examining Associations between the Built Environment and Health
Baek, Jonggyu; Sánchez, Brisa N.; Berrocal, Veronica J.; Sanchez-Vaznaugh, Emma V.
2016-01-01
Built environment factors constrain individual level behaviors and choices, and thus are receiving increasing attention to assess their influence on health. Traditional regression methods have been widely used to examine associations between built environment measures and health outcomes, where a fixed, pre-specified spatial scale (e.g., 1 mile buffer) is used to construct environment measures. However, the spatial scale for these associations remains largely unknown and misspecifying it introduces bias. We propose the use of distributed lag models (DLMs) to describe the association between built environment features and health as a function of distance from the locations of interest and circumvent a-priori selection of a spatial scale. Based on simulation studies, we demonstrate that traditional regression models produce associations biased away from the null when there is spatial correlation among the built environment features. Inference based on DLMs is robust under a range of scenarios of the built environment. We use this innovative application of DLMs to examine the association between the availability of convenience stores near California public schools, which may affect children’s dietary choices both through direct access to junk food and exposure to advertisement, and children’s body mass index z-scores (BMIz). PMID:26414942
Psychometric Properties of the Perceived Wellness Culture and Environment Support Scale.
Melnyk, Bernadette Mazurek; Szalacha, Laura A; Amaya, Megan
2018-05-01
This study reports on the psychometric properties of the 11-item Perceived Wellness Culture and Environment Support Scale (PWCESS) and its relationship with employee healthy lifestyle beliefs and behaviors. Faculty and staff (N = 3959) at a large public university in the United States mid-west completed the PWCESS along with healthy lifestyle beliefs and behaviors scales. Data were randomly split into 2 halves to explore the PWCESS' validity and reliability and the second half to confirm findings. Principal components analysis indicated a unidimensional construct. The PWCESS was positively related to healthy lifestyle beliefs and behaviors supporting the scale's validity. Confirmatory factor analysis supported the unidimensional construct (Cronbach's α = .92). Strong evidence supports the validity and reliability of the PWCESS. Future use of this scale could guide workplace intervention strategies to improve organizational wellness culture and employee health outcomes.
a Voxel-Based Metadata Structure for Change Detection in Point Clouds of Large-Scale Urban Areas
NASA Astrophysics Data System (ADS)
Gehrung, J.; Hebel, M.; Arens, M.; Stilla, U.
2018-05-01
Mobile laser scanning has not only the potential to create detailed representations of urban environments, but also to determine changes up to a very detailed level. An environment representation for change detection in large scale urban environments based on point clouds has drawbacks in terms of memory scalability. Volumes, however, are a promising building block for memory efficient change detection methods. The challenge of working with 3D occupancy grids is that the usual raycasting-based methods applied for their generation lead to artifacts caused by the traversal of unfavorable discretized space. These artifacts have the potential to distort the state of voxels in close proximity to planar structures. In this work we propose a raycasting approach that utilizes knowledge about planar surfaces to completely prevent this kind of artifacts. To demonstrate the capabilities of our approach, a method for the iterative volumetric approximation of point clouds that allows to speed up the raycasting by 36 percent is proposed.
Schlecht, Ulrich; Liu, Zhimin; Blundell, Jamie R; St Onge, Robert P; Levy, Sasha F
2017-05-25
Several large-scale efforts have systematically catalogued protein-protein interactions (PPIs) of a cell in a single environment. However, little is known about how the protein interactome changes across environmental perturbations. Current technologies, which assay one PPI at a time, are too low throughput to make it practical to study protein interactome dynamics. Here, we develop a highly parallel protein-protein interaction sequencing (PPiSeq) platform that uses a novel double barcoding system in conjunction with the dihydrofolate reductase protein-fragment complementation assay in Saccharomyces cerevisiae. PPiSeq detects PPIs at a rate that is on par with current assays and, in contrast with current methods, quantitatively scores PPIs with enough accuracy and sensitivity to detect changes across environments. Both PPI scoring and the bulk of strain construction can be performed with cell pools, making the assay scalable and easily reproduced across environments. PPiSeq is therefore a powerful new tool for large-scale investigations of dynamic PPIs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. G. Little
1999-03-01
The Idaho National Engineering and Environmental Laboratory (INEEL), through the US Department of Energy (DOE), has proposed that a large-scale wind test facility (LSWTF) be constructed to study, in full-scale, the behavior of low-rise structures under simulated extreme wind conditions. To determine the need for, and potential benefits of, such a facility, the Idaho Operations Office of the DOE requested that the National Research Council (NRC) perform an independent assessment of the role and potential value of an LSWTF in the overall context of wind engineering research. The NRC established the Committee to Review the Need for a Large-scale Testmore » Facility for Research on the Effects of Extreme Winds on Structures, under the auspices of the Board on Infrastructure and the Constructed Environment, to perform this assessment. This report conveys the results of the committee's deliberations as well as its findings and recommendations. Data developed at large-scale would enhanced the understanding of how structures, particularly light-frame structures, are affected by extreme winds (e.g., hurricanes, tornadoes, sever thunderstorms, and other events). With a large-scale wind test facility, full-sized structures, such as site-built or manufactured housing and small commercial or industrial buildings, could be tested under a range of wind conditions in a controlled, repeatable environment. At this time, the US has no facility specifically constructed for this purpose. During the course of this study, the committee was confronted by three difficult questions: (1) does the lack of a facility equate to a need for the facility? (2) is need alone sufficient justification for the construction of a facility? and (3) would the benefits derived from information produced in an LSWTF justify the costs of producing that information? The committee's evaluation of the need and justification for an LSWTF was shaped by these realities.« less
Virtual workstation - A multimodal, stereoscopic display environment
NASA Astrophysics Data System (ADS)
Fisher, S. S.; McGreevy, M.; Humphries, J.; Robinett, W.
1987-01-01
A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed for use in a multipurpose interface environment. The system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, application scenarios, and research directions are described.
Energy harvesting: small scale energy production from ambient sources
NASA Astrophysics Data System (ADS)
Yeatman, Eric M.
2009-03-01
Energy harvesting - the collection of otherwise unexploited energy in the local environment - is attracting increasing attention for the powering of electronic devices. While the power levels that can be reached are typically modest (microwatts to milliwatts), the key motivation is to avoid the need for battery replacement or recharging in portable or inaccessible devices. Wireless sensor networks are a particularly important application: the availability of essentially maintenance free sensor nodes, as enabled by energy harvesting, will greatly increase the feasibility of large scale networks, in the paradigm often known as pervasive sensing. Such pervasive sensing networks, used to monitor buildings, structures, outdoor environments or the human body, offer significant benefits for large scale energy efficiency, health and safety, and many other areas. Sources of energy for harvesting include light, temperature differences, and ambient motion, and a wide range of miniature energy harvesters based on these sources have been proposed or demonstrated. This paper reviews the principles and practice in miniature energy harvesters, and discusses trends, suitable applications, and possible future developments.
Van Bogaert, Peter; Van Heusden, Danny; Verspuy, Martijn; Wouters, Kristien; Slootmans, Stijn; Van der Straeten, Johnny; Van Aken, Paul; White, Mark
2017-03-01
Aim To investigate the impact of the quality improvement program "Productive Ward - Releasing Time to Care™" using nurses' and midwives' reports of practice environment, burnout, quality of care, job outcomes, as well as workload, decision latitude, social capital, and engagement. Background Despite the requirement for health systems to improve quality and the proliferation of quality improvement programs designed for healthcare, the empirical evidence supporting large-scale quality improvement programs impacting patient satisfaction, staff engagement, and quality care remains sparse. Method A longitudinal study was performed in a large 600-bed acute care university hospital at two measurement intervals for nurse practice environment, burnout, and quality of care and job outcomes and three measurement intervals for workload, decision latitude, social capital, and engagement between June 2011 and November 2014. Results Positive results were identified in practice environment, decision latitude, and social capital. Less favorable results were identified in relation to perceived workload, emotional exhaustion. and vigor. Moreover, measures of quality of care and job satisfaction were reported less favorably. Conclusion This study highlights the need to further understand how to implement large-scale quality improvement programs so that they integrate with daily practices and promote "quality improvement" as "business as usual."
Virtual interface environment workstations
NASA Technical Reports Server (NTRS)
Fisher, S. S.; Wenzel, E. M.; Coler, C.; Mcgreevy, M. W.
1988-01-01
A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed at NASA's Ames Research Center for use as a multipurpose interface environment. This Virtual Interface Environment Workstation (VIEW) system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, research scenarios, and research directions are described.
Large-scale machine learning and evaluation platform for real-time traffic surveillance
NASA Astrophysics Data System (ADS)
Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel
2016-09-01
In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.
NASA Astrophysics Data System (ADS)
Lin, Y.; O'Malley, D.; Vesselinov, V. V.
2015-12-01
Inverse modeling seeks model parameters given a set of observed state variables. However, for many practical problems due to the facts that the observed data sets are often large and model parameters are often numerous, conventional methods for solving the inverse modeling can be computationally expensive. We have developed a new, computationally-efficient Levenberg-Marquardt method for solving large-scale inverse modeling. Levenberg-Marquardt methods require the solution of a dense linear system of equations which can be prohibitively expensive to compute for large-scale inverse problems. Our novel method projects the original large-scale linear problem down to a Krylov subspace, such that the dimensionality of the measurements can be significantly reduced. Furthermore, instead of solving the linear system for every Levenberg-Marquardt damping parameter, we store the Krylov subspace computed when solving the first damping parameter and recycle it for all the following damping parameters. The efficiency of our new inverse modeling algorithm is significantly improved by using these computational techniques. We apply this new inverse modeling method to invert for a random transitivity field. Our algorithm is fast enough to solve for the distributed model parameters (transitivity) at each computational node in the model domain. The inversion is also aided by the use regularization techniques. The algorithm is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). Julia is an advanced high-level scientific programing language that allows for efficient memory management and utilization of high-performance computational resources. By comparing with a Levenberg-Marquardt method using standard linear inversion techniques, our Levenberg-Marquardt method yields speed-up ratio of 15 in a multi-core computational environment and a speed-up ratio of 45 in a single-core computational environment. Therefore, our new inverse modeling method is a powerful tool for large-scale applications.
The problem of ecological scaling in spatially complex, nonequilibrium ecological systems [chapter 3
Samuel A. Cushman; Jeremy Littell; Kevin McGarigal
2010-01-01
In the previous chapter we reviewed the challenges posed by spatial complexity and temporal disequilibrium to efforts to understand and predict the structure and dynamics of ecological systems. The central theme was that spatial variability in the environment and population processes fundamentally alters the interactions between species and their environments, largely...
VET Workers' Problem-Solving Skills in Technology-Rich Environments: European Approach
ERIC Educational Resources Information Center
Hämäläinen, Raija; Cincinnato, Sebastiano; Malin, Antero; De Wever, Bram
2014-01-01
The European workplace is challenging VET adults' problem-solving skills in technology-rich environments (TREs). So far, no international large-scale assessment data has been available for VET. The PIAAC data comprise the most comprehensive source of information on adults' skills to date. The present study (N = 50 369) focuses on gaining insight…
The Relative Emphasis of Play Rules between Experienced and Trainee Caregivers of Toddlers
ERIC Educational Resources Information Center
Gyöngy, Kinga
2017-01-01
Content analysis of a large-scale (N = 920) qualitative data set with MAXQDA12 from a nationwide questionnaire of nursery practitioners in Hungary was able to demonstrate various types of rules during free play: social, health and safety, and environment-related rules. Environment-related rules, which govern space utilisation in toddler groups,…
USDA-ARS?s Scientific Manuscript database
When large-scale restorations are undertaken using local genotypes, wild-collected sources often undergo a generation in an agronomic environment for seed increase. We have little information on how a single generation of agronomic production can alter seed success in restoration. In this study, we...
Examining the Characteristics of Student Postings That Are Liked and Linked in a CSCL Environment
ERIC Educational Resources Information Center
Makos, Alexandra; Lee, Kyungmee; Zingaro, Daniel
2015-01-01
This case study is the first iteration of a large-scale design-based research project to improve Pepper, an interactive discussion-based learning environment. In this phase, we designed and implemented two social features to scaffold positive learner interactivity behaviors: a "Like" button and linking tool. A mixed-methods approach was…
Large in-stream wood studies: A call for common metrics
Ellen Wohl; Daniel A. Cenderelli; Kathleen A. Dwire; Sandra E. Ryan-Burkett; Michael K. Young; Kurt D. Fausch
2010-01-01
During the past decade, research on large in-stream wood has expanded beyond North America's Pacific Northwest to diverse environments and has shifted toward increasingly holistic perspectives that incorporate processes of wood recruitment, retention, and loss at scales from channel segments to entire watersheds. Syntheses of this rapidly expanding literature can...
A Phenomenology of Learning Large: The Tutorial Sphere of xMOOC Video Lectures
ERIC Educational Resources Information Center
Adams, Catherine; Yin, Yin; Vargas Madriz, Luis Francisco; Mullen, C. Scott
2014-01-01
The current discourse surrounding Massive Open Online Courses (MOOCs) is powerful. Despite their rapid and widespread deployment, research has yet to confirm or refute some of the bold claims rationalizing the popularity and efficacy of these large-scale virtual learning environments. Also, MOOCs' reputed disruptive, game-changing potential…
5 years of experience with a large-scale mentoring program for medical students.
Pinilla, Severin; Pander, Tanja; von der Borch, Philip; Fischer, Martin R; Dimitriadis, Konstantinos
2015-01-01
In this paper we present our 5-year-experience with a large-scale mentoring program for undergraduate medical students at the Ludwig Maximilians-Universität Munich (LMU). We implemented a two-tiered program with a peer-mentoring concept for preclinical students and a 1:1-mentoring concept for clinical students aided by a fully automated online-based matching algorithm. Approximately 20-30% of each student cohort participates in our voluntary mentoring program. Defining ideal program evaluation strategies, recruiting mentors from beyond the academic environment and accounting for the mentoring network reality remain challenging. We conclude that a two-tiered program is well accepted by students and faculty. In addition the online-based matching seems to be effective for large-scale mentoring programs.
Mark A. Dietenberger
2010-01-01
Effective mitigation of external fires on structures can be achieved flexibly, economically, and aesthetically by (1) preventing large-area ignition on structures by avoiding close proximity of burning vegetation; and (2) stopping flame travel from firebrands landing on combustible building objects. Using bench-scale and mid-scale fire tests to obtain flammability...
Ignition and flame travel on realistic building and landscape objects in changing environments
Mark A. Dietenberger
2007-01-01
Effective mitigation of external fires on structures can be achieved flexibly, economically, and aesthetically by (1) preventing large-area ignition on structures from close proximity of burning vegetations and (2) stopping flame travel from firebrands landing on combustible building objects. In using bench-scale and mid-scale fire tests to obtain fire growth...
Effects of individual, community and landscape drivers on the dynamics of a wildland forest epidemic
Sarah E. Haas; J. Hall Cushman; Whalen W. Dillon; Nathan E. Rank; David M. Rizzo; Ross K. Meentemeyer
2016-01-01
The challenges posed by observing host-pathogen-environment interactions across large geographic extents and over meaningful time scales limit our ability to understand and manage wildland epidemics. We conducted a landscape-scale, longitudinal study designed to analyze the dynamics of sudden oak death (an emerging forest disease caused by Phytophthora...
Students' Attitudes towards Edmodo, a Social Learning Network: A Scale Development Study
ERIC Educational Resources Information Center
Yunkul, Eyup; Cankaya, Serkan
2017-01-01
Social Learning Networks (SLNs) are the developed forms of Social Network Sites (SNSs) adapted to educational environments, and they are used by quite a large population throughout the world. In addition, in related literature, there is no scale for the measurement of students' attitudes towards such sites. The purpose of this study was to develop…
The Breast and Prostate Cancer and Hormone-Related Gene Variant Study allows large-scale analyses of breast and prostate cancer risk in relation to genetic polymorphisms and gene-environment interactions that affect hormone metabolism.
Measuring the universe with high-precision large-scale structure
NASA Astrophysics Data System (ADS)
Mehta, Kushal Tushar
Baryon acoustic oscillations (BAOs) are used to obtain precision measurements of cosmological parameters from large-scale surveys. While robust against most systematics, there are certain theoretical uncertainties that can affect BAO and galaxy clustering measurements. In this thesis I use data from the Sloan Digital Sky Survey (SDSS) to measure cosmological parameters and use N-body and smoothed-particle hydrodynamic (SPH) simulations to measure the effect of theoretical uncertainties by using halo occupation distributions (HODs). I investigate the effect of galaxy bias on BAO measurements by creating mock galaxy catalogs from large N-body simulations at z = 1. I find that there is no additional shift in the acoustic scale (0.10% +/- 0.10%) for the less biased HODs (b 3). I present the methodology and implementation of the simple one-step reconstruction technique introduced by Eisenstein et al. (2007) to biased tracers in N-body simulation. Reconstruction reduces the errorbars on the acoustic scale measurement by a factor of 1.5 - 2, and removes any additional shift due to galaxy bias for all HODs (0.07% +/- 0.15%) . Padmanabhan et al. (2012) and Xu et al. (2012) use this reconstruction technique in the SDSS DR7 data to measure DV (z = 0.35) (rsfidr s) = 1356 +/- 25 Mpc. Here I use this measurement in combination with measurements from the cosmic microwave background and the supernovae legacy survey to measure various cosmological parameters. I find the data consistent with the LambdaCDM Universe with a flat geometry. In particular, I measure H0 = 69.8 +/- 1.2 km/s/Mpc, w = 0.97 +/- 0.17, OK= -0.004 +/- 0.005 in the LambdaCDM, wCDM, and oCDM models respectively. Next, I measure the effect of large-scale (5 Mpc) halo environment density on the HOD by using an SPH simulation at z = 0, 0.35, 0.5, 0.75, 1.0$. I do not find any significant dependence of the HOD on the halo environment density for different galaxy mass thresholds, red and blue galaxies, and at different redshifts. I use the MultiDark N-body simualtion to measure the possible effect of environment density on the galaxy correlation function xi(r). I find that environment density enhances xi(r) by 3% at scales of 1 - 20 Mpc/h at z = 0 and up to 12% at 0.3 Mpc/h and 8% at 1 - 4 Mpc/h for z = 1.
Bakken, Tor Haakon; Aase, Anne Guri; Hagen, Dagmar; Sundt, Håkon; Barton, David N; Lujala, Päivi
2014-07-01
Climate change and the needed reductions in the use of fossil fuels call for the development of renewable energy sources. However, renewable energy production, such as hydropower (both small- and large-scale) and wind power have adverse impacts on the local environment by causing reductions in biodiversity and loss of habitats and species. This paper compares the environmental impacts of many small-scale hydropower plants with a few large-scale hydropower projects and one wind power farm, based on the same set of environmental parameters; land occupation, reduction in wilderness areas (INON), visibility and impacts on red-listed species. Our basis for comparison was similar energy volumes produced, without considering the quality of the energy services provided. The results show that small-scale hydropower performs less favourably in all parameters except land occupation. The land occupation of large hydropower and wind power is in the range of 45-50 m(2)/MWh, which is more than two times larger than the small-scale hydropower, where the large land occupation for large hydropower is explained by the extent of the reservoirs. On all the three other parameters small-scale hydropower performs more than two times worse than both large hydropower and wind power. Wind power compares similarly to large-scale hydropower regarding land occupation, much better on the reduction in INON areas, and in the same range regarding red-listed species. Our results demonstrate that the selected four parameters provide a basis for further development of a fair and consistent comparison of impacts between the analysed renewable technologies. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Soils as sinks or sources for diffuse pollution of the water cycle
NASA Astrophysics Data System (ADS)
Grathwohl, Peter
2010-05-01
Numerous chemical compounds have been released into the environment by human activities and can nowadays be found everywhere, i.e. in the compartments water, soil, and air, at the poles and in high mountains. Examples for a global distribution of toxic compounds are the persistent organic pollutants (PCB, dioxins, PAH, fluorinated surfactants and flame retardants, etc.: "the Stockholm dirty dozen") but also mercury and other metals. Many of these compounds reached a global distribution via the atmo¬sphere; others have been and are still directly applied to top soils at the large scale by agriculture or are released into groundwater at landfill sites or by discharge of treated or untreated waste waters. Sooner or later such compounds end up in the water cycle - often via an intermediate storage in soils. Pollutants in soils are leached by seepage waters, transferred to ground¬water, and transported to rivers via groundwater flow. Adsorbed compounds may be transported from soils into surface waters by erosion processes and will end up in the sediments. Diffuse pollution of the subsurface environment not only reflects the history of the economic development of the modern society but it is still ongoing - e.g. the number of organic pollutants released into the environment is increasing even though the con¬centrations may decrease compared to the past. Evidence shows that many compounds are persistent in the subsurface environment at large time scales (up to centuries). Thus polluted soils already are or may become a future source for pollution of adjacent compartments such as the atmosphere and groundwater. A profound understanding on how diffuse pollutants are stored and processed in the subsurface environment is crucial to assess their long term fate and transport at large scales. Thus integrated studies e.g. at the catchment scale and models are needed which couple not only the relevant compartments (soil - atmosphere - groundwater/surface waters) but also flow and reactive transport. Field observations must allow long-term monitoring (e.g. in hydrological observatories, TERENO etc.), new cross-compartment monitoring strategies need to be applied, and massive parallel numerical codes for prediction of reactive transport of potential water pollutants at catchment scale have to be developed. This is also a prerequisite to assess the impact of climate change as well as land use change on future surface and groundwater quality.
ON THE STAR FORMATION PROPERTIES OF VOID GALAXIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moorman, Crystal M.; Moreno, Jackeline; White, Amanda
2016-11-10
We measure the star formation properties of two large samples of galaxies from the SDSS in large-scale cosmic voids on timescales of 10 and 100 Myr, using H α emission line strengths and GALEX FUV fluxes, respectively. The first sample consists of 109,818 optically selected galaxies. We find that void galaxies in this sample have higher specific star formation rates (SSFRs; star formation rates per unit stellar mass) than similar stellar mass galaxies in denser regions. The second sample is a subset of the optically selected sample containing 8070 galaxies with reliable H i detections from ALFALFA. For the fullmore » H i detected sample, SSFRs do not vary systematically with large-scale environment. However, investigating only the H i detected dwarf galaxies reveals a trend toward higher SSFRs in voids. Furthermore, we estimate the star formation rate per unit H i mass (known as the star formation efficiency; SFE) of a galaxy, as a function of environment. For the overall H i detected population, we notice no environmental dependence. Limiting the sample to dwarf galaxies still does not reveal a statistically significant difference between SFEs in voids versus walls. These results suggest that void environments, on average, provide a nurturing environment for dwarf galaxy evolution allowing for higher specific star formation rates while forming stars with similar efficiencies to those in walls.« less
Astakhov, Vadim
2009-01-01
Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment.
Space Weather Research at the National Science Foundation
NASA Astrophysics Data System (ADS)
Moretto, T.
2015-12-01
There is growing recognition that the space environment can have substantial, deleterious, impacts on society. Consequently, research enabling specification and forecasting of hazardous space effects has become of great importance and urgency. This research requires studying the entire Sun-Earth system to understand the coupling of regions all the way from the source of disturbances in the solar atmosphere to the Earth's upper atmosphere. The traditional, region-based structure of research programs in Solar and Space physics is ill suited to fully support the change in research directions that the problem of space weather dictates. On the observational side, dense, distributed networks of observations are required to capture the full large-scale dynamics of the space environment. However, the cost of implementing these is typically prohibitive, especially for measurements in space. Thus, by necessity, the implementation of such new capabilities needs to build on creative and unconventional solutions. A particularly powerful idea is the utilization of new developments in data engineering and informatics research (big data). These new technologies make it possible to build systems that can collect and process huge amounts of noisy and inaccurate data and extract from them useful information. The shift in emphasis towards system level science for geospace also necessitates the development of large-scale and multi-scale models. The development of large-scale models capable of capturing the global dynamics of the Earth's space environment requires investment in research team efforts that go beyond what can typically be funded under the traditional grants programs. This calls for effective interdisciplinary collaboration and efficient leveraging of resources both nationally and internationally. This presentation will provide an overview of current and planned initiatives, programs, and activities at the National Science Foundation pertaining to space weathe research.
Banana production systems: identification of alternative systems for more sustainable production.
Bellamy, Angelina Sanderson
2013-04-01
Large-scale, monoculture production systems dependent on synthetic fertilizers and pesticides, increase yields, but are costly and have deleterious impacts on human health and the environment. This research investigates variations in banana production practices in Costa Rica, to identify alternative systems that combine high productivity and profitability, with reduced reliance on agrochemicals. Farm workers were observed during daily production activities; 39 banana producers and 8 extension workers/researchers were interviewed; and a review of field experiments conducted by the National Banana Corporation between 1997 and 2002 was made. Correspondence analysis showed that there is no structured variation in large-scale banana producers' practices, but two other banana production systems were identified: a small-scale organic system and a small-scale conventional coffee-banana intercropped system. Field-scale research may reveal ways that these practices can be scaled up to achieve a productive and profitable system producing high-quality export bananas with fewer or no pesticides.
Real-time evolution of a large-scale relativistic jet
NASA Astrophysics Data System (ADS)
Martí, Josep; Luque-Escamilla, Pedro L.; Romero, Gustavo E.; Sánchez-Sutil, Juan R.; Muñoz-Arjonilla, Álvaro J.
2015-06-01
Context. Astrophysical jets are ubiquitous in the Universe on all scales, but their large-scale dynamics and evolution in time are hard to observe since they usually develop at a very slow pace. Aims: We aim to obtain the first observational proof of the expected large-scale evolution and interaction with the environment in an astrophysical jet. Only jets from microquasars offer a chance to witness the real-time, full-jet evolution within a human lifetime, since they combine a "short", few parsec length with relativistic velocities. Methods: The methodology of this work is based on a systematic recalibraton of interferometric radio observations of microquasars available in public archives. In particular, radio observations of the microquasar GRS 1758-258 over less than two decades have provided the most striking results. Results: Significant morphological variations in the extended jet structure of GRS 1758-258 are reported here that were previously missed. Its northern radio lobe underwent a major morphological variation that rendered the hotspot undetectable in 2001 and reappeared again in the following years. The reported changes confirm the Galactic nature of the source. We tentatively interpret them in terms of the growth of instabilities in the jet flow. There is also evidence of surrounding cocoon. These results can provide a testbed for models accounting for the evolution of jets and their interaction with the environment.
Cyclicity in Upper Mississippian Bangor Limestone, Blount County, Alabama
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronner, R.L.
1988-01-01
The Upper Mississippian (Chesterian) Bangor Limestone in Alabama consists of a thick, complex sequence of carbonate platform deposits. A continuous core through the Bangor on Blount Mountain in north-central Alabama provides the opportunity to analyze the unit for cyclicity and to identify controls on vertical facies sequence. Lithologies from the core represent four general environments of deposition: (1) subwave-base, open marine, (2) shoal, (3) lagoon, and (4) peritidal. Analysis of the vertical sequence of lithologies in the core indicates the presence of eight large-scale cycles dominated by subtidal deposits, but defined on the basis of peritidal caps. These large-scale cyclesmore » can be subdivided into 16 small-scale cycles that may be entirely subtidal but illustrate upward shallowing followed by rapid deepening. Large-scale cycles range from 33 to 136 ft thick, averaging 68 ft; small-scale cycles range from 5 to 80 ft thick and average 34 ft. Small-scale cycles have an average duration of approximately 125,000 years, which is compatible with Milankovitch periodicity. The large-scale cycles have an average duration of approximately 250,000 years, which may simply reflect variations in amplitude of sea level fluctuation or the influence of tectonic subsidence along the southeastern margin of the North American craton.« less
Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan
NASA Astrophysics Data System (ADS)
Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun
2017-04-01
Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.
A general explanation on the correlation of dark matter halo spin with the large-scale environment
NASA Astrophysics Data System (ADS)
Wang, Peng; Kang, Xi
2017-06-01
Both simulations and observations have found that the spin of halo/galaxy is correlated with the large-scale environment, and particularly the spin of halo flips in filament. A consistent picture of halo spin evolution in different environments is still lacked. Using N-body simulation, we find that halo spin with its environment evolves continuously from sheet to cluster, and the flip of halo spin happens both in filament and nodes. The flip in filament can be explained by halo formation time and migrating time when its environment changes from sheet to filament. For low-mass haloes, they form first in sheets and migrate into filaments later, so their mass and spin growth inside filament are lower, and the original spin is still parallel to filament. For high-mass haloes, they migrate into filaments first, and most of their mass and spin growth are obtained in filaments, so the resulted spin is perpendicular to filament. Our results well explain the overall evolution of cosmic web in the cold dark matter model and can be tested using high-redshift data. The scenario can also be tested against alternative models of dark matter, such as warm/hot dark matter, where the structure formation will proceed in a different way.
Visualizing vascular structures in virtual environments
NASA Astrophysics Data System (ADS)
Wischgoll, Thomas
2013-01-01
In order to learn more about the cause of coronary heart diseases and develop diagnostic tools, the extraction and visualization of vascular structures from volumetric scans for further analysis is an important step. By determining a geometric representation of the vasculature, the geometry can be inspected and additional quantitative data calculated and incorporated into the visualization of the vasculature. To provide a more user-friendly visualization tool, virtual environment paradigms can be utilized. This paper describes techniques for interactive rendering of large-scale vascular structures within virtual environments. This can be applied to almost any virtual environment configuration, such as CAVE-type displays. Specifically, the tools presented in this paper were tested on a Barco I-Space and a large 62x108 inch passive projection screen with a Kinect sensor for user tracking.
Scattering from Marine Sediments in a Very Shallow Water Environment
2015-12-28
taking into account only large-scale changes of the environment. Keywords: Reciprocity , integral equations, volume and roughness scattering...for Public Release, Distribution Unlimited A. Ivakin: Scattering in range-dependent waveguides 5 II. VOLUME PERTURBATIONS: RECIPROCITY THEOREM...6], i.e. with the same υ , and therefore same Q , which, along with following discussion of reciprocity , explains the choice of this parameter
ERIC Educational Resources Information Center
Chu, Man-Wai; Babenko, Oksana; Cui, Ying; Leighton, Jacqueline P.
2014-01-01
The study examines the role that perceptions or impressions of learning environments and assessments play in students' performance on a large-scale standardized test. Hierarchical linear modeling (HLM) was used to test aspects of the Learning Errors and Formative Feedback model to determine how much variation in students' performance was explained…
vPELS: An E-Learning Social Environment for VLSI Design with Content Security Using DRM
ERIC Educational Resources Information Center
Dewan, Jahangir; Chowdhury, Morshed; Batten, Lynn
2014-01-01
This article provides a proposal for personal e-learning system (vPELS [where "v" stands for VLSI: very large scale integrated circuit])) architecture in the context of social network environment for VLSI Design. The main objective of vPELS is to develop individual skills on a specific subject--say, VLSI--and share resources with peers.…
The Intelligent Control System and Experiments for an Unmanned Wave Glider.
Liao, Yulei; Wang, Leifeng; Li, Yiming; Li, Ye; Jiang, Quanquan
2016-01-01
The control system designing of Unmanned Wave Glider (UWG) is challenging since the control system is weak maneuvering, large time-lag and large disturbance, which is difficult to establish accurate mathematical model. Meanwhile, to complete marine environment monitoring in long time scale and large spatial scale autonomously, UWG asks high requirements of intelligence and reliability. This paper focuses on the "Ocean Rambler" UWG. First, the intelligent control system architecture is designed based on the cerebrum basic function combination zone theory and hierarchic control method. The hardware and software designing of the embedded motion control system are mainly discussed. A motion control system based on rational behavior model of four layers is proposed. Then, combining with the line-of sight method(LOS), a self-adapting PID guidance law is proposed to compensate the steady state error in path following of UWG caused by marine environment disturbance especially current. Based on S-surface control method, an improved S-surface heading controller is proposed to solve the heading control problem of the weak maneuvering carrier under large disturbance. Finally, the simulation experiments were carried out and the UWG completed autonomous path following and marine environment monitoring in sea trials. The simulation experiments and sea trial results prove that the proposed intelligent control system, guidance law, controller have favorable control performance, and the feasibility and reliability of the designed intelligent control system of UWG are verified.
The Intelligent Control System and Experiments for an Unmanned Wave Glider
Liao, Yulei; Wang, Leifeng; Li, Yiming; Li, Ye; Jiang, Quanquan
2016-01-01
The control system designing of Unmanned Wave Glider (UWG) is challenging since the control system is weak maneuvering, large time-lag and large disturbance, which is difficult to establish accurate mathematical model. Meanwhile, to complete marine environment monitoring in long time scale and large spatial scale autonomously, UWG asks high requirements of intelligence and reliability. This paper focuses on the “Ocean Rambler” UWG. First, the intelligent control system architecture is designed based on the cerebrum basic function combination zone theory and hierarchic control method. The hardware and software designing of the embedded motion control system are mainly discussed. A motion control system based on rational behavior model of four layers is proposed. Then, combining with the line-of sight method(LOS), a self-adapting PID guidance law is proposed to compensate the steady state error in path following of UWG caused by marine environment disturbance especially current. Based on S-surface control method, an improved S-surface heading controller is proposed to solve the heading control problem of the weak maneuvering carrier under large disturbance. Finally, the simulation experiments were carried out and the UWG completed autonomous path following and marine environment monitoring in sea trials. The simulation experiments and sea trial results prove that the proposed intelligent control system, guidance law, controller have favorable control performance, and the feasibility and reliability of the designed intelligent control system of UWG are verified. PMID:28005956
NASA Astrophysics Data System (ADS)
Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin
2012-08-01
Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.
Couriot, Ophélie; Hewison, A J Mark; Saïd, Sonia; Cagnacci, Francesca; Chamaillé-Jammes, Simon; Linnell, John D C; Mysterud, Atle; Peters, Wibke; Urbano, Ferdinando; Heurich, Marco; Kjellander, Petter; Nicoloso, Sandro; Berger, Anne; Sustr, Pavel; Kroeschel, Max; Soennichsen, Leif; Sandfort, Robin; Gehr, Benedikt; Morellet, Nicolas
2018-05-01
Much research on large herbivore movement has focused on the annual scale to distinguish between resident and migratory tactics, commonly assuming that individuals are sedentary at the within-season scale. However, apparently sedentary animals may occupy a number of sub-seasonal functional home ranges (sfHR), particularly when the environment is spatially heterogeneous and/or temporally unpredictable. The roe deer (Capreolus capreolus) experiences sharply contrasting environmental conditions due to its widespread distribution, but appears markedly sedentary over much of its range. Using GPS monitoring from 15 populations across Europe, we evaluated the propensity of this large herbivore to be truly sedentary at the seasonal scale in relation to variation in environmental conditions. We studied movement using net square displacement to identify the possible use of sfHR. We expected that roe deer should be less sedentary within seasons in heterogeneous and unpredictable environments, while migratory individuals should be seasonally more sedentary than residents. Our analyses revealed that, across the 15 populations, all individuals adopted a multi-range tactic, occupying between two and nine sfHR during a given season. In addition, we showed that (i) the number of sfHR was only marginally influenced by variation in resource distribution, but decreased with increasing sfHR size; and (ii) the distance between sfHR increased with increasing heterogeneity and predictability in resource distribution, as well as with increasing sfHR size. We suggest that the multi-range tactic is likely widespread among large herbivores, allowing animals to track spatio-temporal variation in resource distribution and, thereby, to cope with changes in their local environment.
Effect of the Large Scale Environment on the Internal Dynamics of Early-Type Galaxies
NASA Astrophysics Data System (ADS)
Maubon, G.; Prugniel, Ph.
We have studied the population-density relation in very sparse environments, from poor clusters to isolated galaxies, and we find that early-type galaxies with a young stellar population are preferably found in the lowest density environments. We show a marginal indication that this effect is due to an enhancement of the stellar formation independent of the morphological segregation, but we failed to find any effect from the internal dynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Gang
Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less
Automated Scheduling of Science Activities for Titan Encounters by Cassini
NASA Technical Reports Server (NTRS)
Ray, Trina L.; Knight, Russel L.; Mohr, Dave
2014-01-01
In an effort to demonstrate the efficacy of automated planning and scheduling techniques for large missions, we have adapted ASPEN (Activity Scheduling and Planning Environment) [1] and CLASP (Compressed Large-scale Activity Scheduling and Planning) [2] to the domain of scheduling high-level science goals into conflict-free operations plans for Titan encounters by the Cassini spacecraft.
ERIC Educational Resources Information Center
Kurup, Anitha; Maithreyi, R.
2012-01-01
Large-scale sequential research developments for identification and measurement of giftedness have received ample attention in the West, whereas India's response to this has largely been lukewarm. The wide variation in parents' abilities to provide enriched environments to nurture their children's potential makes it imperative for India to develop…
Reisner, A E
2005-11-01
The building and expansion of large-scale swine facilities has created considerable controversy in many neighboring communities, but to date, no systematic analysis has been done of the types of claims made during these conflicts. This study examined how local newspapers in one state covered the transition from the dominance of smaller, diversified swine operations to large, single-purpose pig production facilities. To look at publicly made statements concerning large-scale swine facilities (LSSF), the study collected all articles related to LSSF from 22 daily Illinois newspapers over a 3-yr period (a total of 1,737 articles). The most frequent sets of claims used by proponents of LSSF were that the environment was not harmed, that state regulations were sufficiently strict, and that the state economically needed this type of agriculture. The most frequent claims made by opponents were that LSSF harmed the environment and neighboring communities and that stricter regulations were needed. Proponents' claims were primarily defensive and, to some degree, underplayed the advantages of LSSF. Pro-and anti-LSSF groups were talking at cross-purposes, to some degree. Even across similar themes, those in favor of LSSF and those opposed were addressing different sets of concerns. The newspaper claims did not indicate any effective alliances forming between local anti-LSSF groups and national environmental or animal rights groups.
VizieR Online Data Catalog: Isolated galaxies, pairs and triplets (Argudo-Fernandez+, 2015)
NASA Astrophysics Data System (ADS)
Argudo-Fernandez, M.; Verley, S.; Bergond, G.; Duarte Puertas, S.; Ramos Carmona, E.; Sabater, J.; Fernandez, Lorenzo M.; Espada, D.; Sulentic, J.; Ruiz, J. E.; Leon, S.
2015-04-01
Catalogues of isolated galaxies, isolated pairs, and isolated triplets in the local Universe with positions, redshifts, and degrees of relation with their physical and large-scale environments. (5 data files).
Management applications of discontinuity theory
1.Human impacts on the environment are multifaceted and can occur across distinct spatiotemporal scales. Ecological responses to environmental change are therefore difficult to predict, and entail large degrees of uncertainty. Such uncertainty requires robust tools for management...
Harris, Michael J; Woo, Hyung-June
2008-11-01
Energetics of conformational changes experienced by an ATP-bound myosin head detached from actin was studied by all-atom explicit water umbrella sampling simulations. The statistics of coupling between large scale domain movements and smaller scale structural features were examined, including the closing of the ATP binding pocket, and a number of key hydrogen bond formations shown to play roles in structural and biochemical studies. The statistics for the ATP binding pocket open/close transition show an evolution of the relative stability from the open state in the early stages of the recovery stroke to the stable closed state after the stroke. The change in solvation environment of the fluorescence probe Trp507 (scallop numbering; 501 in Dictyostelium discoideum) indicates that the probe faithfully reflects the closing of the binding pocket as previously shown in experimental studies, while being directly coupled to roughly the early half of the overall large scale conformational change of the converter domain rotation. The free energy change of this solvation environment change, in particular, is -1.3 kcal/mol, in close agreement with experimental estimates. In addition, our results provide direct molecular level data allowing for interpretations of the fluorescence experiments of myosin conformational change in terms of the de-solvation of Trp side chain.
Dynamics and locomotion of flexible foils in a frictional environment
NASA Astrophysics Data System (ADS)
Wang, Xiaolin; Alben, Silas
2018-01-01
Over the past few decades, oscillating flexible foils have been used to study the physics of organismal propulsion in different fluid environments. Here, we extend this work to a study of flexible foils in a frictional environment. When the foil is oscillated by heaving at one end but is not free to locomote, the dynamics change from periodic to non-periodic and chaotic as the heaving amplitude increases or the bending rigidity decreases. For friction coefficients lying in a certain range, the transition passes through a sequence of N-periodic and asymmetric states before reaching chaotic dynamics. Resonant peaks are damped and shifted by friction and large heaving amplitudes, leading to bistable states. When the foil is free to locomote, the horizontal motion smoothes the resonant behaviours. For moderate frictional coefficients, steady but slow locomotion is obtained. For large transverse friction and small tangential friction corresponding to wheeled snake robots, faster locomotion is obtained. Travelling wave motions arise spontaneously, and move with horizontal speeds that scale as transverse friction coefficient to the power 1/4 and input power that scales as the transverse friction coefficient to the power 5/12. These scalings are consistent with a boundary layer form of the solutions near the foil's leading edge.
Dynamics and locomotion of flexible foils in a frictional environment.
Wang, Xiaolin; Alben, Silas
2018-01-01
Over the past few decades, oscillating flexible foils have been used to study the physics of organismal propulsion in different fluid environments. Here, we extend this work to a study of flexible foils in a frictional environment. When the foil is oscillated by heaving at one end but is not free to locomote, the dynamics change from periodic to non-periodic and chaotic as the heaving amplitude increases or the bending rigidity decreases. For friction coefficients lying in a certain range, the transition passes through a sequence of N -periodic and asymmetric states before reaching chaotic dynamics. Resonant peaks are damped and shifted by friction and large heaving amplitudes, leading to bistable states. When the foil is free to locomote, the horizontal motion smoothes the resonant behaviours. For moderate frictional coefficients, steady but slow locomotion is obtained. For large transverse friction and small tangential friction corresponding to wheeled snake robots, faster locomotion is obtained. Travelling wave motions arise spontaneously, and move with horizontal speeds that scale as transverse friction coefficient to the power 1/4 and input power that scales as the transverse friction coefficient to the power 5/12. These scalings are consistent with a boundary layer form of the solutions near the foil's leading edge.
Kongelf, Anine; Bandewar, Sunita V. S.; Bharat, Shalini; Collumbien, Martine
2015-01-01
Background In the last decade, community mobilisation (CM) interventions targeting female sex workers (FSWs) have been scaled-up in India’s national response to the HIV epidemic. This included the Bill and Melinda Gates Foundation’s Avahan programme which adopted a business approach to plan and manage implementation at scale. With the focus of evaluation efforts on measuring effectiveness and health impacts there has been little analysis thus far of the interaction of the CM interventions with the sex work industry in complex urban environments. Methods and Findings Between March and July 2012 semi-structured, in-depth interviews and focus group discussions were conducted with 63 HIV intervention implementers, to explore challenges of HIV prevention among FSWs in Mumbai. A thematic analysis identified contextual factors that impact CM implementation. Large-scale interventions are not only impacted by, but were shown to shape the dynamic social context. Registration practices and programme monitoring were experienced as stigmatising, reflected in shifting client preferences towards women not disclosing as ‘sex workers’. This combined with urban redevelopment and gentrification of traditional red light areas, forcing dispersal and more ‘hidden’ ways of solicitation, further challenging outreach and collectivisation. Participants reported that brothel owners and ‘pimps’ continued to restrict access to sex workers and the heterogeneous ‘community’ of FSWs remains fragmented with high levels of mobility. Stakeholder engagement was poor and mobilising around HIV prevention not compelling. Interventions largely failed to respond to community needs as strong target-orientation skewed activities towards those most easily measured and reported. Conclusion Large-scale interventions have been impacted by and contributed to an increasingly complex sex work environment in Mumbai, challenging outreach and mobilisation efforts. Sex workers remain a vulnerable and disempowered group needing continued support and more comprehensive services. PMID:25811484
Preliminary simulations of the large-scale environment during the FIRE cirrus IFO
NASA Technical Reports Server (NTRS)
Westphal, Douglas L.; Toon, Owen B.
1990-01-01
Large scale forcing (scales greater than 500 km) is the dominant factor in the generation, maintenance, and dissipation of cirrus cloud systems. However, the analyses of data acquired during the first Cirrus IFO have highlighted the importance of mesoscale processes (scales of 20 to 500 km) to the development of cirrus cloud systems. Unfortunately, Starr and Wylie found that the temporal and spatial resolution of the standard and supplemental rawinsonde data were insufficient to allow an explanation of all of the mesoscale cloud features that were present on the 27 to 28 Oct. 1986. It is described how dynamic initialization, or 4-D data assimilation (FDDA) can provide a method to address this problem. The first steps towards application of FDDA to FIRE are also described.
Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon
2018-01-01
Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.
Design and Implement of Astronomical Cloud Computing Environment In China-VO
NASA Astrophysics Data System (ADS)
Li, Changhua; Cui, Chenzhou; Mi, Linying; He, Boliang; Fan, Dongwei; Li, Shanshan; Yang, Sisi; Xu, Yunfei; Han, Jun; Chen, Junyi; Zhang, Hailong; Yu, Ce; Xiao, Jian; Wang, Chuanjun; Cao, Zihuang; Fan, Yufeng; Liu, Liang; Chen, Xiao; Song, Wenming; Du, Kangyu
2017-06-01
Astronomy cloud computing environment is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Based on virtualization technology, astronomy cloud computing environment was designed and implemented by China-VO team. It consists of five distributed nodes across the mainland of China. Astronomer can get compuitng and storage resource in this cloud computing environment. Through this environments, astronomer can easily search and analyze astronomical data collected by different telescopes and data centers , and avoid the large scale dataset transportation.
Track-based event recognition in a realistic crowded environment
NASA Astrophysics Data System (ADS)
van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.
2014-10-01
Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.
Scales of Star Formation: Does Local Environment Matter?
NASA Astrophysics Data System (ADS)
Bittle, Lauren
2018-01-01
I will present my work on measuring molecular gas properties in local universe galaxies to assess the impact of local environment on the gas and thus star formation. I will also discuss the gas properties on spatial scales that span an order of magnitude to best understand the layers of star formation processes. Local environments within these galaxies include external mechanisms from starburst supernova shells, spiral arm structure, and superstar cluster radiation. Observations of CO giant molecular clouds (GMC) of ~150pc resolution in IC 10, the Local Group dwarf starburst, probe the large-scale diffuse gas, some of which are near supernova bubble ridges. We mapped CO clouds across the spiral NGC 7793 at intermediate scales of ~20pc resolution with ALMA. With the clouds, we can test theories of cloud formation and destruction in relation to the spiral arm pattern and cluster population from the HST LEGUS analysis. Addressing the smallest scales, I will show results of 30 Doradus ALMA observations of sub-parsec dense molecular gas clumps only 15pc away from a superstar cluster R136. Though star formation occurs directly from the collapse of densest molecular gas, we test theories of scale-free star formation, which suggests a constant slope of the mass function from ~150pc GMCs to sub-parsec clumps. Probing environments including starburst supernova shells, spiral arm structure, and superstar cluster radiation shed light on how these local external mechanisms affect the molecular gas at various scales of star formation.
Human3.6M: Large Scale Datasets and Predictive Methods for 3D Human Sensing in Natural Environments.
Ionescu, Catalin; Papava, Dragos; Olaru, Vlad; Sminchisescu, Cristian
2014-07-01
We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.
Breast and Prostate Cancer and Hormone-Related Gene Variant Study
The Breast and Prostate Cancer and Hormone-Related Gene Variant Study allows large-scale analyses of breast and prostate cancer risk in relation to genetic polymorphisms and gene-environment interactions that affect hormone metabolism.
Enhanced ICP for the Registration of Large-Scale 3D Environment Models: An Experimental Study
Han, Jianda; Yin, Peng; He, Yuqing; Gu, Feng
2016-01-01
One of the main applications of mobile robots is the large-scale perception of the outdoor environment. One of the main challenges of this application is fusing environmental data obtained by multiple robots, especially heterogeneous robots. This paper proposes an enhanced iterative closest point (ICP) method for the fast and accurate registration of 3D environmental models. First, a hierarchical searching scheme is combined with the octree-based ICP algorithm. Second, an early-warning mechanism is used to perceive the local minimum problem. Third, a heuristic escape scheme based on sampled potential transformation vectors is used to avoid local minima and achieve optimal registration. Experiments involving one unmanned aerial vehicle and one unmanned surface vehicle were conducted to verify the proposed technique. The experimental results were compared with those of normal ICP registration algorithms to demonstrate the superior performance of the proposed method. PMID:26891298
NASA Astrophysics Data System (ADS)
Hamada, Y.; O'Connor, B. L.
2012-12-01
Development in arid environments often results in the loss and degradation of the ephemeral streams that provide habitat and critical ecosystem functions such as water delivery, sediment transport, and groundwater recharge. Quantification of these ecosystem functions is challenging because of the episodic nature of runoff events in desert landscapes and the large spatial scale of watersheds that potentially can be impacted by large-scale development. Low-impact development guidelines and regulatory protection of ephemeral streams are often lacking due to the difficulty of accurately mapping and quantifying the critical functions of ephemeral streams at scales larger than individual reaches. Renewable energy development in arid regions has the potential to disturb ephemeral streams at the watershed scale, and it is necessary to develop environmental monitoring applications for ephemeral streams to help inform land management and regulatory actions aimed at protecting and mitigating for impacts related to large-scale land disturbances. This study focuses on developing remote sensing methodologies to identify and monitor impacts on ephemeral streams resulting from the land disturbance associated with utility-scale solar energy development in the desert southwest of the United States. Airborne very high resolution (VHR) multispectral imagery is used to produce stereoscopic, three-dimensional landscape models that can be used to (1) identify and map ephemeral stream channel networks, and (2) support analyses and models of hydrologic and sediment transport processes that pertain to the critical functionality of ephemeral streams. Spectral and statistical analyses are being developed to extract information about ephemeral channel location and extent, micro-topography, riparian vegetation, and soil moisture characteristics. This presentation will demonstrate initial results and provide a framework for future work associated with this project, for developing the necessary field measurements necessary to verify remote sensing landscape models, and for generating hydrologic models and analyses.
Mechanisation of large-scale agricultural fields in developing countries - a review.
Onwude, Daniel I; Abdulstter, Rafia; Gomes, Chandima; Hashim, Norhashila
2016-09-01
Mechanisation of large-scale agricultural fields often requires the application of modern technologies such as mechanical power, automation, control and robotics. These technologies are generally associated with relatively well developed economies. The application of these technologies in some developing countries in Africa and Asia is limited by factors such as technology compatibility with the environment, availability of resources to facilitate the technology adoption, cost of technology purchase, government policies, adequacy of technology and appropriateness in addressing the needs of the population. As a result, many of the available resources have been used inadequately by farmers, who continue to rely mostly on conventional means of agricultural production, using traditional tools and equipment in most cases. This has led to low productivity and high cost of production among others. Therefore this paper attempts to evaluate the application of present day technology and its limitations to the advancement of large-scale mechanisation in developing countries of Africa and Asia. Particular emphasis is given to a general understanding of the various levels of mechanisation, present day technology, its management and application to large-scale agricultural fields. This review also focuses on/gives emphasis to future outlook that will enable a gradual, evolutionary and sustainable technological change. The study concludes that large-scale-agricultural farm mechanisation for sustainable food production in Africa and Asia must be anchored on a coherent strategy based on the actual needs and priorities of the large-scale farmers. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets. PMID:25937948
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.
Klein, Brennan J; Li, Zhi; Durgin, Frank H
2016-04-01
What is the natural reference frame for seeing large-scale spatial scenes in locomotor action space? Prior studies indicate an asymmetric angular expansion in perceived direction in large-scale environments: Angular elevation relative to the horizon is perceptually exaggerated by a factor of 1.5, whereas azimuthal direction is exaggerated by a factor of about 1.25. Here participants made angular and spatial judgments when upright or on their sides to dissociate egocentric from allocentric reference frames. In Experiment 1, it was found that body orientation did not affect the magnitude of the up-down exaggeration of direction, suggesting that the relevant orientation reference frame for this directional bias is allocentric rather than egocentric. In Experiment 2, the comparison of large-scale horizontal and vertical extents was somewhat affected by viewer orientation, but only to the extent necessitated by the classic (5%) horizontal-vertical illusion (HVI) that is known to be retinotopic. Large-scale vertical extents continued to appear much larger than horizontal ground extents when observers lay sideways. When the visual world was reoriented in Experiment 3, the bias remained tied to the ground-based allocentric reference frame. The allocentric HVI is quantitatively consistent with differential angular exaggerations previously measured for elevation and azimuth in locomotor space. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Klein, Brennan J.; Li, Zhi; Durgin, Frank H.
2015-01-01
What is the natural reference frame for seeing large-scale spatial scenes in locomotor action space? Prior studies indicate an asymmetric angular expansion in perceived direction in large-scale environments: Angular elevation relative to the horizon is perceptually exaggerated by a factor of 1.5, whereas azimuthal direction is exaggerated by a factor of about 1.25. Here participants made angular and spatial judgments when upright or on their sides in order to dissociate egocentric from allocentric reference frames. In Experiment 1 it was found that body orientation did not affect the magnitude of the up-down exaggeration of direction, suggesting that the relevant orientation reference frame for this directional bias is allocentric rather than egocentric. In Experiment 2, the comparison of large-scale horizontal and vertical extents was somewhat affected by viewer orientation, but only to the extent necessitated by the classic (5%) horizontal-vertical illusion (HVI) that is known to be retinotopic. Large-scale vertical extents continued to appear much larger than horizontal ground extents when observers lay sideways. When the visual world was reoriented in Experiment 3, the bias remained tied to the ground-based allocentric reference frame. The allocentric HVI is quantitatively consistent with differential angular exaggerations previously measured for elevation and azimuth in locomotor space. PMID:26594884
Coronal mass ejections and their sheath regions in interplanetary space
NASA Astrophysics Data System (ADS)
Kilpua, Emilia; Koskinen, Hannu E. J.; Pulkkinen, Tuija I.
2017-11-01
Interplanetary coronal mass ejections (ICMEs) are large-scale heliospheric transients that originate from the Sun. When an ICME is sufficiently faster than the preceding solar wind, a shock wave develops ahead of the ICME. The turbulent region between the shock and the ICME is called the sheath region. ICMEs and their sheaths and shocks are all interesting structures from the fundamental plasma physics viewpoint. They are also key drivers of space weather disturbances in the heliosphere and planetary environments. ICME-driven shock waves can accelerate charged particles to high energies. Sheaths and ICMEs drive practically all intense geospace storms at the Earth, and they can also affect dramatically the planetary radiation environments and atmospheres. This review focuses on the current understanding of observational signatures and properties of ICMEs and the associated sheath regions based on five decades of studies. In addition, we discuss modelling of ICMEs and many fundamental outstanding questions on their origin, evolution and effects, largely due to the limitations of single spacecraft observations of these macro-scale structures. We also present current understanding of space weather consequences of these large-scale solar wind structures, including effects at the other Solar System planets and exoplanets. We specially emphasize the different origin, properties and consequences of the sheaths and ICMEs.
Juvrud, Joshua; Gredebäck, Gustaf; Åhs, Fredrik; Lerin, Nils; Nyström, Pär; Kastrati, Granit; Rosén, Jörgen
2018-01-01
There is a need for large-scale remote data collection in a controlled environment, and the in-home availability of virtual reality (VR) and the commercial availability of eye tracking for VR present unique and exciting opportunities for researchers. We propose and provide a proof-of-concept assessment of a robust system for large-scale in-home testing using consumer products that combines psychophysiological measures and VR, here referred to as a Virtual Lab. For the first time, this method is validated by correlating autonomic responses, skin conductance response (SCR), and pupillary dilation, in response to a spider, a beetle, and a ball using commercially available VR. Participants demonstrated greater SCR and pupillary responses to the spider, and the effect was dependent on the proximity of the stimuli to the participant, with a stronger response when the spider was close to the virtual self. We replicated these effects across two experiments and in separate physical room contexts to mimic variability in home environment. Together, these findings demonstrate the utility of pupil dilation as a marker of autonomic arousal and the feasibility to assess this in commercially available VR hardware and support a robust Virtual Lab tool for massive remote testing.
Juvrud, Joshua; Gredebäck, Gustaf; Åhs, Fredrik; Lerin, Nils; Nyström, Pär; Kastrati, Granit; Rosén, Jörgen
2018-01-01
There is a need for large-scale remote data collection in a controlled environment, and the in-home availability of virtual reality (VR) and the commercial availability of eye tracking for VR present unique and exciting opportunities for researchers. We propose and provide a proof-of-concept assessment of a robust system for large-scale in-home testing using consumer products that combines psychophysiological measures and VR, here referred to as a Virtual Lab. For the first time, this method is validated by correlating autonomic responses, skin conductance response (SCR), and pupillary dilation, in response to a spider, a beetle, and a ball using commercially available VR. Participants demonstrated greater SCR and pupillary responses to the spider, and the effect was dependent on the proximity of the stimuli to the participant, with a stronger response when the spider was close to the virtual self. We replicated these effects across two experiments and in separate physical room contexts to mimic variability in home environment. Together, these findings demonstrate the utility of pupil dilation as a marker of autonomic arousal and the feasibility to assess this in commercially available VR hardware and support a robust Virtual Lab tool for massive remote testing. PMID:29867318
Sex differences in virtual navigation influenced by scale and navigation experience.
Padilla, Lace M; Creem-Regehr, Sarah H; Stefanucci, Jeanine K; Cashdan, Elizabeth A
2017-04-01
The Morris water maze is a spatial abilities test adapted from the animal spatial cognition literature and has been studied in the context of sex differences in humans. This is because its standard design, which manipulates proximal (close) and distal (far) cues, applies to human navigation. However, virtual Morris water mazes test navigation skills on a scale that is vastly smaller than natural human navigation. Many researchers have argued that navigating in large and small scales is fundamentally different, and small-scale navigation might not simulate natural human navigation. Other work has suggested that navigation experience could influence spatial skills. To address the question of how individual differences influence navigational abilities in differently scaled environments, we employed both a large- (146.4 m in diameter) and a traditional- (36.6 m in diameter) scaled virtual Morris water maze along with a novel measure of navigation experience (lifetime mobility). We found sex differences on the small maze in the distal cue condition only, but in both cue-conditions on the large maze. Also, individual differences in navigation experience modulated navigation performance on the virtual water maze, showing that higher mobility was related to better performance with proximal cues for only females on the small maze, but for both males and females on the large maze.
Tropical Cyclone Information System
NASA Technical Reports Server (NTRS)
Li, P. Peggy; Knosp, Brian W.; Vu, Quoc A.; Yi, Chao; Hristova-Veleva, Svetla M.
2009-01-01
The JPL Tropical Cyclone Infor ma tion System (TCIS) is a Web portal (http://tropicalcyclone.jpl.nasa.gov) that provides researchers with an extensive set of observed hurricane parameters together with large-scale and convection resolving model outputs. It provides a comprehensive set of high-resolution satellite (see figure), airborne, and in-situ observations in both image and data formats. Large-scale datasets depict the surrounding environmental parameters such as SST (Sea Surface Temperature) and aerosol loading. Model outputs and analysis tools are provided to evaluate model performance and compare observations from different platforms. The system pertains to the thermodynamic and microphysical structure of the storm, the air-sea interaction processes, and the larger-scale environment as depicted by ocean heat content and the aerosol loading of the environment. Currently, the TCIS is populated with satellite observations of all tropical cyclones observed globally during 2005. There is a plan to extend the database both forward in time till present as well as backward to 1998. The portal is powered by a MySQL database and an Apache/Tomcat Web server on a Linux system. The interactive graphic user interface is provided by Google Map.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jang, Seogjoo; Hoyer, Stephan; Fleming, Graham
2014-10-31
A generalized master equation (GME) governing quantum evolution of modular exciton density (MED) is derived for large scale light harvesting systems composed of weakly interacting modules of multiple chromophores. The GME-MED offers a practical framework to incorporate real time coherent quantum dynamics calculations of small length scales into dynamics over large length scales, and also provides a non-Markovian generalization and rigorous derivation of the Pauli master equation employing multichromophoric Förster resonance energy transfer rates. A test of the GME-MED for four sites of the Fenna-Matthews-Olson complex demonstrates how coherent dynamics of excitonic populations over coupled chromophores can be accurately describedmore » by transitions between subgroups (modules) of delocalized excitons. Application of the GME-MED to the exciton dynamics between a pair of light harvesting complexes in purple bacteria demonstrates its promise as a computationally efficient tool to investigate large scale exciton dynamics in complex environments.« less
ERIC Educational Resources Information Center
Asian American Legal Defense and Education Fund, 2009
2009-01-01
Since 2002, the New York City Department of Education (DOE) has attempted to reverse the city's severe drop-out crisis through a large scale restructuring of high schools, focused mainly on closing large, comprehensive high schools and replacing them with small high schools that offer a more personalized learning environment. Unfortunately, this…
Intermountain Cyclogenesis: a Climatology and Multiscale Case Studies
NASA Astrophysics Data System (ADS)
Lee, Tiros Peijiun
1995-11-01
A detailed study of Intermountain cyclones over the western United States is conducted through climatological and case studies. An eleven-year (1976-1986) statistical survey shows that the Nevada cyclogenesis is mainly a springtime (March, April) event while a secondary maximum of cyclogenesis frequency is found in November. Nearly 75% of the Nevada cyclogenesis events (177 out of 237 cases) take place under large-scale westerly to southerly flow aloft across the Sierra Nevada Mountains, while 24% of the events (57 out of 237 cases) occur under northwesterly flow aloft. A composite study of these two types of the flow is shown to demonstrate how differences in large-scale topography affect Intermountain cyclogenesis processes. The result from a case study of 9-11 February 1984 reveals that an antecedent Nevada lee trough formed as a result of large-scale southwesterly flow aloft interacting with the underlying terrain well before the surface and upper-level troughs moved onshore. Subsequent cyclogenesis took place in situ with the axis of the trough as the center of large-scale quasi-geostrophic ascent/positive potential vorticity advection began to spread across the Sierra Nevada Mountains. As the cyclone moved downstream, it was observed to weaken well before reaching the Continental Divide while a new cyclonic development occurred east of the Rocky Mountains. It is shown that the weakening of the Intermountain cyclone was associated with the ongoing interaction between the Intermountain cyclone and large-scale topography and the progressive outrunning of the large-scale dynamical forcing aloft away from the surface cyclone center. An investigation of the large-scale evolution for the 26-29 January 1980 case, which developed beneath the northwesterly flow aloft, further reveals that the underlying topography plays two major roles in contributing to the initial cyclogenesis: (1) to block and to retard cold, stable air east of the Continental Divide from rushing into the Great Basin region, and (2) to produce differential pressure falls across the Sierra Nevada Mountains (more along the eastern slopes) in response to increasing cross -mountain flow. Numerous transient shortwaves in the midtroposphere rapidly move across the GB and the Rocky Mountains into the Plains States, while the Intermountain cyclone moves slower than to the disturbances aloft. There is no downstream lee trough/cyclogenesis to the east of the Rockies during the investigation period since the leeside is characterized by cold, stable air. The third case study is made of an 11-14 December 1987 Intermountain cyclogenesis case which took place in an area of relatively warm and less stable environment near the Arizona-New Mexico border beneath northwesterly flow aloft. The ensuing interaction between the large -scale flow and underlying terrain allowed the surface cyclone to remain quasi-stationary for its entire 36 h life span. We also document a cold-season small-scale Catalina eddy development in the coastal southern California waters in this case. The eddy formed as the equatorward and northeasterly flow upstream of the coastal (San Rafael and Saint Ynez) mountains increased in the lower troposphere. Weak large-scale ascent in the mid- and upper-troposphere over the incipient eddy environment provided evidence of the orographic nature of the small -scale cyclone. The eddy was eventually displaced seaward and weakened with the arrival of powerful large-scale subsidence and increasing northeasterly downslope flow at the lower levels that reached the coastal waters.
Nuclear EMP simulation for large-scale urban environments. FDTD for electrically large problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, William S.; Bull, Jeffrey S.; Wilcox, Trevor
2012-08-13
In case of a terrorist nuclear attack in a metropolitan area, EMP measurement could provide: (1) a prompt confirmation of the nature of the explosion (chemical or nuclear) for emergency response; and (2) and characterization parameters of the device (reaction history, yield) for technical forensics. However, urban environment could affect the fidelity of the prompt EMP measurement (as well as all other types of prompt measurement): (1) Nuclear EMP wavefront would no longer be coherent, due to incoherent production, attenuation, and propagation of gamma and electrons; and (2) EMP propagation from source region outward would undergo complicated transmission, reflection, andmore » diffraction processes. EMP simulation for electrically-large urban environment: (1) Coupled MCNP/FDTD (Finite-difference time domain Maxwell solver) approach; and (2) FDTD tends to be limited to problems that are not 'too' large compared to the wavelengths of interest because of numerical dispersion and anisotropy. We use a higher-order low-dispersion, isotropic FDTD algorithm for EMP propagation.« less
Large-scale deformed QRPA calculations of the gamma-ray strength function based on a Gogny force
NASA Astrophysics Data System (ADS)
Martini, M.; Goriely, S.; Hilaire, S.; Péru, S.; Minato, F.
2016-01-01
The dipole excitations of nuclei play an important role in nuclear astrophysics processes in connection with the photoabsorption and the radiative neutron capture that take place in stellar environment. We present here the results of a large-scale axially-symmetric deformed QRPA calculation of the γ-ray strength function based on the finite-range Gogny force. The newly determined γ-ray strength is compared with experimental photoabsorption data for spherical as well as deformed nuclei. Predictions of γ-ray strength functions and Maxwellian-averaged neutron capture rates for Sn isotopes are also discussed.
Performance of Sweetpotato for Bioregenerative Life Support
NASA Technical Reports Server (NTRS)
Barta, Daniel J.; Henderson, Keith E.; Mortley, Desmond G.; Henninger, Donald L.
2001-01-01
Sweetpotato was successfully grown to harvest maturity in a large-scale atmospherically-closed controlled environment chamber. Yield of edible biomass and capacity for contributing to air revitalization and water recovery were documented. Yield was slightly less than that found in smaller-scale studies, but this is not unusual (Wheeler 1999). Continued work is suggested to improve control of storage root initiation, bulking and vine growth.
ERIC Educational Resources Information Center
Lynch, Sharon Jo; Pyke, Curtis; Grafton, Bonnie Hansen
2012-01-01
This article provides an extended, comprehensive example of how teachers, schools, districts, and external factors (e.g., parental pressure and policy mandates) shape curriculum research in the U.S. It retrospectively examines how three different middle school curriculum units were implemented and scaled-up in a large, diverse school system. The…
Globus | Informatics Technology for Cancer Research (ITCR)
Globus software services provide secure cancer research data transfer, synchronization, and sharing in distributed environments at large scale. These services can be integrated into applications and research data gateways, leveraging Globus identity management, single sign-on, search, and authorization capabilities. Globus Genomics integrates Globus with the Galaxy genomics workflow engine and Amazon Web Services to enable cancer genomics analysis that can elastically scale compute resources with demand.
Lu, Zhixiang; Wei, Yongping; Feng, Qi; Xie, Jiali; Xiao, Honglang; Cheng, Guodong
2018-09-01
There is limited quantitative understanding of interactions between human and environmental systems over the millennial scale. We aim to reveal the co-evolutionary dynamics of the human-environment system in a river basin by simulating the water use and net primary production (NPP) allocation for human and environmental systems over the last 2000years in Heihe River basin (HRB) in northwest China. We partition the catchment total evapotranspiration (ET) into ET for human and environmental systems with a social-hydrological framework and estimate the NPP for human and environmental systems using the Box-Lieth model, then classify the co-evolutionary processes of the human-environment system into distinct phases using the rate of changes of NPP over time, and discover the trade-offs or synergies relationships between them based on the elasticity of change of the NPP for humans to the change of NPP for environment. The co-evolutionary dynamics of human-environment system in the HRB can be divided into four periods, including: Phase I (Han Dynasty-Yuan Dynasty): predevelopment characterized by nearly no trade-offs between human and environment; Phase II (Yuan Dynasty-RC): slow agricultural development: characterized by a small human win due to small trade-offs between human and environment; Phase III (RC-2000): rapid agricultural development: characterized by a large human win due to large trade-offs between human and environment, and Phase IV (2000-2010): a rebalance characterized by large human wins with a small-environment win due to synergies, although these occurred very occasionally. This study provides a quantitative approach to describe the co-evolution of the human-environment system from the perspective of trade-offs and synergies in the millennial scale for the first time. The relationships between humans and environment changed from trade-off to synergy with the implementation of the water reallocation scheme in 2000. These findings improve the understanding of how humans influence environmental systems and responses to environmental stresses. Copyright © 2018 Elsevier B.V. All rights reserved.
A place meaning scale for tropical marine settings.
Wynveen, Christopher J; Kyle, Gerard T
2015-01-01
Over the past 20 years, most of the worldwide hectares set aside for environmental protection have been added to marine protected areas. Moreover, these areas are under tremendous pressure from negative anthropogenic impacts. Given this growth and pressure, there is a need to increase the understanding of the connection between people and marine environments in order to better manage the resource. One construct that researchers have used to understand human-environment connections is place meanings. Place meanings reflect the value and significance of a setting to individuals. Most investigations of place meanings have been confined to terrestrial settings. Moreover, most studies have had small sample sizes or have used place attachment scales as a proxy to gage the meanings individuals ascribe to a setting. Hence, it has become necessary to develop a place meaning scale for use with large samples and for use by those who are concerned about the management of marine environments. Therefore, the purpose of this investigation was to develop a scale to measure the importance people associate with the meanings they ascribe to tropical marine settings and empirically test the scale using two independent samples; that is, Great Barrier Reef Marine Park and the Florida Keys National Marine Sanctuary stakeholders.
A Place Meaning Scale for Tropical Marine Settings
NASA Astrophysics Data System (ADS)
Wynveen, Christopher J.; Kyle, Gerard T.
2015-01-01
Over the past 20 years, most of the worldwide hectares set aside for environmental protection have been added to marine protected areas. Moreover, these areas are under tremendous pressure from negative anthropogenic impacts. Given this growth and pressure, there is a need to increase the understanding of the connection between people and marine environments in order to better manage the resource. One construct that researchers have used to understand human-environment connections is place meanings. Place meanings reflect the value and significance of a setting to individuals. Most investigations of place meanings have been confined to terrestrial settings. Moreover, most studies have had small sample sizes or have used place attachment scales as a proxy to gage the meanings individuals ascribe to a setting. Hence, it has become necessary to develop a place meaning scale for use with large samples and for use by those who are concerned about the management of marine environments. Therefore, the purpose of this investigation was to develop a scale to measure the importance people associate with the meanings they ascribe to tropical marine settings and empirically test the scale using two independent samples; that is, Great Barrier Reef Marine Park and the Florida Keys National Marine Sanctuary stakeholders.
Galaxy clusters in local Universe simulations without density constraints: a long uphill struggle
NASA Astrophysics Data System (ADS)
Sorce, Jenny G.
2018-06-01
Galaxy clusters are excellent cosmological probes provided that their formation and evolution within the large scale environment are precisely understood. Therefore studies with simulated galaxy clusters have flourished. However detailed comparisons between simulated and observed clusters and their population - the galaxies - are complicated by the diversity of clusters and their surrounding environment. An original way initiated by Bertschinger as early as 1987, to legitimize the one-to-one comparison exercise down to the details, is to produce simulations constrained to resemble the cluster under study within its large scale environment. Subsequently several methods have emerged to produce simulations that look like the local Universe. This paper highlights one of these methods and its essential steps to get simulations that not only resemble the local Large Scale Structure but also that host the local clusters. It includes a new modeling of the radial peculiar velocity uncertainties to remove the observed correlation between the decreases of the simulated cluster masses and of the amount of data used as constraints with the distance from us. This method has the particularity to use solely radial peculiar velocities as constraints: no additional density constraints are required to get local cluster simulacra. The new resulting simulations host dark matter halos that match the most prominent local clusters such as Coma. Zoom-in simulations of the latter and of a volume larger than the 30h-1 Mpc radius inner sphere become now possible to study local clusters and their effects. Mapping the local Sunyaev-Zel'dovich and Sachs-Wolfe effects can follow.
The effect of accretion environment at large radius on hot accretion flows
NASA Astrophysics Data System (ADS)
Yang, Xiao-Hong; Bu, De-Fu
2018-05-01
We study the effects of accretion environment (gas density, temperature, and angular momentum) at large radii (˜10 pc) on luminosity of hot accretion flows. The radiative feedback effects from the accretion flow on the accretion environment are also self-consistently taken into account. We find that the slowly rotating flows at large radii can significantly deviate from Bondi accretion when radiation heating and cooling are considered. We further find that when the temperature of environment gas is low (e.g. T = 2 × 107 K), the luminosity of hot accretion flows is high. When the temperature of gas is high (e.g. T ≥ 4 × 107 K), the luminosity of hot accretion flow significantly deceases. The environment gas density can also significantly influence the luminosity of accretion flows. When density is higher than ˜4 × 10-22 g cm-3 and temperature is lower than 2 × 107 K, hot accretion flow with luminosity lower than 2 per cent LEdd is not present. Therefore, the parsec-scale environment density and temperature are two important parameters to determine the luminosity. The results are also useful for the subgrid models adopted by the cosmological simulations.
Advanced computer architecture for large-scale real-time applications.
DOT National Transportation Integrated Search
1973-04-01
Air traffic control automation is identified as a crucial problem which provides a complex, real-time computer application environment. A novel computer architecture in the form of a pipeline associative processor is conceived to achieve greater perf...
Retrieving cosmological signal using cosmic flows
NASA Astrophysics Data System (ADS)
Bouillot, V.; Alimi, J.-M.
2011-12-01
To understand the origin of the anomalously high bulk flow at large scales, we use very large simulations in various cosmological models. To disentangle between cosmological and environmental effects, we select samples with bulk flow profiles similar to the observational data Watkins et al. (2009) which exhibit a maximum in the bulk flow at 53 h^{-1} Mpc. The estimation of the cosmological parameters Ω_M and σ_8, done on those samples, is correct from the rms mass fluctuation whereas this estimation gives completely false values when done on bulk flow measurements, hence showing a dependance of velocity fields on larger scales. By drawing a clear link between velocity fields at 53 h^{-1} Mpc and asymmetric patterns of the density field at 85 h^{-1} Mpc, we show that the bulk flow can depend largely on the environment. The retrieving of the cosmological signal is achieved by studying the convergence of the bulk flow towards the linear prediction at very large scale (˜ 150 h^{-1} Mpc).
Overview of Opportunities for Co-Location of Solar Energy Technologies and Vegetation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macknick, Jordan; Beatty, Brenda; Hill, Graham
2013-12-01
Large-scale solar facilities have the potential to contribute significantly to national electricity production. Many solar installations are large-scale or utility-scale, with a capacity over 1 MW and connected directly to the electric grid. Large-scale solar facilities offer an opportunity to achieve economies of scale in solar deployment, yet there have been concerns about the amount of land required for solar projects and the impact of solar projects on local habitat. During the site preparation phase for utility-scale solar facilities, developers often grade land and remove all vegetation to minimize installation and operational costs, prevent plants from shading panels, and minimizemore » potential fire or wildlife risks. However, the common site preparation practice of removing vegetation can be avoided in certain circumstances, and there have been successful examples where solar facilities have been co-located with agricultural operations or have native vegetation growing beneath the panels. In this study we outline some of the impacts that large-scale solar facilities can have on the local environment, provide examples of installations where impacts have been minimized through co-location with vegetation, characterize the types of co-location, and give an overview of the potential benefits from co-location of solar energy projects and vegetation. The varieties of co-location can be replicated or modified for site-specific use at other solar energy installations around the world. We conclude with opportunities to improve upon our understanding of ways to reduce the environmental impacts of large-scale solar installations.« less
The XMM Large Scale Structure Survey
NASA Astrophysics Data System (ADS)
Pierre, Marguerite
2005-10-01
We propose to complete, by an additional 5 deg2, the XMM-LSS Survey region overlying the Spitzer/SWIRE field. This field already has CFHTLS and Integral coverage, and will encompass about 10 deg2. The resulting multi-wavelength medium-depth survey, which complements XMM and Chandra deep surveys, will provide a unique view of large-scale structure over a wide range of redshift, and will show active galaxies in the full range of environments. The complete coverage by optical and IR surveys provides high-quality photometric redshifts, so that cosmological results can quickly be extracted. In the spirit of a Legacy survey, we will make the raw X-ray data immediately public. Multi-band catalogues and images will also be made available on short time scales.
Automated Decomposition of Model-based Learning Problems
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Millar, Bill
1996-01-01
A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pugh, C.E.; Bass, B.R.; Keeney, J.A.
This report contains 40 papers that were presented at the Joint IAEA/CSNI Specialists` Meeting Fracture Mechanics Verification by Large-Scale Testing held at the Pollard Auditorium, Oak Ridge, Tennessee, during the week of October 26--29, 1992. The papers are printed in the order of their presentation in each session and describe recent large-scale fracture (brittle and/or ductile) experiments, analyses of these experiments, and comparisons between predictions and experimental results. The goal of the meeting was to allow international experts to examine the fracture behavior of various materials and structures under conditions relevant to nuclear reactor components and operating environments. The emphasismore » was on the ability of various fracture models and analysis methods to predict the wide range of experimental data now available. The individual papers have been cataloged separately.« less
Development of fire test methods for airplane interior materials
NASA Technical Reports Server (NTRS)
Tustin, E. A.
1978-01-01
Fire tests were conducted in a 737 airplane fuselage at NASA-JSC to characterize jet fuel fires in open steel pans (simulating post-crash fire sources and a ruptured airplane fuselage) and to characterize fires in some common combustibles (simulating in-flight fire sources). Design post-crash and in-flight fire source selections were based on these data. Large panels of airplane interior materials were exposed to closely-controlled large scale heating simulations of the two design fire sources in a Boeing fire test facility utilizing a surplused 707 fuselage section. Small samples of the same airplane materials were tested by several laboratory fire test methods. Large scale and laboratory scale data were examined for correlative factors. Published data for dangerous hazard levels in a fire environment were used as the basis for developing a method to select the most desirable material where trade-offs in heat, smoke and gaseous toxicant evolution must be considered.
NASA's Information Power Grid: Large Scale Distributed Computing and Data Management
NASA Technical Reports Server (NTRS)
Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)
2001-01-01
Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.
2018-01-01
Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410
Gaussian processes for personalized e-health monitoring with wearable sensors.
Clifton, Lei; Clifton, David A; Pimentel, Marco A F; Watkinson, Peter J; Tarassenko, Lionel
2013-01-01
Advances in wearable sensing and communications infrastructure have allowed the widespread development of prototype medical devices for patient monitoring. However, such devices have not penetrated into clinical practice, primarily due to a lack of research into "intelligent" analysis methods that are sufficiently robust to support large-scale deployment. Existing systems are typically plagued by large false-alarm rates, and an inability to cope with sensor artifact in a principled manner. This paper has two aims: 1) proposal of a novel, patient-personalized system for analysis and inference in the presence of data uncertainty, typically caused by sensor artifact and data incompleteness; 2) demonstration of the method using a large-scale clinical study in which 200 patients have been monitored using the proposed system. This latter provides much-needed evidence that personalized e-health monitoring is feasible within an actual clinical environment, at scale, and that the method is capable of improving patient outcomes via personalized healthcare.
Towards building high performance medical image management system for clinical trials
NASA Astrophysics Data System (ADS)
Wang, Fusheng; Lee, Rubao; Zhang, Xiaodong; Saltz, Joel
2011-03-01
Medical image based biomarkers are being established for therapeutic cancer clinical trials, where image assessment is among the essential tasks. Large scale image assessment is often performed by a large group of experts by retrieving images from a centralized image repository to workstations to markup and annotate images. In such environment, it is critical to provide a high performance image management system that supports efficient concurrent image retrievals in a distributed environment. There are several major challenges: high throughput of large scale image data over the Internet from the server for multiple concurrent client users, efficient communication protocols for transporting data, and effective management of versioning of data for audit trails. We study the major bottlenecks for such a system, propose and evaluate a solution by using a hybrid image storage with solid state drives and hard disk drives, RESTfulWeb Services based protocols for exchanging image data, and a database based versioning scheme for efficient archive of image revision history. Our experiments show promising results of our methods, and our work provides a guideline for building enterprise level high performance medical image management systems.
Micromégas: Altered Body-Environment Scaling in Literary Fiction.
Dieguez, Sebastian
2016-01-01
Architectonic embodiment postulates a bidirectional link between bodily awareness and the architectural environment. The standard size and features of the human body, for instance, are thought to influence the structure of interiors and buildings, as well as their perception and appreciation. Whereas architectural practice and theory, the visual arts and more recently the cognitive sciences have explored this relationship of humans with their crafted environments, many fictional literary works have long experimented with alterations of body-environment scaling. This so-called Gulliver theme - popular in the science-fiction genre but also in children's literature and philosophical satire - reveals, as a recurrent thought-experiment, our preoccupation with proportions and our fascination for the infinitely small and large. Here I provide an overview of the altered scaling theme in literature, including classics such as Voltaire's Micromégas, Swift's Gulliver's Travels, Caroll's Alice, and Matheson's The Shrinking man, closely examining issues relevant to architectonic embodiment such as: bodily, perceptual, cognitive, affective, and social changes related to alterations in body size relative to people, objects and architectural environments. I next provide a taxonomy of the Gulliver theme and highlight its main psychological features, and then proceed to review relevant work from cognitive science. Although fictional alterations of body-environment scaling far outreach current possibilities in experimental research, I argue that the peripetiae and morals outlined in the literary realm, as products of the human imagination, provide a unique window into the folk-psychology of body and space.
Wu, Junjun; Du, Guocheng; Zhou, Jingwen; Chen, Jian
2014-10-20
Flavonoids possess pharmaceutical potential due to their health-promoting activities. The complex structures of these products make extraction from plants difficult, and chemical synthesis is limited because of the use of many toxic solvents. Microbial production offers an alternate way to produce these compounds on an industrial scale in a more economical and environment-friendly manner. However, at present microbial production has been achieved only on a laboratory scale and improvements and scale-up of these processes remain challenging. Naringenin and pinocembrin, which are flavonoid scaffolds and precursors for most of the flavonoids, are the model molecules that are key to solving the current issues restricting industrial production of these chemicals. The emergence of systems metabolic engineering, which combines systems biology with synthetic biology and evolutionary engineering at the systems level, offers new perspectives on strain and process optimization. In this review, current challenges in large-scale fermentation processes involving flavonoid scaffolds and the strategies and tools of systems metabolic engineering used to overcome these challenges are summarized. This will offer insights into overcoming the limitations and challenges of large-scale microbial production of these important pharmaceutical compounds. Copyright © 2014 Elsevier B.V. All rights reserved.
A kinetic energy study of the meso beta-scale storm environment during AVE-SESAME 5 (20-21 May 1979)
NASA Technical Reports Server (NTRS)
Printy, M. F.; Fuelberg, H. E.
1984-01-01
Kinetic energy of the near storm environment was analyzed by meso beta scale data. It was found that horizontal winds in the 400 to 150 mb layer strengthen rapidly north of the developing convection. Peak values then decrease such that the maximum disappears 6 h later. Southeast of the storms, wind speeds above 300 mb decrease nearly 50% during the 3 h period of most intense thunderstorm activity. When the convection dissipates, wind patterns return to prestorm conditions. The mesoscale storm environment of AVE-SESAME 5 is characterized by large values of cross contour generation of kinetic energy, transfers of energy to nonresolvable scales of motion, and horizontal flux divergence. These processes are maximized within the upper troposphere and are greatest during times of strongest convection. It is shown that patterns agree with observed weather features. The southeast area of the network is examined to determine causes for vertical wind variations.
Towards Portable Large-Scale Image Processing with High-Performance Computing.
Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A
2018-05-03
High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.
Quantifying the Environmental Memory of Tropical Cyclones: Lingering Footprint or Climate Amnesia?
NASA Astrophysics Data System (ADS)
Schenkel, B. A.; Hart, R. E.
2011-12-01
One of the great remaining unanswered questions in tropical meteorology is why there are 90 tropical cyclones (TCs) globally, on average, per year as opposed to 10, 1000, or 10000 TCs. In contrast to extratropical cyclones whose annual frequency can be roughly calculated given the large scale characteristics of the mid-latitudes, there is no equivalent theory that even justifies the order of magnitude of TCs that occur globally each year. In spite of this, there appears to be a preferential spacing of approximately 1500-2000 km between TCs during multiple TC episodes in the Eastern North Pacific, North Atlantic, and Western North Pacific possibly suggesting that the number of storms in each basin is limited energetically by the environment. Reconciling these issues is fundamentally rooted in determining the role of TCs within the climate. Building upon previous research (e.g. Sobel and Camargo 2005, Hart et al. 2007), the following study seeks to take a preliminary step in addressing these questions by quantifying the spatiotemporal scales over which TCs and the large scale environment interact. Four-dimensional, storm-relative composites of raw variables, raw anomalies, and normalized anomalies for Western North Pacific TCs are utilized in the analysis presented here. Preliminary results show that the passage of a TC may be initially responsible for exciting a large scale cooling and drying of the atmospheric environment spanning the majority of the composite domain. Within two weeks, these anomalies are found to become localized over the region in which the TC directly passed through and most strongly manifest themselves as a drying of the lower and middle tropospheric environment. The spatial distribution of the moisture and temperature anomalies in the area immediately surrounding the TC track suggests that the suppression of convection potentially due to the underlying sea surface temperature cold wake induced by the TC is the predominant factor in anomaly maintenance. Furthermore, the periodic pulsation in the magnitude of the dry anomalies, approximately every 10 days following TC passage, either implies the passage of waves generated independently of the TC or in response to the passage of the TC itself that further serve to increase the stabilization of the atmospheric environment. In their totality, these results suggest that TCs serve as an efficient mechanism for regulating atmospheric instability within the tropics for weeks after TC passage.
Halo Intrinsic Alignment: Dependence on Mass, Formation Time, and Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xia, Qianli; Kang, Xi; Wang, Peng
In this paper we use high-resolution cosmological simulations to study halo intrinsic alignment and its dependence on mass, formation time, and large-scale environment. In agreement with previous studies using N -body simulations, it is found that massive halos have stronger alignment. For the first time, we find that for a given halo mass older halos have stronger alignment and halos in cluster regions also have stronger alignment than those in filaments. To model these dependencies, we extend the linear alignment model with inclusion of halo bias and find that the halo alignment with its mass and formation time dependence canmore » be explained by halo bias. However, the model cannot account for the environment dependence, as it is found that halo bias is lower in clusters and higher in filaments. Our results suggest that halo bias and environment are independent factors in determining halo alignment. We also study the halo alignment correlation function and find that halos are strongly clustered along their major axes and less clustered along the minor axes. The correlated halo alignment can extend to scales as large as 100 h {sup −1} Mpc, where its feature is mainly driven by the baryon acoustic oscillation effect.« less
Reefs as cradles of evolution and sources of biodiversity in the Phanerozoic.
Kiessling, Wolfgang; Simpson, Carl; Foote, Michael
2010-01-08
Large-scale biodiversity gradients among environments and habitats are usually attributed to a complex array of ecological and evolutionary factors. We tested the evolutionary component of such gradients by compiling the environments of the geologically oldest occurrences of marine genera and using sampling standardization to assess if originations tended to be clustered in particular environments. Shallow, tropical environments and carbonate substrates all tend to have harbored high origination rates. Diversity within these environments tended to be preferentially generated in reefs, probably because of their habitat complexity. Reefs were also prolific at exporting diversity to other environments, which might be a consequence of low-diversity habitats being more susceptible to invasions.
ENCAPSULATING WASTE DISPOSAL METHODS - PHASE I
Bioimmobilization of uranium-practical tools for field applications
NASA Astrophysics Data System (ADS)
Istok, J. D.
2011-12-01
Extensive laboratory and field research has conclusively demonstrated that it is possible to stimulate indigenous microbial activity and create conditions favorable for the reductive precipitation of uranium from groundwater, reducing aqueous U concentrations below regulatory levels. A wide variety of complex and coupled biogeochemical processes have been identified and specific reaction mechanisms and parameters have been quantified for a variety of experimental systems including pure, mixed, and natural microbial cultures, and single mineral, artificial, and natural sediments, and groundwater aquifers at scales ranging from very small (10s nm) to very large (10s m). Multicomponent coupled reactive transport models have also been developed to simulate various aspects of this process in 3D heterogeneous environments. Nevertheless, full-scale application of reductive bioimmobilization of uranium (and other radionuclides and metals) remains problematical because of the technical and logistical difficulties in creating and maintaining reducing environment in the many large U contaminated groundwater aquifers currently under aerobic and oxidizing conditions and often containing high concentrations of competing and more energetically favorable electron acceptors (esp. nitrate). This talk will discuss how simple tools, including small-scale in situ testing and geochemical reaction path modeling, can be used to quickly assess the feasibility of applying bioimmobilization to remediate U contaminated groundwater aquifers and provide data needed for full-scale design.
A study of environmental effects on galaxy spin using MaNGA data
NASA Astrophysics Data System (ADS)
Lee, Jong Chul; Hwang, Ho Seong; Chung, Haeun
2018-06-01
We investigate environmental effects on galaxy spin using the recent public data of Mapping Nearby Galaxies at APO (MaNGA) integral field spectroscopic survey containing ˜2800 galaxies. We measure the spin parameter of 1830 galaxies through the analysis of two-dimensional stellar kinematic maps within the effective radii, and obtain their large-scale (background mass density from 20 nearby galaxies) and small-scale (distance to and morphology of the nearest neighbour galaxy) environmental parameters for 1529 and 1767 galaxies, respectively. We first examine the mass dependence of galaxy spin, and find that the spin parameter of early-type galaxies decreases with stellar mass at log (M*/M⊙) ≳ 10, consistent with the results from previous studies. We then divide the galaxies into three subsamples using their stellar masses to minimize the mass effects on galaxy spin. The spin parameters of galaxies in each subsample do not change with background mass density, but do change with distance to and morphology of the nearest neighbour. In particular, the spin parameter of late-type galaxies decreases as early-type neighbours approach within the virial radius. These results suggest that the large-scale environments hardly affect the galaxy spin, but the small-scale environments such as hydrodynamic galaxy-galaxy interactions can play a substantial role in determining galaxy spin.
Primordial Magnetic Field Effects on the CMB and Large-Scale Structure
Yamazaki, Dai G.; Ichiki, Kiyotomo; Kajino, Toshitaka; ...
2010-01-01
Mmore » agnetic fields are everywhere in nature, and they play an important role in every astronomical environment which involves the formation of plasma and currents. It is natural therefore to suppose that magnetic fields could be present in the turbulent high-temperature environment of the big bang. Such a primordial magnetic field (PF) would be expected to manifest itself in the cosmic microwave background (CB) temperature and polarization anisotropies, and also in the formation of large-scale structure. In this paper, we summarize the theoretical framework which we have developed to calculate the PF power spectrum to high precision. Using this formulation, we summarize calculations of the effects of a PF which take accurate quantitative account of the time evolution of the cutoff scale. We review the constructed numerical program, which is without approximation, and an improvement over the approach used in a number of previous works for studying the effect of the PF on the cosmological perturbations. We demonstrate how the PF is an important cosmological physical process on small scales. We also summarize the current constraints on the PF amplitude B λ and the power spectral index n B which have been deduced from the available CB observational data by using our computational framework.« less
Cowley, Lauren A; Petersen, Fernanda C; Junges, Roger; Jimson D Jimenez, Med; Morrison, Donald A; Hanage, William P
2018-06-01
Homologous recombination in the genetic transformation model organism Streptococcus pneumoniae is thought to be important in the adaptation and evolution of this pathogen. While competent pneumococci are able to scavenge DNA added to laboratory cultures, large-scale transfers of multiple kb are rare under these conditions. We used whole genome sequencing (WGS) to map transfers in recombinants arising from contact of competent cells with non-competent 'target' cells, using strains with known genomes, distinguished by a total of ~16,000 SNPs. Experiments designed to explore the effect of environment on large scale recombination events used saturating purified donor DNA, short-term cell assemblages on Millipore filters, and mature biofilm mixed cultures. WGS of 22 recombinants for each environment mapped all SNPs that were identical between the recombinant and the donor but not the recipient. The mean recombination event size was found to be significantly larger in cell-to-cell contact cultures (4051 bp in filter assemblage and 3938 bp in biofilm co-culture versus 1815 bp with saturating DNA). Up to 5.8% of the genome was transferred, through 20 recombination events, to a single recipient, with the largest single event incorporating 29,971 bp. We also found that some recombination events are clustered, that these clusters are more likely to occur in cell-to-cell contact environments, and that they cause significantly increased linkage of genes as far apart as 60,000 bp. We conclude that pneumococcal evolution through homologous recombination is more likely to occur on a larger scale in environments that permit cell-to-cell contact.
Scalable clustering algorithms for continuous environmental flow cytometry.
Hyrkas, Jeremy; Clayton, Sophie; Ribalet, Francois; Halperin, Daniel; Armbrust, E Virginia; Howe, Bill
2016-02-01
Recent technological innovations in flow cytometry now allow oceanographers to collect high-frequency flow cytometry data from particles in aquatic environments on a scale far surpassing conventional flow cytometers. The SeaFlow cytometer continuously profiles microbial phytoplankton populations across thousands of kilometers of the surface ocean. The data streams produced by instruments such as SeaFlow challenge the traditional sample-by-sample approach in cytometric analysis and highlight the need for scalable clustering algorithms to extract population information from these large-scale, high-frequency flow cytometers. We explore how available algorithms commonly used for medical applications perform at classification of such a large-scale, environmental flow cytometry data. We apply large-scale Gaussian mixture models to massive datasets using Hadoop. This approach outperforms current state-of-the-art cytometry classification algorithms in accuracy and can be coupled with manual or automatic partitioning of data into homogeneous sections for further classification gains. We propose the Gaussian mixture model with partitioning approach for classification of large-scale, high-frequency flow cytometry data. Source code available for download at https://github.com/jhyrkas/seaflow_cluster, implemented in Java for use with Hadoop. hyrkas@cs.washington.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Evolution of the indoor biome.
Martin, Laura J; Adams, Rachel I; Bateman, Ashley; Bik, Holly M; Hawks, John; Hird, Sarah M; Hughes, David; Kembel, Steven W; Kinney, Kerry; Kolokotronis, Sergios-Orestis; Levy, Gabriel; McClain, Craig; Meadow, James F; Medina, Raul F; Mhuireach, Gwynne; Moreau, Corrie S; Munshi-South, Jason; Nichols, Lauren M; Palmer, Clare; Popova, Laura; Schal, Coby; Täubel, Martin; Trautwein, Michelle; Ugalde, Juan A; Dunn, Robert R
2015-04-01
Few biologists have studied the evolutionary processes at work in indoor environments. Yet indoor environments comprise approximately 0.5% of ice-free land area--an area as large as the subtropical coniferous forest biome. Here we review the emerging subfield of 'indoor biome' studies. After defining the indoor biome and tracing its deep history, we discuss some of its evolutionary dimensions. We restrict our examples to the species found in human houses--a subset of the environments constituting the indoor biome--and offer preliminary hypotheses to advance the study of indoor evolution. Studies of the indoor biome are situated at the intersection of evolutionary ecology, anthropology, architecture, and human ecology and are well suited for citizen science projects, public outreach, and large-scale international collaborations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Spatial and Temporal Dynamics of Pacific Oyster Hemolymph Microbiota across Multiple Scales
Lokmer, Ana; Goedknegt, M. Anouk; Thieltges, David W.; Fiorentino, Dario; Kuenzel, Sven; Baines, John F.; Wegner, K. Mathias
2016-01-01
Unveiling the factors and processes that shape the dynamics of host associated microbial communities (microbiota) under natural conditions is an important part of understanding and predicting an organism's response to a changing environment. The microbiota is shaped by host (i.e., genetic) factors as well as by the biotic and abiotic environment. Studying natural variation of microbial community composition in multiple host genetic backgrounds across spatial as well as temporal scales represents a means to untangle this complex interplay. Here, we combined a spatially-stratified with a longitudinal sampling scheme within differentiated host genetic backgrounds by reciprocally transplanting Pacific oysters between two sites in the Wadden Sea (Sylt and Texel). To further differentiate contingent site from host genetic effects, we repeatedly sampled the same individuals over a summer season to examine structure, diversity and dynamics of individual hemolymph microbiota following experimental removal of resident microbiota by antibiotic treatment. While a large proportion of microbiome variation could be attributed to immediate environmental conditions, we observed persistent effects of antibiotic treatment and translocation suggesting that hemolymph microbial community dynamics is subject to within-microbiome interactions and host population specific factors. In addition, the analysis of spatial variation revealed that the within-site microenvironmental heterogeneity resulted in high small-scale variability, as opposed to large-scale (between-site) stability. Similarly, considerable within-individual temporal variability was in contrast with the overall temporal stability at the site level. Overall, our longitudinal, spatially-stratified sampling design revealed that variation in hemolymph microbiota is strongly influenced by site and immediate environmental conditions, whereas internal microbiome dynamics and oyster-related factors add to their long-term stability. The combination of small and large scale resolution of spatial and temporal observations therefore represents a crucial but underused tool to study host-associated microbiome dynamics. PMID:27630625
Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee
2016-07-01
In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research.
Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee
2015-01-01
In this paper, we present strategies for collecting and coding a large longitudinal communication dataset collected across multiple sites, consisting of over 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication datasets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multi-site secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges have the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a “how-to” example for managing large, digitally-recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research. PMID:26580414
Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping
NASA Astrophysics Data System (ADS)
Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.
2017-12-01
Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.
Impact of entrainment on cloud droplet spectra: theory, observations, and modeling
NASA Astrophysics Data System (ADS)
Grabowski, W.
2016-12-01
Understanding the impact of entrainment and mixing on microphysical properties of warm boundary layer clouds is an important aspect of the representation of such clouds in large-scale models of weather and climate. Entrainment leads to a reduction of the liquid water content in agreement with the fundamental thermodynamics, but its impact on the droplet spectrum is difficult to quantify in observations and modeling. For in-situ (e.g., aircraft) observations, it is impossible to follow air parcels and observe processes that lead to changes of the droplet spectrum in different regions of a cloud. For similar reasons traditional modeling methodologies (e.g., the Eulerian large eddy simulation) are not useful either. Moreover, both observations and modeling can resolve only relatively narrow range of spatial scales. Theory, typically focusing on differences between idealized concepts of homogeneous and inhomogeneous mixing, is also of a limited use for the multiscale turbulent mixing between a cloud and its environment. This presentation will illustrate the above points and argue that the Lagrangian large-eddy simulation with appropriate subgrid-scale scheme may provide key insights and eventually lead to novel parameterizations for large-scale models.
NASA Technical Reports Server (NTRS)
Berchem, J.; Raeder, J.; Ashour-Abdalla, M.; Frank, L. A.; Paterson, W. R.; Ackerson, K. L.; Kokubun, S.; Yamamoto, T.; Lepping, R. P.
1998-01-01
Understanding the large-scale dynamics of the magnetospheric boundary is an important step towards achieving the ISTP mission's broad objective of assessing the global transport of plasma and energy through the geospace environment. Our approach is based on three-dimensional global magnetohydrodynamic (MHD) simulations of the solar wind-magnetosphere- ionosphere system, and consists of using interplanetary magnetic field (IMF) and plasma parameters measured by solar wind monitors upstream of the bow shock as input to the simulations for predicting the large-scale dynamics of the magnetospheric boundary. The validity of these predictions is tested by comparing local data streams with time series measured by downstream spacecraft crossing the magnetospheric boundary. In this paper, we review results from several case studies which confirm that our MHD model reproduces very well the large-scale motion of the magnetospheric boundary. The first case illustrates the complexity of the magnetic field topology that can occur at the dayside magnetospheric boundary for periods of northward IMF with strong Bx and By components. The second comparison reviewed combines dynamic and topological aspects in an investigation of the evolution of the distant tail at 200 R(sub E) from the Earth.
Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.
2006-01-01
This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.
NASA Astrophysics Data System (ADS)
Fortmann, C. M.; Farley, M. V.; Smoot, M. A.; Fieselmann, B. F.
1988-07-01
Solarex is one of the leaders in amorphous silicon based photovoltaic production and research. The large scale production environment presents unique safety concerns related to the quantity of dangerous materials as well as the number of personnel handling these materials. The safety measures explored by this work include gas detection systems, training, and failure resistant gas handling systems. Our experiences with flow restricting orifices in the CGA connections and the use of steel cylinders is reviewed. The hazards and efficiency of wet scrubbers for silane exhausts are examined. We have found it to be useful to provide the scrubbler with temperature alarms.
HRLSim: a high performance spiking neural network simulator for GPGPU clusters.
Minkovich, Kirill; Thibeault, Corey M; O'Brien, Michael John; Nogin, Aleksey; Cho, Youngkwan; Srinivasa, Narayan
2014-02-01
Modeling of large-scale spiking neural models is an important tool in the quest to understand brain function and subsequently create real-world applications. This paper describes a spiking neural network simulator environment called HRL Spiking Simulator (HRLSim). This simulator is suitable for implementation on a cluster of general purpose graphical processing units (GPGPUs). Novel aspects of HRLSim are described and an analysis of its performance is provided for various configurations of the cluster. With the advent of inexpensive GPGPU cards and compute power, HRLSim offers an affordable and scalable tool for design, real-time simulation, and analysis of large-scale spiking neural networks.
Distributed intrusion detection system based on grid security model
NASA Astrophysics Data System (ADS)
Su, Jie; Liu, Yahui
2008-03-01
Grid computing has developed rapidly with the development of network technology and it can solve the problem of large-scale complex computing by sharing large-scale computing resource. In grid environment, we can realize a distributed and load balance intrusion detection system. This paper first discusses the security mechanism in grid computing and the function of PKI/CA in the grid security system, then gives the application of grid computing character in the distributed intrusion detection system (IDS) based on Artificial Immune System. Finally, it gives a distributed intrusion detection system based on grid security system that can reduce the processing delay and assure the detection rates.
Dusty Starbursts within a z=3 Large Scale Structure revealed by ALMA
NASA Astrophysics Data System (ADS)
Umehata, Hideki
The role of the large-scale structure is one of the most important theme in studying galaxy formation and evolution. However, it has been still mystery especially at z>2. On the basis of our ALMA 1.1 mm observations in a z ~ 3 protocluster field, it is suggested that submillimeter galaxies (SMGs) preferentially reside in the densest environment at z ~ 3. Furthermore we find a rich cluster of AGN-host SMGs at the core of the protocluster, combining with Chandra X-ray data. Our results indicate the vigorous star-formation and accelerated super massive black hole (SMBH) growth in the node of the cosmic web.
Large-scale physical activity data reveal worldwide activity inequality
Althoff, Tim; Sosič, Rok; Hicks, Jennifer L.; King, Abby C.; Delp, Scott L.; Leskovec, Jure
2018-01-01
Understanding the basic principles that govern physical activity is needed to curb the global pandemic of physical inactivity1–7 and the 5.3 million deaths per year associated with in-activity2. Our knowledge, however, remains limited owing to the lack of large-scale measurements of physical activity patterns across free-living populations worldwide1, 6. Here, we leverage the wide usage of smartphones with built-in accelerometry to measure physical activity at planetary scale. We study a dataset consisting of 68 million days of physical activity for 717,527 people, giving us a window into activity in 111 countries across the globe. We find inequality in how activity is distributed within countries and that this inequality is a better predictor of obesity prevalence in the population than average activity volume. Reduced activity in females contributes to a large portion of the observed activity inequality. Aspects of the built environment, such as the walkability of a city, were associated with less gender gap in activity and activity inequality. In more walkable cities, activity is greater throughout the day and throughout the week, across age, gender, and body mass index (BMI) groups, with the greatest increases in activity for females. Our findings have implications for global public health policy and urban planning and highlight the role of activity inequality and the built environment for improving physical activity and health. PMID:28693034
The origin of polygonal troughs on the northern plains of Mars
NASA Astrophysics Data System (ADS)
Pechmann, J. C.
1980-05-01
The morphology, distribution, geologic environment and relative age of large-scale polygonal trough systems on Mars are examined. The troughs are steep-walled, flat-floored, sinuous depressions typically 200-800 m wide, 20-120 m deep and spaced 5-10 km apart. The mechanics of formation of tension cracks is reviewed to identify the factors controlling the scale of tension crack systems; special emphasis is placed on thermal cracking in permafrost. It is shown that because of the extremely large scale of the Martian fracture systems, they could not have formed by thermal cracking in permafrost, dessication cracking in sediments or contraction cracking in cooling lava. On the basis of photogeologic evidence and analog studies, it is proposed that polygonal troughs on the northern plains of Mars are grabens.
Reproducible Large-Scale Neuroimaging Studies with the OpenMOLE Workflow Management System.
Passerat-Palmbach, Jonathan; Reuillon, Romain; Leclaire, Mathieu; Makropoulos, Antonios; Robinson, Emma C; Parisot, Sarah; Rueckert, Daniel
2017-01-01
OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. OpenMOLE hides the complexity of designing complex experiments thanks to its DSL. Users can embed their own applications and scale their pipelines from a small prototype running on their desktop computer to a large-scale study harnessing distributed computing infrastructures, simply by changing a single line in the pipeline definition. The construction of the pipeline itself is decoupled from the execution context. The high-level DSL abstracts the underlying execution environment, contrary to classic shell-script based pipelines. These two aspects allow pipelines to be shared and studies to be replicated across different computing environments. Workflows can be run as traditional batch pipelines or coupled with OpenMOLE's advanced exploration methods in order to study the behavior of an application, or perform automatic parameter tuning. In this work, we briefly present the strong assets of OpenMOLE and detail recent improvements targeting re-executability of workflows across various Linux platforms. We have tightly coupled OpenMOLE with CARE, a standalone containerization solution that allows re-executing on a Linux host any application that has been packaged on another Linux host previously. The solution is evaluated against a Python-based pipeline involving packages such as scikit-learn as well as binary dependencies. All were packaged and re-executed successfully on various HPC environments, with identical numerical results (here prediction scores) obtained on each environment. Our results show that the pair formed by OpenMOLE and CARE is a reliable solution to generate reproducible results and re-executable pipelines. A demonstration of the flexibility of our solution showcases three neuroimaging pipelines harnessing distributed computing environments as heterogeneous as local clusters or the European Grid Infrastructure (EGI).
Reproducible Large-Scale Neuroimaging Studies with the OpenMOLE Workflow Management System
Passerat-Palmbach, Jonathan; Reuillon, Romain; Leclaire, Mathieu; Makropoulos, Antonios; Robinson, Emma C.; Parisot, Sarah; Rueckert, Daniel
2017-01-01
OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. OpenMOLE hides the complexity of designing complex experiments thanks to its DSL. Users can embed their own applications and scale their pipelines from a small prototype running on their desktop computer to a large-scale study harnessing distributed computing infrastructures, simply by changing a single line in the pipeline definition. The construction of the pipeline itself is decoupled from the execution context. The high-level DSL abstracts the underlying execution environment, contrary to classic shell-script based pipelines. These two aspects allow pipelines to be shared and studies to be replicated across different computing environments. Workflows can be run as traditional batch pipelines or coupled with OpenMOLE's advanced exploration methods in order to study the behavior of an application, or perform automatic parameter tuning. In this work, we briefly present the strong assets of OpenMOLE and detail recent improvements targeting re-executability of workflows across various Linux platforms. We have tightly coupled OpenMOLE with CARE, a standalone containerization solution that allows re-executing on a Linux host any application that has been packaged on another Linux host previously. The solution is evaluated against a Python-based pipeline involving packages such as scikit-learn as well as binary dependencies. All were packaged and re-executed successfully on various HPC environments, with identical numerical results (here prediction scores) obtained on each environment. Our results show that the pair formed by OpenMOLE and CARE is a reliable solution to generate reproducible results and re-executable pipelines. A demonstration of the flexibility of our solution showcases three neuroimaging pipelines harnessing distributed computing environments as heterogeneous as local clusters or the European Grid Infrastructure (EGI). PMID:28381997
[Research progress on hydrological scaling].
Liu, Jianmei; Pei, Tiefan
2003-12-01
With the development of hydrology and the extending effect of mankind on environment, scale issue has become a great challenge to many hydrologists due to the stochasticism and complexity of hydrological phenomena and natural catchments. More and more concern has been given to the scaling issues to gain a large-scale (or small-scale) hydrological characteristic from a certain known catchments, but hasn't been solved successfully. The first part of this paper introduced some concepts about hydrological scale, scale issue and scaling. The key problem is the spatial heterogeneity of catchments and the temporal and spatial variability of hydrological fluxes. Three approaches to scale were put forward in the third part, which were distributed modeling, fractal theory and statistical self similarity analyses. Existing problems and future research directions were proposed in the last part.
Effects of Animal Feeding Operations on Water Resources and the Environment
2000-01-01
and others tested swine feed and feed ingredients (grain, soybean meal, milk /whey, fats/oils, and protein products). The most frequent serotype...Swine Hepatitis E Virus (sHEV) is a recently discovered virus endemic to Midwest hog herds. The proposed zoonotic nature of Asian strains of human HEV...ground and surface water proximal to large-scale swine operations. We identified chemical pollutants and zoonotic pathogens in the environment on
Teaching the blind to find their way by playing video games.
Merabet, Lotfi B; Connors, Erin C; Halko, Mark A; Sánchez, Jaime
2012-01-01
Computer based video games are receiving great interest as a means to learn and acquire new skills. As a novel approach to teaching navigation skills in the blind, we have developed Audio-based Environment Simulator (AbES); a virtual reality environment set within the context of a video game metaphor. Despite the fact that participants were naïve to the overall purpose of the software, we found that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building using audio based cues alone. This was confirmed by a series of behavioral performance tests designed to assess the transfer of acquired spatial information to a large-scale, real-world indoor navigation task. Furthermore, learning the spatial layout through a goal directed gaming strategy allowed for the mental manipulation of spatial information as evidenced by enhanced navigation performance when compared to an explicit route learning strategy. We conclude that the immersive and highly interactive nature of the software greatly engages the blind user to actively explore the virtual environment. This in turn generates an accurate sense of a large-scale three-dimensional space and facilitates the learning and transfer of navigation skills to the physical world.
Optimal configurations of spatial scale for grid cell firing under noise and uncertainty
Towse, Benjamin W.; Barry, Caswell; Bush, Daniel; Burgess, Neil
2014-01-01
We examined the accuracy with which the location of an agent moving within an environment could be decoded from the simulated firing of systems of grid cells. Grid cells were modelled with Poisson spiking dynamics and organized into multiple ‘modules’ of cells, with firing patterns of similar spatial scale within modules and a wide range of spatial scales across modules. The number of grid cells per module, the spatial scaling factor between modules and the size of the environment were varied. Errors in decoded location can take two forms: small errors of precision and larger errors resulting from ambiguity in decoding periodic firing patterns. With enough cells per module (e.g. eight modules of 100 cells each) grid systems are highly robust to ambiguity errors, even over ranges much larger than the largest grid scale (e.g. over a 500 m range when the maximum grid scale is 264 cm). Results did not depend strongly on the precise organization of scales across modules (geometric, co-prime or random). However, independent spatial noise across modules, which would occur if modules receive independent spatial inputs and might increase with spatial uncertainty, dramatically degrades the performance of the grid system. This effect of spatial uncertainty can be mitigated by uniform expansion of grid scales. Thus, in the realistic regimes simulated here, the optimal overall scale for a grid system represents a trade-off between minimizing spatial uncertainty (requiring large scales) and maximizing precision (requiring small scales). Within this view, the temporary expansion of grid scales observed in novel environments may be an optimal response to increased spatial uncertainty induced by the unfamiliarity of the available spatial cues. PMID:24366144
Micromégas: Altered Body–Environment Scaling in Literary Fiction
Dieguez, Sebastian
2016-01-01
Architectonic embodiment postulates a bidirectional link between bodily awareness and the architectural environment. The standard size and features of the human body, for instance, are thought to influence the structure of interiors and buildings, as well as their perception and appreciation. Whereas architectural practice and theory, the visual arts and more recently the cognitive sciences have explored this relationship of humans with their crafted environments, many fictional literary works have long experimented with alterations of body–environment scaling. This so-called Gulliver theme – popular in the science-fiction genre but also in children’s literature and philosophical satire – reveals, as a recurrent thought-experiment, our preoccupation with proportions and our fascination for the infinitely small and large. Here I provide an overview of the altered scaling theme in literature, including classics such as Voltaire’s Micromégas, Swift’s Gulliver’s Travels, Caroll’s Alice, and Matheson’s The Shrinking man, closely examining issues relevant to architectonic embodiment such as: bodily, perceptual, cognitive, affective, and social changes related to alterations in body size relative to people, objects and architectural environments. I next provide a taxonomy of the Gulliver theme and highlight its main psychological features, and then proceed to review relevant work from cognitive science. Although fictional alterations of body-environment scaling far outreach current possibilities in experimental research, I argue that the peripetiae and morals outlined in the literary realm, as products of the human imagination, provide a unique window into the folk-psychology of body and space. PMID:27148156
Shaw, Liam; Ribeiro, Andre L R; Levine, Adam P; Pontikos, Nikolas; Balloux, Francois; Segal, Anthony W; Roberts, Adam P; Smith, Andrew M
2017-09-12
The human microbiome is affected by multiple factors, including the environment and host genetics. In this study, we analyzed the salivary microbiomes of an extended family of Ashkenazi Jewish individuals living in several cities and investigated associations with both shared household and host genetic similarities. We found that environmental effects dominated over genetic effects. While there was weak evidence of geographical structuring at the level of cities, we observed a large and significant effect of shared household on microbiome composition, supporting the role of the immediate shared environment in dictating the presence or absence of taxa. This effect was also seen when including adults who had grown up in the same household but moved out prior to the time of sampling, suggesting that the establishment of the salivary microbiome earlier in life may affect its long-term composition. We found weak associations between host genetic relatedness and microbiome dissimilarity when using family pedigrees as proxies for genetic similarity. However, this association disappeared when using more-accurate measures of kinship based on genome-wide genetic markers, indicating that the environment rather than host genetics is the dominant factor affecting the composition of the salivary microbiome in closely related individuals. Our results support the concept that there is a consistent core microbiome conserved across global scales but that small-scale effects due to a shared living environment significantly affect microbial community composition. IMPORTANCE Previous research shows that the salivary microbiomes of relatives are more similar than those of nonrelatives, but it remains difficult to distinguish the effects of relatedness and shared household environment. Furthermore, pedigree measures may not accurately measure host genetic similarity. In this study, we include genetic relatedness based on genome-wide single nucleotide polymorphisms (SNPs) (rather than pedigree measures) and shared environment in the same analysis. We quantify the relative importance of these factors by studying the salivary microbiomes in members of a large extended Ashkenazi Jewish family living in different locations. We find that host genetics plays no significant role and that the dominant factor is the shared environment at the household level. We also find that this effect appears to persist in individuals who have moved out of the parental household, suggesting that aspects of salivary microbiome composition established during upbringing can persist over a time scale of years. Copyright © 2017 Shaw et al.
A practical large scale/high speed data distribution system using 8 mm libraries
NASA Technical Reports Server (NTRS)
Howard, Kevin
1993-01-01
Eight mm tape libraries are known primarily for their small size, large storage capacity, and low cost. However, many applications require an additional attribute which, heretofore, has been lacking -- high transfer rate. Transfer rate is particularly important in a large scale data distribution environment -- an environment in which 8 mm tape should play a very important role. Data distribution is a natural application for 8 mm for several reasons: most large laboratories have access to 8 mm tape drives, 8 mm tapes are upwardly compatible, 8 mm media are very inexpensive, 8 mm media are light weight (important for shipping purposes), and 8 mm media densely pack data (5 gigabytes now and 15 gigabytes on the horizon). If the transfer rate issue were resolved, 8 mm could offer a good solution to the data distribution problem. To that end Exabyte has analyzed four ways to increase its transfer rate: native drive transfer rate increases, data compression at the drive level, tape striping, and homogeneous drive utilization. Exabyte is actively pursuing native drive transfer rate increases and drive level data compression. However, for non-transmitted bulk data applications (which include data distribution) the other two methods (tape striping and homogeneous drive utilization) hold promise.
NASA Astrophysics Data System (ADS)
Bowden, Jared H.; Nolte, Christopher G.; Otte, Tanya L.
2013-04-01
The impact of the simulated large-scale atmospheric circulation on the regional climate is examined using the Weather Research and Forecasting (WRF) model as a regional climate model. The purpose is to understand the potential need for interior grid nudging for dynamical downscaling of global climate model (GCM) output for air quality applications under a changing climate. In this study we downscale the NCEP-Department of Energy Atmospheric Model Intercomparison Project (AMIP-II) Reanalysis using three continuous 20-year WRF simulations: one simulation without interior grid nudging and two using different interior grid nudging methods. The biases in 2-m temperature and precipitation for the simulation without interior grid nudging are unreasonably large with respect to the North American Regional Reanalysis (NARR) over the eastern half of the contiguous United States (CONUS) during the summer when air quality concerns are most relevant. This study examines how these differences arise from errors in predicting the large-scale atmospheric circulation. It is demonstrated that the Bermuda high, which strongly influences the regional climate for much of the eastern half of the CONUS during the summer, is poorly simulated without interior grid nudging. In particular, two summers when the Bermuda high was west (1993) and east (2003) of its climatological position are chosen to illustrate problems in the large-scale atmospheric circulation anomalies. For both summers, WRF without interior grid nudging fails to simulate the placement of the upper-level anticyclonic (1993) and cyclonic (2003) circulation anomalies. The displacement of the large-scale circulation impacts the lower atmosphere moisture transport and precipitable water, affecting the convective environment and precipitation. Using interior grid nudging improves the large-scale circulation aloft and moisture transport/precipitable water anomalies, thereby improving the simulated 2-m temperature and precipitation. The results demonstrate that constraining the RCM to the large-scale features in the driving fields improves the overall accuracy of the simulated regional climate, and suggest that in the absence of such a constraint, the RCM will likely misrepresent important large-scale shifts in the atmospheric circulation under a future climate.
Introducing Large-Scale Innovation in Schools
NASA Astrophysics Data System (ADS)
Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.
2016-08-01
Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.
Gehman, Alyssa-Lois M; Grabowski, Jonathan H; Hughes, A Randall; Kimbro, David L; Piehler, Michael F; Byers, James E
2017-01-01
Not all hosts, communities or environments are equally hospitable for parasites. Direct and indirect interactions between parasites and their predators, competitors and the environment can influence variability in host exposure, susceptibility and subsequent infection, and these influences may vary across spatial scales. To determine the relative influences of abiotic, biotic and host characteristics on probability of infection across both local and estuary scales, we surveyed the oyster reef-dwelling mud crab Eurypanopeus depressus and its parasite Loxothylacus panopaei, an invasive castrating rhizocephalan, in a hierarchical design across >900 km of the southeastern USA. We quantified the density of hosts, predators of the parasite and host, the host's oyster reef habitat, and environmental variables that might affect the parasite either directly or indirectly on oyster reefs within 10 estuaries throughout this biogeographic range. Our analyses revealed that both between and within estuary-scale variation and host characteristics influenced L. panopaei prevalence. Several additional biotic and abiotic factors were positive predictors of infection, including predator abundance and the depth of water inundation over reefs at high tide. We demonstrate that in addition to host characteristics, biotic and abiotic community-level variables both serve as large-scale indicators of parasite dynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.
It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less
Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.
2016-07-26
It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less
To proceed in the investigation of potential effects of thousands of active pharmaceutical ingredients (API) which may enter the aquatic environment, a cohesive research strategy, specifically a prioritization is paramount. API are biologically active, with specific physiologica...
Macrophytes: Freshwater Forests of Lakes and Rivers.
ERIC Educational Resources Information Center
McDermid, Karla J.; Naiman, Robert J.
1983-01-01
Physical, chemical, and biological effects on macrophytes (aquatic plants) on the freshwater ecosystem are discussed. Research questions and issues related to these organisms are also discussed, including adaptations for survival in a wet environment, ecological consequences of large-scale macrophyte eradication, seasonal changes in plant…
Large scale tracking algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett
2015-01-01
Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For highermore » resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.« less
Nagaoka, Tomoaki; Watanabe, Soichi
2012-01-01
Electromagnetic simulation with anatomically realistic computational human model using the finite-difference time domain (FDTD) method has recently been performed in a number of fields in biomedical engineering. To improve the method's calculation speed and realize large-scale computing with the computational human model, we adapt three-dimensional FDTD code to a multi-GPU cluster environment with Compute Unified Device Architecture and Message Passing Interface. Our multi-GPU cluster system consists of three nodes. The seven GPU boards (NVIDIA Tesla C2070) are mounted on each node. We examined the performance of the FDTD calculation on multi-GPU cluster environment. We confirmed that the FDTD calculation on the multi-GPU clusters is faster than that on a multi-GPU (a single workstation), and we also found that the GPU cluster system calculate faster than a vector supercomputer. In addition, our GPU cluster system allowed us to perform the large-scale FDTD calculation because were able to use GPU memory of over 100 GB.
Hamadi, Hanadi; Probst, Janice C; Khan, Mahmud M; Bellinger, Jessica; Porter, Candace
2017-08-04
Home health aides (HHAs) work in a high-risk industry and experience high rates of work-related injury that have been significantly associated with reduction in workers and organisational productivity, quality and performance. The main objective of the study was to examine how worker environment and ergonomic factors affect HHA risk for reporting occupational injuries. We used cross-sectional analysis of data from the 2007 National Home Health and Hospice Aide Survey (NHHAS). The study sample consisted of a nationally represented sample of home health aides (n=3.377) with a 76.6% response rate. We used two scales 1 : a Work Environment Scale and 2 an Ergonomic Scale. Univariate and bivariate analyses were conducted to describe HHA work-related injury across individual, job and organisational factors. To measure scale reliability, Cronbach's alphas were calculated. Multivariable logistic regression was used to determine predictors of reported occupational injury. In terms of Work Environment Scale, the injury risk was decreased in HHAs who did not consistently care for the same patients (OR=0.96, 95% CI: 0.53 to 1.73). In terms of Ergonomic Scale, the injury risk was decreased only in HHAs who reported not needing any other devices for job safety (OR=0.30, 95% (CI): 0.15 to 0.61). No other Work Environment or Ergonomic Scale factors were associated with HHAs' risk of injury. This study has great implications on a subcategory of the workforce that has a limited amount of published work and studies, as of today, as well as an anticipated large demand for them. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
MIGHTEE: The MeerKAT International GHz Tiered Extragalactic Exploration
NASA Astrophysics Data System (ADS)
Taylor, A. Russ; Jarvis, Matt
2017-05-01
The MeerKAT telescope is the precursor of the Square Kilometre Array mid-frequency dish array to be deployed later this decade on the African continent. MIGHTEE is one of the MeerKAT large survey projects designed to pathfind SKA key science in cosmology and galaxy evolution. Through a tiered radio continuum deep imaging project including several fields totaling 20 square degrees to microJy sensitivities and an ultra-deep image of a single 1 square degree field of view, MIGHTEE will explore dark matter and large scale structure, the evolution of galaxies, including AGN activity and star formation as a function of cosmic time and environment, the emergence and evolution of magnetic fields in galaxies, and the magnetic counter part to large scale structure of the universe.
Robust Coordination for Large Sets of Simple Rovers
NASA Technical Reports Server (NTRS)
Tumer, Kagan; Agogino, Adrian
2006-01-01
The ability to coordinate sets of rovers in an unknown environment is critical to the long-term success of many of NASA;s exploration missions. Such coordination policies must have the ability to adapt in unmodeled or partially modeled domains and must be robust against environmental noise and rover failures. In addition such coordination policies must accommodate a large number of rovers, without excessive and burdensome hand-tuning. In this paper we present a distributed coordination method that addresses these issues in the domain of controlling a set of simple rovers. The application of these methods allows reliable and efficient robotic exploration in dangerous, dynamic, and previously unexplored domains. Most control policies for space missions are directly programmed by engineers or created through the use of planning tools, and are appropriate for single rover missions or missions requiring the coordination of a small number of rovers. Such methods typically require significant amounts of domain knowledge, and are difficult to scale to large numbers of rovers. The method described in this article aims to address cases where a large number of rovers need to coordinate to solve a complex time dependent problem in a noisy environment. In this approach, each rover decomposes a global utility, representing the overall goal of the system, into rover-specific utilities that properly assign credit to the rover s actions. Each rover then has the responsibility to create a control policy that maximizes its own rover-specific utility. We show a method of creating rover-utilities that are "aligned" with the global utility, such that when the rovers maximize their own utility, they also maximize the global utility. In addition we show that our method creates rover-utilities that allow the rovers to create their control policies quickly and reliably. Our distributed learning method allows large sets rovers be used unmodeled domains, while providing robustness against rover failures and changing environments. In experimental simulations we show that our method scales well with large numbers of rovers in addition to being robust against noisy sensor inputs and noisy servo control. The results show that our method is able to scale to large numbers of rovers and achieves up to 400% performance improvement over standard machine learning methods.
Challenges in engineering large customized bone constructs.
Forrestal, David P; Klein, Travis J; Woodruff, Maria A
2017-06-01
The ability to treat large tissue defects with customized, patient-specific scaffolds is one of the most exciting applications in the tissue engineering field. While an increasing number of modestly sized tissue engineering solutions are making the transition to clinical use, successfully scaling up to large scaffolds with customized geometry is proving to be a considerable challenge. Managing often conflicting requirements of cell placement, structural integrity, and a hydrodynamic environment supportive of cell culture throughout the entire thickness of the scaffold has driven the continued development of many techniques used in the production, culturing, and characterization of these scaffolds. This review explores a range of technologies and methods relevant to the design and manufacture of large, anatomically accurate tissue-engineered scaffolds with a focus on the interaction of manufactured scaffolds with the dynamic tissue culture fluid environment. Biotechnol. Bioeng. 2017;114: 1129-1139. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
The food environment and adult obesity in US metropolitan areas.
Michimi, Akihiko; Wimberly, Michael C
2015-11-26
This research examines the larger-scale associations between obesity and food environments in metropolitan areas in the United States (US). The US Census County Business Patterns dataset for 2011 was used to construct various indices of food environments for selected metropolitan areas. The numbers of employees engaged in supermarkets, convenience stores, full service restaurants, fast food restaurants, and snack/coffee shops were standardised using the location quotients, and factor analysis was used to produce two uncorrelated factors measuring food environments. Data on obesity were obtained from the 2011 Behavioral Risk Factor Surveillance System. Individual level obesity measures were linked to the metropolitan area level food environment factors. Models were fitted using generalised estimating equations to control for metropolitan area level intra-correlation and individual level sociodemographic characteristics. It was found that adults residing in cities with a large share of supermarket and full-service restaurant workers were less likely to be obese, while adults residing in cities with a large share of convenience store and fast food restaurant workers were more likely to be obese. Supermarkets and full-service restaurant workers are concentrated in the Northeast and West of the US, where obesity prevalence is relatively lower, while convenience stores and fast-food restaurant workers are concentrated in the South and Midwest, where obesity prevalence is relatively higher. The food environment landscapes measured at the metropolitan area level explain the continental-scale patterns of obesity prevalence. The types of food that are readily available and widely served may translate into obesity disparities across metropolitan areas.
NASA Astrophysics Data System (ADS)
Handlos, Zachary J.
Though considerable research attention has been devoted to examination of the Northern Hemispheric polar and subtropical jet streams, relatively little has been directed toward understanding the circumstances that conspire to produce the relatively rare vertical superposition of these usually separate features. This dissertation investigates the structure and evolution of large-scale environments associated with jet superposition events in the northwest Pacific. An objective identification scheme, using NCEP/NCAR Reanalysis 1 data, is employed to identify all jet superpositions in the west Pacific (30-40°N, 135-175°E) for boreal winters (DJF) between 1979/80 - 2009/10. The analysis reveals that environments conducive to west Pacific jet superposition share several large-scale features usually associated with East Asian Winter Monsoon (EAWM) northerly cold surges, including the presence of an enhanced Hadley Cell-like circulation within the jet entrance region. It is further demonstrated that several EAWM indices are statistically significantly correlated with jet superposition frequency in the west Pacific. The life cycle of EAWM cold surges promotes interaction between tropical convection and internal jet dynamics. Low potential vorticity (PV), high theta e tropical boundary layer air, exhausted by anomalous convection in the west Pacific lower latitudes, is advected poleward towards the equatorward side of the jet in upper tropospheric isentropic layers resulting in anomalous anticyclonic wind shear that accelerates the jet. This, along with geostrophic cold air advection in the left jet entrance region that drives the polar tropopause downward through the jet core, promotes the development of the deep, vertical PV wall characteristic of superposed jets. West Pacific jet superpositions preferentially form within an environment favoring the aforementioned characteristics regardless of EAWM seasonal strength. Post-superposition, it is shown that the west Pacific jet extends eastward and is associated with an upper tropospheric cyclonic (anticyclonic) anomaly in its left (right) exit region. A downstream ridge is present over northwest Canada, and within the strong EAWM environment, a wavier flow over North America is observed relative to the neutral EAWM environment. Preliminary investigation of the two weak EAWM season superpositions reveals a Kona Low type feature post-superposition. This is associated with anomalous convection reminiscent of an atmospheric river southwest of Mexico.
Human preferences for sexually dimorphic faces may be evolutionarily novel
Scott, Isabel M.; Clark, Andrew P.; Josephson, Steven C.; Boyette, Adam H.; Cuthill, Innes C.; Fried, Ruby L.; Gibson, Mhairi A.; Hewlett, Barry S.; Jamieson, Mark; Jankowiak, William; Honey, P. Lynne; Huang, Zejun; Liebert, Melissa A.; Purzycki, Benjamin G.; Shaver, John H.; Snodgrass, J. Josh; Sosis, Richard; Sugiyama, Lawrence S.; Swami, Viren; Yu, Douglas W.; Zhao, Yangke; Penton-Voak, Ian S.
2014-01-01
A large literature proposes that preferences for exaggerated sex typicality in human faces (masculinity/femininity) reflect a long evolutionary history of sexual and social selection. This proposal implies that dimorphism was important to judgments of attractiveness and personality in ancestral environments. It is difficult to evaluate, however, because most available data come from large-scale, industrialized, urban populations. Here, we report the results for 12 populations with very diverse levels of economic development. Surprisingly, preferences for exaggerated sex-specific traits are only found in the novel, highly developed environments. Similarly, perceptions that masculine males look aggressive increase strongly with development and, specifically, urbanization. These data challenge the hypothesis that facial dimorphism was an important ancestral signal of heritable mate value. One possibility is that highly developed environments provide novel opportunities to discern relationships between facial traits and behavior by exposing individuals to large numbers of unfamiliar faces, revealing patterns too subtle to detect with smaller samples. PMID:25246593
Polarization of the prompt gamma-ray emission from the gamma-ray burst of 6 December 2002.
Coburn, Wayne; Boggs, Steven E
2003-05-22
Observations of the afterglows of gamma-ray bursts (GRBs) have revealed that they lie at cosmological distances, and so correspond to the release of an enormous amount of energy. The nature of the central engine that powers these events and the prompt gamma-ray emission mechanism itself remain enigmatic because, once a relativistic fireball is created, the physics of the afterglow is insensitive to the nature of the progenitor. Here we report the discovery of linear polarization in the prompt gamma-ray emission from GRB021206, which indicates that it is synchrotron emission from relativistic electrons in a strong magnetic field. The polarization is at the theoretical maximum, which requires a uniform, large-scale magnetic field over the gamma-ray emission region. A large-scale magnetic field constrains possible progenitors to those either having or producing organized fields. We suggest that the large magnetic energy densities in the progenitor environment (comparable to the kinetic energy densities of the fireball), combined with the large-scale structure of the field, indicate that magnetic fields drive the GRB explosion.
Wang, Yong; Zhang, Damao; Liu, Xiaohong; ...
2018-01-06
Mixed-phase clouds containing both liquid droplets and ice particles occur frequently at high latitudes and in the midlatitude storm track regions. Simulations of the cloud phase partitioning between liquid and ice hydrometeors in state-of-the-art global climate models are still associated with large biases. For this study, the phase partitioning in terms of liquid mass phase ratio (MPR liq, defined as the ratio of liquid mass to total condensed water mass) simulated from the NCAR Community Atmosphere Model version 5 (CAM5) is evaluated against the observational data from A-Train satellite remote sensors. Modeled MPR liq is significantly lower than observations onmore » the global scale, especially in the Southern Hemisphere (e.g., Southern Ocean and the Antarctic). Sensitivity tests with CAM5 are conducted to investigate the distinct contributions of heterogeneous ice nucleation, shallow cumulus detrainment, and large-scale environment (e.g., winds, temperature, and water vapor) to the low MPR liq biases. Our results show that an aerosol-aware ice nucleation parameterization increases the MPR liq especially at temperatures colder than -20°C and significantly improves the model agreements with observations in the Polar regions in summer. The decrease of threshold temperature over which all detrained cloud water is liquid from 268 to 253 K enhances the MPR liq and improves the MPR liq mostly over the Southern Ocean. By constraining water vapor in CAM5 toward reanalysis, modeled low biases in many geographical regions are largely reduced through a significant decrease of cloud ice mass mixing ratio.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yong; Zhang, Damao; Liu, Xiaohong
Mixed-phase clouds containing both liquid droplets and ice particles occur frequently at high latitudes and in the midlatitude storm track regions. Simulations of the cloud phase partitioning between liquid and ice hydrometeors in state-of-the-art global climate models are still associated with large biases. For this study, the phase partitioning in terms of liquid mass phase ratio (MPR liq, defined as the ratio of liquid mass to total condensed water mass) simulated from the NCAR Community Atmosphere Model version 5 (CAM5) is evaluated against the observational data from A-Train satellite remote sensors. Modeled MPR liq is significantly lower than observations onmore » the global scale, especially in the Southern Hemisphere (e.g., Southern Ocean and the Antarctic). Sensitivity tests with CAM5 are conducted to investigate the distinct contributions of heterogeneous ice nucleation, shallow cumulus detrainment, and large-scale environment (e.g., winds, temperature, and water vapor) to the low MPR liq biases. Our results show that an aerosol-aware ice nucleation parameterization increases the MPR liq especially at temperatures colder than -20°C and significantly improves the model agreements with observations in the Polar regions in summer. The decrease of threshold temperature over which all detrained cloud water is liquid from 268 to 253 K enhances the MPR liq and improves the MPR liq mostly over the Southern Ocean. By constraining water vapor in CAM5 toward reanalysis, modeled low biases in many geographical regions are largely reduced through a significant decrease of cloud ice mass mixing ratio.« less
NASA Astrophysics Data System (ADS)
Wang, Yong; Zhang, Damao; Liu, Xiaohong; Wang, Zhien
2018-01-01
Mixed-phase clouds containing both liquid droplets and ice particles occur frequently at high latitudes and in the midlatitude storm track regions. Simulations of the cloud phase partitioning between liquid and ice hydrometeors in state-of-the-art global climate models are still associated with large biases. In this study, the phase partitioning in terms of liquid mass phase ratio (MPRliq, defined as the ratio of liquid mass to total condensed water mass) simulated from the NCAR Community Atmosphere Model version 5 (CAM5) is evaluated against the observational data from A-Train satellite remote sensors. Modeled MPRliq is significantly lower than observations on the global scale, especially in the Southern Hemisphere (e.g., Southern Ocean and the Antarctic). Sensitivity tests with CAM5 are conducted to investigate the distinct contributions of heterogeneous ice nucleation, shallow cumulus detrainment, and large-scale environment (e.g., winds, temperature, and water vapor) to the low MPRliq biases. Our results show that an aerosol-aware ice nucleation parameterization increases the MPRliq especially at temperatures colder than -20°C and significantly improves the model agreements with observations in the Polar regions in summer. The decrease of threshold temperature over which all detrained cloud water is liquid from 268 to 253 K enhances the MPRliq and improves the MPRliq mostly over the Southern Ocean. By constraining water vapor in CAM5 toward reanalysis, modeled low biases in many geographical regions are largely reduced through a significant decrease of cloud ice mass mixing ratio.
Prediction of Acoustic Environments from Horizontal Rocket Firings
NASA Technical Reports Server (NTRS)
Giacomoni, Clothilde
2014-01-01
In recent years, advances in research and engineering have led to more powerful launch vehicles which can reach areas of space not yet explored. These more powerful vehicles yield acoustic environments potentially destructive to the vehicle or surrounding structures. Therefore, it has become increasingly important to be able to predict the acoustic environments created by these vehicles in order to avoid structural and/or competent failure. The current industry standard technique for predicting launch-induced acoustic environments was developed by Eldred in the early 1970's and is published in NASA SP-80721. Recent work2 has shown Eldred's technique to be inaccurate for current state-of-the-art launch vehicles. Due to the high cost of full-scale and even sub-scale rocket experiments, very little rocket noise data is available. Furthermore, much of the work thought to be applicable to rocket noise has been done with heated jets. Tam3,4 has done an extensive amount of research on jets of different nozzle exit shape, diameter, velocity, and temperature. Though the values of these parameters, especially exit velocity and temperature, are often very low compared to these values in rockets, a lot can be learned about rocket noise from jet noise literature. The turbulent nature of jet and rocket exhausts is quite similar. Both exhausts contain turbulent structures of varying scale-termed the fine and large scale turbulence by Tam. The finescale turbulence is due to small eddies from the jet plume interacting with the ambient atmosphere. According to Tam et al., the noise radiated by this envelope of small-scale turbulence is statistically isotropic. Hence, one would expect the noise from the small scale turbulence of the jet to be nearly omni-directional. The coherent nature of the large-scale turbulence results in interference of the noise radiated from different spatial locations within the jet. This interference-whether it is constructive or destructive-results in highly directional noise radiation. Tam3 has proposed a model to predict the acoustic environment due to jets and while it works extremely well for jets, it was found to be inappropriate for rockets8. A model to predict the acoustic environment due to a launch vehicle in the far-field which incorporates concepts from both Eldred and Tam was created. This was done using five sets of horizontally fired rocket data, obtained between 2008 and 2012. Three of these rockets use solid propellant and two use liquid propellant. Through scaling analysis, it is shown that liquid and solid rocket motors exhibit similar spectra at similar amplitudes. This model is accurate for these five data sets within 5 dB of the measured data for receiver angles of 30deg to 160deg (with respect to the downstream exhaust centerline). The model uses the following vehicle parameters: nozzle exit diameter and velocity, radial distance from source to receiver, receiver angle, mass flow rate, and acoustic efficiency.
NASA Technical Reports Server (NTRS)
Huang, Jingfeng; Hsu, N. Christina; Tsay, Si-Chee; Zhang, Chidong; Jeong, Myeong Jae; Gautam, Ritesh; Bettenhausen, Corey; Sayer, Andrew M.; Hansell, Richard A.; Liu, Xiaohong;
2012-01-01
One of the seven scientific areas of interests of the 7-SEAS field campaign is to evaluate the impact of aerosol on cloud and precipitation (http://7-seas.gsfc.nasa.gov). However, large-scale covariability between aerosol, cloud and precipitation is complicated not only by ambient environment and a variety of aerosol effects, but also by effects from rain washout and climate factors. This study characterizes large-scale aerosol-cloud-precipitation covariability through synergy of long-term multi ]sensor satellite observations with model simulations over the 7-SEAS region [10S-30N, 95E-130E]. Results show that climate factors such as ENSO significantly modulate aerosol and precipitation over the region simultaneously. After removal of climate factor effects, aerosol and precipitation are significantly anti-correlated over the southern part of the region, where high aerosols loading is associated with overall reduced total precipitation with intensified rain rates and decreased rain frequency, decreased tropospheric latent heating, suppressed cloud top height and increased outgoing longwave radiation, enhanced clear-sky shortwave TOA flux but reduced all-sky shortwave TOA flux in deep convective regimes; but such covariability becomes less notable over the northern counterpart of the region where low ]level stratus are found. Using CO as a proxy of biomass burning aerosols to minimize the washout effect, large-scale covariability between CO and precipitation was also investigated and similar large-scale covariability observed. Model simulations with NCAR CAM5 were found to show similar effects to observations in the spatio-temporal patterns. Results from both observations and simulations are valuable for improving our understanding of this region's meteorological system and the roles of aerosol within it. Key words: aerosol; precipitation; large-scale covariability; aerosol effects; washout; climate factors; 7- SEAS; CO; CAM5
Experimental investigation of large-scale vortices in a freely spreading gravity current
NASA Astrophysics Data System (ADS)
Yuan, Yeping; Horner-Devine, Alexander R.
2017-10-01
A series of laboratory experiments are presented to compare the dynamics of constant-source buoyant gravity currents propagating into laterally confined (channelized) and unconfined (spreading) environments. The plan-form structure of the spreading current and the vertical density and velocity structures on the interface are quantified using the optical thickness method and a combined particle image velocimetry and planar laser-induced fluorescence method, respectively. With lateral boundaries, the buoyant current thickness is approximately constant and Kelvin-Helmholtz instabilities are generated within the shear layer. The buoyant current structure is significantly different in the spreading case. As the current spreads laterally, nonlinear large-scale vortex structures are observed at the interface, which maintain a coherent shape as they propagate away from the source. These structures are continuously generated near the river mouth, have amplitudes close to the buoyant layer thickness, and propagate offshore at speeds approximately equal to the internal wave speed. The observed depth and propagation speed of the instabilities match well with the fastest growing mode predicted by linear stability analysis, but with a shorter wavelength. The spreading flows have much higher vorticity, which is aggregated within the large-scale structures. Secondary instabilities are generated on the leading edge of the braids between the large-scale vortex structures and ultimately break and mix on the lee side of the structures. Analysis of the vortex dynamics shows that lateral stretching intensifies the vorticity in the spreading currents, contributing to higher vorticity within the large-scale structures in the buoyant plume. The large-scale instabilities and vortex structures observed in the present study provide new insights into the origin of internal frontal structures frequently observed in coastal river plumes.
NASA Technical Reports Server (NTRS)
Parsons, David S.; Ordway, David; Johnson, Kenneth
2013-01-01
This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.
NASA Technical Reports Server (NTRS)
Parsons, David S.; Ordway, David O.; Johnson, Kenneth L.
2013-01-01
This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.
Factors influencing the spreading of a low-discharge river plume
NASA Astrophysics Data System (ADS)
Mestres, M.; Sierra, J. P.; Sánchez-Arcilla, A.
2007-09-01
Coastal plumes resulting from the continuous discharge of brackish or fresh river water are common features of continental and shelf seas. They are important for several aspects of the coastal environment, and can influence the local socio-economy to some degree. It is known from many studies that the evolution of plumes depends on various factors, such as the local bathymetry, hydrodynamics and meteorological conditions; most of these works; however, have focused on medium to large-scale rivers, while the smaller-scale discharges commonly found in the microtidal environments of the Mediterranean Sea have been less studied. This paper is centred on the behaviour of a freshwater plume arising from one of such outflows, in terms of both the physical configuration of the waterbody and the characteristics of the main driving mechanisms (discharge rate and wind stress). The modelled cases correspond to an open shallow bay, limited at one end by a large headland, and into which a typical Mediterranean waterway discharges. This particular setup is representative of a number of different bays existing on the Eastern Spanish coast. The numerical results highlight the large influence of the bay's topography on the river plume's extension and inner structure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraus, Michaela; Nickeler, Dieter H.; Liimets, Tiina
The Galactic object MWC 137 has been suggested to belong to the group of B[e] supergiants. However, with its large-scale optical bipolar ring nebula and high-velocity jet and knots, it is a rather atypical representative of this class. We performed multiwavelength observations spreading from the optical to the radio regimes. Based on optical imaging and long-slit spectroscopic data, we found that the northern parts of the large-scale nebula are predominantly blueshifted, while the southern regions appear mostly redshifted. We developed a geometrical model consisting of two double cones. Although various observational features can be approximated with such a scenario, themore » observed velocity pattern is more complex. Using near-infrared integral-field unit spectroscopy, we studied the hot molecular gas in the vicinity of the star. The emission from the hot CO gas arises in a small-scale disk revolving around the star on Keplerian orbits. Although the disk itself cannot be spatially resolved, its emission is reflected by the dust arranged in arc-like structures and the clumps surrounding MWC 137 on small scales. In the radio regime, we mapped the cold molecular gas in the outskirts of the optical nebula. We found that large amounts of cool molecular gas and warm dust embrace the optical nebula in the east, south, and west. No cold gas or dust was detected in the north and northwestern regions. Despite the new insights into the nebula kinematics gained from our studies, the real formation scenario of the large-scale nebula remains an open issue.« less
NASA Astrophysics Data System (ADS)
Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.
2018-05-01
The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.
A Computational framework for telemedicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, I.; von Laszewski, G.; Thiruvathukal, G. K.
1998-07-01
Emerging telemedicine applications require the ability to exploit diverse and geographically distributed resources. Highspeed networks are used to integrate advanced visualization devices, sophisticated instruments, large databases, archival storage devices, PCs, workstations, and supercomputers. This form of telemedical environment is similar to networked virtual supercomputers, also known as metacomputers. Metacomputers are already being used in many scientific application areas. In this article, we analyze requirements necessary for a telemedical computing infrastructure and compare them with requirements found in a typical metacomputing environment. We will show that metacomputing environments can be used to enable a more powerful and unified computational infrastructure formore » telemedicine. The Globus metacomputing toolkit can provide the necessary low level mechanisms to enable a large scale telemedical infrastructure. The Globus toolkit components are designed in a modular fashion and can be extended to support the specific requirements for telemedicine.« less
What Determines Upscale Growth of Oceanic Convection into MCSs?
NASA Astrophysics Data System (ADS)
Zipser, E. J.
2017-12-01
Over tropical oceans, widely scattered convection of various depths may or may not grow upscale into mesoscale convective systems (MCSs). But what distinguishes the large-scale environment that favors such upscale growth from that favoring "unorganized", scattered convection? Is it some combination of large-scale low-level convergence and ascending motion, combined with sufficient instability? We recently put this to a test with ERA-I reanalysis data, with disappointing results. The "usual suspects" of total column water vapor, large-scale ascent, and CAPE may all be required to some extent, but their differences between large MCSs and scattered convection are small. The main positive results from this work (already published) demonstrate that the strength of convection is well correlated with the size and perhaps "organization" of convective features over tropical oceans, in contrast to tropical land, where strong convection is common for large or small convective features. So, important questions remain: Over tropical oceans, how should we define "organized" convection? By size of the precipitation area? And what environmental conditions lead to larger and better organized MCSs? Some recent attempts to answer these questions will be described, but good answers may require more data, and more insights.
Seismic and source characteristics of large chemical explosions. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adushkin, V.V.; Kostuchenko, V.N.; Pernik, L.M.
From the very beginning of its arrangement in 1947, the Institute for Dynamics of the Geospheres RAS (former Special Sector of the Institute for physics of the Earth, RAS) was providing scientific observations of effects of nuclear explosions, as well as large-scale detonations of HE, on environment. This report presents principal results of instrumental observations obtained from various large-scale chemical explosions conducted in the Former-Soviet Union in the period of time from 1957 to 1989. Considering principal aim of the work, tamped and equivalent chemical explosions have been selected with total weights from several hundreds to several thousands ton. Inmore » particular, the selected explosions were aimed to study scaling law from excavation explosions, seismic effect of tamped explosions, and for dam construction for hydropower stations and soil melioration. Instrumental data on surface explosions of total weight in the same range aimed to test military technics and special objects are not included.« less
SPIN ALIGNMENTS OF SPIRAL GALAXIES WITHIN THE LARGE-SCALE STRUCTURE FROM SDSS DR7
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Youcai; Yang, Xiaohu; Luo, Wentao
Using a sample of spiral galaxies selected from the Sloan Digital Sky Survey Data Release 7 and Galaxy Zoo 2, we investigate the alignment of spin axes of spiral galaxies with their surrounding large-scale structure, which is characterized by the large-scale tidal field reconstructed from the data using galaxy groups above a certain mass threshold. We find that the spin axes only have weak tendencies to be aligned with (or perpendicular to) the intermediate (or minor) axis of the local tidal tensor. The signal is the strongest in a cluster environment where all three eigenvalues of the local tidal tensor aremore » positive. Compared to the alignments between halo spins and the local tidal field obtained in N-body simulations, the above observational results are in best agreement with those for the spins of inner regions of halos, suggesting that the disk material traces the angular momentum of dark matter halos in the inner regions.« less
Carbon nanotubes (CNTs) have been incorporated into numerous consumer products, and have also been employed in various industrial areas because of their extraordinary properties. The large scale production and wide applications of CNTs make their release into the environment a ma...
Optimizing C-17 Pacific Basing
2014-05-01
Starlifter and C-5 Galaxy turbofan -powered aircraft (Owen, 2014). Pax Americana was characterized by a busy and steady operating environment for US global...center of the Pacific as well as the north and western Pacific rim (Owen, 2014). Turbofan airlifters began to be integrated into the large-scale
Ecological Effects of Weather Modification: A Problem Analysis.
ERIC Educational Resources Information Center
Cooper, Charles F.; Jolly, William C.
This publication reviews the potential hazards to the environment of weather modification techniques as they eventually become capable of producing large scale weather pattern modifications. Such weather modifications could result in ecological changes which would generally require several years to be fully evident, including the alteration of…
The National Near-Road Mobile Source Air Toxics Study: Las Vegas
EPA, in collaboration with FHWA, has been involved in a large-scale monitoring research study in an effort to characterize highway vehicle emissions in a near-road environment. The pollutants of interest include particulate matter with aerodynamic diameter less than 2.5 microns ...
The Electronic Librarian: Inching Towards the Revolution
ERIC Educational Resources Information Center
Cuesta, Emerita M.
2005-01-01
Electronic resources are transforming the way librarians work. New technological skills have been added to the librarian's tool kit. Some libraries have undertaken large-scale organizational reconfigurations to meet the challenges of the digital environment. Yet libraries still rely on traditional functions such as acquisitions, cataloging, and…
ERIC Educational Resources Information Center
Crane, Earl Newell
2013-01-01
The research problem that inspired this effort is the challenge of managing the security of systems in large-scale heterogeneous networked environments. Human intervention is slow and limited: humans operate at much slower speeds than networked computer communications and there are few humans associated with each network. Enabling each node in the…
Test and Evaluation of Architecture-Aware Compiler Environment
2011-11-01
biology, medicine, social sciences , and security applications. Challenges include extremely large graphs (the Facebook friend network has over...Operations with Temporal Binning ....................................................................... 32 4.12 Memory behavior and Energy per...five challenge problems empirically, exploring their scaling properties, computation and datatype needs, memory behavior , and temporal behavior
Measurement-Driven Characterization of the Mobile Environment
ERIC Educational Resources Information Center
Soroush, Hamed
2013-01-01
The concurrent deployment of high-quality wireless networks and large-scale cloud services offers the promise of secure ubiquitous access to seemingly limitless amount of content. However, as users' expectations have grown more demanding, the performance and connectivity failures endemic to the existing networking infrastructure have become more…
SOIL NITRATE AND AMMONIUM THROUGH 2 YEARS OF SELECTIVE HERBIVORY AND CHRONIC NITROGEN ENRICHMENT
-The effects of increased amounts and flux of bioavailable nitrogenous compounds in the ecosystem is of great interest to ecological researchers and longstanding concern to land-managers. Excess nitrogen in the environment is associated with many large-scale environmental concer...
Power Grid Data Analysis with R and Hadoop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hafen, Ryan P.; Gibson, Tara D.; Kleese van Dam, Kerstin
This book chapter presents an approach to analysis of large-scale time-series sensor information based on our experience with power grid data. We use the R-Hadoop Integrated Programming Environment (RHIPE) to analyze a 2TB data set and present code and results for this analysis.
Leveraging Web-Based Environments for Mass Atrocity Prevention
ERIC Educational Resources Information Center
Harding, Tucker B.; Whitlock, Mark A.
2013-01-01
A growing literature exploring large-scale, identity-based political violence, including mass killing and genocide, debates the plausibility of, and prospects for, early warning and prevention. An extension of the debate involves the prospects for creating educational experiences that result in more sophisticated analytical products that enhance…
REDUCTIVE DEHALOGENATION OF ORGANIC CONTAMINANTS IN SOILS AND GROUND WATER
Introduction and large scale production of synthetic halogenated organic chemicals over the last 50 years has resulted in a group of contaminants which tend to persist in the environment and resist both biotic and abiotic degradation. The low solubility of these types of contamin...
REDUCTIVE DEHALOGENATION OF ORGANIC CONTAMINANTS IN SOILS AND GROUND WATER
Introduction and large-scale production of synthetic halogenated organic chemicals over the last fifty years has resulted in a group of contaminants that tend to persist in the environment and resist both biotic and abiotic degradation. The low solubility of these types of contam...
NASA Astrophysics Data System (ADS)
Sotiropoulos, Fotis; Khosronejad, Ali
2016-02-01
Sand waves arise in subaqueous and Aeolian environments as the result of the complex interaction between turbulent flows and mobile sand beds. They occur across a wide range of spatial scales, evolve at temporal scales much slower than the integral scale of the transporting turbulent flow, dominate river morphodynamics, undermine streambank stability and infrastructure during flooding, and sculpt terrestrial and extraterrestrial landscapes. In this paper, we present the vision for our work over the last ten years, which has sought to develop computational tools capable of simulating the coupled interactions of sand waves with turbulence across the broad range of relevant scales: from small-scale ripples in laboratory flumes to mega-dunes in large rivers. We review the computational advances that have enabled us to simulate the genesis and long-term evolution of arbitrarily large and complex sand dunes in turbulent flows using large-eddy simulation and summarize numerous novel physical insights derived from our simulations. Our findings explain the role of turbulent sweeps in the near-bed region as the primary mechanism for destabilizing the sand bed, show that the seeds of the emergent structure in dune fields lie in the heterogeneity of the turbulence and bed shear stress fluctuations over the initially flatbed, and elucidate how large dunes at equilibrium give rise to energetic coherent structures and modify the spectra of turbulence. We also discuss future challenges and our vision for advancing a data-driven simulation-based engineering science approach for site-specific simulations of river flooding.
Stochastic Reconnection for Large Magnetic Prandtl Numbers
NASA Astrophysics Data System (ADS)
Jafari, Amir; Vishniac, Ethan T.; Kowal, Grzegorz; Lazarian, Alex
2018-06-01
We consider stochastic magnetic reconnection in high-β plasmas with large magnetic Prandtl numbers, Pr m > 1. For large Pr m , field line stochasticity is suppressed at very small scales, impeding diffusion. In addition, viscosity suppresses very small-scale differential motions and therefore also the local reconnection. Here we consider the effect of high magnetic Prandtl numbers on the global reconnection rate in a turbulent medium and provide a diffusion equation for the magnetic field lines considering both resistive and viscous dissipation. We find that the width of the outflow region is unaffected unless Pr m is exponentially larger than the Reynolds number Re. The ejection velocity of matter from the reconnection region is also unaffected by viscosity unless Re ∼ 1. By these criteria the reconnection rate in typical astrophysical systems is almost independent of viscosity. This remains true for reconnection in quiet environments where current sheet instabilities drive reconnection. However, if Pr m > 1, viscosity can suppress small-scale reconnection events near and below the Kolmogorov or viscous damping scale. This will produce a threshold for the suppression of large-scale reconnection by viscosity when {\\Pr }m> \\sqrt{Re}}. In any case, for Pr m > 1 this leads to a flattening of the magnetic fluctuation power spectrum, so that its spectral index is ∼‑4/3 for length scales between the viscous dissipation scale and eddies larger by roughly {{\\Pr }}m3/2. Current numerical simulations are insensitive to this effect. We suggest that the dependence of reconnection on viscosity in these simulations may be due to insufficient resolution for the turbulent inertial range rather than a guide to the large Re limit.
NASA Astrophysics Data System (ADS)
Bryant, Gerald
2015-04-01
Large-scale soft-sediment deformation features in the Navajo Sandstone have been a topic of interest for nearly 40 years, ever since they were first explored as a criterion for discriminating between marine and continental processes in the depositional environment. For much of this time, evidence for large-scale sediment displacements was commonly attributed to processes of mass wasting. That is, gravity-driven movements of surficial sand. These slope failures were attributed to the inherent susceptibility of dune sand responding to environmental triggers such as earthquakes, floods, impacts, and the differential loading associated with dune topography. During the last decade, a new wave of research is focusing on the event significance of deformation features in more detail, revealing a broad diversity of large-scale deformation morphologies. This research has led to a better appreciation of subsurface dynamics in the early Jurassic deformation events recorded in the Navajo Sandstone, including the important role of intrastratal sediment flow. This report documents two illustrative examples of large-scale sediment displacements represented in extensive outcrops of the Navajo Sandstone along the Utah/Arizona border. Architectural relationships in these outcrops provide definitive constraints that enable the recognition of a large-scale sediment outflow, at one location, and an equally large-scale subsurface flow at the other. At both sites, evidence for associated processes of liquefaction appear at depths of at least 40 m below the original depositional surface, which is nearly an order of magnitude greater than has commonly been reported from modern settings. The surficial, mass flow feature displays attributes that are consistent with much smaller-scale sediment eruptions (sand volcanoes) that are often documented from modern earthquake zones, including the development of hydraulic pressure from localized, subsurface liquefaction and the subsequent escape of fluidized sand toward the unconfined conditions of the surface. The origin of the forces that produced the lateral, subsurface movement of a large body of sand at the other site is not readily apparent. The various constraints on modeling the generation of the lateral force required to produce the observed displacement are considered here, along with photodocumentation of key outcrop relationships.
Lithic Landscapes: Early Human Impact from Stone Tool Production on the Central Saharan Environment
Foley, Robert A.; Lahr, Marta Mirazón
2015-01-01
Humans have had a major impact on the environment. This has been particularly intense in the last millennium but has been noticeable since the development of food production and the associated higher population densities in the last 10,000 years. The use of fire and over-exploitation of large mammals has also been recognized as having an effect on the world’s ecology, going back perhaps 100,000 years or more. Here we report on an earlier anthropogenic environmental change. The use of stone tools, which dates back over 2.5 million years, and the subsequent evolution of a technologically-dependent lineage required the exploitation of very large quantities of rock. However, measures of the impact of hominin stone exploitation are rare and inherently difficult. The Messak Settafet, a sandstone massif in the Central Sahara (Libya), is littered with Pleistocene stone tools on an unprecedented scale and is, in effect, a man-made landscape. Surveys showed that parts of the Messak Settafet have as much as 75 lithics per square metre and that this fractured debris is a dominant element of the environment. The type of stone tools—Acheulean and Middle Stone Age—indicates that extensive stone tool manufacture occurred over the last half million years or more. The lithic-strewn pavement created by this ancient stone tool manufacture possibly represents the earliest human environmental impact at a landscape scale and is an example of anthropogenic change. The nature of the lithics and inferred age may suggest that hominins other than modern humans were capable of unintentionally modifying their environment. The scale of debris also indicates the significance of stone as a critical resource for hominins and so provides insights into a novel evolutionary ecology. PMID:25760999
Stellato, Giuseppina; La Storia, Antonietta; De Filippis, Francesca; Borriello, Giorgia; Villani, Francesco
2016-01-01
ABSTRACT Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The aims of this study were to learn more about the possible influence of the meat processing environment on initial fresh meat contamination and to investigate the differences between small-scale retail distribution (SD) and large-scale retail distribution (LD) facilities. Samples were collected from butcheries (n = 20), including LD (n = 10) and SD (n = 10) facilities, over two sampling campaigns. Samples included fresh beef and pork cuts and swab samples from the knife, the chopping board, and the butcher's hand. The microbiota of both meat samples and environmental swabs were very complex, including more than 800 operational taxonomic units (OTUs) collapsed at the species level. The 16S rRNA sequencing analysis showed that core microbiota were shared by 80% of the samples and included Pseudomonas spp., Streptococcus spp., Brochothrix spp., Psychrobacter spp., and Acinetobacter spp. Hierarchical clustering of the samples based on the microbiota showed a certain separation between meat and environmental samples, with higher levels of Proteobacteria in meat. In particular, levels of Pseudomonas and several Enterobacteriaceae members were significantly higher in meat samples, while Brochothrix, Staphylococcus, lactic acid bacteria, and Psychrobacter prevailed in environmental swab samples. Consistent clustering was also observed when metabolic activities were considered by predictive metagenomic analysis of the samples. An increase in carbohydrate metabolism was predicted for the environmental swabs and was consistently linked to Firmicutes, while increases in pathways related to amino acid and lipid metabolism were predicted for the meat samples and were positively correlated with Proteobacteria. Our results highlighted the importance of the processing environment in contributing to the initial microbial levels of meat and clearly showed that the type of retail facility (LD or SD) did not apparently affect the contamination. IMPORTANCE The study provides an in-depth description of the microbiota of meat and meat processing environments. It highlights the importance of the environment as a contamination source of spoilage bacteria, and it shows that the size of the retail facility does not affect the level and type of contamination. PMID:27129965
Evaluation and comparison of health care Work Environment Scale in military settings.
Maloney, J P; Anderson, F D; Gladd, D L; Brown, D L; Hardy, M A
1996-05-01
The purpose of this study was to describe health care providers' perceptions of their work environment at a large U.S. Army medical center, and to compare the findings to other military medical centers. The sample (N = 112) consisted of the professional nursing staff working on the nine inpatient units. The Work Environmental Scale (WES) was used to measure perceptions of the workplace relative to gender, position (head nurses, staff nurses, and agency nurses), specialty nursing (intensive care unit [ICU] versus non-ICU), education (MSN, BSN, and ADN), and patterns of differences between the WES subscales of four military medical centers. Results of the study indicate that there were no significant gender differences. Head nurses, non-ICU nurses, and MSN nurses perceived their environment more positively. There were significant differences in the WES subscales between the military hospitals. Implications for nursing using the WES were recommended.
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Huffman, Allan W.; Lux, Kevin M.; Cetola, Jeffrey D.; Charney, Joseph J.; Riordan, Allen J.; Lin, Yuh-Lang; Waight, Kenneth T., III; Proctor, Fred (Technical Monitor)
2003-01-01
Simulation experiments reveal key processes that organize a hydrostatic environment conducive to severe turbulence. The paradigm requires juxtaposition of the entrance region of a curved jet stream, which is highly subgeostrophic, with the entrance region of a straight jet stream, which is highly supergeostrophic. The wind and mass fields become misphased as the entrance regions converge resulting in the significant spatial variation of inertial forcing, centripetal forcing, and along- and cross-stream pressure gradient forcing over a mesobeta scale region. This results in frontogenesis and the along-stream divergence of cyclonic and convergence of cyclonic ageostrophic vertical vorticity. The centripetally forced mesoscale front becomes the locus of large gradients of ageostrophic vertical vorticity along an overturning isentrope. This region becomes favorable for streamwise vorticity gradient formation enhancing the environment for organization of horizontal vortex tubes in the presence of buoyant forcing.
NASA Astrophysics Data System (ADS)
Lares, M.; Luparello, H. E.; Garcia Lambas, D.; Ruiz, A. N.; Ceccarelli, L.; Paz, D.
2017-10-01
Cosmic voids are of great interest given their relation to the large scale distribution of mass and the way they trace cosmic flows shaping the cosmic web. Here we show that the distribution of voids has, in consonance with the distribution of mass, a characteristic scale at which void pairs are preferentially located. We identify clumps of voids with similar environments and use them to define second order underdensities. Also, we characterize its properties and analyze its impact on the cosmic microwave background.
Design considerations for implementation of large scale automatic meter reading systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mak, S.; Radford, D.
1995-01-01
This paper discusses the requirements imposed on the design of an AMR system expected to serve a large (> 1 million) customer base spread over a large geographical area. Issues such as system throughput response time, and multi-application expendability are addressed, all of which are intimately dependent on the underlying communication system infrastructure, the local geography, the customer base, and the regulatory environment. A methodology for analysis, assessment, and design of large systems is presented. For illustration, two communication systems -- a low power RF/PLC system and a power frequency carrier system -- are analyzed and discussed.
Sun, Guibo; Webster, Chris; Ni, Michael Y; Zhang, Xiaohu
2018-05-07
Uncertainty with respect to built environment (BE) data collection, measure conceptualization and spatial scales is evident in urban health research, but most findings are from relatively lowdensity contexts. We selected Hong Kong, an iconic high-density city, as the study area as limited research has been conducted on uncertainty in such areas. We used geocoded home addresses (n=5732) from a large population-based cohort in Hong Kong to extract BE measures for the participants' place of residence based on an internationally recognized BE framework. Variability of the measures was mapped and Spearman's rank correlation calculated to assess how well the relationships among indicators are preserved across variables and spatial scales. We found extreme variations and uncertainties for the 180 measures collected using comprehensive data and advanced geographic information systems modelling techniques. We highlight the implications of methodological selection and spatial scales of the measures. The results suggest that more robust information regarding urban health research in high-density city would emerge if greater consideration were given to BE data, design methods and spatial scales of the BE measures.
Land Use, Livelihoods, Vulnerabilities, and Resilience in Coastal Bangladesh
NASA Astrophysics Data System (ADS)
Gilligan, J. M.; Ackerly, B.; Goodbred, S. L., Jr.; Wilson, C.
2014-12-01
The densely populated, low-lying coast of Bangladesh is famously associated with vulnerability to sea-level rise, storms, and flooding. Simultaneously, land-use change has significantly altered local sediment transport, causing elevation loss and degradation of drainage. The rapid growth of shrimp aquaculture has also affected soil chemistry in former agricultural areas and the stock of riverine fisheries through intense larval harvesting. To understand the net impact of these environmental changes on the region's communities, it is necessary to examine interactions across scale - from externally driven large scale environmental change to smaller scale, but often more intense, local change - and also between the physical environment and social, political, and economic conditions. We report on a study of interactions between changing communities and changing environment in coastal Bangladesh, exploring the role of societal and physical factors in shaping the different outcomes and their effects on people's lives. Land reclamation projects in the 1960s surrounded intertidal islands with embankments. This allowed rice farming to expand, but also produced significant elevation loss, which rendered many islands vulnerable to waterlogging and flooding from storm surges. The advent of large-scale shrimp aquaculture added environmental, economic, social, and political stresses, but also brought much export revenue to a developing nation. Locally, attempts to remedy environmental stresses have produced mixed results, with similar measures succeeding in some communities and failing in others. In this context, we find that people are continually adapting to changing opportunities and constraints for food, housing, and income. Niches that support different livelihood activities emerge and dwindle, and their occupants' desires affect the political context. Understanding and successfully responding to the impacts of environmental change requires understanding not only the physical environment, but also the human livelihoods, interpersonal interactions, and human-environmental interactions within a socio-ecological system.
Bohnhoff, Marco; Dresen, Georg; Ellsworth, William L.; Ito, Hisao; Cloetingh, Sierd; Negendank, Jörg
2010-01-01
An important discovery in crustal mechanics has been that the Earth’s crust is commonly stressed close to failure, even in tectonically quiet areas. As a result, small natural or man-made perturbations to the local stress field may trigger earthquakes. To understand these processes, Passive Seismic Monitoring (PSM) with seismometer arrays is a widely used technique that has been successfully applied to study seismicity at different magnitude levels ranging from acoustic emissions generated in the laboratory under controlled conditions, to seismicity induced by hydraulic stimulations in geological reservoirs, and up to great earthquakes occurring along plate boundaries. In all these environments the appropriate deployment of seismic sensors, i.e., directly on the rock sample, at the earth’s surface or in boreholes close to the seismic sources allows for the detection and location of brittle failure processes at sufficiently low magnitude-detection threshold and with adequate spatial resolution for further analysis. One principal aim is to develop an improved understanding of the physical processes occurring at the seismic source and their relationship to the host geologic environment. In this paper we review selected case studies and future directions of PSM efforts across a wide range of scales and environments. These include induced failure within small rock samples, hydrocarbon reservoirs, and natural seismicity at convergent and transform plate boundaries. Each example represents a milestone with regard to bridging the gap between laboratory-scale experiments under controlled boundary conditions and large-scale field studies. The common motivation for all studies is to refine the understanding of how earthquakes nucleate, how they proceed and how they interact in space and time. This is of special relevance at the larger end of the magnitude scale, i.e., for large devastating earthquakes due to their severe socio-economic impact.
Intensification and Structure Change of Super Typhoon Flo as Related to the Large-Scale Environment.
1998-06-01
large dataset is a challenge. Schiavone and Papathomas (1990) summarize methods currently available for visualizing scientific 116 datasets. These...Prediction and Dynamic Meteorology, Second Edition. John Wiley and Sons, 477 pp. Hardy, R. L., 1971: Multiquadric equations of topography and other...Inter. Corp., Monterey CA, 40 pp. Sawyer, J. S., 1947: Notes on the theory of tropical cyclones. Quart. J. Roy. Meteor. Soc, 73, 101-126. Schiavone
Forest-fire model as a supercritical dynamic model in financial systems
NASA Astrophysics Data System (ADS)
Lee, Deokjae; Kim, Jae-Young; Lee, Jeho; Kahng, B.
2015-02-01
Recently large-scale cascading failures in complex systems have garnered substantial attention. Such extreme events have been treated as an integral part of self-organized criticality (SOC). Recent empirical work has suggested that some extreme events systematically deviate from the SOC paradigm, requiring a different theoretical framework. We shed additional theoretical light on this possibility by studying financial crisis. We build our model of financial crisis on the well-known forest fire model in scale-free networks. Our analysis shows a nontrivial scaling feature indicating supercritical behavior, which is independent of system size. Extreme events in the supercritical state result from bursting of a fat bubble, seeds of which are sown by a protracted period of a benign financial environment with few shocks. Our findings suggest that policymakers can control the magnitude of financial meltdowns by keeping the economy operating within reasonable duration of a benign environment.
Sulfide scaling in low enthalpy geothermal environments; A survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Criaud, A.; Fouillac, C.
1989-01-01
A review of the sulfide scaling phenomena in low-temperature environments is presented. While high-temperature fluids tend to deposit metal sulfides because of their high concentrations of dissolved metals and variations of temperature, pressure and fluid chemistry, low temperature media are characterized by very low metal content but much higher dissolved sulfide. In the case of the goethermal wells of the Paris Basin, detailed studies demonstrate that the relatively large concentrations of chloride and dissolved sulfide are responsible for corrosion and consequent formation of iron sulfide scale composed of mackinawite, pyrite and pyrrhotite. The effects of the exploitation schemes are farmore » less important than the corrosion of the casings. The low-enthalpy fluids that do not originate from sedimentary aquifers (such as in Iceland and Bulgaria), have a limited corrosion potential, and the thin sulfide film that appears may prevent the progress of corrosion.« less
Efficient parallelization of analytic bond-order potentials for large-scale atomistic simulations
NASA Astrophysics Data System (ADS)
Teijeiro, C.; Hammerschmidt, T.; Drautz, R.; Sutmann, G.
2016-07-01
Analytic bond-order potentials (BOPs) provide a way to compute atomistic properties with controllable accuracy. For large-scale computations of heterogeneous compounds at the atomistic level, both the computational efficiency and memory demand of BOP implementations have to be optimized. Since the evaluation of BOPs is a local operation within a finite environment, the parallelization concepts known from short-range interacting particle simulations can be applied to improve the performance of these simulations. In this work, several efficient parallelization methods for BOPs that use three-dimensional domain decomposition schemes are described. The schemes are implemented into the bond-order potential code BOPfox, and their performance is measured in a series of benchmarks. Systems of up to several millions of atoms are simulated on a high performance computing system, and parallel scaling is demonstrated for up to thousands of processors.
Fermi rules out the IC/CMB model for the Large-Scale Jet X-ray emission of 3C 273
NASA Astrophysics Data System (ADS)
Georganopoulos, Markos; Meyer, E. T.
2014-01-01
The process responsible for the Chandra-detected X-ray emission from the large-scale jets of powerful quasars is not clear yet. The two main models are inverse Compton scattering off the cosmic microwave background (IC/CMB) photons and synchrotron emission from a population of electrons separate from those producing the radio-IR emission. These two models imply radically different conditions in the large scale jet in terms of jet speed and maximum energy of the particle acceleration mechanism, with important implications for the impact of the jet on the larger-scale environment. Georganopoulos et al. (2006) proposed a diagnostic based on a fundamental difference between these two models: the production of synchrotron X-rays requires multi-TeV electrons, while the EC/CMB model requires a cutoff in the electron energy distribution below TeV energies. This has significant implications for the gamma-ray emission predicted by these two models. Here we present new Fermi observations that put an upper limit on the gamma-ray flux from the large-scale jet of 3C 273 that clearly violates the flux expected from the IC/CMB X-ray interpretation found by extrapolation of the UV to X-ray spectrum of knot A, thus ruling out the IC/CMB interpretation entirely for this source. Further, the Fermi upper limit constraints the Doppler beaming factor delta <5.
Biological mechanisms supporting adaptation to ocean acidification in coastal ecosystems
NASA Astrophysics Data System (ADS)
Hendriks, Iris E.; Duarte, Carlos M.; Olsen, Ylva S.; Steckbauer, Alexandra; Ramajo, Laura; Moore, Tommy S.; Trotter, Julie A.; McCulloch, Malcolm
2015-01-01
The direct influence of anthropogenic CO2 might play a limited role in pH regulation in coastal ecosystems as pH regulation in these areas can be complex. They experience large variability across a broad range of spatial and temporal scales, with complex external and internal drivers. Organisms influence pH at a patch scale, where community metabolic effects and hydrodynamic processes interact to produce broad ranges in pH, (˜0.3-0.5 pH units) over daily cycles and spatial scales (mm to m) particularly in shallow vegetated habitats and coral reefs where both respiration and photosynthetic activity are intense. Biological interactions at the ecosystem scale, linked to patchiness in habitat landscapes and seasonal changes in metabolic processes and temperature lead to changes of about 0.3-0.5 pH units throughout a year. Furthermore, on the scale of individual organisms, small-scale processes including changes at the Diffusive Boundary Layer (DBL), interactions with symbionts, and changes to the specific calcification environment, induce additional changes in excess of 0.5 pH units. In these highly variable pH environments calcifying organisms have developed the capacity to alter the pH of their calcifying environment, or specifically within critical tissues where calcification occurs, thus achieving a homeostasis. This capacity to control the conditions for calcification at the organism scale may therefore buffer the full impacts of ocean acidification on an organism scale, although this might be at a cost to the individual. Furthermore, in some areas, calcifiers may potentially benefit from changes to ambient seawater pH, where photosynthetic organisms drawdown CO2.
Tan, Zhihong; Kaul, Colleen M.; Pressel, Kyle G.; Cohen, Yair; Teixeira, João
2018-01-01
Abstract Large‐scale weather forecasting and climate models are beginning to reach horizontal resolutions of kilometers, at which common assumptions made in existing parameterization schemes of subgrid‐scale turbulence and convection—such as that they adjust instantaneously to changes in resolved‐scale dynamics—cease to be justifiable. Additionally, the common practice of representing boundary‐layer turbulence, shallow convection, and deep convection by discontinuously different parameterizations schemes, each with its own set of parameters, has contributed to the proliferation of adjustable parameters in large‐scale models. Here we lay the theoretical foundations for an extended eddy‐diffusivity mass‐flux (EDMF) scheme that has explicit time‐dependence and memory of subgrid‐scale variables and is designed to represent all subgrid‐scale turbulence and convection, from boundary layer dynamics to deep convection, in a unified manner. Coherent up and downdrafts in the scheme are represented as prognostic plumes that interact with their environment and potentially with each other through entrainment and detrainment. The more isotropic turbulence in their environment is represented through diffusive fluxes, with diffusivities obtained from a turbulence kinetic energy budget that consistently partitions turbulence kinetic energy between plumes and environment. The cross‐sectional area of up and downdrafts satisfies a prognostic continuity equation, which allows the plumes to cover variable and arbitrarily large fractions of a large‐scale grid box and to have life cycles governed by their own internal dynamics. Relatively simple preliminary proposals for closure parameters are presented and are shown to lead to a successful simulation of shallow convection, including a time‐dependent life cycle. PMID:29780442
Fundamental tests of galaxy formation theory
NASA Technical Reports Server (NTRS)
Silk, J.
1982-01-01
The structure of the universe as an environment where traces exist of the seed fluctuations from which galaxies formed is studied. The evolution of the density fluctuation modes that led to the eventual formation of matter inhomogeneities is reviewed, How the resulting clumps developed into galaxies and galaxy clusters acquiring characteristic masses, velocity dispersions, and metallicities, is discussed. Tests are described that utilize the large scale structure of the universe, including the dynamics of the local supercluster, the large scale matter distribution, and the anisotropy of the cosmic background radiation, to probe the earliest accessible stages of evolution. Finally, the role of particle physics is described with regard to its observable implications for galaxy formation.
Scientific management and implementation of the geophysical fluid flow cell for Spacelab missions
NASA Technical Reports Server (NTRS)
Hart, J.; Toomre, J.
1980-01-01
Scientific support for the spherical convection experiment to be flown on Spacelab 3 was developed. This experiment takes advantage of the zero gravity environment of the orbiting space laboratory to conduct fundamental fluid flow studies concerned with thermally driven motions inside a rotating spherical shell with radial gravity. Such a system is a laboratory analog of large scale atmospheric and solar circulations. The radial body force necessary to model gravity correctly is obtained by using dielectric polarization forces in a radially varying electric field to produce radial accelerations proportional to temperature. This experiment will answer fundamental questions concerned with establishing the preferred modes of large scale motion in planetary and stellar atmospheres.
Selecting habitat to survive: the impact of road density on survival in a large carnivore.
Basille, Mathieu; Van Moorter, Bram; Herfindal, Ivar; Martin, Jodie; Linnell, John D C; Odden, John; Andersen, Reidar; Gaillard, Jean-Michel
2013-01-01
Habitat selection studies generally assume that animals select habitat and food resources at multiple scales to maximise their fitness. However, animals sometimes prefer habitats of apparently low quality, especially when considering the costs associated with spatially heterogeneous human disturbance. We used spatial variation in human disturbance, and its consequences on lynx survival, a direct fitness component, to test the Hierarchical Habitat Selection hypothesis from a population of Eurasian lynx Lynx lynx in southern Norway. Data from 46 lynx monitored with telemetry indicated that a high proportion of forest strongly reduced the risk of mortality from legal hunting at the home range scale, while increasing road density strongly increased such risk at the finer scale within the home range. We found hierarchical effects of the impact of human disturbance, with a higher road density at a large scale reinforcing its negative impact at a fine scale. Conversely, we demonstrated that lynx shifted their habitat selection to avoid areas with the highest road densities within their home ranges, thus supporting a compensatory mechanism at fine scale enabling lynx to mitigate the impact of large-scale disturbance. Human impact, positively associated with high road accessibility, was thus a stronger driver of lynx space use at a finer scale, with home range characteristics nevertheless constraining habitat selection. Our study demonstrates the truly hierarchical nature of habitat selection, which aims at maximising fitness by selecting against limiting factors at multiple spatial scales, and indicates that scale-specific heterogeneity of the environment is driving individual spatial behaviour, by means of trade-offs across spatial scales.
NASA Astrophysics Data System (ADS)
Meertens, C. M.; Boler, F. M.; Ertz, D. J.; Mencin, D.; Phillips, D.; Baker, S.
2017-12-01
UNAVCO, in its role as a NSF facility for geodetic infrastructure and data, has succeeded for over two decades using on-premises infrastructure, and while the promise of cloud-based infrastructure is well-established, significant questions about suitability of such infrastructure for facility-scale services remain. Primarily through the GeoSciCloud award from NSF EarthCube, UNAVCO is investigating the costs, advantages, and disadvantages of providing its geodetic data and services in the cloud versus using UNAVCO's on-premises infrastructure. (IRIS is a collaborator on the project and is performing its own suite of investigations). In contrast to the 2-3 year time scale for the research cycle, the time scale of operation and planning for NSF facilities is for a minimum of five years and for some services extends to a decade or more. Planning for on-premises infrastructure is deliberate, and migrations typically take months to years to fully implement. Migrations to a cloud environment can only go forward with similar deliberate planning and understanding of all costs and benefits. The EarthCube GeoSciCloud project is intended to address the uncertainties of facility-level operations in the cloud. Investigations are being performed in a commercial cloud environment (Amazon AWS) during the first year of the project and in a private cloud environment (NSF XSEDE resource at the Texas Advanced Computing Center) during the second year. These investigations are expected to illuminate the potential as well as the limitations of running facility scale production services in the cloud. The work includes running parallel equivalent cloud-based services to on premises services and includes: data serving via ftp from a large data store, operation of a metadata database, production scale processing of multiple months of geodetic data, web services delivery of quality checked data and products, large-scale compute services for event post-processing, and serving real time data from a network of 700-plus GPS stations. The evaluation is based on a suite of metrics that we have developed to elucidate the effectiveness of cloud-based services in price, performance, and management. Services are currently running in AWS and evaluation is underway.
Global Scale Solar Disturbances
NASA Astrophysics Data System (ADS)
Title, A. M.; Schrijver, C. J.; DeRosa, M. L.
2013-12-01
The combination of the STEREO and SDO missions have allowed for the first time imagery of the entire Sun. This coupled with the high cadence, broad thermal coverage, and the large dynamic range of the Atmospheric Imaging Assembly on SDO has allowed discovery of impulsive solar disturbances that can significantly affect a hemisphere or more of the solar volume. Such events are often, but not always, associated with M and X class flares. GOES C and even B class flares are also associated with these large scale disturbances. Key to the recognition of the large scale disturbances was the creation of log difference movies. By taking the log of images before differencing events in the corona become much more evident. Because such events cover such a large portion of the solar volume their passage can effect the dynamics of the entire corona as it adjusts to and recovers from their passage. In some cases this may lead to a another flare or filament ejection, but in general direct causal evidence of 'sympathetic' behavior is lacking. However, evidence is accumulating these large scale events create an environment that encourages other solar instabilities to occur. Understanding the source of these events and how the energy that drives them is built up, stored, and suddenly released is critical to understanding the origins of space weather. Example events and comments of their relevance will be presented.
Utilizing Online Training for Child Sexual Abuse Prevention: Benefits and Limitations
ERIC Educational Resources Information Center
Paranal, Rechelle; Thomas, Kiona Washington; Derrick, Christina
2012-01-01
The prevalence of child sexual abuse demands innovative approaches to prevent further victimization. The online environment provides new opportunities to expand existing child sexual abuse prevention trainings that target adult gatekeepers and allow for large scale interventions that are fiscally viable. This article discusses the benefits and…
The Small College Administrative Environment.
ERIC Educational Resources Information Center
Buzza, Bonnie Wilson
Environmental differences for speech departments at large and small colleges are not simply of scale; there are qualitative as well as quantitative differences. At small colleges, faculty are hired as teachers, rather than as researchers. Because speech teachers at small colleges must be generalists, and because it is often difficult to replace…
Turning of COGS moves forward findings for hormonally mediated cancers.
Sakoda, Lori C; Jorgenson, Eric; Witte, John S
2013-04-01
The large-scale Collaborative Oncological Gene-environment Study (COGS) presents new findings that further characterize the genetic bases of breast, ovarian and prostate cancers. We summarize and provide insights into this collection of papers from COGS and discuss the implications of the results and future directions for such efforts.
USDA-ARS?s Scientific Manuscript database
With rapid advances in DNA sequencing, phenotyping has become the rate-limiting step in using large-scale genomic data to understand and improve agricultural crops. Here, the Bellwether Phenotyping platform for controlled-environment plant growth and automated, multimodal phenotyping is described. T...
Helping Students Interpret Large-Scale Data Tables
ERIC Educational Resources Information Center
Prodromou, Theodosia
2016-01-01
New technologies have completely altered the ways that citizens can access data. Indeed, emerging online data sources give citizens access to an enormous amount of numerical information that provides new sorts of evidence used to influence public opinion. In this new environment, two trends have had a significant impact on our increasingly…
Solutions to pervasive environmental problems often are not amenable to a straightforward application of science-based actions. These problems encompass large-scale environmental policy questions where environmental concerns, economic constraints, and societal values conflict ca...
Toward a Framework for Learner Segmentation
ERIC Educational Resources Information Center
Azarnoush, Bahareh; Bekki, Jennifer M.; Runger, George C.; Bernstein, Bianca L.; Atkinson, Robert K.
2013-01-01
Effectively grouping learners in an online environment is a highly useful task. However, datasets used in this task often have large numbers of attributes of disparate types and different scales, which traditional clustering approaches cannot handle effectively. Here, a unique dissimilarity measure based on the random forest, which handles the…
The Future of Data-Enriched Assessment
ERIC Educational Resources Information Center
Thille, Candace; Schneider, Emily; Kizilcec, René F.; Piech, Christopher; Halawa, Sherif A.; Greene, Daniel K.
2014-01-01
The article addresses the question of how the assessment process with large-scale data derived from online learning environments will be different from the assessment process without it. Following an explanation of big data and how it is different from previously available learner data, we describe three notable features that characterize…
Personalised Information Services Using a Hybrid Recommendation Method Based on Usage Frequency
ERIC Educational Resources Information Center
Kim, Yong; Chung, Min Gyo
2008-01-01
Purpose: This paper seeks to describe a personal recommendation service (PRS) involving an innovative hybrid recommendation method suitable for deployment in a large-scale multimedia user environment. Design/methodology/approach: The proposed hybrid method partitions content and user into segments and executes association rule mining,…
The "New" Competition: Serving the Learning Society in an Electronic Age.
ERIC Educational Resources Information Center
El-Khawas, Elaine
1999-01-01
Examines the changing environment for serving adult learners, with special attention to new modes of delivering instruction: distance education; use of information technology; emphasis on convenience; focus on special markets; and emergence of large-scale, profit-driven enterprises. Argues that, to develop an effective response, universities must…
A Tangled Path: Negotiating Leadership for, in, of, and with Diverse Communities
ERIC Educational Resources Information Center
Goddard, J. Tim
2015-01-01
This article addresses issues of educational leadership in postcolonial and post-conflict environments. The focus relates to large-scale testing protocols, diversity in schools, asset-based community development, and communications technology. The article highlights the importance of context to the tenets of social justice, equitable human…
Responses of timber rattlesnakes to fire: Lessons from two prescribed burns
Steven J. Beaupre; Lara E. Douglas
2012-01-01
Timber rattlesnakes (Crotalus horridus) are excellent model organisms for understanding the effects of large scale habitat manipulations because of their low-energy lifestyle, rapid response to changes in resource environment, uniform diet (small mammals), and simple behaviors. We present two case studies that illustrate interactions between timber...
Willingness to Communicate in English: A Model in the Chinese EFL Classroom Context
ERIC Educational Resources Information Center
Peng, Jian-E; Woodrow, Lindy
2010-01-01
This study involves a large-scale investigation of willingness to communicate (WTC) in Chinese English-as-a-foreign-language (EFL) classrooms. A hypothesized model integrating WTC in English, communication confidence, motivation, learner beliefs, and classroom environment was tested using structural equation modeling. Validation of the…
Environmental Risk and Young Children's Cognitive and Behavioral Development
ERIC Educational Resources Information Center
Pike, Alison; Iervolino, Alessandra C.; Eley, Thalia C.; Price, Thomas S.; Plomin, Robert
2006-01-01
Using a longitudinal, large-scale sample of British twins, we addressed the prediction of both cognitive abilities and behavioral adjustment from eight domains of environmental risk: minority status, socio-economic status, maternal medical factors, twin medical factors, maternal depression, chaos within the home environment, and parental feelings…
Techniques for inventorying manmade impacts in roadway environments.
Dale R. Potter; J. Alan. Wagar
1971-01-01
Four techniques for inventorying manmade impacts along roadway corridors were devised and compared. Ground surveillance and ground photography techniques recorded impacts within the corridor visible from the road. Techniques on large- and small-scale aerial photography recorded impacts within a more complete corridor that included areas screened from the road by...
An Introduction to the Safe Schools/Healthy Students Initiative
ERIC Educational Resources Information Center
Modzeleski, William; Mathews-Younes, Anne; Arroyo, Carmen G.; Mannix, Danyelle; Wells, Michael E.; Hill, Gary; Yu, Ping; Murray, Stephen
2012-01-01
The Safe Schools/Healthy Students (SS/HS) Initiative offers a unique opportunity to conduct large-scale, multisite, multilevel program evaluation in the context of a federal environment that places many requirements and constraints on how the grants are conducted and managed. Federal programs stress performance-based outcomes, valid and reliable…
SCIMITAR: Scalable Stream-Processing for Sensor Information Brokering
2013-11-01
IaaS) cloud frameworks including Amazon Web Services and Eucalyptus . For load testing, we used The Grinder [9], a Java load testing framework that...internal Eucalyptus cluster which we could not scale as large as the Amazon environment due to a lack of computation resources. We recreated our
NASA Astrophysics Data System (ADS)
Beichner, Robert
2016-03-01
The Student-Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) Project combines curricula and a specially-designed instructional space to enhance learning. SCALE-UP students practice communication and teamwork skills while performing activities that enhance their conceptual understanding and problem solving skills. This can be done with small or large classes and has been implemented at more than 250 institutions. Educational research indicates that students should collaborate on interesting tasks and be deeply involved with the material they are studying. SCALE-UP classtime is spent primarily on ``tangibles'' and ``ponderables''--hands-on measurements/observations and interesting questions. There are also computer simulations (called ``visibles'') and hypothesis-driven labs. Students sit at tables designed to facilitate group interactions. Instructors circulate and engage in Socratic dialogues. The setting looks like a banquet hall, with lively interactions nearly all the time. Impressive learning gains have been measured at institutions across the US and internationally. This talk describes today's students, how lecturing got started, what happens in a SCALE-UP classroom, and how the approach has spread. The SCALE-UP project has greatly benefitted from numerous Grants made by NSF and FIPSE to NCSU and other institutions.
Interactions between cumulus convection and its environment as revealed by the MC3E sounding array
Xie, Shaocheng; Zhang, Yunyan; Giangrande, Scott E.; ...
2014-10-27
This study attempts to understand interactions between midlatitude convective systems and their environments through a heat and moisture budget analysis using the sounding data collected from the Midlatitude Continental Convective Clouds Experiment (MC3E) in central Oklahoma. Distinct large-scale structures and diabatic heating and drying profiles are presented for cases of weaker and elevated thunderstorms as well as intense squall line and supercell thunderstorm events during the campaign. The elevated cell events were nocturnal convective systems occurring in an environment having low convective available potential energy (CAPE) and a very dry boundary layer. In contrast, deeper convective events happened during themore » morning into early afternoon within an environment associated with large CAPE and a near-saturated boundary layer. As the systems reached maturity, the diagnosed diabatic heating in the latter deep convective cases was much stronger and of greater vertical extent than the former. Both groups showed considerable diabatic cooling in the lower troposphere, associated with the evaporation of precipitation and low-level clouds. The horizontal advection of moisture also played a dominant role in moistening the lower troposphere, particularly for the deeper convective events, wherein the near surface southeasterly flow allows persistent low-level moisture return from the Gulf of Mexico to support convection. The moisture convergence often was present before these systems develop, suggesting a strong correlation between the large-scale moisture convergence and convection. As a result, sensitivity tests indicated that the uncertainty in the surface precipitation and the size of analysis domain mainly affected the magnitude of these analyzed fields rather than their vertical structures.« less
Using the morphology and magnetic fields of tailed radio galaxies as environmental probes
NASA Astrophysics Data System (ADS)
Johnston-Hollitt, M.; Dehghan, S.; Pratley, L.
2015-03-01
Bent-tailed (BT) radio sources have long been known to trace over densities in the Universe up to z ~ 1 and there is increasing evidence this association persists out to redshifts of 2. The morphology of the jets in BT galaxies is primarily a function of the environment that they have resided in and so BTs provide invaluable clues as to their local conditions. Thus, not only can samples of BT galaxies be used as signposts of large-scale structure, but are also valuable for obtaining a statistical measurement of properties of the intra-cluster medium including the presence of cluster accretion shocks & winds, and as historical anemometers, preserving the dynamical history of their surroundings in their jets. We discuss the use of BTs to unveil large-scale structure and provide an example in which a BT was used to unlock the dynamical history of its host cluster. In addition to their use as density and dynamical indicators, BTs are useful probes of the magnetic field on their environment on scales which are inaccessible to other methods. Here we discuss a novel way in which a particular sub-class of BTs, the so-called `corkscrew' galaxies might further elucidate the coherence lengths of the magnetic fields in their vicinity. Given that BTs are estimated to make up a large population in next generation surveys we posit that the use of jets in this way could provide a unique source of environmental information for clusters and groups up to z = 2.
Similar Scaling Relations for the Gas Content of Galaxies Across Environments to z ∼ 3.5
NASA Astrophysics Data System (ADS)
Darvish, Behnam; Scoville, Nick Z.; Martin, Christopher; Mobasher, Bahram; Diaz-Santos, Tanio; Shen, Lu
2018-06-01
We study the effects of the local environment on the molecular gas content of a large sample of log(M */M ⊙) ≳ 10 star-forming and starburst galaxies with specific star formation rates (sSFRs) on and above the main sequence (MS) to z ∼ 3.5. ALMA observations of the dust continuum in the COSMOS field are used to estimate molecular gas masses at z ≈ 0.5–3.5. We also use a local universe sample from the ALFALFA H I survey after converting it into molecular masses. The molecular mass (M ISM) scaling relation shows a dependence on z, M *, and sSFR relative to the MS, but no dependence on environmental overdensity Δ(M ISM ∝ Δ0.03). Similarly, gas mass fraction (f gas) and depletion timescale (τ) show no environmental dependence to z ∼ 3.5. At < z> ∼ 1.8, the average < {M}ISM}> , < {f}gas}> , and < τ > in densest regions is (1.6 ± 0.2) × 1011 M ⊙, 55 ± 2%, and 0.8 ± 0.1 Gyr, respectively, similar to those in the lowest density bin. Independent of the environment, f gas decreases and τ increases with increasing cosmic time. Cosmic molecular mass density (ρ) in the lowest density bins peaks at z ∼ 1–2, and this peak happens at z < 1 in densest bins. This differential evolution of ρ across environments is likely due to the growth of the large-scale structure with cosmic time. Our results suggest that the molecular gas content and the subsequent star formation activity of log(M */M ⊙) ≳ 10 star-forming and starburst galaxies is primarily driven by internal processes, and not by their local environment since z ∼ 3.5.
Use of Nanostructures in Fabrication of Large Scale Electrochemical Film
NASA Astrophysics Data System (ADS)
Chen, Chien Chon; Chen, Shih Hsun; Shyu, Sheang Wen; Hsieh, Sheng Jen
Control of electrochemical parameters when preparing small-scale samples for academic research is not difficult. In mass production environments, however, maintenance of constant current density and temperature become a critical issue. This article describes the design of several molds for large work pieces. These molds were designed to maintain constant current density and to facilitate the occurrence of electrochemical reactions in designated areas. Large-area thin films with fine nanostructure were successfully prepared using the designed electrochemical molds and containers. In addition, current density and temperature could be controlled well. This electrochemical system has been verified in many experimental operations, including etching of Al surfaces; electro-polishing of Al, Ti and stainless steel; and fabrication of anodic alumina oxide (AAO), Ti-TiO2 interference membrane, TiO2 nanotubes, AAO-TiO2 nanotubes, Ni nanowires and porous tungsten
An Overview of the Launch Vehicle Blast Environments Development Efforts
NASA Technical Reports Server (NTRS)
Richardson, Erin; Bangham, Mike; Blackwood, James; Skinner, Troy; Hays, Michael; Jackson, Austin; Richman, Ben
2014-01-01
NASA has been funding an ongoing development program to characterize the explosive environments produced during a catastrophic launch vehicle accident. These studies and small-scale tests are focused on the near field environments that threaten the crew. The results indicate that these environments are unlikely to result in immediate destruction of the crew modules. The effort began as an independent assessment by NASA safety organizations, followed by the Ares program and NASA Engineering and Safety Center and now as a Space Launch Systems (SLS) focused effort. The development effort is using the test and accident data available from public or NASA sources as well as focused scaled tests that are examining the fundamental aspects of uncontained explosions of Hydrogen and air and Hydrogen and Oxygen. The primary risk to the crew appears to be the high-energy fragments and these are being characterized for the SLS. The development efforts will characterize the thermal environment of the explosions as well to ensure that the risk is well understood and to document the overall energy balance of an explosion. The effort is multi-path in that analytical, computational and focused testing is being used to develop the knowledge to understand potential SLS explosions. This is an ongoing program with plans that expand the development from fundamental testing at small-scale levels to large-scale tests that can be used to validate models for commercial programs. The ultimate goal is to develop a knowledge base that can be used by vehicle designers to maximize crew survival in an explosion.
ATLAS I/O performance optimization in as-deployed environments
NASA Astrophysics Data System (ADS)
Maier, T.; Benjamin, D.; Bhimji, W.; Elmsheuser, J.; van Gemmeren, P.; Malon, D.; Krumnack, N.
2015-12-01
This paper provides an overview of an integrated program of work underway within the ATLAS experiment to optimise I/O performance for large-scale physics data analysis in a range of deployment environments. It proceeds to examine in greater detail one component of that work, the tuning of job-level I/O parameters in response to changes to the ATLAS event data model, and considers the implications of such tuning for a number of measures of I/O performance.
Coupled continuous time-random walks in quenched random environment
NASA Astrophysics Data System (ADS)
Magdziarz, M.; Szczotka, W.
2018-02-01
We introduce a coupled continuous-time random walk with coupling which is characteristic for Lévy walks. Additionally we assume that the walker moves in a quenched random environment, i.e. the site disorder at each lattice point is fixed in time. We analyze the scaling limit of such a random walk. We show that for large times the behaviour of the analyzed process is exactly the same as in the case of uncoupled quenched trap model for Lévy flights.
A radiant heating test facility for space shuttle orbiter thermal protection system certification
NASA Technical Reports Server (NTRS)
Sherborne, W. D.; Milhoan, J. D.
1980-01-01
A large scale radiant heating test facility was constructed so that thermal certification tests can be performed on the new generation of thermal protection systems developed for the space shuttle orbiter. This facility simulates surface thermal gradients, onorbit cold-soak temperatures down to 200 K, entry heating temperatures to 1710 K in an oxidizing environment, and the dynamic entry pressure environment. The capabilities of the facility and the development of new test equipment are presented.
Ionic electroactive polymer artificial muscles in space applications.
Punning, Andres; Kim, Kwang J; Palmre, Viljar; Vidal, Frédéric; Plesse, Cédric; Festin, Nicolas; Maziz, Ali; Asaka, Kinji; Sugino, Takushi; Alici, Gursel; Spinks, Geoff; Wallace, Gordon; Must, Indrek; Põldsalu, Inga; Vunder, Veiko; Temmer, Rauno; Kruusamäe, Karl; Torop, Janno; Kaasik, Friedrich; Rinne, Pille; Johanson, Urmas; Peikolainen, Anna-Liisa; Tamm, Tarmo; Aabloo, Alvo
2014-11-05
A large-scale effort was carried out to test the performance of seven types of ionic electroactive polymer (IEAP) actuators in space-hazardous environmental factors in laboratory conditions. The results substantiate that the IEAP materials are tolerant to long-term freezing and vacuum environments as well as ionizing Gamma-, X-ray, and UV radiation at the levels corresponding to low Earth orbit (LEO) conditions. The main aim of this material behaviour investigation is to understand and predict device service time for prolonged exposure to space environment.
ERIC Educational Resources Information Center
Brand, Stephen; Felner, Robert D.; Seitsinger, Anne; Burns, Amy; Bolton, Natalie
2008-01-01
Due to changes in state and federal policies, as well as logistical and fiscal limitations, researchers must increasingly rely on teachers' reports of school climate dimensions in order to investigate the developmental impact of these dimensions, and to evaluate efforts to enhance the impact of school environments on the development of young…
A multi-scale framework to link remotely sensed metrics with socioeconomic data
NASA Astrophysics Data System (ADS)
Watmough, Gary; Svenning, Jens-Christian; Palm, Cheryl; Sullivan, Clare; Danylo, Olha; McCallum, Ian
2017-04-01
There is increasing interest in the use of remotely sensed satellite data for estimating human poverty as it can bridge data gaps that prevent fine scale monitoring of development goals across large areas. The ways in which metrics derived from satellite imagery are linked with socioeconomic data are crucial for accurate estimation of poverty. Yet, to date, approaches in the literature linking satellite metrics with socioeconomic data are poorly characterized. Typically, approaches use a GIS approach such as circular buffer zones around a village or household or an administrative boundary such as a district or census enumeration area. These polygons are then used to extract environmental data from satellite imagery and related to the socioeconomic data in statistical analyses. The use of a single polygon to link environment and socioeconomic data is inappropriate in coupled human-natural systems as processes operate over multiple scales. Human interactions with the environment occur at multiple levels from individual (household) access to agricultural plots adjacent to homes, to communal access to common pool resources (CPR) such as forests at the village level. Here, we present a multi-scale framework that explicitly considers how people use the landscape. The framework is presented along with a case study example in Kenya. The multi-scale approach could enhance the modelling of human-environment interactions which will have important consequences for monitoring the sustainable development goals for human livelihoods and biodiversity conservation.
Cloud Computing for Complex Performance Codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin
This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.
The Convergence of High Performance Computing and Large Scale Data Analytics
NASA Astrophysics Data System (ADS)
Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.
2015-12-01
As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.
Investigations of grain size dependent sediment transport phenomena on multiple scales
NASA Astrophysics Data System (ADS)
Thaxton, Christopher S.
Sediment transport processes in coastal and fluvial environments resulting from disturbances such as urbanization, mining, agriculture, military operations, and climatic change have significant impact on local, regional, and global environments. Primarily, these impacts include the erosion and deposition of sediment, channel network modification, reduction in downstream water quality, and the delivery of chemical contaminants. The scale and spatial distribution of these effects are largely attributable to the size distribution of the sediment grains that become eligible for transport. An improved understanding of advective and diffusive grain-size dependent sediment transport phenomena will lead to the development of more accurate predictive models and more effective control measures. To this end, three studies were performed that investigated grain-size dependent sediment transport on three different scales. Discrete particle computer simulations of sheet flow bedload transport on the scale of 0.1--100 millimeters were performed on a heterogeneous population of grains of various grain sizes. The relative transport rates and diffusivities of grains under both oscillatory and uniform, steady flow conditions were quantified. These findings suggest that boundary layer formalisms should describe surface roughness through a representative grain size that is functionally dependent on the applied flow parameters. On the scale of 1--10m, experiments were performed to quantify the hydrodynamics and sediment capture efficiency of various baffles installed in a sediment retention pond, a commonly used sedimentation control measure in watershed applications. Analysis indicates that an optimum sediment capture effectiveness may be achieved based on baffle permeability, pond geometry and flow rate. Finally, on the scale of 10--1,000m, a distributed, bivariate watershed terain evolution module was developed within GRASS GIS. Simulation results for variable grain sizes and for distributed rainfall infiltration and land cover matched observations. Although a unique set of governing equations applies to each scale, an improved physics-based understanding of small and medium scale behavior may yield more accurate parameterization of key variables used in large scale predictive models.
From Physics to industry: EOS outside HEP
NASA Astrophysics Data System (ADS)
Espinal, X.; Lamanna, M.
2017-10-01
In the competitive market for large-scale storage solutions the current main disk storage system at CERN EOS has been showing its excellence in the multi-Petabyte high-concurrency regime. It has also shown a disruptive potential in powering the service in providing sync and share capabilities and in supporting innovative analysis environments along the storage of LHC data. EOS has also generated interest as generic storage solution ranging from university systems to very large installations for non-HEP applications.
A representation of place attachment: A study of spatial cognition in Latvia
NASA Astrophysics Data System (ADS)
Skilters, Jurgis; Zarina, Liga; Raita, Liva
2017-04-01
Perception of geographical space is reflected in place attachment, i.e., a multidimensional cognitive-affective link between humans and their spatial environment. Place attachment balances emotions, conception of proximity. It is both social and spatial cognitive structure. Place attachment has an impact on people's actions, which in turn reversibly affect the environment in which people live. Place attachment provides emotional regulation for humans linking local - neighborhood-scale and country and world-scale environments. In Latvia a large-scale spatial cognition study has been conducted within participatory research project „Telpas pavasaris" ("Spatial Spring") by foundation Viegli. In the study 1523 respondents reported their associations characterizing certain type of places (e.g., safe place, dangerous place, far place, close place, dear place). The answers were analyzed according to several cognitive-affective categories including modes of experience, emotional valence, geographical distance, and perceptual modality. The current results indicate that socio-cognitive and affective information are primary in respect to purely spatial information (referring to spatial objects or regions and their relations). However, different types of geographical places and spatial objects (natural or artefactual) have to be distinguished and are significant to a different degree. Our results are important for environmental and urban planning because they show the ways how socio-cognitive and affective knowledge shapes the spatial cognition of geographic environment.
Schweitzer, Peter; Povoroznyuk, Olga; Schiesser, Sigrid
2017-01-01
Abstract Public and academic discourses about the Polar regions typically focus on the so-called natural environment. While, these discourses and inquiries continue to be relevant, the current article asks the question how to conceptualize the on-going industrial and infrastructural build-up of the Arctic. Acknowledging that the “built environment” is not an invention of modernity, the article nevertheless focuses on large-scale infrastructural projects of the twentieth century, which marks a watershed of industrial and infrastructural development in the north. Given that the Soviet Union was at the vanguard of these developments, the focus will be on Soviet and Russian large-scale projects. We will be discussing two cases of transportation infrastructure, one of them based on an on-going research project being conducted by the authors along the Baikal–Amur Mainline (BAM) and the other focused on the so-called Northern Sea Route, the marine passage with a long history that has recently been regaining public and academic attention. The concluding section will argue for increased attention to the interactions between humans and the built environment, serving as a kind of programmatic call for more anthropological attention to infrastructure in the Russian north and other polar regions. PMID:29098112
Teaching the Blind to Find Their Way by Playing Video Games
Merabet, Lotfi B.; Connors, Erin C.; Halko, Mark A.; Sánchez, Jaime
2012-01-01
Computer based video games are receiving great interest as a means to learn and acquire new skills. As a novel approach to teaching navigation skills in the blind, we have developed Audio-based Environment Simulator (AbES); a virtual reality environment set within the context of a video game metaphor. Despite the fact that participants were naïve to the overall purpose of the software, we found that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building using audio based cues alone. This was confirmed by a series of behavioral performance tests designed to assess the transfer of acquired spatial information to a large-scale, real-world indoor navigation task. Furthermore, learning the spatial layout through a goal directed gaming strategy allowed for the mental manipulation of spatial information as evidenced by enhanced navigation performance when compared to an explicit route learning strategy. We conclude that the immersive and highly interactive nature of the software greatly engages the blind user to actively explore the virtual environment. This in turn generates an accurate sense of a large-scale three-dimensional space and facilitates the learning and transfer of navigation skills to the physical world. PMID:23028703
Demir, E; Babur, O; Dogrusoz, U; Gursoy, A; Nisanci, G; Cetin-Atalay, R; Ozturk, M
2002-07-01
Availability of the sequences of entire genomes shifts the scientific curiosity towards the identification of function of the genomes in large scale as in genome studies. In the near future, data produced about cellular processes at molecular level will accumulate with an accelerating rate as a result of proteomics studies. In this regard, it is essential to develop tools for storing, integrating, accessing, and analyzing this data effectively. We define an ontology for a comprehensive representation of cellular events. The ontology presented here enables integration of fragmented or incomplete pathway information and supports manipulation and incorporation of the stored data, as well as multiple levels of abstraction. Based on this ontology, we present the architecture of an integrated environment named Patika (Pathway Analysis Tool for Integration and Knowledge Acquisition). Patika is composed of a server-side, scalable, object-oriented database and client-side editors to provide an integrated, multi-user environment for visualizing and manipulating network of cellular events. This tool features automated pathway layout, functional computation support, advanced querying and a user-friendly graphical interface. We expect that Patika will be a valuable tool for rapid knowledge acquisition, microarray generated large-scale data interpretation, disease gene identification, and drug development. A prototype of Patika is available upon request from the authors.
PREFACE TO: "PHARMACEUTICALS AND PERSONAL ...
Often overlooked in our daily lives are the inescapable, intimate, and immediate connections between our personal activities and the environment in which we live. This is especially true with regard to the use and disposal of consumer chemicals. A significant aspect of our global society that illustrates the potential impact of our lives on the environment is the widespread and escalating use of pharmaceuticals and personal care products - simply referred to as PPCPS. Many of these chemicals are specifically designed to elicit potent pharmacological or toxicological effects. In distinct contrast to nearly all agro/industrial chemicals, which are often used on large, relatively confined scales, the end use for PPCPs is highly dispersed and centered around the activities and actions of the individual. PPCPs enjoy worldwide usage and attendant discharge or inadvertent release to the environment. Their introduction to the environment has no geographic boundaries or climatic-use limitations as do many other synthetic chemicals - they are discharged to the environment wherever people live or visit, regardless of the time of year. It is difficult for the individual to perceive their small-scale activities as having any measurable impact on the larger environment - personal actions are often deemed minuscule or inconsequential in the larger scheme. Yet it is the combined actions and activities of individuals that indeed can significantly impact the environment in a myri
NASA Astrophysics Data System (ADS)
Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley
2015-04-01
The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially increasing data volumes at NCI. Traditional HPC and data environments are still made available in a way that flexibly provides the tools, services and supporting software systems on these new petascale infrastructures. But to enable the research to take place at this scale, the data, metadata and software now need to evolve together - creating a new integrated high performance infrastructure. The new infrastructure at NCI currently supports a catalogue of integrated, reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. One of the challenges for NCI has been to support existing techniques and methods, while carefully preparing the underlying infrastructure for the transition needed for the next class of Data-intensive Science. In doing so, a flexible range of techniques and software can be made available for application across the corpus of data collections available, and to provide a new infrastructure for future interdisciplinary research.
Lara, Alvaro R; Galindo, Enrique; Ramírez, Octavio T; Palomares, Laura A
2006-11-01
The presence of spatial gradients in fundamental culture parameters, such as dissolved gases, pH, concentration of substrates, and shear rate, among others, is an important problem that frequently occurs in large-scale bioreactors. This problem is caused by a deficient mixing that results from limitations inherent to traditional scale-up methods and practical constraints during large-scale bioreactor design and operation. When cultured in a heterogeneous environment, cells are continuously exposed to fluctuating conditions as they travel through the various zones of a bioreactor. Such fluctuations can affect cell metabolism, yields, and quality of the products of interest. In this review, the theoretical analyses that predict the existence of environmental gradients in bioreactors and their experimental confirmation are reviewed. The origins of gradients in common culture parameters and their effects on various organisms of biotechnological importance are discussed. In particular, studies based on the scale-down methodology, a convenient tool for assessing the effect of environmental heterogeneities, are surveyed.
Self-organizing network services with evolutionary adaptation.
Nakano, Tadashi; Suda, Tatsuya
2005-09-01
This paper proposes a novel framework for developing adaptive and scalable network services. In the proposed framework, a network service is implemented as a group of autonomous agents that interact in the network environment. Agents in the proposed framework are autonomous and capable of simple behaviors (e.g., replication, migration, and death). In this paper, an evolutionary adaptation mechanism is designed using genetic algorithms (GAs) for agents to evolve their behaviors and improve their fitness values (e.g., response time to a service request) to the environment. The proposed framework is evaluated through simulations, and the simulation results demonstrate the ability of autonomous agents to adapt to the network environment. The proposed framework may be suitable for disseminating network services in dynamic and large-scale networks where a large number of data and services need to be replicated, moved, and deleted in a decentralized manner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, Maggie R.; Lundberg, Derek S.; del Rio, Tijana G.
Bacteria living on and in leaves and roots influence many aspects of plant health, so the extent of a plant's genetic control over its microbiota is of great interest to crop breeders and evolutionary biologists. Laboratory-based studies, because they poorly simulate true environmental heterogeneity, may misestimate or totally miss the influence of certain host genes on the microbiome. Here we report a large-scale field experiment to disentangle the effects of genotype, environment, age and year of harvest on bacterial communities associated with leaves and roots of Boechera stricta (Brassicaceae), a perennial wild mustard. Host genetic control of the microbiome ismore » evident in leaves but not roots, and varies substantially among sites. Microbiome composition also shifts as plants age. Furthermore, a large proportion of leaf bacterial groups are shared with roots, suggesting inoculation from soil. Our results demonstrate how genotype-by-environment interactions contribute to the complexity of microbiome assembly in natural environments.« less
NASA Astrophysics Data System (ADS)
Pankratova, Evgeniya V.; Kalyakulina, Alena I.
2016-12-01
We study the dynamics of multielement neuronal systems taking into account both the direct interaction between the cells via linear coupling and nondiffusive cell-to-cell communication via common environment. For the cells exhibiting individual bursting behavior, we have revealed the dependence of the network activity on its scale. Particularly, we show that small-scale networks demonstrate the inability to maintain complicated oscillations: for a small number of elements in an ensemble, the phenomenon of amplitude death is observed. The existence of threshold network scales and mechanisms causing firing in artificial and real multielement neural networks, as well as their significance for biological applications, are discussed.
Large-scale parallel genome assembler over cloud computing environment.
Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong
2017-06-01
The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.
White, Richard S A; Wintle, Brendan A; McHugh, Peter A; Booker, Douglas J; McIntosh, Angus R
2017-06-14
Despite growing concerns regarding increasing frequency of extreme climate events and declining population sizes, the influence of environmental stochasticity on the relationship between population carrying capacity and time-to-extinction has received little empirical attention. While time-to-extinction increases exponentially with carrying capacity in constant environments, theoretical models suggest increasing environmental stochasticity causes asymptotic scaling, thus making minimum viable carrying capacity vastly uncertain in variable environments. Using empirical estimates of environmental stochasticity in fish metapopulations, we showed that increasing environmental stochasticity resulting from extreme droughts was insufficient to create asymptotic scaling of time-to-extinction with carrying capacity in local populations as predicted by theory. Local time-to-extinction increased with carrying capacity due to declining sensitivity to demographic stochasticity, and the slope of this relationship declined significantly as environmental stochasticity increased. However, recent 1 in 25 yr extreme droughts were insufficient to extirpate populations with large carrying capacity. Consequently, large populations may be more resilient to environmental stochasticity than previously thought. The lack of carrying capacity-related asymptotes in persistence under extreme climate variability reveals how small populations affected by habitat loss or overharvesting, may be disproportionately threatened by increases in extreme climate events with global warming. © 2017 The Author(s).
Lopes, Anne; Sacquin-Mora, Sophie; Dimitrova, Viktoriya; Laine, Elodie; Ponty, Yann; Carbone, Alessandra
2013-01-01
Large-scale analyses of protein-protein interactions based on coarse-grain molecular docking simulations and binding site predictions resulting from evolutionary sequence analysis, are possible and realizable on hundreds of proteins with variate structures and interfaces. We demonstrated this on the 168 proteins of the Mintseris Benchmark 2.0. On the one hand, we evaluated the quality of the interaction signal and the contribution of docking information compared to evolutionary information showing that the combination of the two improves partner identification. On the other hand, since protein interactions usually occur in crowded environments with several competing partners, we realized a thorough analysis of the interactions of proteins with true partners but also with non-partners to evaluate whether proteins in the environment, competing with the true partner, affect its identification. We found three populations of proteins: strongly competing, never competing, and interacting with different levels of strength. Populations and levels of strength are numerically characterized and provide a signature for the behavior of a protein in the crowded environment. We showed that partner identification, to some extent, does not depend on the competing partners present in the environment, that certain biochemical classes of proteins are intrinsically easier to analyze than others, and that small proteins are not more promiscuous than large ones. Our approach brings to light that the knowledge of the binding site can be used to reduce the high computational cost of docking simulations with no consequence in the quality of the results, demonstrating the possibility to apply coarse-grain docking to datasets made of thousands of proteins. Comparison with all available large-scale analyses aimed to partner predictions is realized. We release the complete decoys set issued by coarse-grain docking simulations of both true and false interacting partners, and their evolutionary sequence analysis leading to binding site predictions. Download site: http://www.lgm.upmc.fr/CCDMintseris/ PMID:24339765
Climate of the Arctic marine environment.
Walsh, John E
2008-03-01
The climate of the Arctic marine environment is characterized by strong seasonality in the incoming solar radiation and by tremendous spatial variations arising from a variety of surface types, including open ocean, sea ice, large islands, and proximity to major landmasses. Interannual and decadal-scale variations are prominent features of Arctic climate, complicating the distinction between natural and anthropogenically driven variations. Nevertheless, climate models consistently indicate that the Arctic is the most climatically sensitive region of the Northern Hemisphere, especially near the sea ice margins. The Arctic marine environment has shown changes over the past several decades, and these changes are part of a broader global warming that exceeds the range of natural variability over the past 1000 years. Record minima of sea ice coverage during the past few summers and increased melt from Greenland have important implications for the hydrographic regime of the Arctic marine environment. The recent changes in the atmosphere (temperature, precipitation, pressure), sea ice, and ocean appear to be a coordinated response to systematic variations of the large-scale atmospheric circulation, superimposed on a general warming that is likely associated with increasing greenhouse gases. The changes have been sufficiently large in some sectors (e.g., the Bering/Chukchi Seas) that consequences for marine ecosystems appear to be underway. Global climate models indicate an additional warming of several degrees Celsius in much of the Arctic marine environment by 2050. However, the warming is seasonal (largest in autumn and winter), spatially variable, and closely associated with further retreat of sea ice. Additional changes predicted for 2050 are a general decrease of sea level pressure (largest in the Bering sector) and an increase of precipitation. While predictions of changes in storminess cannot be made with confidence, the predicted reduction of sea ice cover will almost certainly lead to increased oceanic mixing, ocean wave generation, and coastal flooding.
Environmental status of livestock and poultry sectors in China under current transformation stage.
Qian, Yi; Song, Kaihui; Hu, Tao; Ying, Tianyu
2018-05-01
Intensive animal husbandry had aroused great environmental concerns in many developed countries. However, some developing countries are still undergoing the environmental pollution from livestock and poultry sectors. Driven by the large demand, China has experienced a remarkable increase in dairy and meat production, especially in the transformation stage from conventional household breeding to large-scale industrial breeding. At the same time, a large amount of manure from the livestock and poultry sector is released into waterbodies and soil, causing eutrophication and soil degradation. This condition will be reinforced in the large-scale cultivation where the amount of manure exceeds the soil nutrient capacity, if not treated or utilized properly. Our research aims to analyze whether the transformation of raising scale would be beneficial to the environment as well as present the latest status of livestock and poultry sectors in China. The estimation of the pollutants generated and discharged from livestock and poultry sector in China will facilitate the legislation of manure management. This paper analyzes the pollutants generated from the manure of the five principal commercial animals in different farming practices. The results show that the fattening pigs contribute almost half of the pollutants released from manure. Moreover, the beef cattle exert the largest environmental impact for unitary production, about 2-3 times of pork and 5-20 times of chicken. The animals raised with large-scale feedlots practice generate fewer pollutants than those raised in households. The shift towards industrial production of livestock and poultry is easier to manage from the environmental perspective, but adequate large-scale cultivation is encouraged. Regulation control, manure treatment and financial subsidies for the manure treatment and utilization are recommended to achieve the ecological agriculture in China. Copyright © 2017 Elsevier B.V. All rights reserved.
Study of multi-functional precision optical measuring system for large scale equipment
NASA Astrophysics Data System (ADS)
Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi
2017-10-01
The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.
Acoustic localization at large scales: a promising method for grey wolf monitoring.
Papin, Morgane; Pichenot, Julian; Guérold, François; Germain, Estelle
2018-01-01
The grey wolf ( Canis lupus ) is naturally recolonizing its former habitats in Europe where it was extirpated during the previous two centuries. The management of this protected species is often controversial and its monitoring is a challenge for conservation purposes. However, this elusive carnivore can disperse over long distances in various natural contexts, making its monitoring difficult. Moreover, methods used for collecting signs of presence are usually time-consuming and/or costly. Currently, new acoustic recording tools are contributing to the development of passive acoustic methods as alternative approaches for detecting, monitoring, or identifying species that produce sounds in nature, such as the grey wolf. In the present study, we conducted field experiments to investigate the possibility of using a low-density microphone array to localize wolves at a large scale in two contrasting natural environments in north-eastern France. For scientific and social reasons, the experiments were based on a synthetic sound with similar acoustic properties to howls. This sound was broadcast at several sites. Then, localization estimates and the accuracy were calculated. Finally, linear mixed-effects models were used to identify the factors that influenced the localization accuracy. Among 354 nocturnal broadcasts in total, 269 were recorded by at least one autonomous recorder, thereby demonstrating the potential of this tool. Besides, 59 broadcasts were recorded by at least four microphones and used for acoustic localization. The broadcast sites were localized with an overall mean accuracy of 315 ± 617 (standard deviation) m. After setting a threshold for the temporal error value associated with the estimated coordinates, some unreliable values were excluded and the mean accuracy decreased to 167 ± 308 m. The number of broadcasts recorded was higher in the lowland environment, but the localization accuracy was similar in both environments, although it varied significantly among different nights in each study area. Our results confirm the potential of using acoustic methods to localize wolves with high accuracy, in different natural environments and at large spatial scales. Passive acoustic methods are suitable for monitoring the dynamics of grey wolf recolonization and so, will contribute to enhance conservation and management plans.
Ishimori, Yuu; Mitsunobu, Fumihiro; Yamaoka, Kiyonori; Tanaka, Hiroshi; Kataoka, Takahiro; Sakoda, Akihiro
2011-07-01
A radon test facility for small animals was developed in order to increase the statistical validity of differences of the biological response in various radon environments. This paper illustrates the performances of that facility, the first large-scale facility of its kind in Japan. The facility has a capability to conduct approximately 150 mouse-scale tests at the same time. The apparatus for exposing small animals to radon has six animal chamber groups with five independent cages each. Different radon concentrations in each animal chamber group are available. Because the first target of this study is to examine the in vivo behaviour of radon and its effects, the major functions to control radon and to eliminate thoron were examined experimentally. Additionally, radon progeny concentrations and their particle size distributions in the cages were also examined experimentally to be considered in future projects.
PetIGA: A framework for high-performance isogeometric analysis
Dalcin, Lisandro; Collier, Nathaniel; Vignal, Philippe; ...
2016-05-25
We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility ofmore » PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. Lastly, we show strong scaling results on up to 4096 cores, which confirm the suitability of PetIGA for large scale simulations.« less
The Role of Forests in Regulating the River Flow Regime of Large Basins of the World
NASA Astrophysics Data System (ADS)
Salazar, J. F.; Villegas, J. C.; Mercado-Bettin, D. A.; Rodríguez, E.
2016-12-01
Many natural and social phenomena depend on river flow regimes that are being altered by global change. Understanding the mechanisms behind such alterations is crucial for predicting river flow regimes in a changing environment. Here we explore potential linkages between the presence of forests and the capacity of river basins for regulating river flows. Regulation is defined here as the capacity of river basins to attenuate the amplitude of the river flow regime, that is to reduce the difference between high and low flows. We first use scaling theory to show how scaling properties of observed river flows can be used to classify river basins as regulated or unregulated. This parsimonious classification is based on a physical interpretation of the scaling properties (particularly the scaling exponents) that is novel (most previous studies have focused on the interpretation of the scaling exponents for floods only), and widely-applicable to different basins (the only assumption is that river flows in a given river basin exhibit scaling properties through well-known power laws). Then we show how this scaling framework can be used to explore global-change-induced temporal variations in the regulation capacity of river basins. Finally, we propose a conceptual hypothesis (the "Forest reservoir concept") to explain how large-scale forests can exert important effects on the long-term water balance partitioning and regulation capacity of large basins of the world. Our quantitative results are based on data analysis (river flows and land cover features) from 22 large basins of the world, with emphasis in the Amazon river and its main tributaries. Collectively, our findings support the hypothesis that forest cover enhances the capacity of large river basins to maintain relatively high mean river flows, as well as to regulate (ameliorate) extreme river flows. Advancing towards this quantitative understanding of the relation between forest cover and river flow regimes is crucial for water management- and land cover-related decisions.
The Role of Forests in Regulating the River Flow Regime of Large Basins of the World
NASA Astrophysics Data System (ADS)
Salazar, J. F.; Villegas, J. C.; Mercado-Bettin, D. A.; Rodríguez, E.
2017-12-01
Many natural and social phenomena depend on river flow regimes that are being altered by global change. Understanding the mechanisms behind such alterations is crucial for predicting river flow regimes in a changing environment. Here we explore potential linkages between the presence of forests and the capacity of river basins for regulating river flows. Regulation is defined here as the capacity of river basins to attenuate the amplitude of the river flow regime, that is to reduce the difference between high and low flows. We first use scaling theory to show how scaling properties of observed river flows can be used to classify river basins as regulated or unregulated. This parsimonious classification is based on a physical interpretation of the scaling properties (particularly the scaling exponents) that is novel (most previous studies have focused on the interpretation of the scaling exponents for floods only), and widely-applicable to different basins (the only assumption is that river flows in a given river basin exhibit scaling properties through well-known power laws). Then we show how this scaling framework can be used to explore global-change-induced temporal variations in the regulation capacity of river basins. Finally, we propose a conceptual hypothesis (the "Forest reservoir concept") to explain how large-scale forests can exert important effects on the long-term water balance partitioning and regulation capacity of large basins of the world. Our quantitative results are based on data analysis (river flows and land cover features) from 22 large basins of the world, with emphasis in the Amazon river and its main tributaries. Collectively, our findings support the hypothesis that forest cover enhances the capacity of large river basins to maintain relatively high mean river flows, as well as to regulate (ameliorate) extreme river flows. Advancing towards this quantitative understanding of the relation between forest cover and river flow regimes is crucial for water management- and land cover-related decisions.
Extended-Range High-Resolution Dynamical Downscaling over a Continental-Scale Domain
NASA Astrophysics Data System (ADS)
Husain, S. Z.; Separovic, L.; Yu, W.; Fernig, D.
2014-12-01
High-resolution mesoscale simulations, when applied for downscaling meteorological fields over large spatial domains and for extended time periods, can provide valuable information for many practical application scenarios including the weather-dependent renewable energy industry. In the present study, a strategy has been proposed to dynamically downscale coarse-resolution meteorological fields from Environment Canada's regional analyses for a period of multiple years over the entire Canadian territory. The study demonstrates that a continuous mesoscale simulation over the entire domain is the most suitable approach in this regard. Large-scale deviations in the different meteorological fields pose the biggest challenge for extended-range simulations over continental scale domains, and the enforcement of the lateral boundary conditions is not sufficient to restrict such deviations. A scheme has therefore been developed to spectrally nudge the simulated high-resolution meteorological fields at the different model vertical levels towards those embedded in the coarse-resolution driving fields derived from the regional analyses. A series of experiments were carried out to determine the optimal nudging strategy including the appropriate nudging length scales, nudging vertical profile and temporal relaxation. A forcing strategy based on grid nudging of the different surface fields, including surface temperature, soil-moisture, and snow conditions, towards their expected values obtained from a high-resolution offline surface scheme was also devised to limit any considerable deviation in the evolving surface fields due to extended-range temporal integrations. The study shows that ensuring large-scale atmospheric similarities helps to deliver near-surface statistical scores for temperature, dew point temperature and horizontal wind speed that are better or comparable to the operational regional forecasts issued by Environment Canada. Furthermore, the meteorological fields resulting from the proposed downscaling strategy have significantly improved spatiotemporal variance compared to those from the operational forecasts, and any time series generated from the downscaled fields do not suffer from discontinuities due to switching between the consecutive forecasts.
Extending large-scale forest inventories to assess urban forests.
Corona, Piermaria; Agrimi, Mariagrazia; Baffetta, Federica; Barbati, Anna; Chiriacò, Maria Vincenza; Fattorini, Lorenzo; Pompei, Enrico; Valentini, Riccardo; Mattioli, Walter
2012-03-01
Urban areas are continuously expanding today, extending their influence on an increasingly large proportion of woods and trees located in or nearby urban and urbanizing areas, the so-called urban forests. Although these forests have the potential for significantly improving the quality the urban environment and the well-being of the urban population, data to quantify the extent and characteristics of urban forests are still lacking or fragmentary on a large scale. In this regard, an expansion of the domain of multipurpose forest inventories like National Forest Inventories (NFIs) towards urban forests would be required. To this end, it would be convenient to exploit the same sampling scheme applied in NFIs to assess the basic features of urban forests. This paper considers approximately unbiased estimators of abundance and coverage of urban forests, together with estimators of the corresponding variances, which can be achieved from the first phase of most large-scale forest inventories. A simulation study is carried out in order to check the performance of the considered estimators under various situations involving the spatial distribution of the urban forests over the study area. An application is worked out on the data from the Italian NFI.
Pynamic: the Python Dynamic Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, G L; Ahn, D H; de Supinksi, B R
2007-07-10
Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, wemore » present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cree, Johnathan Vee; Delgado-Frias, Jose
Large scale wireless sensor networks have been proposed for applications ranging from anomaly detection in an environment to vehicle tracking. Many of these applications require the networks to be distributed across a large geographic area while supporting three to five year network lifetimes. In order to support these requirements large scale wireless sensor networks of duty-cycled devices need a method of efficient and effective autonomous configuration/maintenance. This method should gracefully handle the synchronization tasks duty-cycled networks. Further, an effective configuration solution needs to recognize that in-network data aggregation and analysis presents significant benefits to wireless sensor network and should configuremore » the network in a way such that said higher level functions benefit from the logically imposed structure. NOA, the proposed configuration and maintenance protocol, provides a multi-parent hierarchical logical structure for the network that reduces the synchronization workload. It also provides higher level functions with significant inherent benefits such as but not limited to: removing network divisions that are created by single-parent hierarchies, guarantees for when data will be compared in the hierarchy, and redundancies for communication as well as in-network data aggregation/analysis/storage.« less
NASA Astrophysics Data System (ADS)
Beichner, Robert
2015-03-01
The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).
A global probabilistic tsunami hazard assessment from earthquake sources
Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana
2017-01-01
Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.
Evaluative Appraisals of Environmental Mystery and Surprise
ERIC Educational Resources Information Center
Nasar, Jack L.; Cubukcu, Ebru
2011-01-01
This study used a desktop virtual environment (VE) of 15 large-scale residential streets to test the effects of environmental mystery and surprise on response. In theory, mystery and surprise should increase interest and visual appeal. For each VE, participants walked through an approach street and turned right onto a post-turn street. We designed…
USDA-ARS?s Scientific Manuscript database
The Fusarium graminearum species complex (FGSC) comprises several toxigenic species that cause Fusarium head blight (FHB) in wheat. In this study, high number (n=671 isolates) of pathogenic isolates (isolated from infected spikes) was obtained from a 3-year large-scale survey (2009-2011) conducted o...
Annual Research Review: Developmental Considerations of Gene by Environment Interactions
ERIC Educational Resources Information Center
Lenroot, Rhoshel K.; Giedd, Jay N.
2011-01-01
Biological development is driven by a complex dance between nurture and nature, determined not only by the specific features of the interacting genetic and environmental influences but also by the timing of their rendezvous. The initiation of large-scale longitudinal studies, ever-expanding knowledge of genetics, and increasing availability of…
Examining the Types, Features, and Use of Instructional Materials in Afterschool Science
ERIC Educational Resources Information Center
D'Angelo, Cynthia M.; Harris, Christopher J.; Lundh, Patrik; House, Ann; Leones, Tiffany; Llorente, Carlin
2017-01-01
Afterschool programs have garnered much attention as promising environments for learning where children can engage in rich science activities. Yet, little is known about the kinds of instructional materials used in typical, large-scale afterschool programs that implement science with diverse populations of children. In this study, we investigated…
ERIC Educational Resources Information Center
Andrews, Dee H.; Dineen, Toni; Bell, Herbert H.
1999-01-01
Discusses the use of constructive modeling and virtual simulation in team training; describes a military application of constructive modeling, including technology issues and communication protocols; considers possible improvements; and discusses applications in team-learning environments other than military, including industry and education. (LRW)
Teachers' Views about Pupil Diversity in the Primary School Classroom
ERIC Educational Resources Information Center
Kaldi, Stavroula; Govaris, Christos; Filippatou, Diamanto
2018-01-01
The present study explores Greek primary school teachers' perceptions and views on pupil diversity in the classroom environment. A large-scale survey was carried out in order to examine teachers' perceptions about pupil diversity and to identify personal and/or educational characteristics that can influence or predict these perceptions. The…
Studies in Environment--Volume III: Pollution and the Municipality.
ERIC Educational Resources Information Center
Cooper, Pamela C.; And Others
Recent studies have focused attention on the fact that residents of inner-city neighborhoods are subject to greater amounts of pollutants than are other neighborhoods of large cities. In this study, the premise is set forth and investigated at the metropolitan scale, seeking to discover differences of impact between the center city and its…
A large scale joint analysis of flowering time reveals independent temperate adaptations in maize
USDA-ARS?s Scientific Manuscript database
Modulating days to flowering is a key mechanism in plants for adapting to new environments, and variation in days to flowering drives population structure by limiting mating. To elucidate the genetic architecture of flowering across maize, a quantitative trait, we mapped flowering in five global pop...
The Characteristics and Quality of Pre-School Education in Spain
ERIC Educational Resources Information Center
Sandstrom, Heather
2012-01-01
We examined 25 four-year-old pre-school classrooms from a random sample of 15 schools within a large urban city in southern Spain. Observational measures of classroom quality included the Early Childhood Environment Rating Scale-Revised, the Classroom Assessment Scoring System and the Observation of Activities in Pre-school. Findings revealed…
Teaching and Learning International Survey TALIS 2013: Conceptual Framework. Final
ERIC Educational Resources Information Center
Rutkowski, David; Rutkowski, Leslie; Bélanger, Julie; Knoll, Steffen; Weatherby, Kristen; Prusinski, Ellen
2013-01-01
In 2008, the initial cycle of the OECD's Teaching and Learning International Survey (TALIS 2008) established, for the first time, an international, large-scale survey of the teaching workforce, the conditions of teaching, and the learning environments of schools in participating countries. The second cycle of TALIS (TALIS 2013) aims to continue…
The Design and Evaluation of a Large-Scale Real-Walking Locomotion Interface
Peck, Tabitha C.; Fuchs, Henry; Whitton, Mary C.
2014-01-01
Redirected Free Exploration with Distractors (RFED) is a large-scale real-walking locomotion interface developed to enable people to walk freely in virtual environments that are larger than the tracked space in their facility. This paper describes the RFED system in detail and reports on a user study that evaluated RFED by comparing it to walking-in-place and joystick interfaces. The RFED system is composed of two major components, redirection and distractors. This paper discusses design challenges, implementation details, and lessons learned during the development of two working RFED systems. The evaluation study examined the effect of the locomotion interface on users’ cognitive performance on navigation and wayfinding measures. The results suggest that participants using RFED were significantly better at navigating and wayfinding through virtual mazes than participants using walking-in-place and joystick interfaces. Participants traveled shorter distances, made fewer wrong turns, pointed to hidden targets more accurately and more quickly, and were able to place and label targets on maps more accurately, and more accurately estimate the virtual environment size. PMID:22184262
Monitoring Surface Climate With its Emissivity Derived From Satellite Measurements
NASA Technical Reports Server (NTRS)
Zhou, Daniel K.; Larar, Allen M.; Liu, Xu
2012-01-01
Satellite thermal infrared (IR) spectral emissivity data have been shown to be significant for atmospheric research and monitoring the Earth fs environment. Long-term and large-scale observations needed for global monitoring and research can be supplied by satellite-based remote sensing. Presented here is the global surface IR emissivity data retrieved from the last 5 years of Infrared Atmospheric Sounding Interferometer (IASI) measurements observed from the MetOp-A satellite. Monthly mean surface properties (i.e., skin temperature T(sub s) and emissivity spectra epsilon(sub v) with a spatial resolution of 0.5x0.5-degrees latitude-longitude are produced to monitor seasonal and inter-annual variations. We demonstrate that surface epsilon(sub v) and T(sub s) retrieved with IASI measurements can be used to assist in monitoring surface weather and surface climate change. Surface epsilon(sub v) together with T(sub s) from current and future operational satellites can be utilized as a means of long-term and large-scale monitoring of Earth 's surface weather environment and associated changes.
Coarse-grained models of key self-assembly processes in HIV-1
NASA Astrophysics Data System (ADS)
Grime, John
Computational molecular simulations can elucidate microscopic information that is inaccessible to conventional experimental techniques. However, many processes occur over time and length scales that are beyond the current capabilities of atomic-resolution molecular dynamics (MD). One such process is the self-assembly of the HIV-1 viral capsid, a biological structure that is crucial to viral infectivity. The nucleation and growth of capsid structures requires the interaction of large numbers of capsid proteins within a complicated molecular environment. Coarse-grained (CG) models, where degrees of freedom are removed to produce more computationally efficient models, can in principle access large-scale phenomena such as the nucleation and growth of HIV-1 capsid lattice. We report here studies of the self-assembly behaviors of a CG model of HIV-1 capsid protein, including the influence of the local molecular environment on nucleation and growth processes. Our results suggest a multi-stage process, involving several characteristic structures, eventually producing metastable capsid lattice morphologies that are amenable to subsequent capsid dissociation in order to transmit the viral infection.
Biogas from Macroalgae: is it time to revisit the idea?
2012-01-01
The economic and environmental viability of dedicated terrestrial energy crops is in doubt. The production of large scale biomass (macroalgae) for biofuels in the marine environment was first tested in the late 1960’s. The culture attempts failed due to the engineering challenges of farming offshore. However the energy conversion via anaerobic digestion was successful as the biochemical composition of macroalgae makes it an ideal feedstock. The technology for the mass production of macroalgae has developed principally in China and Asia over the last 50 years to such a degree that it is now the single largest product of aquaculture. There has also been significant technology transfer and macroalgal cultivation is now well tried and tested in Europe and America. The inherent advantage of production of biofuel feedstock in the marine environment is that it does not compete with food production for land or fresh water. Here we revisit the idea of the large scale cultivation of macroalgae at sea for subsequent anaerobic digestion to produce biogas as a source of renewable energy, using a European case study as an example. PMID:23186536
Impact of environment on dynamics of exciton complexes in a WS2 monolayer
NASA Astrophysics Data System (ADS)
Jakubczyk, Tomasz; Nogajewski, Karol; Molas, Maciej R.; Bartos, Miroslav; Langbein, Wolfgang; Potemski, Marek; Kasprzak, Jacek
2018-07-01
Scientific curiosity to uncover original optical properties and functionalities of atomically thin semiconductors, stemming from unusual Coulomb interactions in the two-dimensional geometry and multi-valley band structure, drives the research on monolayers of transition metal dichalcogenides (TMDs). While recent works ascertained the exotic energetic schemes of exciton complexes in TMDs, we here infer their unusual coherent dynamics occurring on subpicosecond time scale. The dynamics is largely affected by the disorder landscape on the submicron scale, thus can be uncovered using four-wave mixing in the frequency domain, which enables microscopic investigations and imaging. Focusing on a WS2 monolayer, we observe that exciton coherence is lost primarily due to interaction with phonons and relaxation processes towards optically dark excitonic states. Notably, when temperature is low and disorder weak, excitons large coherence volume results in enhanced oscillator strength, allowing to reach the regime of radiatively limited dephasing. Additionally, we observe long valley coherence for the negatively charged exciton complex. We therefore elucidate the crucial role of exciton environment in the TMDs on its dynamics and show that revealed mechanisms are ubiquitous within this family.
Research on the impacts of large-scale electric vehicles integration into power grid
NASA Astrophysics Data System (ADS)
Su, Chuankun; Zhang, Jian
2018-06-01
Because of its special energy driving mode, electric vehicles can improve the efficiency of energy utilization and reduce the pollution to the environment, which is being paid more and more attention. But the charging behavior of electric vehicles is random and intermittent. If the electric vehicle is disordered charging in a large scale, it causes great pressure on the structure and operation of the power grid and affects the safety and economic operation of the power grid. With the development of V2G technology in electric vehicle, the study of the charging and discharging characteristics of electric vehicles is of great significance for improving the safe operation of the power grid and the efficiency of energy utilization.
Computational solutions to large-scale data management and analysis
Schadt, Eric E.; Linderman, Michael D.; Sorenson, Jon; Lee, Lawrence; Nolan, Garry P.
2011-01-01
Today we can generate hundreds of gigabases of DNA and RNA sequencing data in a week for less than US$5,000. The astonishing rate of data generation by these low-cost, high-throughput technologies in genomics is being matched by that of other technologies, such as real-time imaging and mass spectrometry-based flow cytometry. Success in the life sciences will depend on our ability to properly interpret the large-scale, high-dimensional data sets that are generated by these technologies, which in turn requires us to adopt advances in informatics. Here we discuss how we can master the different types of computational environments that exist — such as cloud and heterogeneous computing — to successfully tackle our big data problems. PMID:20717155
Remote sensing applied to numerical modelling. [water resources pollution
NASA Technical Reports Server (NTRS)
Sengupta, S.; Lee, S. S.; Veziroglu, T. N.; Bland, R.
1975-01-01
Progress and remaining difficulties in the construction of predictive mathematical models of large bodies of water as ecosystems are reviewed. Surface temperature is at present the only variable than can be measured accurately and reliably by remote sensing techniques, but satellite infrared data are of sufficient resolution for macro-scale modeling of oceans and large lakes, and airborne radiometers are useful in meso-scale analysis (of lakes, bays, and thermal plumes). Finite-element and finite-difference techniques applied to the solution of relevant coupled time-dependent nonlinear partial differential equations are compared, and the specific problem of the Biscayne Bay and environs ecosystem is tackled in a finite-differences treatment using the rigid-lid model and a rigid-line grid system.
Use of large-scale, multi-species surveys to monitor gyrfalcon and ptarmigan populations
Bart, Jonathan; Fuller, Mark; Smith, Paul; Dunn, Leah; Watson, Richard T.; Cade, Tom J.; Fuller, Mark; Hunt, Grainger; Potapov, Eugene
2011-01-01
We evaluated the ability of three large-scale, multi-species surveys in the Arctic to provide information on abundance and habitat relationships of Gyrfalcons (Falco rusticolus) and ptarmigan. The Program for Regional and International Shorebird Monitoring (PRISM) has surveyed birds widely across the arctic regions of Canada and Alaska since 2001. The Arctic Coastal Plain survey has collected abundance information on the North Slope of Alaska using fixed-wing aircraft since 1992. The Northwest Territories-Nunavut Bird Checklist has collected presenceabsence information from little-known locations in northern Canada since 1995. All three surveys provide extensive information on Willow Ptarmigan (Lagopus lagopus) and Rock Ptarmigan (L. muta). For example, they show that ptarmigan are most abundant in western Alaska, next most abundant in northern Alaska and northwest Canada, and least abundant in the Canadian Archipelago. PRISM surveys were less successful in detecting Gyrfalcons, and the Arctic Coastal Plain Survey is largely outside the Gyrfalcon?s breeding range. The Checklist Survey, however, reflects the expansive Gyrfalcon range in Canada. We suggest that collaboration by Gyrfalcon and ptarmigan biologists with the organizers of large scale surveys like the ones we investigated provides an opportunity for obtaining useful information on these species and their environment across large areas.
Emissions of air pollutants from scented candles burning in a test chamber
NASA Astrophysics Data System (ADS)
Derudi, Marco; Gelosa, Simone; Sliepcevich, Andrea; Cattaneo, Andrea; Rota, Renato; Cavallo, Domenico; Nano, Giuseppe
2012-08-01
Burning of scented candles in indoor environment can release a large number of toxic chemicals. However, in spite of the large market penetration of scented candles, very few works investigated their organic pollutants emissions. This paper investigates volatile organic compounds emissions, with particular reference to the priority indoor pollutants identified by the European Commission, from the burning of scented candles in a laboratory-scale test chamber. It has been found that BTEX and PAHs emission factors show large differences among different candles, possibly due to the raw paraffinic material used, while aldehydes emission factors seem more related to the presence of additives. This clearly evidences the need for simple and cheap methodologies to measure the emission factors of commercial candles in order to foresee the expected pollutant concentration in a given indoor environment and compare it with health safety standards.
Assessing the sustainable construction of large construction companies in Malaysia
NASA Astrophysics Data System (ADS)
Adewale, Bamgbade Jibril; Mohammed, Kamaruddeen Ahmed; Nasrun, Mohd Nawi Mohd
2016-08-01
Considering the increasing concerns for the consideration of sustainability issues in construction project delivery within the construction industry, this paper assesses the extent of sustainable construction among Malaysian large contractors, in order to ascertain the level of the industry's impacts on both the environment and the society. Sustainable construction explains the construction industry's responsibility to efficiently utilise the finite resources while also reducing construction impacts on both humans and the environment throughout the phases of construction. This study used proportionate stratified random sampling to conduct a field study with a sample of 172 contractors out of the 708 administered questionnaires. Data were collected from large contractors in the eleven states of peninsular Malaysia. Using the five-level rating scale (which include: 1= Very Low; 2= Low; 3= Moderate; 4= High; 5= Very High) to describe the level of sustainable construction of Malaysian contractors based on previous studies, statistical analysis reveals that environmental, social and economic sustainability of Malaysian large contractors are high.
Explaining human uniqueness: genome interactions with environment, behaviour and culture.
Varki, Ajit; Geschwind, Daniel H; Eichler, Evan E
2008-10-01
What makes us human? Specialists in each discipline respond through the lens of their own expertise. In fact, 'anthropogeny' (explaining the origin of humans) requires a transdisciplinary approach that eschews such barriers. Here we take a genomic and genetic perspective towards molecular variation, explore systems analysis of gene expression and discuss an organ-systems approach. Rejecting any 'genes versus environment' dichotomy, we then consider genome interactions with environment, behaviour and culture, finally speculating that aspects of human uniqueness arose because of a primate evolutionary trend towards increasing and irreversible dependence on learned behaviours and culture - perhaps relaxing allowable thresholds for large-scale genomic diversity.
Enabling communication in emergency response environments.
Aldunate, Roberto G; Schmidt, Klaus Nicholas; Herrera, Oriel
2012-01-01
Effective communication among first responders during response to natural and human-made large-scale catastrophes has increased tremendously during the last decade. However, most efforts to achieve a higher degree of effectiveness in communication lack synergy between the environment and the technology involved to support first responders operations. This article presents a natural and intuitive interface to support Stigmergy; or communication through the environment, based on intuitively marking and retrieving information from the environment with a pointer. A prototype of the system was built and tested in the field, however the pointing activity revealed challenges regarding accuracy due to limitations of the sensors used. The results obtained from these field tests were the basis for this research effort and will have the potential to enable communication through the environment for first responders operating in highly dynamical and inhospitable disaster relief environments.
Climate Change Potential Impacts on the Built Environment and Possible Adaptation Strategies
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.
2014-01-01
The built environment consists of components that exist at a range of scales from small (e.g., houses, shopping malls) to large (e.g., transportation networks) to highly modified landscapes such as cities. Thus, the impacts of climate change on the built environment may have a multitude of effects on humans and the land. The impact of climate change may be exacerbated by the interaction of different events that singly may be minor, but together may have a synergistic set of impacts that are significant. Also, mechanisms may exist wherein the built environment, particularly in the form of cities, may affect weather and the climate on local and regional scales. Hence, a city may be able to cope with prolonged heat waves, but if this is combined with severe drought, the overall result could be significant or even catastrophic, as accelerating demand for energy to cooling taxes water supplies needed both for energy supply and municipal water needs. This presentation surveys potential climate change impacts on the built environment from the perspective of the National Climate Assessment, and explores adaptation measures that can be employed to mitigate these impacts.
Segmentation and Quantitative Analysis of Epithelial Tissues.
Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne
2016-01-01
Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.
The Army’s National Training Center: A Case Study in Management of a Large Defense Project
1983-04-26
34Desining . . .," op cit, p. 130. IT. M3l Ior-ritc’h and C. K. Prahalad , "Managing Multi-Organization Enterprises: The Emerging Strategic Frontier...Rational Altor TModel "assumes that what must be explained is an action, i.e., behavior that reflects purpose or intention .Ŗ It assumes "that what human...a fourth element is essential to the success of large-scale commerciali- zation programs: a favorable corporate strategic environment. This element
Large Eddy Simulation of Flame-Turbulence Interactions in a LOX-CH4 Shear Coaxial Injector
2012-01-01
heat transfer from dense to light fluids.A previous study on LOX/H2 flames39,40 have pointed the limitations of central scheme to predict such large...pp. 151–169. 39Masquelet, M., Simulations of a Sub-scale Liquid Rocket Engine: Transient Heat Transfer in a Real Gas Environment , Master’s thesis...Eddy Simulation of a cryogenic flame issued from a LOX-CH4 shear coaxial injector. The operating pressure is above the critical pressure for both
Individual Decision-Making in Uncertain and Large-Scale Multi-Agent Environments
2009-02-18
first method, labeled as MC, limits and holds constant the number of models, 0 < KMC < M, where M is the possibly large number of candidate models of...equivalent and hence may be replaced by a subset of representative models without a significant loss in the optimality of the decision maker. KMC ...for different horizons. KMC and M are equal to 50 and 100 respectively for both approximate and exact approaches (Pentium 4, 3.0GHz, 1GB RAM, WinXP
NASA Astrophysics Data System (ADS)
Sinha, Neeraj; Zambon, Andrea; Ott, James; Demagistris, Michael
2015-06-01
Driven by the continuing rapid advances in high-performance computing, multi-dimensional high-fidelity modeling is an increasingly reliable predictive tool capable of providing valuable physical insight into complex post-detonation reacting flow fields. Utilizing a series of test cases featuring blast waves interacting with combustible dispersed clouds in a small-scale test setup under well-controlled conditions, the predictive capabilities of a state-of-the-art code are demonstrated and validated. Leveraging physics-based, first principle models and solving large system of equations on highly-resolved grids, the combined effects of finite-rate/multi-phase chemical processes (including thermal ignition), turbulent mixing and shock interactions are captured across the spectrum of relevant time-scales and length scales. Since many scales of motion are generated in a post-detonation environment, even if the initial ambient conditions are quiescent, turbulent mixing plays a major role in the fireball afterburning as well as in dispersion, mixing, ignition and burn-out of combustible clouds in its vicinity. Validating these capabilities at the small scale is critical to establish a reliable predictive tool applicable to more complex and large-scale geometries of practical interest.
Beyond theories of plant invasions: Lessons from natural landscapes
Stohlgren, Thomas J.
2002-01-01
There are a growing number of contrasting theories about plant invasions, but most are only weakly supported by small-scale field experiments, observational studies, and mathematical models. Among the most contentious theories is that species-rich habitats should be less vulnerable to plant invasion than species-poor sites, stemming from earlier theories that competition is a major force in structuring plant communities. Early ecologists such as Charles Darwin (1859) and Charles Elton (1958) suggested that a lack of intense interspecific competition on islands made these low-diversity habitats vulnerable to invasion. Small-scale field experiments have supported and contradicted this theory, as have various mathematical models. In contrast, many large-scale observational studies and detailed vegetation surveys in continental areas often report that species-rich areas are more heavily invaded than species-poor areas, but there are exceptions here as well. In this article, I show how these seemingly contrasting patterns converge once appropriate spatial and temporal scales are considered in complex natural environments. I suggest ways in which small-scale experiments, mathematical models, and large- scale observational studies can be improved and better integrated to advance a theoretically based understanding of plant invasions.
Diffuse pollution of soil and water: Long term trends at large scales?
NASA Astrophysics Data System (ADS)
Grathwohl, P.
2012-04-01
Industrialization and urbanization, which consequently increased pressure on the environment to cause degradation of soil and water quality over more than a century, is still ongoing. The number of potential environmental contaminants detected in surface and groundwater is continuously increasing; from classical industrial and agricultural chemicals, to flame retardants, pharmaceuticals, and personal care products. While point sources of pollution can be managed in principle, diffuse pollution is only reversible at very long time scales if at all. Compounds which were phased out many decades ago such as PCBs or DDT are still abundant in soils, sediments and biota. How diffuse pollution is processed at large scales in space (e.g. catchments) and time (centuries) is unknown. The relevance to the field of processes well investigated at the laboratory scale (e.g. sorption/desorption and (bio)degradation kinetics) is not clear. Transport of compounds is often coupled to the water cycle and in order to assess trends in diffuse pollution, detailed knowledge about the hydrology and the solute fluxes at the catchment scale is required (e.g. input/output fluxes, transformation rates at the field scale). This is also a prerequisite in assessing management options for reversal of adverse trends.
Range-wide reproductive consequences of ocean climate variability for the seabird Cassin's Auklet.
Wolf, Shaye G; Sydeman, William J; Hipfner, J Mark; Abraham, Christine L; Tershy, Bernie R; Croll, Donald A
2009-03-01
We examine how ocean climate variability influences the reproductive phenology and demography of the seabird Cassin's Auklet (Ptychoramphus aleuticus) across approximately 2500 km of its breeding range in the oceanographically dynamic California Current System along the west coast of North America. Specifically, we determine the extent to which ocean climate conditions and Cassin's Auklet timing of breeding and breeding success covary across populations in British Columbia, central California, and northern Mexico over six years (2000-2005) and test whether auklet timing of breeding and breeding success are similarly related to local and large-scale ocean climate indices across populations. Local ocean foraging environments ranged from seasonally variable, high-productivity environments in the north to aseasonal, low-productivity environments to the south, but covaried similarly due to the synchronizing effects of large-scale climate processes. Auklet timing of breeding in the southern population did not covary with populations to the north and was not significantly related to local oceanographic conditions, in contrast to northern populations, where timing of breeding appears to be influenced by oceanographic cues that signal peaks in prey availability. Annual breeding success covaried similarly across populations and was consistently related to local ocean climate conditions across this system. Overall, local ocean climate indices, particularly sea surface height, better explained timing of breeding and breeding success than a large-scale climate index by better representing heterogeneity in physical processes important to auklets and their prey. The significant, consistent relationships we detected between Cassin's Auklet breeding success and ocean climate conditions across widely spaced populations indicate that Cassin's Auklets are susceptible to climate change across the California Current System, especially by the strengthening of climate processes that synchronize oceanographic conditions. Auklet populations in the northern and central regions of this ecosystem may be more sensitive to changes in the timing and variability of ocean climate conditions since they appear to time breeding to take advantage of seasonal productivity peaks.
Topographically Engineered Large Scale Nanostructures for Plasmonic Biosensing
NASA Astrophysics Data System (ADS)
Xiao, Bo; Pradhan, Sangram K.; Santiago, Kevin C.; Rutherford, Gugu N.; Pradhan, Aswini K.
2016-04-01
We demonstrate that a nanostructured metal thin film can achieve enhanced transmission efficiency and sharp resonances and use a large-scale and high-throughput nanofabrication technique for the plasmonic structures. The fabrication technique combines the features of nanoimprint and soft lithography to topographically construct metal thin films with nanoscale patterns. Metal nanogratings developed using this method show significantly enhanced optical transmission (up to a one-order-of-magnitude enhancement) and sharp resonances with full width at half maximum (FWHM) of ~15nm in the zero-order transmission using an incoherent white light source. These nanostructures are sensitive to the surrounding environment, and the resonance can shift as the refractive index changes. We derive an analytical method using a spatial Fourier transformation to understand the enhancement phenomenon and the sensing mechanism. The use of real-time monitoring of protein-protein interactions in microfluidic cells integrated with these nanostructures is demonstrated to be effective for biosensing. The perpendicular transmission configuration and large-scale structures provide a feasible platform without sophisticated optical instrumentation to realize label-free surface plasmon resonance (SPR) sensing.
Opportunities for Breakthroughs in Large-Scale Computational Simulation and Design
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia; Alter, Stephen J.; Atkins, Harold L.; Bey, Kim S.; Bibb, Karen L.; Biedron, Robert T.; Carpenter, Mark H.; Cheatwood, F. McNeil; Drummond, Philip J.; Gnoffo, Peter A.
2002-01-01
Opportunities for breakthroughs in the large-scale computational simulation and design of aerospace vehicles are presented. Computational fluid dynamics tools to be used within multidisciplinary analysis and design methods are emphasized. The opportunities stem from speedups and robustness improvements in the underlying unit operations associated with simulation (geometry modeling, grid generation, physical modeling, analysis, etc.). Further, an improved programming environment can synergistically integrate these unit operations to leverage the gains. The speedups result from reducing the problem setup time through geometry modeling and grid generation operations, and reducing the solution time through the operation counts associated with solving the discretized equations to a sufficient accuracy. The opportunities are addressed only at a general level here, but an extensive list of references containing further details is included. The opportunities discussed are being addressed through the Fast Adaptive Aerospace Tools (FAAST) element of the Advanced Systems Concept to Test (ASCoT) and the third Generation Reusable Launch Vehicles (RLV) projects at NASA Langley Research Center. The overall goal is to enable greater inroads into the design process with large-scale simulations.
Kinetic energy budgets during the life cycle of intense convective activity
NASA Technical Reports Server (NTRS)
Fuelberg, H. E.; Scoggins, J. R.
1978-01-01
Synoptic-scale data at three- and six-hour intervals are employed to study the relationship between changing kinetic energy variables and the life cycles of two severe squall lines. The kinetic energy budgets indicate a high degree of kinetic energy generation, especially pronounced near the jet-stream level. Energy losses in the storm environment are due to the transfer of kinetic energy from grid to subgrid scales of motion; large-scale upward vertical motion carries aloft the kinetic energy generated by storm activity at lower levels. In general, the time of maximum storm intensity is also the time of maximum energy conversion and transport.
Elliott, P
1993-01-01
Epidemiology is the study of the distribution and determinants of health and disease in human populations. Epidemiology on a global scale is severely constrained by the lack of data. In many countries, there are no comprehensive data on mortality or basic demographic data. Where data are available, findings on the relationship of environment to health across countries need to be interpreted with caution. For example, there is well-known variation in standards of medical practice and diagnosis, and in certification and coding, but there are also large differences in diet, the social environment and lifestyle--all of which strongly predict disease incidence. Inappropriate inference concerning aetiology made from such broad-scale studies may result in what has been termed the 'ecological fallacy'. A complementary approach is to collect and analyse data in standardized fashion as part of international collaborative studies. These can offer some important advantages over the more conventional single-centre design. Recent advances have meant that studies of environment and health can now--in some countries--be carried out using routine data at the small area level. Although problems of interpretation remain, they are generally less severe than in broad-scale studies. Examples of this approach are given.
Wood, Fiona; Kowalczuk, Jenny; Elwyn, Glyn; Mitchell, Clive; Gallacher, John
2011-08-01
Population based genetics studies are dependent on large numbers of individuals in the pursuit of small effect sizes. Recruiting and consenting a large number of participants is both costly and time consuming. We explored whether an online consent process for large-scale genetics studies is acceptable for prospective participants using an example online genetics study. We conducted semi-structured interviews with 42 members of the public stratified by age group, gender and newspaper readership (a measure of social status). Respondents were asked to use a website designed to recruit for a large-scale genetic study. After using the website a semi-structured interview was conducted to explore opinions and any issues they would have. Responses were analysed using thematic content analysis. The majority of respondents said they would take part in the research (32/42). Those who said they would decline to participate saw fewer benefits from the research, wanted more information and expressed a greater number of concerns about the study. Younger respondents had concerns over time commitment. Middle aged respondents were concerned about privacy and security. Older respondents were more altruistic in their motivation to participate. Common themes included trust in the authenticity of the website, security of personal data, curiosity about their own genetic profile, operational concerns and a desire for more information about the research. Online consent to large-scale genetic studies is likely to be acceptable to the public. The online consent process must establish trust quickly and effectively by asserting authenticity and credentials, and provide access to a range of information to suit different information preferences.
Dynamic SLA Negotiation in Autonomic Federated Environments
NASA Astrophysics Data System (ADS)
Rubach, Pawel; Sobolewski, Michael
Federated computing environments offer requestors the ability to dynamically invoke services offered by collaborating providers in the virtual service network. Without an efficient resource management that includes Dynamic SLA Negotiation, however, the assignment of providers to customer's requests cannot be optimized and cannot offer high reliability without relevant SLA guarantees. We propose a new SLA-based SERViceable Metacomputing Environment (SERVME) capable of matching providers based on QoS requirements and performing autonomic provisioning and deprovisioning of services according to dynamic requestor needs. This paper presents the SLA negotiation process that includes on-demand provisioning and uses an object-oriented SLA model for large-scale service-oriented systems supported by SERVME. An initial reference implementation in the SORCER environment is also described.
An evaluation of the accuracy and performance of lightweight GPS collars in a suburban environment.
Adams, Amy L; Dickinson, Katharine J M; Robertson, Bruce C; van Heezik, Yolanda
2013-01-01
The recent development of lightweight GPS collars has enabled medium-to-small sized animals to be tracked via GPS telemetry. Evaluation of the performance and accuracy of GPS collars is largely confined to devices designed for large animals for deployment in natural environments. This study aimed to assess the performance of lightweight GPS collars within a suburban environment, which may be different from natural environments in a way that is relevant to satellite signal acquisition. We assessed the effects of vegetation complexity, sky availability (percentage of clear sky not obstructed by natural or artificial features of the environment), proximity to buildings, and satellite geometry on fix success rate (FSR) and location error (LE) for lightweight GPS collars within a suburban environment. Sky availability had the largest affect on FSR, while LE was influenced by sky availability, vegetation complexity, and HDOP (Horizontal Dilution of Precision). Despite the complexity and modified nature of suburban areas, values for FSR (mean= 90.6%) and LE (mean = 30.1 m) obtained within the suburban environment are comparable to those from previous evaluations of GPS collars designed for larger animals and within less built-up environments. Due to fine-scale patchiness of habitat within urban environments, it is recommended that resource selection methods that are not reliant on buffer sizes be utilised for selection studies.
NASA Astrophysics Data System (ADS)
Usman, Muhammad
2018-04-01
Bismide semiconductor materials and heterostructures are considered a promising candidate for the design and implementation of photonic, thermoelectric, photovoltaic, and spintronic devices. This work presents a detailed theoretical study of the electronic and optical properties of strongly coupled GaBixAs1 -x /GaAs multiple quantum well (MQW) structures. Based on a systematic set of large-scale atomistic tight-binding calculations, our results reveal that the impact of atomic-scale fluctuations in alloy composition is stronger than the interwell coupling effect, and plays an important role in the electronic and optical properties of the investigated MQW structures. Independent of QW geometry parameters, alloy disorder leads to a strong confinement of charge carriers, a large broadening of the hole energies, and a red-shift in the ground-state transition wavelength. Polarization-resolved optical transition strengths exhibit a striking effect of disorder, where the inhomogeneous broadening could exceed an order of magnitude for MQWs, in comparison to a factor of about 3 for single QWs. The strong influence of alloy disorder effects persists when small variations in the size and composition of MQWs typically expected in a realistic experimental environment are considered. The presented results highlight the limited scope of continuum methods and emphasize on the need for large-scale atomistic approaches to design devices with tailored functionalities based on the novel properties of bismide materials.
NASA Astrophysics Data System (ADS)
Deng, Chengbin; Wu, Changshan
2013-12-01
Urban impervious surface information is essential for urban and environmental applications at the regional/national scales. As a popular image processing technique, spectral mixture analysis (SMA) has rarely been applied to coarse-resolution imagery due to the difficulty of deriving endmember spectra using traditional endmember selection methods, particularly within heterogeneous urban environments. To address this problem, we derived endmember signatures through a least squares solution (LSS) technique with known abundances of sample pixels, and integrated these endmember signatures into SMA for mapping large-scale impervious surface fraction. In addition, with the same sample set, we carried out objective comparative analyses among SMA (i.e. fully constrained and unconstrained SMA) and machine learning (i.e. Cubist regression tree and Random Forests) techniques. Analysis of results suggests three major conclusions. First, with the extrapolated endmember spectra from stratified random training samples, the SMA approaches performed relatively well, as indicated by small MAE values. Second, Random Forests yields more reliable results than Cubist regression tree, and its accuracy is improved with increased sample sizes. Finally, comparative analyses suggest a tentative guide for selecting an optimal approach for large-scale fractional imperviousness estimation: unconstrained SMA might be a favorable option with a small number of samples, while Random Forests might be preferred if a large number of samples are available.
NASA Astrophysics Data System (ADS)
Manfredi, Sabato
2016-06-01
Large-scale dynamic systems are becoming highly pervasive in their occurrence with applications ranging from system biology, environment monitoring, sensor networks, and power systems. They are characterised by high dimensionality, complexity, and uncertainty in the node dynamic/interactions that require more and more computational demanding methods for their analysis and control design, as well as the network size and node system/interaction complexity increase. Therefore, it is a challenging problem to find scalable computational method for distributed control design of large-scale networks. In this paper, we investigate the robust distributed stabilisation problem of large-scale nonlinear multi-agent systems (briefly MASs) composed of non-identical (heterogeneous) linear dynamical systems coupled by uncertain nonlinear time-varying interconnections. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, new conditions are given for the distributed control design of large-scale MASs that can be easily solved by the toolbox of MATLAB. The stabilisability of each node dynamic is a sufficient assumption to design a global stabilising distributed control. The proposed approach improves some of the existing LMI-based results on MAS by both overcoming their computational limits and extending the applicative scenario to large-scale nonlinear heterogeneous MASs. Additionally, the proposed LMI conditions are further reduced in terms of computational requirement in the case of weakly heterogeneous MASs, which is a common scenario in real application where the network nodes and links are affected by parameter uncertainties. One of the main advantages of the proposed approach is to allow to move from a centralised towards a distributed computing architecture so that the expensive computation workload spent to solve LMIs may be shared among processors located at the networked nodes, thus increasing the scalability of the approach than the network size. Finally, a numerical example shows the applicability of the proposed method and its advantage in terms of computational complexity when compared with the existing approaches.
Glimpsing the imprint of local environment on the galaxy stellar mass function
NASA Astrophysics Data System (ADS)
Tomczak, Adam R.; Lemaux, Brian C.; Lubin, Lori M.; Gal, Roy R.; Wu, Po-Feng; Holden, Bradford; Kocevski, Dale D.; Mei, Simona; Pelliccia, Debora; Rumbaugh, Nicholas; Shen, Lu
2017-12-01
We investigate the impact of local environment on the galaxy stellar mass function (SMF) spanning a wide range of galaxy densities from the field up to dense cores of massive galaxy clusters. Data are drawn from a sample of eight fields from the Observations of Redshift Evolution in Large-Scale Environments (ORELSE) survey. Deep photometry allow us to select mass-complete samples of galaxies down to 109 M⊙. Taking advantage of >4000 secure spectroscopic redshifts from ORELSE and precise photometric redshifts, we construct three-dimensional density maps between 0.55 < z < 1.3 using a Voronoi tessellation approach. We find that the shape of the SMF depends strongly on local environment exhibited by a smooth, continual increase in the relative numbers of high- to low-mass galaxies towards denser environments. A straightforward implication is that local environment proportionally increases the efficiency of (a) destroying lower mass galaxies and/or (b) growth of higher mass galaxies. We also find a presence of this environmental dependence in the SMFs of star-forming and quiescent galaxies, although not quite as strongly for the quiescent subsample. To characterize the connection between the SMF of field galaxies and that of denser environments, we devise a simple semi-empirical model. The model begins with a sample of ≈106 galaxies at zstart = 5 with stellar masses distributed according to the field. Simulated galaxies then evolve down to zfinal = 0.8 following empirical prescriptions for star-formation, quenching and galaxy-galaxy merging. We run the simulation multiple times, testing a variety of scenarios with differing overall amounts of merging. Our model suggests that a large number of mergers are required to reproduce the SMF in dense environments. Additionally, a large majority of these mergers would have to occur in intermediate density environments (e.g. galaxy groups).
Imprints of the large-scale structure on AGN formation and evolution
NASA Astrophysics Data System (ADS)
Porqueres, Natàlia; Jasche, Jens; Enßlin, Torsten A.; Lavaux, Guilhem
2018-04-01
Black hole masses are found to correlate with several global properties of their host galaxies, suggesting that black holes and galaxies have an intertwined evolution and that active galactic nuclei (AGN) have a significant impact on galaxy evolution. Since the large-scale environment can also affect AGN, this work studies how their formation and properties depend on the environment. We have used a reconstructed three-dimensional high-resolution density field obtained from a Bayesian large-scale structure reconstruction method applied to the 2M++ galaxy sample. A web-type classification relying on the shear tensor is used to identify different structures on the cosmic web, defining voids, sheets, filaments, and clusters. We confirm that the environmental density affects the AGN formation and their properties. We found that the AGN abundance is equivalent to the galaxy abundance, indicating that active and inactive galaxies reside in similar dark matter halos. However, occurrence rates are different for each spectral type and accretion rate. These differences are consistent with the AGN evolutionary sequence suggested by previous authors, Seyferts and Transition objects transforming into low-ionization nuclear emission line regions (LINERs), the weaker counterpart of Seyferts. We conclude that AGN properties depend on the environmental density more than on the web-type. More powerful starbursts and younger stellar populations are found in high densities, where interactions and mergers are more likely. AGN hosts show smaller masses in clusters for Seyferts and Transition objects, which might be due to gas stripping. In voids, the AGN population is dominated by the most massive galaxy hosts.
Li, Lun; Long, Yan; Zhang, Libin; Dalton-Morgan, Jessica; Batley, Jacqueline; Yu, Longjiang; Meng, Jinling; Li, Maoteng
2015-01-01
The prediction of the flowering time (FT) trait in Brassica napus based on genome-wide markers and the detection of underlying genetic factors is important not only for oilseed producers around the world but also for the other crop industry in the rotation system in China. In previous studies the low density and mixture of biomarkers used obstructed genomic selection in B. napus and comprehensive mapping of FT related loci. In this study, a high-density genome-wide SNP set was genotyped from a double-haploid population of B. napus. We first performed genomic prediction of FT traits in B. napus using SNPs across the genome under ten environments of three geographic regions via eight existing genomic predictive models. The results showed that all the models achieved comparably high accuracies, verifying the feasibility of genomic prediction in B. napus. Next, we performed a large-scale mapping of FT related loci among three regions, and found 437 associated SNPs, some of which represented known FT genes, such as AP1 and PHYE. The genes tagged by the associated SNPs were enriched in biological processes involved in the formation of flowers. Epistasis analysis showed that significant interactions were found between detected loci, even among some known FT related genes. All the results showed that our large scale and high-density genotype data are of great practical and scientific values for B. napus. To our best knowledge, this is the first evaluation of genomic selection models in B. napus based on a high-density SNP dataset and large-scale mapping of FT loci.
NASA Technical Reports Server (NTRS)
Bernstein, W.
1981-01-01
The possible use of Chamber A for the replication or simulation of space plasma physics processes which occur in the geosynchronous Earth orbit (GEO) environment is considered. It is shown that replication is not possible and that scaling of the environmental conditions is required for study of the important instability processes. Rules for such experimental scaling are given. At the present time, it does not appear technologically feasible to satisfy these requirements in Chamber A. It is, however, possible to study and qualitatively evaluate the problem of vehicle charging at GEO. In particular, Chamber A is sufficiently large that a complete operational spacecraft could be irradiated by beams and charged to high potentials. Such testing would contribute to the assessment of the operational malfunctions expected at GEO and their possible correction. However, because of the many tabulated limitations in such a testing programs, its direct relevance to conditions expected in the geo environment remains questionable.
Loxdale, H. D.
1999-01-01
The majority of insect species do not show an innate behavioural migration, but rather populations expand into favourable new habitats or contract away from unfavourable ones by random changes of spatial scale. Over the past 50 years, the scientific fascination with dramatic long-distance and directed mass migratory events has overshadowed the more universal mode of population movement, involving much smaller stochastic displacement during the lifetime of the insects concerned. This may be limiting our understanding of insect population dynamics. In the following synthesis, we provide an overview of how herbivorous insect movement is governed by both abiotic and biotic factors, making these animals essentially 'slaves of their environment'. No displaced insect or insect population can leave a resource patch, migrate and flourish, leaving descendants, unless suitable habitat and/or resources are reached during movement. This must have constrained insects over geological time, bringing about species-specific adaptation in behaviour and movements in relation to their environment at a micro- and macrogeographical scale. With insects that undergo long-range spatial displacements, e.g. aphids and locusts, there is presumably a selection against movement unless overruled by factors, such as density-dependent triggering, which cause certain genotypes within the population to migrate. However, for most insect species, spatial changes of scale and range expansion are much slower and may occur over a much longer time-scale, and are not innate (nor directed). Ecologists may say that all animals and plants are figuratively speaking 'slaves of their environments', in the sense that their distribution is defined by their ecology and genotype. But in the case of insects, a vast number must perish daily, either out at sea or over other hostile habitats, having failed to find suitable resources and/or a habitat on which to feed and reproduce. Since many are blown by the vagaries of the wind, their chances of success are serendipitous in the extreme, especially over large distances. Hence, the strategies adopted by mass migratory species (innate pre-programmed flight behaviour, large population sizes and/or fast reproduction), which improve the chances that some of these individuals will succeed. We also emphasize the dearth of knowledge in the various interactions of insect movement and their environment, and describe how molecular markers (protein and DNA) may be used to examine the details of spatial scale over which movement occurs in relation to insect ecology and genotype.
Warwick-Evans, Victoria C.; Atkinson, Philip W.; Robinson, Leonie A.; Green, Jonathan A.
2016-01-01
During the breeding season seabirds are constrained to coastal areas and are restricted in their movements, spending much of their time in near-shore waters either loafing or foraging. However, in using these areas they may be threatened by anthropogenic activities such as fishing, watersports and coastal developments including marine renewable energy installations. Although many studies describe large scale interactions between seabirds and the environment, the drivers behind near-shore, fine-scale distributions are not well understood. For example, Alderney is an important breeding ground for many species of seabird and has a diversity of human uses of the marine environment, thus providing an ideal location to investigate the near-shore fine-scale interactions between seabirds and the environment. We used vantage point observations of seabird distribution, collected during the 2013 breeding season in order to identify and quantify some of the environmental variables affecting the near-shore, fine-scale distribution of seabirds in Alderney’s coastal waters. We validate the models with observation data collected in 2014 and show that water depth, distance to the intertidal zone, and distance to the nearest seabird nest are key predictors in the distribution of Alderney’s seabirds. AUC values for each species suggest that these models perform well, although the model for shags performed better than those for auks and gulls. While further unexplained underlying localised variation in the environmental conditions will undoubtedly effect the fine-scale distribution of seabirds in near-shore waters we demonstrate the potential of this approach in marine planning and decision making. PMID:27031616
Delvigne, Frank; Takors, Ralf; Mudde, Rob; van Gulik, Walter; Noorman, Henk
2017-09-01
Efficient optimization of microbial processes is a critical issue for achieving a number of sustainable development goals, considering the impact of microbial biotechnology in agrofood, environment, biopharmaceutical and chemical industries. Many of these applications require scale-up after proof of concept. However, the behaviour of microbial systems remains unpredictable (at least partially) when shifting from laboratory-scale to industrial conditions. The need for robust microbial systems is thus highly needed in this context, as well as a better understanding of the interactions between fluid mechanics and cell physiology. For that purpose, a full scale-up/down computational framework is already available. This framework links computational fluid dynamics (CFD), metabolic flux analysis and agent-based modelling (ABM) for a better understanding of the cell lifelines in a heterogeneous environment. Ultimately, this framework can be used for the design of scale-down simulators and/or metabolically engineered cells able to cope with environmental fluctuations typically found in large-scale bioreactors. However, this framework still needs some refinements, such as a better integration of gas-liquid flows in CFD, and taking into account intrinsic biological noise in ABM. © 2017 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
Pizette, Patrick; Govender, Nicolin; Wilke, Daniel N.; Abriak, Nor-Edine
2017-06-01
The use of the Discrete Element Method (DEM) for industrial civil engineering industrial applications is currently limited due to the computational demands when large numbers of particles are considered. The graphics processing unit (GPU) with its highly parallelized hardware architecture shows potential to enable solution of civil engineering problems using discrete granular approaches. We demonstrate in this study the pratical utility of a validated GPU-enabled DEM modeling environment to simulate industrial scale granular problems. As illustration, the flow discharge of storage silos using 8 and 17 million particles is considered. DEM simulations have been performed to investigate the influence of particle size (equivalent size for the 20/40-mesh gravel) and induced shear stress for two hopper shapes. The preliminary results indicate that the shape of the hopper significantly influences the discharge rates for the same material. Specifically, this work shows that GPU-enabled DEM modeling environments can model industrial scale problems on a single portable computer within a day for 30 seconds of process time.
NASA Technical Reports Server (NTRS)
Halverson, Jeffrey B.; Roy, Biswadev; O'CStarr, David (Technical Monitor)
2002-01-01
An overview of mean convective thermodynamic and wind profiles for the Tropical Rainfall Measuring Mission (TRMM) Large Scale Biosphere-Atmosphere Experiment (LBA) and Kwajalein Experiment (KWAJEX) field campaigns will be presented, highlighting the diverse continental and marine tropical environments in which rain clouds and mesoscale convective systems evolved. An assessment of ongoing sounding quality control procedures will be shown. Additionally, we will present preliminary budgets of sensible heat source (Q1) and apparent moisture sink (Q2), which have been diagnosed from the various sounding networks.
Ionic electroactive polymer artificial muscles in space applications
Punning, Andres; Kim, Kwang J.; Palmre, Viljar; Vidal, Frédéric; Plesse, Cédric; Festin, Nicolas; Maziz, Ali; Asaka, Kinji; Sugino, Takushi; Alici, Gursel; Spinks, Geoff; Wallace, Gordon; Must, Indrek; Põldsalu, Inga; Vunder, Veiko; Temmer, Rauno; Kruusamäe, Karl; Torop, Janno; Kaasik, Friedrich; Rinne, Pille; Johanson, Urmas; Peikolainen, Anna-Liisa; Tamm, Tarmo; Aabloo, Alvo
2014-01-01
A large-scale effort was carried out to test the performance of seven types of ionic electroactive polymer (IEAP) actuators in space-hazardous environmental factors in laboratory conditions. The results substantiate that the IEAP materials are tolerant to long-term freezing and vacuum environments as well as ionizing Gamma-, X-ray, and UV radiation at the levels corresponding to low Earth orbit (LEO) conditions. The main aim of this material behaviour investigation is to understand and predict device service time for prolonged exposure to space environment. PMID:25372857
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George
1999-01-11
A workshop on collaborative problem-solving environments (CPSEs) was held June 29 through July 1, 1999, in San Diego, California. The workshop was sponsored by the U.S. Department of Energy and the High Performance Network Applications Team of the Large Scale Networking Working Group. The workshop brought together researchers and developers from industry, academia, and government to identify, define, and discuss future directions in collaboration and problem-solving technologies in support of scientific research.
Stellato, Giuseppina; La Storia, Antonietta; De Filippis, Francesca; Borriello, Giorgia; Villani, Francesco; Ercolini, Danilo
2016-07-01
Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The aims of this study were to learn more about the possible influence of the meat processing environment on initial fresh meat contamination and to investigate the differences between small-scale retail distribution (SD) and large-scale retail distribution (LD) facilities. Samples were collected from butcheries (n = 20), including LD (n = 10) and SD (n = 10) facilities, over two sampling campaigns. Samples included fresh beef and pork cuts and swab samples from the knife, the chopping board, and the butcher's hand. The microbiota of both meat samples and environmental swabs were very complex, including more than 800 operational taxonomic units (OTUs) collapsed at the species level. The 16S rRNA sequencing analysis showed that core microbiota were shared by 80% of the samples and included Pseudomonas spp., Streptococcus spp., Brochothrix spp., Psychrobacter spp., and Acinetobacter spp. Hierarchical clustering of the samples based on the microbiota showed a certain separation between meat and environmental samples, with higher levels of Proteobacteria in meat. In particular, levels of Pseudomonas and several Enterobacteriaceae members were significantly higher in meat samples, while Brochothrix, Staphylococcus, lactic acid bacteria, and Psychrobacter prevailed in environmental swab samples. Consistent clustering was also observed when metabolic activities were considered by predictive metagenomic analysis of the samples. An increase in carbohydrate metabolism was predicted for the environmental swabs and was consistently linked to Firmicutes, while increases in pathways related to amino acid and lipid metabolism were predicted for the meat samples and were positively correlated with Proteobacteria Our results highlighted the importance of the processing environment in contributing to the initial microbial levels of meat and clearly showed that the type of retail facility (LD or SD) did not apparently affect the contamination. The study provides an in-depth description of the microbiota of meat and meat processing environments. It highlights the importance of the environment as a contamination source of spoilage bacteria, and it shows that the size of the retail facility does not affect the level and type of contamination. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Li, Guo Chun; Song, Hua Dong; Li, Qi; Bu, Shu Hai
2017-11-01
In Abies fargesii forests of the giant panda's habitats in Mt. Taibai, the spatial distribution patterns and interspecific associations of main tree species and their spatial associations with the understory flowering Fargesia qinlingensis were analyzed at multiple scales by univariate and bivaria-te O-ring function in point pattern analysis. The results showed that in the A. fargesii forest, the number of A. fargesii was largest but its population structure was in decline. The population of Betula platyphylla was relatively young, with a stable population structure, while the population of B. albo-sinensis declined. The three populations showed aggregated distributions at small scales and gradually showed random distributions with increasing spatial scales. Spatial associations among tree species were mainly showed at small scales and gradually became not spatially associated with increasing scale. A. fargesii and B. platyphylla were positively associated with flowering F. qinlingensis at large and medium scales, whereas B. albo-sinensis showed negatively associated with flowering F. qinlingensis at large and medium scales. The interaction between trees and F. qinlingensis in the habitats of giant panda promoted the dynamic succession and development of forests, which changed the environment of giant panda's habitats in Qinling.
Scaling properties of European research units
Jamtveit, Bjørn; Jettestuen, Espen; Mathiesen, Joachim
2009-01-01
A quantitative characterization of the scale-dependent features of research units may provide important insight into how such units are organized and how they grow. The relative importance of top-down versus bottom-up controls on their growth may be revealed by their scaling properties. Here we show that the number of support staff in Scandinavian research units, ranging in size from 20 to 7,800 staff members, is related to the number of academic staff by a power law. The scaling exponent of ≈1.30 is broadly consistent with a simple hierarchical model of the university organization. Similar scaling behavior between small and large research units with a wide range of ambitions and strategies argues against top-down control of the growth. Top-down effects, and externally imposed effects from changing political environments, can be observed as fluctuations around the main trend. The observed scaling law implies that cost-benefit arguments for merging research institutions into larger and larger units may have limited validity unless the productivity per academic staff and/or the quality of the products are considerably higher in larger institutions. Despite the hierarchical structure of most large-scale research units in Europe, the network structures represented by the academic component of such units are strongly antihierarchical and suboptimal for efficient communication within individual units. PMID:19625626
Distributed intelligent urban environment monitoring system
NASA Astrophysics Data System (ADS)
Du, Jinsong; Wang, Wei; Gao, Jie; Cong, Rigang
2018-02-01
The current environmental pollution and destruction have developed into a world-wide major social problem that threatens human survival and development. Environmental monitoring is the prerequisite and basis of environmental governance, but overall, the current environmental monitoring system is facing a series of problems. Based on the electrochemical sensor, this paper designs a small, low-cost, easy to layout urban environmental quality monitoring terminal, and multi-terminal constitutes a distributed network. The system has been small-scale demonstration applications and has confirmed that the system is suitable for large-scale promotion
Exploration–exploitation trade-off features a saltatory search behaviour
Volchenkov, Dimitri; Helbach, Jonathan; Tscherepanow, Marko; Kühnel, Sina
2013-01-01
Searching experiments conducted in different virtual environments over a gender-balanced group of people revealed a gender irrelevant scale-free spread of searching activity on large spatio-temporal scales. We have suggested and solved analytically a simple statistical model of the coherent-noise type describing the exploration–exploitation trade-off in humans (‘should I stay’ or ‘should I go’). The model exhibits a variety of saltatory behaviours, ranging from Lévy flights occurring under uncertainty to Brownian walks performed by a treasure hunter confident of the eventual success. PMID:23782535
Self and world: large scale installations at science museums.
Shimojo, Shinsuke
2008-01-01
This paper describes three examples of illusion installation in a science museum environment from the author's collaboration with the artist and architect. The installations amplify the illusory effects, such as vection (visually-induced sensation of self motion) and motion-induced blindness, to emphasize that perception is not just to obtain structure and features of objects, but rather to grasp the dynamic relationship between the self and the world. Scaling up the size and utilizing the live human body turned out to be keys for installations with higher emotional impact.
EPA, in collaboration with FHWA, has been involved in a large-scale monitoring research study in an effort to characterize highway vehicle emissions in a near-road environment. The pollutants of interest include particulate matter with aerodynamic diameter less than 2.5 microns ...
Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...
Positive effects of afforestation efforts on the health of urban soils
Emily E. Oldfield; Alexander J. Felson; Stephen A. Wood; Richard A. Hallett; Michael S. Strickland; Mark A. Bradford
2014-01-01
Large-scale tree planting projects in cities are increasingly implemented as a strategy to improve the urban environment. Trees provide multiple benefits in cities, including reduction of urban temperatures, improved air quality, mitigation of storm-water run-off, and provision of wildlife habitat. How urban afforestation affects the properties and functions of urban...
Education Researchers as Bricoleurs in the Creation of Sustainable Learning Environments
ERIC Educational Resources Information Center
Mahlomaholo, Sechaba
2014-01-01
Higher education has, to date, been unable to provide effective and lasting solutions to challenges of education, because large sections thereof continue to search for knowledge for its own sake. At best, they conduct responsive research, but on a small scale they reduce the complexity that is education to a neat unilinear process which can be…
2009-03-01
earlier, Saji et al. (1999) stated that the changes in the state of the climate system associated with the seasonal monsoonal reversals are responsible...western North Pacific basin, in State of the Climate in 2008. To appear in Bull. Amer. Meteor. Soc., July 2009. Camargo, S. J., and A. H. Sobel
Peter Caldwell; Catalina Segura; Shelby Gull Laird; Ge Sun; Steven G. McNulty; Maria Sandercock; Johnny Boggs; James M. Vose
2015-01-01
Assessment of potential climate change impacts on stream water temperature (Ts) across large scales remains challenging for resource managers because energy exchange processes between the atmosphere and the stream environment are complex and uncertain, and few long-term datasets are available to evaluate changes over time. In this study, we...
A Unique Design for High-Impact Safety and Awareness Training
ERIC Educational Resources Information Center
Calandra, Brendan; Harmon, Stephen W.
2012-01-01
The authors were asked to design and develop a large-scale, web-based learning environment that would effectively assist international aid workers in conducting their daily tasks in the field, at home and in the office in a safer and more secure fashion. The design, development and dissemination of the training needed to be done quickly,…
Improving Real World Performance of Vision Aided Navigation in a Flight Environment
2016-09-15
Introduction . . . . . . . 63 4.2 Wide Area Search Extent . . . . . . . . . . . . . . . . . 64 4.3 Large-Scale Image Navigation Histogram Filter ...65 4.3.1 Location Model . . . . . . . . . . . . . . . . . . 66 4.3.2 Measurement Model . . . . . . . . . . . . . . . 66 4.3.3 Histogram Filter ...Iteration of Histogram Filter . . . . . . . . . . . 70 4.4 Implementation and Flight Test Campaign . . . . . . . . 71 4.4.1 Software Implementation
ERIC Educational Resources Information Center
Mobley, Catherine; Vagias, Wade M.; DeWard, Sarah L.
2010-01-01
It is often assumed that individuals who are knowledgeable and concerned about the environment will engage in environmentally responsible behavior (ERB). We use data from a large scale Web survey hosted on National Geographic's Web site in 2001-2002 to investigate this premise. We examine whether reading three classic environmental books…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-22
.... Chronic marine pollution stresses fish and wildlife resources, possibly delaying recovery of resources... chronic marine pollution are believed to be at least as important as those of large-scale spills. The... and lower Cook Inlet. The Council seeks to further reduce pollution in the marine environment to...
YaQ: an architecture for real-time navigation and rendering of varied crowds.
Maïm, Jonathan; Yersin, Barbara; Thalmann, Daniel
2009-01-01
The YaQ software platform is a complete system dedicated to real-time crowd simulation and rendering. Fitting multiple application domains, such as video games and VR, YaQ aims to provide efficient algorithms to generate crowds comprising up to thousands of varied virtual humans navigating in large-scale, global environments.
A Multivariate Analysis of Secondary Students' Experience of Web-Based Language Acquisition
ERIC Educational Resources Information Center
Felix, Uschi
2004-01-01
This paper reports on a large-scale project designed to replicate an earlier investigation of tertiary students (Felix, 2001) in a secondary school environment. The new project was carried out in five settings, again investigating the potential of the Web as a medium of language instruction. Data was collected by questionnaires and observational…
Physical Environment of the Pacific Missile Range Facility, Kauai, Hawaii,
1984-03-01
Macdonald, Davis, and Cox (1960), the island of Kauai and the adjacent island of Niihau are lava domes located at the top of one large marine volcanic...tidal current. 35 z Hnalei B. ~ IV* KAUAI 220 NIIHAU MnaP N Koeno P OAHU V, 0 10 20 -3.0 Scale in Nautical Miles Approx. * LEGEND ~-FLOOD CURRENT
ERIC Educational Resources Information Center
Schnebele, Emily K.
2013-01-01
Flooding is the most frequently occurring natural hazard on Earth; with catastrophic, large scale floods causing immense damage to people, property, and the environment. Over the past 20 years, remote sensing has become the standard technique for flood identification because of its ability to offer synoptic coverage. Unfortunately, remote sensing…
Too Scared to Learn? The Academic Consequences of Feeling Unsafe at School. Working Paper #02-13
ERIC Educational Resources Information Center
Lacoe, Johanna
2013-01-01
A safe environment is a prerequisite for productive learning. This paper represents the first large-scale analysis of how feelings of safety at school affect educational outcomes. Using a unique longitudinal dataset of survey responses from New York City middle school students, the paper provides insight into the causal relationship between…
We Are Lost: Measuring the Accessibility of Signage in Public General Hospitals
ERIC Educational Resources Information Center
Schuster, Michal; Elroy, Irit; Elmakais, Ido
2017-01-01
Hospital signage is a critical element in the patients' and visitors understanding of directions, instructions and warnings in the facility. In multilingual environments organizations need to make sure that the information is accessible in the languages of the people who consume their services. As part of a large-scale study that examined the…
Understanding the Relationships between Interest in Online Math Games and Academic Performance
ERIC Educational Resources Information Center
Zhang, M.
2015-01-01
Although the Internet is widely used by students in both formal and informal environments, little is known about how and where youth spend their time online. Using Internet search and Web analytics data, this study discovered a large-scale phenomenon associated with the poor performance of elementary school students in the USA that has been…
ERIC Educational Resources Information Center
Thoeun, Chanthou
2013-01-01
In this article, the author contends that there is by now a devastating catalogue of evidence revealing the depth and breadth of corporate-sponsored, government-sanctioned acts of violence against the environment across the globe. British Petroleum's (BP) oil spill, for instance, is a testament to large-scale catastrophic ecological damages…
A Day in Third Grade: A Large-Scale Study of Classroom Quality and Teacher and Student Behavior
ERIC Educational Resources Information Center
Elementary School Journal, 2005
2005-01-01
Observations of 780 third-grade classrooms described classroom activities, child-teacher interactions, and dimensions of the global classroom environment, which were examined in relation to structural aspects of the classroom and child behavior. 1 child per classroom was targeted for observation in relation to classroom quality and teacher and…
Chemistry and structure of giant molecular clouds in energetic environments
NASA Astrophysics Data System (ADS)
Anderson, Crystal Nicole
2016-09-01
Throughout the years many studies on Galactic star formation have been conducted. This resulted in the idea that giant molecular clouds (GMCs) are hierarchical in nature with substructures spanning a large range of sizes. The physical processes that determine how molecular clouds fragment, form clumps/cores and then stars depends strongly on both recent radiative and mechanical feed- back from massive stars and, on longer term, from enhanced cooling due to the buildup of metals. Radiative and mechanical energy input from stellar populations can alter subsequent star formation over a large part of a galaxy and hence is relevant to the evolution of galaxies. Much of our knowledge of star formation on galaxy wide scales is based on scaling laws and other parametric descriptions. But to understand the overall evolution of star formation in galaxies we need to watch the feedback processes at work on giant molecular cloud (GMC) scales. By doing this we can begin to answer how strong feedback environments change the properties of the substructure in GMCs. Tests of Galactic star formation theory to other galaxies has been a challenging process due to the lack of resolution with current instruments. Thus, only the nearest galaxies allow us to resolve GMCs and their substructures. The Large Magellanic Cloud (LMC), is one of the closest low metallicity dwarf galaxies (D˜ 50 kpc) and is close enough that current instruments can resolve the sub- structure of its GMCs to <1pc. The LMC has a star cluster located near the GMC, 30Doradus, producing high levels of far ultra violet (FUV) radiation in the inter- stellar medium (ISM). The dwarf galaxy, NGC 5253, is also a close low metallicity galaxy (3.8 Mpc) with a super star cluster, which appears to be composed of several newborn globular clusters, located within the center of the galaxy. These huge, compact collections of massive stars and their supernovae have the potential to dump large amounts of FUV radiation and momentum into the ISM. Under such hostile conditions, we cannot expect star formation to evolve in the same fashion as it does across much of the Galaxy. With the advancement of radio interferometry instruments like ALMA and the ATCA, we are able to observe nearby dwarf galaxies at 1.5-40 pc scales. Also, with the advancement of the instruments, astrochemistry is becoming an exciting and dominant field in studying star forming regions at varying densities and evolutionary stages outside the Galaxy. In this dissertation, I discuss observations of molecular gas tracers (e.g. HCO+, HCN, HNC, CS, C2H, N2H+) detected in the LMC at 1.5-40 pc scales and in NGC 5253 at 40 pc scales. I then compare the molecular gas detections to the Central Molecular Zone in our Galaxy. Dense molecular gas was detected in all of the sources. For the regions in the LMC, molecular lines of CS, N2H+, C 2H, HNC, HCO+ and HCN were all detected in N159W and N113 while only HCN, HCO+, HNC, and C2H were detected in 30Dor-10. Toward NGC 5253 only HCO+, HCN, C2H and CS were detected. I observe anomalously large HCO+/HCN line ratios of >5 for the NGC 5253 SSC, 30Dor-10 and N159W clumps. However the ratio is <2 for N113, the least energetic source, on clump scales. NGC 5253, 30Dor-10 and N159W have anomalously faint HCN. The CMZ however, does not have anomalously faint HCN it actually has HCO+/HCN common of high metallicity environments, active galactic nuclei and ultra luminous infrared galaxies. These observations suggest the reason HCN has fainter emission than HCO+ must be a combination of low metallicity and energetics contributing to the change in the HCO+/HCN ratio. I find that the impact of the massive star forming regions on surrounding gas in different galaxies from small to large scales changes the chemistry within these regions. A more energetic region's chemistry seems to be different from a less energetic region. There is a richer chemistry within a less energetic region; which may suggest that the chemistry in an energetic environment is quenched due to increased photodissociation.
NASA Technical Reports Server (NTRS)
Tao, W.-K.; Shie, C.-L.; Johnson, D; Simpson, J.; Starr, David OC. (Technical Monitor)
2002-01-01
A two-dimensional version of the Goddard Cumulus Ensemble (GCE) Model is used to simulate convective systems that developed in various geographic locations. Observed large-scale advective tendencies for potential temperature, water vapor mixing ratio, and horizontal momentum derived from field campaigns are used as the main forcing. By examining the surface energy budgets, the model results show that the two largest terms are net condensation (heating/drying) and imposed large-scale forcing (cooling/moistening) for tropical oceanic cases. These two terms arc opposite in sign, however. The contributions by net radiation and latent heat flux to the net condensation vary in these tropical cases, however. For cloud systems that developed over the South China Sea and eastern Atlantic, net radiation (cooling) accounts for about 20% or more of the net condensation. However, short-wave heating and long-wave cooling are in balance with each other for cloud systems over the West Pacific region such that the net radiation is very small. This is due to the thick anvil clouds simulated in the cloud systems over the Pacific region. Large-scale cooling exceeds large-scale moistening in the Pacific and Atlantic cases. For cloud systems over the South China Sea, however, there is more large-scale moistening than cooling even though the cloud systems developed in a very moist environment. though For three cloud systems that developed over a mid-latitude continent, the net radiation and sensible and latent heat fluxes play a much more important role. This means the accurate measurement of surface fluxes and radiation is crucial for simulating these mid-latitude cases.
Additional Results of Glaze Icing Scaling in SLD Conditions
NASA Technical Reports Server (NTRS)
Tsao, Jen-Ching
2016-01-01
New guidance of acceptable means of compliance with the super-cooled large drops (SLD) conditions has been issued by the U.S. Department of Transportation's Federal Aviation Administration (FAA) in its Advisory Circular AC 25-28 in November 2014. The Part 25, Appendix O is developed to define a representative icing environment for super-cooled large drops. Super-cooled large drops, which include freezing drizzle and freezing rain conditions, are not included in Appendix C. This paper reports results from recent glaze icing scaling tests conducted in NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the scaling methods recommended for Appendix C conditions might apply to SLD conditions. The models were straight NACA 0012 wing sections. The reference model had a chord of 72 inches and the scale model had a chord of 21 inches. Reference tests were run with airspeeds of 100 and 130.3 knots and with MVD's of 85 and 170 microns. Two scaling methods were considered. One was based on the modified Ruff method with scale velocity found by matching the Weber number W (sub eL). The other was proposed and developed by Feo specifically for strong glaze icing conditions, in which the scale liquid water content and velocity were found by matching reference and scale values of the non-dimensional water-film thickness expression and the film Weber number W (sub ef). All tests were conducted at 0 degrees angle of arrival. Results will be presented for stagnation freezing fractions of 0.2 and 0.3. For non-dimensional reference and scale ice shape comparison, a new post-scanning ice shape digitization procedure was developed for extracting 2-dimensional ice shape profiles at any selected span-wise location from the high fidelity 3-dimensional scanned ice shapes obtained in the IRT.
Additional Results of Glaze Icing Scaling in SLD Conditions
NASA Technical Reports Server (NTRS)
Tsao, Jen-Ching
2016-01-01
New guidance of acceptable means of compliance with the super-cooled large drops (SLD) conditions has been issued by the U.S. Department of Transportation's Federal Aviation Administration (FAA) in its Advisory Circular AC 25-28 in November 2014. The Part 25, Appendix O is developed to define a representative icing environment for super-cooled large drops. Super-cooled large drops, which include freezing drizzle and freezing rain conditions, are not included in Appendix C. This paper reports results from recent glaze icing scaling tests conducted in NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the scaling methods recommended for Appendix C conditions might apply to SLD conditions. The models were straight NACA 0012 wing sections. The reference model had a chord of 72 in. and the scale model had a chord of 21 in. Reference tests were run with airspeeds of 100 and 130.3 kn and with MVD's of 85 and 170 micron. Two scaling methods were considered. One was based on the modified Ruff method with scale velocity found by matching the Weber number WeL. The other was proposed and developed by Feo specifically for strong glaze icing conditions, in which the scale liquid water content and velocity were found by matching reference and scale values of the nondimensional water-film thickness expression and the film Weber number Wef. All tests were conducted at 0 deg AOA. Results will be presented for stagnation freezing fractions of 0.2 and 0.3. For nondimensional reference and scale ice shape comparison, a new post-scanning ice shape digitization procedure was developed for extracting 2-D ice shape profiles at any selected span-wise location from the high fidelity 3-D scanned ice shapes obtained in the IRT.
Turner, B. L.; Sabloff, Jeremy A.
2012-01-01
The ninth century collapse and abandonment of the Central Maya Lowlands in the Yucatán peninsular region were the result of complex human–environment interactions. Large-scale Maya landscape alterations and demands placed on resources and ecosystem services generated high-stress environmental conditions that were amplified by increasing climatic aridity. Coincident with this stress, the flow of commerce shifted from land transit across the peninsula to sea-borne transit around it. These changing socioeconomic and environmental conditions generated increasing societal conflicts, diminished control by the Maya elite, and led to decisions to move elsewhere in the peninsular region rather than incur the high costs of maintaining the human–environment systems in place. After abandonment, the environment of the Central Maya Lowlands largely recovered, although altered from its state before Maya occupation; the population never recovered. This history and the spatial and temporal variability in the pattern of collapse and abandonment throughout the Maya lowlands support the case for different conditions, opportunities, and constraints in the prevailing human–environment systems and the decisions to confront them. The Maya case lends insights for the use of paleo- and historical analogs to inform contemporary global environmental change and sustainability. PMID:22912403
Turner, B L; Sabloff, Jeremy A
2012-08-28
The ninth century collapse and abandonment of the Central Maya Lowlands in the Yucatán peninsular region were the result of complex human-environment interactions. Large-scale Maya landscape alterations and demands placed on resources and ecosystem services generated high-stress environmental conditions that were amplified by increasing climatic aridity. Coincident with this stress, the flow of commerce shifted from land transit across the peninsula to sea-borne transit around it. These changing socioeconomic and environmental conditions generated increasing societal conflicts, diminished control by the Maya elite, and led to decisions to move elsewhere in the peninsular region rather than incur the high costs of maintaining the human-environment systems in place. After abandonment, the environment of the Central Maya Lowlands largely recovered, although altered from its state before Maya occupation; the population never recovered. This history and the spatial and temporal variability in the pattern of collapse and abandonment throughout the Maya lowlands support the case for different conditions, opportunities, and constraints in the prevailing human-environment systems and the decisions to confront them. The Maya case lends insights for the use of paleo- and historical analogs to inform contemporary global environmental change and sustainability.
Extended X-ray emission in PKS 1718-649
NASA Astrophysics Data System (ADS)
Beuchert, T.; Rodríguez-Ardila, A.; Moss, V. A.; Schulz, R.; Kadler, M.; Wilms, J.; Angioni, R.; Callingham, J. R.; Gräfe, C.; Krauß, F.; Kreikenbohm, A.; Langejahn, M.; Leiter, K.; Maccagni, F. M.; Müller, C.; Ojha, R.; Ros, E.; Tingay, S. J.
2018-04-01
PKS 1718-649 is one of the closest and most comprehensively studied candidates of a young active galactic nucleus (AGN) that is still embedded in its optical host galaxy. The compact radio structure, with a maximal extent of a few parsecs, makes it a member of the group of compact symmetric objects (CSO). Its environment imposes a turnover of the radio synchrotron spectrum towards lower frequencies, also classifying PKS 1718-649 as gigahertz-peaked radio spectrum (GPS) source. Its close proximity has allowed the first detection of extended X-ray emission in a GPS/CSO source with Chandra that is for the most part unrelated to nuclear feedback. However, not much is known about the nature of this emission. By co-adding all archival Chandra data and complementing these datasets with the large effective area of XMM-Newton, we are able to study the detailed physics of the environment of PKS 1718-649. Not only can we confirm that the bulk of the ≲kiloparsec-scale environment emits in the soft X-rays, but we also identify the emitting gas to form a hot, collisionally ionized medium. While the feedback of the central AGN still seems to be constrained to the inner few parsecs, we argue that supernovae are capable of producing the observed large-scale X-ray emission at a rate inferred from its estimated star formation rate.
NASA Astrophysics Data System (ADS)
Holmes, K. W.; Kyriakidis, P. C.; Chadwick, O. A.; Matricardi, E.; Soares, J. V.; Roberts, D. A.
2003-12-01
The natural controls on soil variability and the spatial scales at which correlation exists among soil and environmental variables are critical information for evaluating the effects of deforestation. We detect different spatial scales of variability in soil nutrient levels over a large region (hundreds of thousands of km2) in the Amazon, analyze correlations among soil properties at these different scales, and evaluate scale-specific relationships among soil properties and the factors potentially driving soil development. Statistical relationships among physical drivers of soil formation, namely geology, precipitation, terrain attributes, classified soil types, and land cover derived from remote sensing, were included to determine which factors are related to soil biogeochemistry at each spatial scale. Surface and subsurface soil profile data from a 3000 sample database collected in Rond“nia, Brazil, were used to investigate patterns in pH, phosphorus, nitrogen, organic carbon, effective cation exchange capacity, calcium, magnesium, potassium, aluminum, sand, and clay in this environment grading from closed canopy tropical forest to savanna. We focus on pH in this presentation for simplicity, because pH is the single most important soil characteristic for determining the chemical environment of higher plants and soil microbial activity. We determined four spatial scales which characterize integrated patterns of soil chemistry: less than 3 km; 3 to 10 km; 10 to 68 km; and from 68 to 550 km (extent of study area). Although the finest observable scale was fixed by the field sampling density, the coarser scales were determined from relationships in the data through coregionalization modeling, rather than being imposed by the researcher. Processes which affect soils over short distances, such as land cover and terrain attributes, were good predictors of fine scale spatial components of nutrients; processes which affect soils over very large distances, such as precipitation and geology, were better predictors at coarse spatial scales. However, this result may be affected by the resolution of the available predictor maps. Land-cover change exerted a strong influence on soil chemistry at fine spatial scales, and had progressively less of an effect at coarser scales. It is important to note that land cover, and interactions among land cover and the other predictors, continued to be a significant predictor of soil chemistry at every spatial scale up to hundreds of thousands of kilometers.
Climate Dynamics and Hysteresis at Low and High Obliquity
NASA Astrophysics Data System (ADS)
Colose, C.; Del Genio, A. D.; Way, M.
2017-12-01
We explore the large-scale climate dynamics at low and high obliquity for an Earth-like planet using the ROCKE-3D (Resolving Orbital and Climate Keys of Earth and Extraterrestrial Environments with Dynamics) 3-D General Circulation model being developed at NASA GISS as part of the Nexus for Exoplanet System Science (NExSS) initiative. We highlight the role of ocean heat storage and transport in determining the seasonal cycle at high obliquity, and describe the large-scale circulation and resulting regional climate patterns using both aquaplanet and Earth topographical boundary conditions. Finally, we contrast the hysteresis structure to varying CO2 concentration for a low and high obliquity planet near the outer edge of the habitable zone. We discuss the prospects for habitability for a high obliquity planet susceptible to global glaciation.
Beta decay rates of neutron-rich nuclei
NASA Astrophysics Data System (ADS)
Marketin, Tomislav; Huther, Lutz; Martínez-Pinedo, Gabriel
2015-10-01
Heavy element nucleosynthesis models involve various properties of thousands of nuclei in order to simulate the intricate details of the process. By necessity, as most of these nuclei cannot be studied in a controlled environment, these models must rely on the nuclear structure models for input. Of all the properties, the beta-decay half-lives are one of the most important ones due to their direct impact on the resulting abundance distributions. Currently, a single large-scale calculation is available based on a QRPA calculation with a schematic interaction on top of the Finite Range Droplet Model. In this study we present the results of a large-scale calculation based on the relativistic nuclear energy density functional, where both the allowed and the first-forbidden transitions are studied in more than 5000 neutron-rich nuclei.
The cosmic web in CosmoGrid void regions
NASA Astrophysics Data System (ADS)
Rieder, Steven; van de Weygaert, Rien; Cautun, Marius; Beygu, Burcu; Portegies Zwart, Simon
2016-10-01
We study the formation and evolution of the cosmic web, using the high-resolution CosmoGrid ΛCDM simulation. In particular, we investigate the evolution of the large-scale structure around void halo groups, and compare this to observations of the VGS-31 galaxy group, which consists of three interacting galaxies inside a large void. The structure around such haloes shows a great deal of tenuous structure, with most of such systems being embedded in intra-void filaments and walls. We use the Nexus+} algorithm to detect walls and filaments in CosmoGrid, and find them to be present and detectable at every scale. The void regions embed tenuous walls, which in turn embed tenuous filaments. We hypothesize that the void galaxy group of VGS-31 formed in such an environment.
Ultraviolet and optical view of galaxies in the Coma Supercluster
NASA Astrophysics Data System (ADS)
Mahajan, Smriti; Singh, Ankit; Shobhana, Devika
2018-05-01
The Coma supercluster (100h-1Mpc) offers an unprecedented contiguous range of environments in the nearby Universe. In this paper we present a catalogue of spectroscopically confirmed galaxies in the Coma supercluster detected in the ultraviolet (UV) wavebands. We use the arsenal of UV and optical data for galaxies in the Coma supercluster covering ˜500 square degrees on the sky to study their photometric and spectroscopic properties as a function of environment at various scales. We identify the different components of the cosmic-web: large-scale filaments and voids using Discrete Persistent Structures Extractor, and groups and clusters using Hierarchical Density-based spatial clustering of applications with noise, respectively. We find that in the Coma supercluster the median emission in Hα inclines, while the g - r and FUV - NUV colours of galaxies become bluer moving further away from the spine of the filaments out to a radius of ˜1 Mpc. On the other hand, an opposite trend is observed as the distance between the galaxy and centre of the nearest cluster or group decreases. Our analysis supports the hypothesis that properties of galaxies are not just defined by its stellar mass and large-scale density, but also by the environmental processes resulting due to the intrafilament medium whose role in accelerating galaxy transformations needs to be investigated thoroughly using multi-wavelength data.
Large-scale clustering as a probe of the origin and the host environment of fast radio bursts
NASA Astrophysics Data System (ADS)
Shirasaki, Masato; Kashiyama, Kazumi; Yoshida, Naoki
2017-04-01
We propose to use degree-scale angular clustering of fast radio bursts (FRBs) to identify their origin and the host galaxy population. We study the information content in autocorrelation of the angular positions and dispersion measures (DM) and in cross-correlation with galaxies. We show that the cross-correlation with Sloan Digital Sky Survey (SDSS) galaxies will place stringent constraints on the mean physical quantities associated with FRBs. If ˜10 ,000 FRBs are detected with ≲deg resolution in the SDSS field, the clustering analysis with the intrinsic DM scatter of 100 pc /cm3 can constrain the global abundance of free electrons at z ≲1 and the large-scale bias of FRB host galaxies (the statistical relation between the distribution of host galaxies and cosmic matter density field) with fractional errors (with a 68% confidence level) of ˜10 % and ˜20 %, respectively. The mean near-source dispersion measure and the delay-time distribution of FRB rates relative to the global star forming rate can be also determined by combining the clustering and the probability distribution function of DM. Our approach will be complementary to high-resolution (≪deg ) event localization using e.g., VLA and VLBI for identifying the origin of FRBs and the source environment. We strongly encourage future observational programs such as CHIME, UTMOST, and HIRAX to survey FRBs in the SDSS field.
Advances in cell culture: anchorage dependence
Merten, Otto-Wilhelm
2015-01-01
Anchorage-dependent cells are of great interest for various biotechnological applications. (i) They represent a formidable production means of viruses for vaccination purposes at very large scales (in 1000–6000 l reactors) using microcarriers, and in the last decade many more novel viral vaccines have been developed using this production technology. (ii) With the advent of stem cells and their use/potential use in clinics for cell therapy and regenerative medicine purposes, the development of novel culture devices and technologies for adherent cells has accelerated greatly with a view to the large-scale expansion of these cells. Presently, the really scalable systems—microcarrier/microcarrier-clump cultures using stirred-tank reactors—for the expansion of stem cells are still in their infancy. Only laboratory scale reactors of maximally 2.5 l working volume have been evaluated because thorough knowledge and basic understanding of critical issues with respect to cell expansion while retaining pluripotency and differentiation potential, and the impact of the culture environment on stem cell fate, etc., are still lacking and require further studies. This article gives an overview on critical issues common to all cell culture systems for adherent cells as well as specifics for different types of stem cells in view of small- and large-scale cell expansion and production processes. PMID:25533097
Polarization Radiation with Turbulent Magnetic Fields from X-Ray Binaries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jian-Fu; Xiang, Fu-Yuan; Lu, Ju-Fu, E-mail: jfzhang@xtu.edu.cn, E-mail: fyxiang@xtu.edu.cn, E-mail: lujf@xmu.edu.cn
2017-02-10
We study the properties of polarized radiation in turbulent magnetic fields from X-ray binary jets. These turbulent magnetic fields are composed of large- and small-scale configurations, which result in the polarized jitter radiation when the characteristic length of turbulence is less than the non-relativistic Larmor radius. On the contrary, the polarized synchrotron emission occurs, corresponding to a large-scale turbulent environment. We calculate the spectral energy distributions and the degree of polarization for a general microquasar. Numerical results show that turbulent magnetic field configurations can indeed provide a high degree of polarization, which does not mean that a uniform, large-scale magneticmore » field structure exists. The model is applied to investigate the properties of polarized radiation of the black-hole X-ray binary Cygnus X-1. Under the constraint of multiband observations of this source, our studies demonstrate that the model can explain the high polarization degree at the MeV tail and predict the highly polarized properties at the high-energy γ -ray region, and that the dominant small-scale turbulent magnetic field plays an important role for explaining the highly polarized observation at hard X-ray/soft γ -ray bands. This model can be tested by polarization observations of upcoming polarimeters at high-energy γ -ray bands.« less
NASA Astrophysics Data System (ADS)
Ota, Kazuaki; Venemans, Bram P.; Taniguchi, Yoshiaki; Kashikawa, Nobunari; Nakata, Fumiaki; Harikane, Yuichi; Bañados, Eduardo; Overzier, Roderik; Riechers, Dominik A.; Walter, Fabian; Toshikawa, Jun; Shibuya, Takatoshi; Jiang, Linhua
2018-04-01
Quasars (QSOs) hosting supermassive black holes are believed to reside in massive halos harboring galaxy overdensities. However, many observations revealed average or low galaxy densities around z ≳ 6 QSOs. This could be partly because they measured galaxy densities in only tens of arcmin2 around QSOs and might have overlooked potential larger-scale galaxy overdensities. Some previous studies also observed only Lyman break galaxies (LBGs; massive older galaxies) and missed low-mass young galaxies, like Lyα emitters (LAEs), around QSOs. Here we present observations of LAE and LBG candidates in ∼700 arcmin2 around a z = 6.61 luminous QSO using the Subaru Telescope Suprime-Cam with narrowband/broadband. We compare their sky distributions, number densities, and angular correlation functions with those of LAEs/LBGs detected in the same manner and comparable data quality in our control blank field. In the QSO field, LAEs and LBGs are clustering in 4–20 comoving Mpc angular scales, but LAEs show mostly underdensity over the field while LBGs are forming 30 × 60 comoving Mpc2 large-scale structure containing 3σ–7σ high-density clumps. The highest-density clump includes a bright (23.78 mag in the narrowband) extended (≳16 kpc) Lyα blob candidate, indicative of a dense environment. The QSO could be part of the structure but is not located exactly at any of the high-density peaks. Near the QSO, LAEs show underdensity while LBGs average to 4σ excess densities compared to the control field. If these environments reflect halo mass, the QSO may not be in the most massive halo but still in a moderately massive one. Based on data collected at Subaru Telescope, which is operated by the National Astronomical Observatory of Japan.
Development of the Large-Scale Forcing Data to Support MC3E Cloud Modeling Studies
NASA Astrophysics Data System (ADS)
Xie, S.; Zhang, Y.
2011-12-01
The large-scale forcing fields (e.g., vertical velocity and advective tendencies) are required to run single-column and cloud-resolving models (SCMs/CRMs), which are the two key modeling frameworks widely used to link field data to climate model developments. In this study, we use an advanced objective analysis approach to derive the required forcing data from the soundings collected by the Midlatitude Continental Convective Cloud Experiment (MC3E) in support of its cloud modeling studies. MC3E is the latest major field campaign conducted during the period 22 April 2011 to 06 June 2011 in south-central Oklahoma through a joint effort between the DOE ARM program and the NASA Global Precipitation Measurement Program. One of its primary goals is to provide a comprehensive dataset that can be used to describe the large-scale environment of convective cloud systems and evaluate model cumulus parameterizations. The objective analysis used in this study is the constrained variational analysis method. A unique feature of this approach is the use of domain-averaged surface and top-of-the atmosphere (TOA) observations (e.g., precipitation and radiative and turbulent fluxes) as constraints to adjust atmospheric state variables from soundings by the smallest possible amount to conserve column-integrated mass, moisture, and static energy so that the final analysis data is dynamically and thermodynamically consistent. To address potential uncertainties in the surface observations, an ensemble forcing dataset will be developed. Multi-scale forcing will be also created for simulating various scale convective systems. At the meeting, we will provide more details about the forcing development and present some preliminary analysis of the characteristics of the large-scale forcing structures for several selected convective systems observed during MC3E.
NASA Technical Reports Server (NTRS)
Deardorff, Glenn; Djomehri, M. Jahed; Freeman, Ken; Gambrel, Dave; Green, Bryan; Henze, Chris; Hinke, Thomas; Hood, Robert; Kiris, Cetin; Moran, Patrick;
2001-01-01
A series of NASA presentations for the Supercomputing 2001 conference are summarized. The topics include: (1) Mars Surveyor Landing Sites "Collaboratory"; (2) Parallel and Distributed CFD for Unsteady Flows with Moving Overset Grids; (3) IP Multicast for Seamless Support of Remote Science; (4) Consolidated Supercomputing Management Office; (5) Growler: A Component-Based Framework for Distributed/Collaborative Scientific Visualization and Computational Steering; (6) Data Mining on the Information Power Grid (IPG); (7) Debugging on the IPG; (8) Debakey Heart Assist Device: (9) Unsteady Turbopump for Reusable Launch Vehicle; (10) Exploratory Computing Environments Component Framework; (11) OVERSET Computational Fluid Dynamics Tools; (12) Control and Observation in Distributed Environments; (13) Multi-Level Parallelism Scaling on NASA's Origin 1024 CPU System; (14) Computing, Information, & Communications Technology; (15) NAS Grid Benchmarks; (16) IPG: A Large-Scale Distributed Computing and Data Management System; and (17) ILab: Parameter Study Creation and Submission on the IPG.
Aggregation of alpha-synuclein by a coarse-grained Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Farmer, Barry; Pandey, Ras
Alpha-synuclein, an intrinsic protein abundant in neurons, is believed to be a major cause of neurodegenerative diseases (e.g. Alzheimer, Parkinson's disease). Abnormal aggregation of ASN leads to Lewy bodies with specific morphologies. We investigate the self-organizing structures in a crowded environment of ASN proteins by a coarse-grained Monte Carlo simulation. ASN is a chain of 140 residues. Structure detail of residues is neglected but its specificity is captured via unique knowledge-based residue-residue interactions. Large-scale simulations are performed to analyze a number local and global physical quantities (e.g. mobility profile, contact map, radius of gyration, structure factor) as a function of temperature and protein concentration. Trend in multi-scale structural variations of the protein in a crowded environment is compared with that of a free protein chain.
NASA Astrophysics Data System (ADS)
Hood, Alan W.; Hughes, David W.
2011-08-01
This review provides an introduction to the generation and evolution of the Sun's magnetic field, summarising both observational evidence and theoretical models. The eleven year solar cycle, which is well known from a variety of observed quantities, strongly supports the idea of a large-scale solar dynamo. Current theoretical ideas on the location and mechanism of this dynamo are presented. The solar cycle influences the behaviour of the global coronal magnetic field and it is the eruptions of this field that can impact on the Earth's environment. These global coronal variations can be modelled to a surprising degree of accuracy. Recent high resolution observations of the Sun's magnetic field in quiet regions, away from sunspots, show that there is a continual evolution of a small-scale magnetic field, presumably produced by small-scale dynamo action in the solar interior. Sunspots, a natural consequence of the large-scale dynamo, emerge, evolve and disperse over a period of several days. Numerical simulations can help to determine the physical processes governing the emergence of sunspots. We discuss the interaction of these emerging fields with the pre-existing coronal field, resulting in a variety of dynamic phenomena.
NASA Technical Reports Server (NTRS)
Allen, B. Danette; Alexandrov, Natalia
2016-01-01
Incremental approaches to air transportation system development inherit current architectural constraints, which, in turn, place hard bounds on system capacity, efficiency of performance, and complexity. To enable airspace operations of the future, a clean-slate (ab initio) airspace design(s) must be considered. This ab initio National Airspace System (NAS) must be capable of accommodating increased traffic density, a broader diversity of aircraft, and on-demand mobility. System and subsystem designs should scale to accommodate the inevitable demand for airspace services that include large numbers of autonomous Unmanned Aerial Vehicles and a paradigm shift in general aviation (e.g., personal air vehicles) in addition to more traditional aerial vehicles such as commercial jetliners and weather balloons. The complex and adaptive nature of ab initio designs for the future NAS requires new approaches to validation, adding a significant physical experimentation component to analytical and simulation tools. In addition to software modeling and simulation, the ability to exercise system solutions in a flight environment will be an essential aspect of validation. The NASA Langley Research Center (LaRC) Autonomy Incubator seeks to develop a flight simulation infrastructure for ab initio modeling and simulation that assumes no specific NAS architecture and models vehicle-to-vehicle behavior to examine interactions and emergent behaviors among hundreds of intelligent aerial agents exhibiting collaborative, cooperative, coordinative, selfish, and malicious behaviors. The air transportation system of the future will be a complex adaptive system (CAS) characterized by complex and sometimes unpredictable (or unpredicted) behaviors that result from temporal and spatial interactions among large numbers of participants. A CAS not only evolves with a changing environment and adapts to it, it is closely coupled to all systems that constitute the environment. Thus, the ecosystem that contains the system and other systems evolves with the CAS as well. The effects of the emerging adaptation and co-evolution are difficult to capture with only combined mathematical and computational experimentation. Therefore, an ab initio flight simulation environment must accommodate individual vehicles, groups of self-organizing vehicles, and large-scale infrastructure behavior. Inspired by Massively Multiplayer Online Role Playing Games (MMORPG) and Serious Gaming, the proposed ab initio simulation environment is similar to online gaming environments in which player participants interact with each other, affect their environment, and expect the simulation to persist and change regardless of any individual player's active participation.
NASA Astrophysics Data System (ADS)
Morikawa, Y.; Murata, K. T.; Watari, S.; Kato, H.; Yamamoto, K.; Inoue, S.; Tsubouchi, K.; Fukazawa, K.; Kimura, E.; Tatebe, O.; Shimojo, S.
2010-12-01
Main methodologies of Solar-Terrestrial Physics (STP) so far are theoretical, experimental and observational, and computer simulation approaches. Recently "informatics" is expected as a new (fourth) approach to the STP studies. Informatics is a methodology to analyze large-scale data (observation data and computer simulation data) to obtain new findings using a variety of data processing techniques. At NICT (National Institute of Information and Communications Technology, Japan) we are now developing a new research environment named "OneSpaceNet". The OneSpaceNet is a cloud-computing environment specialized for science works, which connects many researchers with high-speed network (JGN: Japan Gigabit Network). The JGN is a wide-area back-born network operated by NICT; it provides 10G network and many access points (AP) over Japan. The OneSpaceNet also provides with rich computer resources for research studies, such as super-computers, large-scale data storage area, licensed applications, visualization devices (like tiled display wall: TDW), database/DBMS, cluster computers (4-8 nodes) for data processing and communication devices. What is amazing in use of the science cloud is that a user simply prepares a terminal (low-cost PC). Once connecting the PC to JGN2plus, the user can make full use of the rich resources of the science cloud. Using communication devices, such as video-conference system, streaming and reflector servers, and media-players, the users on the OneSpaceNet can make research communications as if they belong to a same (one) laboratory: they are members of a virtual laboratory. The specification of the computer resources on the OneSpaceNet is as follows: The size of data storage we have developed so far is almost 1PB. The number of the data files managed on the cloud storage is getting larger and now more than 40,000,000. What is notable is that the disks forming the large-scale storage are distributed to 5 data centers over Japan (but the storage system performs as one disk). There are three supercomputers allocated on the cloud, one from Tokyo, one from Osaka and the other from Nagoya. One's simulation job data on any supercomputers are saved on the cloud data storage (same directory); it is a kind of virtual computing environment. The tiled display wall has 36 panels acting as one display; the pixel (resolution) size of it is as large as 18000x4300. This size is enough to preview or analyze the large-scale computer simulation data. It also allows us to take a look of multiple (e.g., 100 pictures) on one screen together with many researchers. In our talk we also present a brief report of the initial results using the OneSpaceNet for Global MHD simulations as an example of successful use of our science cloud; (i) Ultra-high time resolution visualization of Global MHD simulations on the large-scale storage and parallel processing system on the cloud, (ii) Database of real-time Global MHD simulation and statistic analyses of the data, and (iii) 3D Web service of Global MHD simulations.
The X-ray emission mechanism of large scale powerful quasar jets: Fermi rules out IC/CMB for 3C 273.
NASA Astrophysics Data System (ADS)
Georganopoulos, Markos; Meyer, Eileen T.
2013-12-01
The process responsible for the Chandra-detected X-ray emission from the large-scale jets of powerful quasars is not clear yet. The two main models are inverse Compton scattering off the cosmic microwave background photons (IC/CMB) and synchrotron emission from a population of electrons separate from those producing the radio-IR emission. These two models imply radically different conditions in the large scale jet in terms of jet speed, kinetic power, and maximum energy of the particle acceleration mechanism, with important implications for the impact of the jet on the larger-scale environment. Georganopoulos et al. (2006) proposed a diagnostic based on a fundamental difference between these two models: the production of synchrotron X-rays requires multi-TeV electrons, while the EC/CMB model requires a cutoff in the electron energy distribution below TeV energies. This has significant implications for the γ-ray emission predicted by these two models. Here we present new Fermi observations that put an upper limit on the gamma-ray flux from the large-scale jet of 3C 273 that clearly violates the flux expected from the IC/CMB X-ray interpretation found by extrapolation of the UV to X-ray spectrum of knot A, thus ruling out the IC/CMB interpretation entirely for this source. Further, the upper limit from Fermi puts a limit on the Doppler beaming factor of at least δ <9, assuming equipartition fields, and possibly as low as δ <5 assuming no major deceleration of the jet from knots A through D1.
O'Connor, Brian D.; Yuen, Denis; Chung, Vincent; Duncan, Andrew G.; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent
2017-01-01
As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH). PMID:28344774
Wang, Yan Jason; Nguyen, Monica T; Steffens, Jonathan T; Tong, Zheming; Wang, Yungang; Hopke, Philip K; Zhang, K Max
2013-01-15
A new methodology, referred to as the multi-scale structure, integrates "tailpipe-to-road" (i.e., on-road domain) and "road-to-ambient" (i.e., near-road domain) simulations to elucidate the environmental impacts of particulate emissions from traffic sources. The multi-scale structure is implemented in the CTAG model to 1) generate process-based on-road emission rates of ultrafine particles (UFPs) by explicitly simulating the effects of exhaust properties, traffic conditions, and meteorological conditions and 2) to characterize the impacts of traffic-related emissions on micro-environmental air quality near a highway intersection in Rochester, NY. The performance of CTAG, evaluated against with the field measurements, shows adequate agreement in capturing the dispersion of carbon monoxide (CO) and the number concentrations of UFPs in the near road micro-environment. As a proof-of-concept case study, we also apply CTAG to separate the relative impacts of the shutdown of a large coal-fired power plant (CFPP) and the adoption of the ultra-low-sulfur diesel (ULSD) on UFP concentrations in the intersection micro-environment. Although CTAG is still computationally expensive compared to the widely-used parameterized dispersion models, it has the potential to advance our capability to predict the impacts of UFP emissions and spatial/temporal variations of air pollutants in complex environments. Furthermore, for the on-road simulations, CTAG can serve as a process-based emission model; Combining the on-road and near-road simulations, CTAG becomes a "plume-in-grid" model for mobile emissions. The processed emission profiles can potentially improve regional air quality and climate predictions accordingly. Copyright © 2012 Elsevier B.V. All rights reserved.
O'Connor, Brian D; Yuen, Denis; Chung, Vincent; Duncan, Andrew G; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent
2017-01-01
As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH).
NASA Astrophysics Data System (ADS)
Boella, Elisabetta; Herrero-Gonzalez, Diego; Innocenti, Maria Elena; Bemporad, Alessandro; Lapenta, Giovanni
2017-04-01
Fully kinetic simulations of magnetic reconnection events in the solar environment are especially challenging due to the extreme range of spatial and temporal scales that characterises them. As one moves from the photosphere to the chromosphere and the corona, the temperature increases from sub eV to 10-100 eV, while the mass density decreases from 10-4 to 10-12 kg/m3 and further. The intrinsic scales of kinetic reconnection (inertial length and gyroradius) are tremendously smaller than the maximum resolution available in observations. Furthermore, no direct information is available on the size of reconnection regions, plasmoids and reconnection fronts, while observations suggest that the process can cascade down to very small scale te{Bemporad}. Resolving the electron and ion scales while simulating a sufficiently large domain is a great challenge facing solar modelling. An especially challenging aspect is the need to consider the Debye length. The very low temperature of the electrons and the large spatial and temporal scales make these simulations hard to implement within existing Particle in Cell (PIC) methods. The limit is the ratio of the grid spacing to the Debye length. PIC methods show good stability and energy conservation when the grid does not exceed the Debye length too much. Semi-implicit methods te{Brackbill, Langdon} improve on this point. Only the recently developed fully energy conserving implicit methods have solved the problem te{Markidis, Chen}, but at a high computational cost. Very recently, we have developed an efficient new semi-implicit algorithm, which has been proven to conserve energy exactly to machine precision te{Lapenta}. In this work, we illustrate the main steps that enabled this great breakthrough and report the implementation on a new massively parallel three dimensional PIC code, called ECsim te{Lapenta2}. The new approach is applied to the problem of reconnection in the solar environment. We compare results of a simple 2D configuration similar to the so-called GEM challenge for different ranges of electron temperature, density and magnetic field, relative to different distances from the photosphere, demonstrating the capability of the new code. Finally, we report on the first results (to the authors' knowledge) of realistic magnetic 3D reconnection simulations in the solar environment, considering a large domain sufficient to describe the interaction of large scale dynamics with the reconnection process. A. Bemporad, ApJ 689, 572 (2008). J.U. Brackbill and D.W. Forslund, J. Comput. Phys. 46, 271 (1982). A. Langdon et al., J. Comput. Phys. 51, 107 (1983). S. Markidis and G. Lapenta, J. Comput. Phys. 230, 7037 (2011). G. Chen et al., J. Comput. Phys. 230, 7018 (2011). G. Lapenta, arXiv preprint arXiv:1602.06326 (2016). G. Lapenta et al., arXiv preprint arXiv:1612.08289 (2016).
Giroux, Marie-Andrée; Valiquette, Éliane; Tremblay, Jean-Pierre; Côté, Steeve D
2015-01-01
Documenting habitat-related patterns in foraging behaviour at the individual level and over large temporal scales remains challenging for large herbivores. Stable isotope analysis could represent a valuable tool to quantify habitat-related foraging behaviour at the scale of individuals and over large temporal scales in forest dwelling large herbivores living in coastal environments, because the carbon (δ13C) or nitrogen (δ15N) isotopic signatures of forage can differ between open and closed habitats or between terrestrial and littoral forage, respectively. Here, we examined if we could detect isotopic differences between the different assemblages of forage taxa consumed by white-tailed deer that can be found in open, closed, supralittoral, and littoral habitats. We showed that δ13C of assemblages of forage taxa were 3.0 ‰ lower in closed than in open habitats, while δ15N were 2.0 ‰ and 7.4 ‰ higher in supralittoral and littoral habitats, respectively, than in terrestrial habitats. Stable isotope analysis may represent an additional technique for ecologists interested in quantifiying the consumption of terrestrial vs. marine autotrophs. Yet, given the relative isotopic proximity and the overlap between forage from open, closed, and supralittoral habitats, the next step would be to determine the potential to estimate their contribution to herbivore diet.