Sample records for large scale multi-site

  1. A comparison of single- and multi-site calibration and validation: a case study of SWAT in the Miyun Reservoir watershed, China

    NASA Astrophysics Data System (ADS)

    Bai, Jianwen; Shen, Zhenyao; Yan, Tiezhu

    2017-09-01

    An essential task in evaluating global water resource and pollution problems is to obtain the optimum set of parameters in hydrological models through calibration and validation. For a large-scale watershed, single-site calibration and validation may ignore spatial heterogeneity and may not meet the needs of the entire watershed. The goal of this study is to apply a multi-site calibration and validation of the Soil andWater Assessment Tool (SWAT), using the observed flow data at three monitoring sites within the Baihe watershed of the Miyun Reservoir watershed, China. Our results indicate that the multi-site calibration parameter values are more reasonable than those obtained from single-site calibrations. These results are mainly due to significant differences in the topographic factors over the large-scale area, human activities and climate variability. The multi-site method involves the division of the large watershed into smaller watersheds, and applying the calibrated parameters of the multi-site calibration to the entire watershed. It was anticipated that this case study could provide experience of multi-site calibration in a large-scale basin, and provide a good foundation for the simulation of other pollutants in followup work in the Miyun Reservoir watershed and other similar large areas.

  2. Robust multi-site MR data processing: iterative optimization of bias correction, tissue classification, and registration.

    PubMed

    Young Kim, Eun; Johnson, Hans J

    2013-01-01

    A robust multi-modal tool, for automated registration, bias correction, and tissue classification, has been implemented for large-scale heterogeneous multi-site longitudinal MR data analysis. This work focused on improving the an iterative optimization framework between bias-correction, registration, and tissue classification inspired from previous work. The primary contributions are robustness improvements from incorporation of following four elements: (1) utilize multi-modal and repeated scans, (2) incorporate high-deformable registration, (3) use extended set of tissue definitions, and (4) use of multi-modal aware intensity-context priors. The benefits of these enhancements were investigated by a series of experiments with both simulated brain data set (BrainWeb) and by applying to highly-heterogeneous data from a 32 site imaging study with quality assessments through the expert visual inspection. The implementation of this tool is tailored for, but not limited to, large-scale data processing with great data variation with a flexible interface. In this paper, we describe enhancements to a joint registration, bias correction, and the tissue classification, that improve the generalizability and robustness for processing multi-modal longitudinal MR scans collected at multi-sites. The tool was evaluated by using both simulated and simulated and human subject MRI images. With these enhancements, the results showed improved robustness for large-scale heterogeneous MRI processing.

  3. The Impact of Large, Multi-Function/Multi-Site Competitions

    DTIC Science & Technology

    2003-08-01

    this approach generates larger savings and improved service quality , and is less expensive to implement. Moreover, it is a way to meet the President s...of the study is to assess the degree to which large-scale competitions completed have resulted in increased savings and service quality and decreased

  4. Ten Steps to Conducting a Large, Multi-Site, Longitudinal Investigation of Language and Reading in Young Children

    PubMed Central

    Farquharson, Kelly; Murphy, Kimberly A.

    2016-01-01

    Purpose: This paper describes methodological procedures involving execution of a large-scale, multi-site longitudinal study of language and reading comprehension in young children. Researchers in the Language and Reading Research Consortium (LARRC) developed and implemented these procedures to ensure data integrity across multiple sites, schools, and grades. Specifically, major features of our approach, as well as lessons learned, are summarized in 10 steps essential for successful completion of a large-scale longitudinal investigation in early grades. Method: Over 5 years, children in preschool through third grade were administered a battery of 35 higher- and lower-level language, listening, and reading comprehension measures (RCM). Data were collected from children, their teachers, and their parents/guardians at four sites across the United States. Substantial and rigorous effort was aimed toward maintaining consistency in processes and data management across sites for children, assessors, and staff. Conclusion: With appropriate planning, flexibility, and communication strategies in place, LARRC developed and executed a successful multi-site longitudinal research study that will meet its goal of investigating the contribution and role of language skills in the development of children's listening and reading comprehension. Through dissemination of our design strategies and lessons learned, research teams embarking on similar endeavors can be better equipped to anticipate the challenges. PMID:27064308

  5. Reduced-complexity multi-site rainfall generation: one million years over night using the model TripleM

    NASA Astrophysics Data System (ADS)

    Breinl, Korbinian; Di Baldassarre, Giuliano; Girons Lopez, Marc

    2017-04-01

    We assess uncertainties of multi-site rainfall generation across spatial scales and different climatic conditions. Many research subjects in earth sciences such as floods, droughts or water balance simulations require the generation of long rainfall time series. In large study areas the simulation at multiple sites becomes indispensable to account for the spatial rainfall variability, but becomes more complex compared to a single site due to the intermittent nature of rainfall. Weather generators can be used for extrapolating rainfall time series, and various models have been presented in the literature. Even though the large majority of multi-site rainfall generators is based on similar methods, such as resampling techniques or Markovian processes, they often become too complex. We think that this complexity has been a limit for the application of such tools. Furthermore, the majority of multi-site rainfall generators found in the literature are either not publicly available or intended for being applied at small geographical scales, often only in temperate climates. Here we present a revised, and now publicly available, version of a multi-site rainfall generation code first applied in 2014 in Austria and France, which we call TripleM (Multisite Markov Model). We test this fast and robust code with daily rainfall observations from the United States, in a subtropical, tropical and temperate climate, using rain gauge networks with a maximum site distance above 1,000km, thereby generating one million years of synthetic time series. The modelling of these one million years takes one night on a recent desktop computer. In this research, we first start the simulations with a small station network of three sites and progressively increase the number of sites and the spatial extent, and analyze the changing uncertainties for multiple statistical metrics such as dry and wet spells, rainfall autocorrelation, lagged cross correlations and the inter-annual rainfall variability. Our study contributes to the scientific community of earth sciences and the ongoing debate on extreme precipitation in a changing climate by making a stable, and very easily applicable, multi-site rainfall generation code available to the research community and providing a better understanding of the performance of multi-site rainfall generation depending on spatial scales and climatic conditions.

  6. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.

  7. Using multi-scale distribution and movement effects along a montane highway to identify optimal crossing locations for a large-bodied mammal community.

    PubMed

    Schuster, Richard; Römer, Heinrich; Germain, Ryan R

    2013-01-01

    Roads are a major cause of habitat fragmentation that can negatively affect many mammal populations. Mitigation measures such as crossing structures are a proposed method to reduce the negative effects of roads on wildlife, but the best methods for determining where such structures should be implemented, and how their effects might differ between species in mammal communities is largely unknown. We investigated the effects of a major highway through south-eastern British Columbia, Canada on several mammal species to determine how the highway may act as a barrier to animal movement, and how species may differ in their crossing-area preferences. We collected track data of eight mammal species across two winters, along both the highway and pre-marked transects, and used a multi-scale modeling approach to determine the scale at which habitat characteristics best predicted preferred crossing sites for each species. We found evidence for a severe barrier effect on all investigated species. Freely-available remotely-sensed habitat landscape data were better than more costly, manually-digitized microhabitat maps in supporting models that identified preferred crossing sites; however, models using both types of data were better yet. Further, in 6 of 8 cases models which incorporated multiple spatial scales were better at predicting preferred crossing sites than models utilizing any single scale. While each species differed in terms of the landscape variables associated with preferred/avoided crossing sites, we used a multi-model inference approach to identify locations along the highway where crossing structures may benefit all of the species considered. By specifically incorporating both highway and off-highway data and predictions we were able to show that landscape context plays an important role for maximizing mitigation measurement efficiency. Our results further highlight the need for mitigation measures along major highways to improve connectivity between mammal populations, and illustrate how multi-scale data can be used to identify preferred crossing sites for different species within a mammal community.

  8. Using multi-scale distribution and movement effects along a montane highway to identify optimal crossing locations for a large-bodied mammal community

    PubMed Central

    Römer, Heinrich; Germain, Ryan R.

    2013-01-01

    Roads are a major cause of habitat fragmentation that can negatively affect many mammal populations. Mitigation measures such as crossing structures are a proposed method to reduce the negative effects of roads on wildlife, but the best methods for determining where such structures should be implemented, and how their effects might differ between species in mammal communities is largely unknown. We investigated the effects of a major highway through south-eastern British Columbia, Canada on several mammal species to determine how the highway may act as a barrier to animal movement, and how species may differ in their crossing-area preferences. We collected track data of eight mammal species across two winters, along both the highway and pre-marked transects, and used a multi-scale modeling approach to determine the scale at which habitat characteristics best predicted preferred crossing sites for each species. We found evidence for a severe barrier effect on all investigated species. Freely-available remotely-sensed habitat landscape data were better than more costly, manually-digitized microhabitat maps in supporting models that identified preferred crossing sites; however, models using both types of data were better yet. Further, in 6 of 8 cases models which incorporated multiple spatial scales were better at predicting preferred crossing sites than models utilizing any single scale. While each species differed in terms of the landscape variables associated with preferred/avoided crossing sites, we used a multi-model inference approach to identify locations along the highway where crossing structures may benefit all of the species considered. By specifically incorporating both highway and off-highway data and predictions we were able to show that landscape context plays an important role for maximizing mitigation measurement efficiency. Our results further highlight the need for mitigation measures along major highways to improve connectivity between mammal populations, and illustrate how multi-scale data can be used to identify preferred crossing sites for different species within a mammal community. PMID:24244912

  9. Subsurface Monitoring of CO2 Sequestration - A Review and Look Forward

    NASA Astrophysics Data System (ADS)

    Daley, T. M.

    2012-12-01

    The injection of CO2 into subsurface formations is at least 50 years old with large-scale utilization of CO2 for enhanced oil recovery (CO2-EOR) beginning in the 1970s. Early monitoring efforts had limited measurements in available boreholes. With growing interest in CO2 sequestration beginning in the 1990's, along with growth in geophysical reservoir monitoring, small to mid-size sequestration monitoring projects began to appear. The overall goals of a subsurface monitoring plan are to provide measurement of CO2 induced changes in subsurface properties at a range of spatial and temporal scales. The range of spatial scales allows tracking of the location and saturation of the plume with varying detail, while finer temporal sampling (up to continuous) allows better understanding of dynamic processes (e.g. multi-phase flow) and constraining of reservoir models. Early monitoring of small scale pilots associated with CO2-EOR (e.g., the McElroy field and the Lost Hills field), developed many of the methodologies including tomographic imaging and multi-physics measurements. Large (reservoir) scale sequestration monitoring began with the Sleipner and Weyburn projects. Typically, large scale monitoring, such as 4D surface seismic, has limited temporal sampling due to costs. Smaller scale pilots can allow more frequent measurements as either individual time-lapse 'snapshots' or as continuous monitoring. Pilot monitoring examples include the Frio, Nagaoka and Otway pilots using repeated well logging, crosswell imaging, vertical seismic profiles and CASSM (continuous active-source seismic monitoring). For saline reservoir sequestration projects, there is typically integration of characterization and monitoring, since the sites are not pre-characterized resource developments (oil or gas), which reinforces the need for multi-scale measurements. As we move beyond pilot sites, we need to quantify CO2 plume and reservoir properties (e.g. pressure) over large scales, while still obtaining high resolution. Typically the high-resolution (spatial and temporal) tools are deployed in permanent or semi-permanent borehole installations, where special well design may be necessary, such as non-conductive casing for electrical surveys. Effective utilization of monitoring wells requires an approach of modular borehole monitoring (MBM) were multiple measurements can be made. An example is recent work at the Citronelle pilot injection site where an MBM package with seismic, fluid sampling and distributed fiber sensing was deployed. For future large scale sequestration monitoring, an adaptive borehole-monitoring program is proposed.

  10. The Parallel System for Integrating Impact Models and Sectors (pSIMS)

    NASA Technical Reports Server (NTRS)

    Elliott, Joshua; Kelly, David; Chryssanthacopoulos, James; Glotter, Michael; Jhunjhnuwala, Kanika; Best, Neil; Wilde, Michael; Foster, Ian

    2014-01-01

    We present a framework for massively parallel climate impact simulations: the parallel System for Integrating Impact Models and Sectors (pSIMS). This framework comprises a) tools for ingesting and converting large amounts of data to a versatile datatype based on a common geospatial grid; b) tools for translating this datatype into custom formats for site-based models; c) a scalable parallel framework for performing large ensemble simulations, using any one of a number of different impacts models, on clusters, supercomputers, distributed grids, or clouds; d) tools and data standards for reformatting outputs to common datatypes for analysis and visualization; and e) methodologies for aggregating these datatypes to arbitrary spatial scales such as administrative and environmental demarcations. By automating many time-consuming and error-prone aspects of large-scale climate impacts studies, pSIMS accelerates computational research, encourages model intercomparison, and enhances reproducibility of simulation results. We present the pSIMS design and use example assessments to demonstrate its multi-model, multi-scale, and multi-sector versatility.

  11. An Integrated Assessment of Location-Dependent Scaling for Microalgae Biofuel Production Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Andre M.; Abodeely, Jared; Skaggs, Richard

    Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting/design through processing/upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are addressed in part by applying the Integrated Assessment Framework (IAF)—an integrated multi-scale modeling, analysis, and data management suite—to address key issues in developing and operating an open-pond facility by analyzing how variability and uncertainty in space andmore » time affect algal feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. The IAF was applied to a set of sites previously identified as having the potential to cumulatively produce 5 billion-gallons/year in the southeastern U.S. and results indicate costs can be reduced by selecting the most effective processing technology pathway and scaling downstream processing capabilities to fit site-specific growing conditions, available resources, and algal strains.« less

  12. 3D fully convolutional networks for subcortical segmentation in MRI: A large-scale study.

    PubMed

    Dolz, Jose; Desrosiers, Christian; Ben Ayed, Ismail

    2018-04-15

    This study investigates a 3D and fully convolutional neural network (CNN) for subcortical brain structure segmentation in MRI. 3D CNN architectures have been generally avoided due to their computational and memory requirements during inference. We address the problem via small kernels, allowing deeper architectures. We further model both local and global context by embedding intermediate-layer outputs in the final prediction, which encourages consistency between features extracted at different scales and embeds fine-grained information directly in the segmentation process. Our model is efficiently trained end-to-end on a graphics processing unit (GPU), in a single stage, exploiting the dense inference capabilities of fully CNNs. We performed comprehensive experiments over two publicly available datasets. First, we demonstrate a state-of-the-art performance on the ISBR dataset. Then, we report a large-scale multi-site evaluation over 1112 unregistered subject datasets acquired from 17 different sites (ABIDE dataset), with ages ranging from 7 to 64 years, showing that our method is robust to various acquisition protocols, demographics and clinical factors. Our method yielded segmentations that are highly consistent with a standard atlas-based approach, while running in a fraction of the time needed by atlas-based methods and avoiding registration/normalization steps. This makes it convenient for massive multi-site neuroanatomical imaging studies. To the best of our knowledge, our work is the first to study subcortical structure segmentation on such large-scale and heterogeneous data. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. An integrated assessment of location-dependent scaling for microalgae biofuel production facilities

    DOE PAGES

    Coleman, André M.; Abodeely, Jared M.; Skaggs, Richard L.; ...

    2014-06-19

    Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting and design through processing and upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are partially addressed by applying the Integrated Assessment Framework (IAF) – an integrated multi-scale modeling, analysis, and data management suite – to address key issues in developing and operating an open-pond microalgae production facility.more » This is done by analyzing how variability and uncertainty over space and through time affect feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. To provide a baseline analysis, the IAF was applied in this paper to a set of sites in the southeastern U.S. with the potential to cumulatively produce 5 billion gallons per year. Finally, the results indicate costs can be reduced by scaling downstream processing capabilities to fit site-specific growing conditions, available and economically viable resources, and specific microalgal strains.« less

  14. Avian movements and wetland connectivity in landscape conservation

    USGS Publications Warehouse

    Haig, Susan M.; Mehlman, D.W.; Oring, L.W.

    1998-01-01

    The current conservation crisis calls for research and management to be carried out on a long-term, multi-species basis at large spatial scales. Unfortunately, scientists, managers, and agencies often are stymied in their effort to conduct these large-scale studies because of a lack of appropriate technology, methodology, and funding. This issue is of particular concern in wetland conservation, for which the standard landscape approach may include consideration of a large tract of land but fail to incorporate the suite of wetland sites frequently used by highly mobile organisms such as waterbirds (e.g., shorebirds, wading birds, waterfowl). Typically, these species have population dynamics that require use of multiple wetlands, but this aspect of their life history has often been ignored in planning for their conservation. We outline theoretical, empirical, modeling, and planning problems associated with this issue and suggest solutions to some current obstacles. These solutions represent a tradeoff between typical in-depth single-species studies and more generic multi-species studies. They include studying within- and among-season movements of waterbirds on a spatial scale appropriate to both widely dispersing and more stationary species; multi-species censuses at multiple sites; further development and use of technology such as satellite transmitters and population-specific molecular markers; development of spatially explicit population models that consider within-season movements of waterbirds; and recognition from funding agencies that landscape-level issues cannot adequately be addressed without support for these types of studies.

  15. Large-scale Activities Associated with the 2005 Sep. 7th Event

    NASA Astrophysics Data System (ADS)

    Zong, Weiguo

    We present a multi-wavelength study on large-scale activities associated with a significant solar event. On 2005 September 7, a flare classified as bigger than X17 was observed. Combining with Hα 6562.8 ˚, He I 10830 ˚and soft X-ray observations, three large-scale activities were A A found to propagate over a long distance on the solar surface. 1) The first large-scale activity emanated from the flare site, which propagated westward around the solar equator and appeared as sequential brightenings. With MDI longitudinal magnetic field map, the activity was found to propagate along the magnetic network. 2) The second large-scale activity could be well identified both in He I 10830 ˚images and soft X-ray images and appeared as diffuse emission A enhancement propagating away. The activity started later than the first one and was not centric on the flare site. Moreover, a rotation was found along with the bright front propagating away. 3) The third activity was ahead of the second one, which was identified as a "winking" filament. The three activities have different origins, which were seldom observed in one event. Therefore this study is useful to understand the mechanism of large-scale activities on solar surface.

  16. Assessing the challenges of multi-scope clinical research sites: an example from NIH HIV/AIDS clinical trials networks.

    PubMed

    Rosas, Scott R; Cope, Marie T; Villa, Christie; Motevalli, Mahnaz; Utech, Jill; Schouten, Jeffrey T

    2014-04-01

    Large-scale, multi-network clinical trials are seen as a means for efficient and effective utilization of resources with greater responsiveness to new discoveries. Formal structures instituted within the National Institutes of Health (NIH) HIV/AIDS Clinical Trials facilitate collaboration and coordination across networks and emphasize an integrated approach to HIV/AIDS vaccine, prevention and therapeutics clinical trials. This study examines the joint usage of clinical research sites as means of gaining efficiency, extending capacity, and adding scientific value to the networks. A semi-structured questionnaire covering eight clinical management domains was administered to 74 (62% of sites) clinical site coordinators at single- and multi-network sites to identify challenges and efficiencies related to clinical trials management activities and coordination with multi-network units. Overall, respondents at multi-network sites did not report more challenges than single-network sites, but did report unique challenges to overcome including in the areas of study prioritization, community engagement, staff education and training, and policies and procedures. The majority of multi-network sites reported that such affiliations do allow for the consolidation and cost-sharing of research functions. Suggestions for increasing the efficiency or performance of multi-network sites included streamlining standards and requirements, consolidating protocol activation methods, using a single cross-network coordinating centre, and creating common budget and payment mechanisms. The results of this assessment provide important information to consider in the design and management of multi-network configurations for the NIH HIV/AIDS Clinical Trials Networks, as well as others contemplating and promoting the concept of multi-network settings. © 2013 John Wiley & Sons Ltd.

  17. INFORMATION MANAGEMENT AND RELATED QUALITY ASSURANCE FOR A LARGE SCALE, MULTI-SITE RESEARCH PROJECT

    EPA Science Inventory

    During the summer of 2000, as part of a U.S. Environmental Protection Agency study designed to improve microbial water quality monitoring protocols at public beaches, over 11,000 water samples were collected at five selected beaches across the country. At each beach, samples wer...

  18. Effect of thematic map misclassification on landscape multi-metric assessment.

    PubMed

    Kleindl, William J; Powell, Scott L; Hauer, F Richard

    2015-06-01

    Advancements in remote sensing and computational tools have increased our awareness of large-scale environmental problems, thereby creating a need for monitoring, assessment, and management at these scales. Over the last decade, several watershed and regional multi-metric indices have been developed to assist decision-makers with planning actions of these scales. However, these tools use remote-sensing products that are subject to land-cover misclassification, and these errors are rarely incorporated in the assessment results. Here, we examined the sensitivity of a landscape-scale multi-metric index (MMI) to error from thematic land-cover misclassification and the implications of this uncertainty for resource management decisions. Through a case study, we used a simplified floodplain MMI assessment tool, whose metrics were derived from Landsat thematic maps, to initially provide results that were naive to thematic misclassification error. Using a Monte Carlo simulation model, we then incorporated map misclassification error into our MMI, resulting in four important conclusions: (1) each metric had a different sensitivity to error; (2) within each metric, the bias between the error-naive metric scores and simulated scores that incorporate potential error varied in magnitude and direction depending on the underlying land cover at each assessment site; (3) collectively, when the metrics were combined into a multi-metric index, the effects were attenuated; and (4) the index bias indicated that our naive assessment model may overestimate floodplain condition of sites with limited human impacts and, to a lesser extent, either over- or underestimated floodplain condition of sites with mixed land use.

  19. A multi-landing pad DNA integration platform for mammalian cell engineering

    PubMed Central

    Gaidukov, Leonid; Wroblewska, Liliana; Teague, Brian; Nelson, Tom; Zhang, Xin; Liu, Yan; Jagtap, Kalpana; Mamo, Selamawit; Tseng, Wen Allen; Lowe, Alexis; Das, Jishnu; Bandara, Kalpanie; Baijuraj, Swetha; Summers, Nevin M; Zhang, Lin; Weiss, Ron

    2018-01-01

    Abstract Engineering mammalian cell lines that stably express many transgenes requires the precise insertion of large amounts of heterologous DNA into well-characterized genomic loci, but current methods are limited. To facilitate reliable large-scale engineering of CHO cells, we identified 21 novel genomic sites that supported stable long-term expression of transgenes, and then constructed cell lines containing one, two or three ‘landing pad’ recombination sites at selected loci. By using a highly efficient BxB1 recombinase along with different selection markers at each site, we directed recombinase-mediated insertion of heterologous DNA to selected sites, including targeting all three with a single transfection. We used this method to controllably integrate up to nine copies of a monoclonal antibody, representing about 100 kb of heterologous DNA in 21 transcriptional units. Because the integration was targeted to pre-validated loci, recombinant protein expression remained stable for weeks and additional copies of the antibody cassette in the integrated payload resulted in a linear increase in antibody expression. Overall, this multi-copy site-specific integration platform allows for controllable and reproducible insertion of large amounts of DNA into stable genomic sites, which has broad applications for mammalian synthetic biology, recombinant protein production and biomanufacturing. PMID:29617873

  20. Bio-stimuli-responsive multi-scale hyaluronic acid nanoparticles for deepened tumor penetration and enhanced therapy.

    PubMed

    Huo, Mengmeng; Li, Wenyan; Chaudhuri, Arka Sen; Fan, Yuchao; Han, Xiu; Yang, Chen; Wu, Zhenghong; Qi, Xiaole

    2017-09-01

    In this study, we developed bio-stimuli-responsive multi-scale hyaluronic acid (HA) nanoparticles encapsulated with polyamidoamine (PAMAM) dendrimers as the subunits. These HA/PAMAM nanoparticles of large scale (197.10±3.00nm) were stable during systematic circulation then enriched at the tumor sites; however, they were prone to be degraded by the high expressed hyaluronidase (HAase) to release inner PAMAM dendrimers and regained a small scale (5.77±0.25nm) with positive charge. After employing tumor spheroids penetration assay on A549 3D tumor spheroids for 8h, the fluorescein isothiocyanate (FITC) labeled multi-scale HA/PAMAM-FITC nanoparticles could penetrate deeply into these tumor spheroids with the degradation of HAase. Moreover, small animal imaging technology in male nude mice bearing H22 tumor showed HA/PAMAM-FITC nanoparticles possess higher prolonged systematic circulation compared with both PAMAM-FITC nanoparticles and free FITC. In addition, after intravenous administration in mice bearing H22 tumors, methotrexate (MTX) loaded multi-scale HA/PAMAM-MTX nanoparticles exhibited a 2.68-fold greater antitumor activity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Network Access to Visual Information: A Study of Costs and Uses.

    ERIC Educational Resources Information Center

    Besser, Howard

    This paper summarizes a subset of the findings of a study of digital image distribution that focused on the Museum Educational Site Licensing (MESL) project--the first large-scale multi-institutional project to explore digital delivery of art images and accompanying text/metadata from disparate sources. This Mellon Foundation-sponsored study…

  2. Estimating occupancy in large landscapes: evaluation of amphibian monitoring in the greater Yellowstone ecosystem

    USGS Publications Warehouse

    Gould, William R.; Patla, Debra A.; Daley, Rob; Corn, Paul Stephen; Hossack, Blake R.; Bennetts, Robert E.; Peterson, Charles R.

    2012-01-01

    Monitoring of natural resources is crucial to ecosystem conservation, and yet it can pose many challenges. Annual surveys for amphibian breeding occupancy were conducted in Yellowstone and Grand Teton National Parks over a 4-year period (2006–2009) at two scales: catchments (portions of watersheds) and individual wetland sites. Catchments were selected in a stratified random sample with habitat quality and ease of access serving as strata. All known wetland sites with suitable habitat were surveyed within selected catchments. Changes in breeding occurrence of tiger salamanders, boreal chorus frogs, and Columbia-spotted frogs were assessed using multi-season occupancy estimation. Numerous a priori models were considered within an information theoretic framework including those with catchment and site-level covariates. Habitat quality was the most important predictor of occupancy. Boreal chorus frogs demonstrated the greatest increase in breeding occupancy at the catchment level. Larger changes for all 3 species were detected at the finer site-level scale. Connectivity of sites explained occupancy rates more than other covariates, and may improve understanding of the dynamic processes occurring among wetlands within this ecosystem. Our results suggest monitoring occupancy at two spatial scales within large study areas is feasible and informative.

  3. Abundance and local-scale processes contribute to multi-phyla gradients in global marine diversity

    PubMed Central

    Edgar, Graham J.; Alexander, Timothy J.; Lefcheck, Jonathan S.; Bates, Amanda E.; Kininmonth, Stuart J.; Thomson, Russell J.; Duffy, J. Emmett; Costello, Mark J.; Stuart-Smith, Rick D.

    2017-01-01

    Among the most enduring ecological challenges is an integrated theory explaining the latitudinal biodiversity gradient, including discrepancies observed at different spatial scales. Analysis of Reef Life Survey data for 4127 marine species at 2406 coral and rocky sites worldwide confirms that the total ecoregion richness peaks in low latitudes, near +15°N and −15°S. However, although richness at survey sites is maximal near the equator for vertebrates, it peaks at high latitudes for large mobile invertebrates. Site richness for different groups is dependent on abundance, which is in turn correlated with temperature for fishes and nutrients for macroinvertebrates. We suggest that temperature-mediated fish predation and herbivory have constrained mobile macroinvertebrate diversity at the site scale across the tropics. Conversely, at the ecoregion scale, richness responds positively to coral reef area, highlighting potentially huge global biodiversity losses with coral decline. Improved conservation outcomes require management frameworks, informed by hierarchical monitoring, that cover differing site- and regional-scale processes across diverse taxa, including attention to invertebrate species, which appear disproportionately threatened by warming seas. PMID:29057321

  4. Abundance and local-scale processes contribute to multi-phyla gradients in global marine diversity.

    PubMed

    Edgar, Graham J; Alexander, Timothy J; Lefcheck, Jonathan S; Bates, Amanda E; Kininmonth, Stuart J; Thomson, Russell J; Duffy, J Emmett; Costello, Mark J; Stuart-Smith, Rick D

    2017-10-01

    Among the most enduring ecological challenges is an integrated theory explaining the latitudinal biodiversity gradient, including discrepancies observed at different spatial scales. Analysis of Reef Life Survey data for 4127 marine species at 2406 coral and rocky sites worldwide confirms that the total ecoregion richness peaks in low latitudes, near +15°N and -15°S. However, although richness at survey sites is maximal near the equator for vertebrates, it peaks at high latitudes for large mobile invertebrates. Site richness for different groups is dependent on abundance, which is in turn correlated with temperature for fishes and nutrients for macroinvertebrates. We suggest that temperature-mediated fish predation and herbivory have constrained mobile macroinvertebrate diversity at the site scale across the tropics. Conversely, at the ecoregion scale, richness responds positively to coral reef area, highlighting potentially huge global biodiversity losses with coral decline. Improved conservation outcomes require management frameworks, informed by hierarchical monitoring, that cover differing site- and regional-scale processes across diverse taxa, including attention to invertebrate species, which appear disproportionately threatened by warming seas.

  5. Moon-based Earth Observation for Large Scale Geoscience Phenomena

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Liu, Guang; Ding, Yixing

    2016-07-01

    The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.

  6. Comparison of rangeland vegetation sampling techniques in the Central Grasslands

    USGS Publications Warehouse

    Stohlgren, T.J.; Bull, K.A.; Otsuki, Yuka

    1998-01-01

    Maintaining native plant diversity, detecting exotic species, and monitoring rare species are becoming important objectives in rangeland conservation. Four rangeland vegetation sampling techniques were compared to see how well they captured local pant diversity. The methods tested included the commonly used Parker transects, Daubenmire transects as modified by the USDA Forest Service, a new transect and 'large quadrat' design proposed by the USDA Agricultural Research Service, and the Modified-Whittaker multi-scale vegetation plot. The 4 methods were superimposed in shortgrass steppe, mixed grass prairie, northern mixed prairie, and tallgrass prairie in the Central Grasslands of the United States with 4 replicates in each prairie type. Analysis of variance tests showed significant method effects and prairie type effects, but no significant method X type interactions for total species richness, the number of native species, the number of species with less than 1 % cover, and the time required for sampling. The methods behaved similarly in each prairie type under a wide variety of grazing regimens. The Parker, large quadrat, and Daubenmire transects significantly underestimated the total species richness and the number of native species in each prairie type, and the number of species with less than 1 % cover in all but the tallgrass prairie type. The transect techniques also consistently missed half the exotic species, including noxious weeds, in each prairie type. The Modified-Whittaker method, which included an exhaustive search for plant species in a 20 x 50 m plot, served as the baseline for species richeness comparisons. For all prairie types, the Modified-Whittaker plot captured an average of 42. (?? 2.4; 1 S.E.) plant species per site compared to 15.9 (?? 1.3), 18.9 (?? 1.2), and 22.8 (?? 1.6) plant species per site using the Parker, large quadrat, and Daubenmire transect methods, respectively. The 4 methods captured most of the dominant species at each site and thus produced similar results for total foliar cover and soil cover. The detection and measurement of exotic plant species were greatly enhanced by using ten 1 m2 subplots in a multi-scale sampling design and searching a larger area (1,000 m2) at each site. Even with 4 replicate sites, the transect methods usually captured, and thus would monitor, 36 to 66 % of the plant species at each site. To evaluate the status and trends of common, rare, and exotic plant species at local, regional, and national scales, innovative, multi-scale methods must replace the commonly used transect methods to the past.

  7. Knowledge-Guided Robust MRI Brain Extraction for Diverse Large-Scale Neuroimaging Studies on Humans and Non-Human Primates

    PubMed Central

    Wang, Yaping; Nie, Jingxin; Yap, Pew-Thian; Li, Gang; Shi, Feng; Geng, Xiujuan; Guo, Lei; Shen, Dinggang

    2014-01-01

    Accurate and robust brain extraction is a critical step in most neuroimaging analysis pipelines. In particular, for the large-scale multi-site neuroimaging studies involving a significant number of subjects with diverse age and diagnostic groups, accurate and robust extraction of the brain automatically and consistently is highly desirable. In this paper, we introduce population-specific probability maps to guide the brain extraction of diverse subject groups, including both healthy and diseased adult human populations, both developing and aging human populations, as well as non-human primates. Specifically, the proposed method combines an atlas-based approach, for coarse skull-stripping, with a deformable-surface-based approach that is guided by local intensity information and population-specific prior information learned from a set of real brain images for more localized refinement. Comprehensive quantitative evaluations were performed on the diverse large-scale populations of ADNI dataset with over 800 subjects (55∼90 years of age, multi-site, various diagnosis groups), OASIS dataset with over 400 subjects (18∼96 years of age, wide age range, various diagnosis groups), and NIH pediatrics dataset with 150 subjects (5∼18 years of age, multi-site, wide age range as a complementary age group to the adult dataset). The results demonstrate that our method consistently yields the best overall results across almost the entire human life span, with only a single set of parameters. To demonstrate its capability to work on non-human primates, the proposed method is further evaluated using a rhesus macaque dataset with 20 subjects. Quantitative comparisons with popularly used state-of-the-art methods, including BET, Two-pass BET, BET-B, BSE, HWA, ROBEX and AFNI, demonstrate that the proposed method performs favorably with superior performance on all testing datasets, indicating its robustness and effectiveness. PMID:24489639

  8. Addressing Methodological Challenges in Large Communication Datasets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care

    PubMed Central

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2015-01-01

    In this paper, we present strategies for collecting and coding a large longitudinal communication dataset collected across multiple sites, consisting of over 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication datasets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multi-site secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges have the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a “how-to” example for managing large, digitally-recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research. PMID:26580414

  9. Towards large scale multi-target tracking

    NASA Astrophysics Data System (ADS)

    Vo, Ba-Ngu; Vo, Ba-Tuong; Reuter, Stephan; Lam, Quang; Dietmayer, Klaus

    2014-06-01

    Multi-target tracking is intrinsically an NP-hard problem and the complexity of multi-target tracking solutions usually do not scale gracefully with problem size. Multi-target tracking for on-line applications involving a large number of targets is extremely challenging. This article demonstrates the capability of the random finite set approach to provide large scale multi-target tracking algorithms. In particular it is shown that an approximate filter known as the labeled multi-Bernoulli filter can simultaneously track one thousand five hundred targets in clutter on a standard laptop computer.

  10. Multi-scale habitat selection in highly territorial bird species: Exploring the contribution of nest, territory and landscape levels to site choice in breeding rallids (Aves: Rallidae)

    NASA Astrophysics Data System (ADS)

    Jedlikowski, Jan; Chibowski, Piotr; Karasek, Tomasz; Brambilla, Mattia

    2016-05-01

    Habitat selection often involves choices made at different spatial scales, but the underlying mechanisms are still poorly understood, and studies that investigate the relative importance of individual scales are rare. We investigated the effect of three spatial scales (landscape, territory, nest-site) on the occurrence pattern of little crake Zapornia parva and water rail Rallus aquaticus at 74 ponds in the Masurian Lakeland, Poland. Habitat structure, food abundance and water chemical parameters were measured at nests and random points within landscape plots (from 300-m to 50-m radius), territory (14-m) and nest-site plots (3-m). Regression analyses suggested that the most relevant scale was territory level, followed by landscape, and finally by nest-site for both species. Variation partitioning confirmed this pattern for water rail, but also highlighted the importance of nest-site (the level explaining the highest share of unique variation) for little crake. The most important variables determining the occurrence of both species were water body fragmentation (landscape), vegetation density (territory) and water depth (at territory level for little crake, and at nest-site level for water rail). Finally, for both species multi-scale models including factors from different levels were more parsimonious than single-scale ones, i.e. habitat selection was likely a multi-scale process. The importance of particular spatial scales seemed more related to life-history traits than to the extent of the scales considered. In the case of our study species, the territory level was highly important likely because both rallids have to obtain all the resources they need (nest site, food and mates) in relatively small areas, the multi-purpose territories they defend.

  11. A multi-scale comparison of trait linkages to environmental and spatial variables in fish communities across a large freshwater lake.

    PubMed

    Strecker, Angela L; Casselman, John M; Fortin, Marie-Josée; Jackson, Donald A; Ridgway, Mark S; Abrams, Peter A; Shuter, Brian J

    2011-07-01

    Species present in communities are affected by the prevailing environmental conditions, and the traits that these species display may be sensitive indicators of community responses to environmental change. However, interpretation of community responses may be confounded by environmental variation at different spatial scales. Using a hierarchical approach, we assessed the spatial and temporal variation of traits in coastal fish communities in Lake Huron over a 5-year time period (2001-2005) in response to biotic and abiotic environmental factors. The association of environmental and spatial variables with trophic, life-history, and thermal traits at two spatial scales (regional basin-scale, local site-scale) was quantified using multivariate statistics and variation partitioning. We defined these two scales (regional, local) on which to measure variation and then applied this measurement framework identically in all 5 study years. With this framework, we found that there was no change in the spatial scales of fish community traits over the course of the study, although there were small inter-annual shifts in the importance of regional basin- and local site-scale variables in determining community trait composition (e.g., life-history, trophic, and thermal). The overriding effects of regional-scale variables may be related to inter-annual variation in average summer temperature. Additionally, drivers of fish community traits were highly variable among study years, with some years dominated by environmental variation and others dominated by spatially structured variation. The influence of spatial factors on trait composition was dynamic, which suggests that spatial patterns in fish communities over large landscapes are transient. Air temperature and vegetation were significant variables in most years, underscoring the importance of future climate change and shoreline development as drivers of fish community structure. Overall, a trait-based hierarchical framework may be a useful conservation tool, as it highlights the multi-scaled interactive effect of variables over a large landscape.

  12. Statistical Downscaling in Multi-dimensional Wave Climate Forecast

    NASA Astrophysics Data System (ADS)

    Camus, P.; Méndez, F. J.; Medina, R.; Losada, I. J.; Cofiño, A. S.; Gutiérrez, J. M.

    2009-04-01

    Wave climate at a particular site is defined by the statistical distribution of sea state parameters, such as significant wave height, mean wave period, mean wave direction, wind velocity, wind direction and storm surge. Nowadays, long-term time series of these parameters are available from reanalysis databases obtained by numerical models. The Self-Organizing Map (SOM) technique is applied to characterize multi-dimensional wave climate, obtaining the relevant "wave types" spanning the historical variability. This technique summarizes multi-dimension of wave climate in terms of a set of clusters projected in low-dimensional lattice with a spatial organization, providing Probability Density Functions (PDFs) on the lattice. On the other hand, wind and storm surge depend on instantaneous local large-scale sea level pressure (SLP) fields while waves depend on the recent history of these fields (say, 1 to 5 days). Thus, these variables are associated with large-scale atmospheric circulation patterns. In this work, a nearest-neighbors analog method is used to predict monthly multi-dimensional wave climate. This method establishes relationships between the large-scale atmospheric circulation patterns from numerical models (SLP fields as predictors) with local wave databases of observations (monthly wave climate SOM PDFs as predictand) to set up statistical models. A wave reanalysis database, developed by Puertos del Estado (Ministerio de Fomento), is considered as historical time series of local variables. The simultaneous SLP fields calculated by NCEP atmospheric reanalysis are used as predictors. Several applications with different size of sea level pressure grid and with different temporal domain resolution are compared to obtain the optimal statistical model that better represents the monthly wave climate at a particular site. In this work we examine the potential skill of this downscaling approach considering perfect-model conditions, but we will also analyze the suitability of this methodology to be used for seasonal forecast and for long-term climate change scenario projection of wave climate.

  13. Intensive agriculture erodes β-diversity at large scales.

    PubMed

    Karp, Daniel S; Rominger, Andrew J; Zook, Jim; Ranganathan, Jai; Ehrlich, Paul R; Daily, Gretchen C

    2012-09-01

    Biodiversity is declining from unprecedented land conversions that replace diverse, low-intensity agriculture with vast expanses under homogeneous, intensive production. Despite documented losses of species richness, consequences for β-diversity, changes in community composition between sites, are largely unknown, especially in the tropics. Using a 10-year data set on Costa Rican birds, we find that low-intensity agriculture sustained β-diversity across large scales on a par with forest. In high-intensity agriculture, low local (α) diversity inflated β-diversity as a statistical artefact. Therefore, at small spatial scales, intensive agriculture appeared to retain β-diversity. Unlike in forest or low-intensity systems, however, high-intensity agriculture also homogenised vegetation structure over large distances, thereby decoupling the fundamental ecological pattern of bird communities changing with geographical distance. This ~40% decline in species turnover indicates a significant decline in β-diversity at large spatial scales. These findings point the way towards multi-functional agricultural systems that maintain agricultural productivity while simultaneously conserving biodiversity. © 2012 Blackwell Publishing Ltd/CNRS.

  14. Evolution of Precipitation Structure During the November DYNAMO MJO Event: Cloud-Resolving Model Intercomparison and Cross Validation Using Radar Observations

    NASA Astrophysics Data System (ADS)

    Li, Xiaowen; Janiga, Matthew A.; Wang, Shuguang; Tao, Wei-Kuo; Rowe, Angela; Xu, Weixin; Liu, Chuntao; Matsui, Toshihisa; Zhang, Chidong

    2018-04-01

    Evolution of precipitation structures are simulated and compared with radar observations for the November Madden-Julian Oscillation (MJO) event during the DYNAmics of the MJO (DYNAMO) field campaign. Three ground-based, ship-borne, and spaceborne precipitation radars and three cloud-resolving models (CRMs) driven by observed large-scale forcing are used to study precipitation structures at different locations over the central equatorial Indian Ocean. Convective strength is represented by 0-dBZ echo-top heights, and convective organization by contiguous 17-dBZ areas. The multi-radar and multi-model framework allows for more stringent model validations. The emphasis is on testing models' ability to simulate subtle differences observed at different radar sites when the MJO event passed through. The results show that CRMs forced by site-specific large-scale forcing can reproduce not only common features in cloud populations but also subtle variations observed by different radars. The comparisons also revealed common deficiencies in CRM simulations where they underestimate radar echo-top heights for the strongest convection within large, organized precipitation features. Cross validations with multiple radars and models also enable quantitative comparisons in CRM sensitivity studies using different large-scale forcing, microphysical schemes and parameters, resolutions, and domain sizes. In terms of radar echo-top height temporal variations, many model sensitivity tests have better correlations than radar/model comparisons, indicating robustness in model performance on this aspect. It is further shown that well-validated model simulations could be used to constrain uncertainties in observed echo-top heights when the low-resolution surveillance scanning strategy is used.

  15. The Challenges and Benefits of Employing a Mobile Research Fellow to Facilitate Team Work on a Large, Interdisciplinary, Multi-Sited Project

    ERIC Educational Resources Information Center

    Sugden, Fraser; Punch, Samantha

    2014-01-01

    Over the last few years research funding has increasingly moved in favour of large, multi-partner, interdisciplinary and multi-site research projects. This article explores the benefits and challenges of employing a full-time research fellow to work across multiple field sites, with all the local research teams, on an international,…

  16. The Emerging Role of the Data Base Manager. Report No. R-1253-PR.

    ERIC Educational Resources Information Center

    Sawtelle, Thomas K.

    The Air Force Logistics Command (AFLC) is revising and enhancing its data-processing capabilities with the development of a large-scale, multi-site, on-line, integrated data base information system known as the Advanced Logistics System (ALS). A data integrity program is to be built around a Data Base Manager (DBM), an individual or a group of…

  17. Large area sub-micron chemical imaging of magnesium in sea urchin teeth.

    PubMed

    Masic, Admir; Weaver, James C

    2015-03-01

    The heterogeneous and site-specific incorporation of inorganic ions can profoundly influence the local mechanical properties of damage tolerant biological composites. Using the sea urchin tooth as a research model, we describe a multi-technique approach to spatially map the distribution of magnesium in this complex multiphase system. Through the combined use of 16-bit backscattered scanning electron microscopy, multi-channel energy dispersive spectroscopy elemental mapping, and diffraction-limited confocal Raman spectroscopy, we demonstrate a new set of high throughput, multi-spectral, high resolution methods for the large scale characterization of mineralized biological materials. In addition, instrument hardware and data collection protocols can be modified such that several of these measurements can be performed on irregularly shaped samples with complex surface geometries and without the need for extensive sample preparation. Using these approaches, in conjunction with whole animal micro-computed tomography studies, we have been able to spatially resolve micron and sub-micron structural features across macroscopic length scales on entire urchin tooth cross-sections and correlate these complex morphological features with local variability in elemental composition. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Natural Tracers and Multi-Scale Assessment of Caprock Sealing Behavior: A Case Study of the Kirtland Formation, San Juan Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason Heath; Brian McPherson; Thomas Dewers

    The assessment of caprocks for geologic CO{sub 2} storage is a multi-scale endeavor. Investigation of a regional caprock - the Kirtland Formation, San Juan Basin, USA - at the pore-network scale indicates high capillary sealing capacity and low permeabilities. Core and wellscale data, however, indicate a potential seal bypass system as evidenced by multiple mineralized fractures and methane gas saturations within the caprock. Our interpretation of {sup 4}He concentrations, measured at the top and bottom of the caprock, suggests low fluid fluxes through the caprock: (1) Of the total {sup 4}He produced in situ (i.e., at the locations of sampling)more » by uranium and thorium decay since deposition of the Kirtland Formation, a large portion still resides in the pore fluids. (2) Simple advection-only and advection-diffusion models, using the measured {sup 4}He concentrations, indicate low permeability ({approx}10-20 m{sup 2} or lower) for the thickness of the Kirtland Formation. These findings, however, do not guarantee the lack of a large-scale bypass system. The measured data, located near the boundary conditions of the models (i.e., the overlying and underlying aquifers), limit our testing of conceptual models and the sensitivity of model parameterization. Thus, we suggest approaches for future studies to better assess the presence or lack of a seal bypass system at this particular site and for other sites in general.« less

  19. Quantification of source impact to PM using three-dimensional weighted factor model analysis on multi-site data

    NASA Astrophysics Data System (ADS)

    Shi, Guoliang; Peng, Xing; Huangfu, Yanqi; Wang, Wei; Xu, Jiao; Tian, Yingze; Feng, Yinchang; Ivey, Cesunica E.; Russell, Armistead G.

    2017-07-01

    Source apportionment technologies are used to understand the impacts of important sources of particulate matter (PM) air quality, and are widely used for both scientific studies and air quality management. Generally, receptor models apportion speciated PM data from a single sampling site. With the development of large scale monitoring networks, PM speciation are observed at multiple sites in an urban area. For these situations, the models should account for three factors, or dimensions, of the PM, including the chemical species concentrations, sampling periods and sampling site information, suggesting the potential power of a three-dimensional source apportionment approach. However, the principle of three-dimensional Parallel Factor Analysis (Ordinary PARAFAC) model does not always work well in real environmental situations for multi-site receptor datasets. In this work, a new three-way receptor model, called "multi-site three way factor analysis" model is proposed to deal with the multi-site receptor datasets. Synthetic datasets were developed and introduced into the new model to test its performance. Average absolute error (AAE, between estimated and true contributions) for extracted sources were all less than 50%. Additionally, three-dimensional ambient datasets from a Chinese mega-city, Chengdu, were analyzed using this new model to assess the application. Four factors are extracted by the multi-site WFA3 model: secondary source have the highest contributions (64.73 and 56.24 μg/m3), followed by vehicular exhaust (30.13 and 33.60 μg/m3), crustal dust (26.12 and 29.99 μg/m3) and coal combustion (10.73 and 14.83 μg/m3). The model was also compared to PMF, with general agreement, though PMF suggested a lower crustal contribution.

  20. A Framework for Multi-Scale, Multi-Disciplinary Arctic Terrestrial Field Research Design, Nomenclature and Data Management

    NASA Astrophysics Data System (ADS)

    Charsley-Groffman, L.; Killeffer, T.; Wullschleger, S. D.; Wilson, C. J.

    2016-12-01

    The Next Generation Ecosystem Experiment, NGEE Arctic, project aims to improve the representation of arctic terrestrial processes and properties in Earth System Models, ESMs, through coordinated multi-disciplinary field-based observations and experiments. NGEE involves nearly one hundred research staff, post docs and students from multiple DOE laboratories and universities who deploy a wide range of in-situ and remote field observation techniques to quantify and understand interactions between the climate system and surface and subsurface coupled thermal-hydrologic, biogeochemical and vegetation processes. Careful attention was given to the design and management of co-located long-term and one off data collection efforts, as well as their data streams. Field research sites at the Barrow Environmental Observatory near Barrow AK and on the Seward Peninsula were designed around the concept of "ecotypes" which co-evolved with readily identified and classified hydro-geomorphic features characteristic of arctic landscapes. NGEE sub-teams focused on 5 unique science questions collaborated to design field sites and develop naming conventions for locations and data types to develop coherent data sets to parameterize, initialize and test a range of site-specific process resolving models to ESMs. Multi-layer mapping products were a critical means of developing a coordinated and coherent observation design, and a centralized data portal and data reporting framework was critical to ensuring meaningful data products for NGEE modelers and Arctic scientific community at large. We present examples of what works and lessons learned for a large multi-disciplinary terrestrial observational research project in the Arctic.

  1. Show me the data: advances in multi-model benchmarking, assimilation, and forecasting

    NASA Astrophysics Data System (ADS)

    Dietze, M.; Raiho, A.; Fer, I.; Cowdery, E.; Kooper, R.; Kelly, R.; Shiklomanov, A. N.; Desai, A. R.; Simkins, J.; Gardella, A.; Serbin, S.

    2016-12-01

    Researchers want their data to inform carbon cycle predictions, but there are considerable bottlenecks between data collection and the use of data to calibrate and validate earth system models and inform predictions. This talk highlights recent advancements in the PEcAn project aimed at it making it easier for individual researchers to confront models with their own data: (1) The development of an easily extensible site-scale benchmarking system aimed at ensuring that models capture process rather than just reproducing pattern; (2) Efficient emulator-based Bayesian parameter data assimilation to constrain model parameters; (3) A novel, generalized approach to ensemble data assimilation to estimate carbon pools and fluxes and quantify process error; (4) automated processing and downscaling of CMIP climate scenarios to support forecasts that include driver uncertainty; (5) a large expansion in the number of models supported, with new tools for conducting multi-model and multi-site analyses; and (6) a network-based architecture that allows analyses to be shared with model developers and other collaborators. Application of these methods is illustrated with data across a wide range of time scales, from eddy-covariance to forest inventories to tree rings to paleoecological pollen proxies.

  2. MRIQC: Advancing the automatic prediction of image quality in MRI from unseen sites

    PubMed Central

    2017-01-01

    Quality control of MRI is essential for excluding problematic acquisitions and avoiding bias in subsequent image processing and analysis. Visual inspection is subjective and impractical for large scale datasets. Although automated quality assessments have been demonstrated on single-site datasets, it is unclear that solutions can generalize to unseen data acquired at new sites. Here, we introduce the MRI Quality Control tool (MRIQC), a tool for extracting quality measures and fitting a binary (accept/exclude) classifier. Our tool can be run both locally and as a free online service via the OpenNeuro.org portal. The classifier is trained on a publicly available, multi-site dataset (17 sites, N = 1102). We perform model selection evaluating different normalization and feature exclusion approaches aimed at maximizing across-site generalization and estimate an accuracy of 76%±13% on new sites, using leave-one-site-out cross-validation. We confirm that result on a held-out dataset (2 sites, N = 265) also obtaining a 76% accuracy. Even though the performance of the trained classifier is statistically above chance, we show that it is susceptible to site effects and unable to account for artifacts specific to new sites. MRIQC performs with high accuracy in intra-site prediction, but performance on unseen sites leaves space for improvement which might require more labeled data and new approaches to the between-site variability. Overcoming these limitations is crucial for a more objective quality assessment of neuroimaging data, and to enable the analysis of extremely large and multi-site samples. PMID:28945803

  3. Large scale and cloud-based multi-model analytics experiments on climate change data in the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Płóciennik, Marcin; Doutriaux, Charles; Blanquer, Ignacio; Barbera, Roberto; Donvito, Giacinto; Williams, Dean N.; Anantharaj, Valentine; Salomoni, Davide D.; Aloisio, Giovanni

    2017-04-01

    In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated, such as the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). A case study on climate models intercomparison data analysis addressing several classes of multi-model experiments is being implemented in the context of the EU H2020 INDIGO-DataCloud project. Such experiments require the availability of large amount of data (multi-terabyte order) related to the output of several climate models simulations as well as the exploitation of scientific data management tools for large-scale data analytics. More specifically, the talk discusses in detail a use case on precipitation trend analysis in terms of requirements, architectural design solution, and infrastructural implementation. The experiment has been tested and validated on CMIP5 datasets, in the context of a large scale distributed testbed across EU and US involving three ESGF sites (LLNL, ORNL, and CMCC) and one central orchestrator site (PSNC). The general "environment" of the case study relates to: (i) multi-model data analysis inter-comparison challenges; (ii) addressed on CMIP5 data; and (iii) which are made available through the IS-ENES/ESGF infrastructure. The added value of the solution proposed in the INDIGO-DataCloud project are summarized in the following: (i) it implements a different paradigm (from client- to server-side); (ii) it intrinsically reduces data movement; (iii) it makes lightweight the end-user setup; (iv) it fosters re-usability (of data, final/intermediate products, workflows, sessions, etc.) since everything is managed on the server-side; (v) it complements, extends and interoperates with the ESGF stack; (vi) it provides a "tool" for scientists to run multi-model experiments, and finally; and (vii) it can drastically reduce the time-to-solution for these experiments from weeks to hours. At the time the contribution is being written, the proposed testbed represents the first concrete implementation of a distributed multi-model experiment in the ESGF/CMIP context joining server-side and parallel processing, end-to-end workflow management and cloud computing. As opposed to the current scenario based on search & discovery, data download, and client-based data analysis, the INDIGO-DataCloud architectural solution described in this contribution addresses the scientific computing & analytics requirements by providing a paradigm shift based on server-side and high performance big data frameworks jointly with two-level workflow management systems realized at the PaaS level via a cloud infrastructure.

  4. Sensitivity of the Modified Children's Yale-Brown Obsessive Compulsive Scale to Detect Change: Results from Two Multi-Site Trials

    ERIC Educational Resources Information Center

    Scahill, Lawrence; Sukhodolsky, Denis G.; Anderberg, Emily; Dimitropoulos, Anastasia; Dziura, James; Aman, Michael G.; McCracken, James; Tierney, Elaine; Hallett, Victoria; Katz, Karol; Vitiello, Benedetto; McDougle, Christopher

    2016-01-01

    Repetitive behavior is a core feature of autism spectrum disorder. We used 8-week data from two federally funded, multi-site, randomized trials with risperidone conducted by the Research Units on Pediatric Psychopharmacology Autism Network to evaluate the sensitivity of the Children's Yale-Brown Obsessive Compulsive Scale modified for autism…

  5. Classification of riparian forest species and health condition using multi-temporal and hyperspatial imagery from unmanned aerial system.

    PubMed

    Michez, Adrien; Piégay, Hervé; Lisein, Jonathan; Claessens, Hugues; Lejeune, Philippe

    2016-03-01

    Riparian forests are critically endangered many anthropogenic pressures and natural hazards. The importance of riparian zones has been acknowledged by European Directives, involving multi-scale monitoring. The use of this very-high-resolution and hyperspatial imagery in a multi-temporal approach is an emerging topic. The trend is reinforced by the recent and rapid growth of the use of the unmanned aerial system (UAS), which has prompted the development of innovative methodology. Our study proposes a methodological framework to explore how a set of multi-temporal images acquired during a vegetative period can differentiate some of the deciduous riparian forest species and their health conditions. More specifically, the developed approach intends to identify, through a process of variable selection, which variables derived from UAS imagery and which scale of image analysis are the most relevant to our objectives.The methodological framework is applied to two study sites to describe the riparian forest through two fundamental characteristics: the species composition and the health condition. These characteristics were selected not only because of their use as proxies for the riparian zone ecological integrity but also because of their use for river management.The comparison of various scales of image analysis identified the smallest object-based image analysis (OBIA) objects (ca. 1 m(2)) as the most relevant scale. Variables derived from spectral information (bands ratios) were identified as the most appropriate, followed by variables related to the vertical structure of the forest. Classification results show good overall accuracies for the species composition of the riparian forest (five classes, 79.5 and 84.1% for site 1 and site 2). The classification scenario regarding the health condition of the black alders of the site 1 performed the best (90.6%).The quality of the classification models developed with a UAS-based, cost-effective, and semi-automatic approach competes successfully with those developed using more expensive imagery, such as multi-spectral and hyperspectral airborne imagery. The high overall accuracy results obtained by the classification of the diseased alders open the door to applications dedicated to monitoring of the health conditions of riparian forest. Our methodological framework will allow UAS users to manage large imagery metric datasets derived from those dense time series.

  6. Tigers Need Cover: Multi-Scale Occupancy Study of the Big Cat in Sumatran Forest and Plantation Landscapes

    PubMed Central

    Sunarto, Sunarto; Kelly, Marcella J.; Parakkasi, Karmila; Klenzendorf, Sybille; Septayuda, Eka; Kurniawan, Harry

    2012-01-01

    The critically endangered Sumatran tiger (Panthera tigris sumatrae Pocock, 1929) is generally known as a forest-dependent animal. With large-scale conversion of forests into plantations, however, it is crucial for restoration efforts to understand to what extent tigers use modified habitats. We investigated tiger-habitat relationships at 2 spatial scales: occupancy across the landscape and habitat use within the home range. Across major landcover types in central Sumatra, we conducted systematic detection, non-detection sign surveys in 47, 17×17 km grid cells. Within each cell, we surveyed 40, 1-km transects and recorded tiger detections and habitat variables in 100 m segments totaling 1,857 km surveyed. We found that tigers strongly preferred forest and used plantations of acacia and oilpalm, far less than their availability. Tiger probability of occupancy covaried positively and strongly with altitude, positively with forest area, and negatively with distance-to-forest centroids. At the fine scale, probability of habitat use by tigers across landcover types covaried positively and strongly with understory cover and altitude, and negatively and strongly with human settlement. Within forest areas, tigers strongly preferred sites that are farther from water bodies, higher in altitude, farther from edge, and closer to centroid of large forest block; and strongly preferred sites with thicker understory cover, lower level of disturbance, higher altitude, and steeper slope. These results indicate that to thrive, tigers depend on the existence of large contiguous forest blocks, and that with adjustments in plantation management, tigers could use mosaics of plantations (as additional roaming zones), riparian forests (as corridors) and smaller forest patches (as stepping stones), potentially maintaining a metapopulation structure in fragmented landscapes. This study highlights the importance of a multi-spatial scale analysis and provides crucial information relevant to restoring tigers and other wildlife in forest and plantation landscapes through improvement in habitat extent, quality, and connectivity. PMID:22292063

  7. Tigers need cover: multi-scale occupancy study of the big cat in Sumatran forest and plantation landscapes.

    PubMed

    Sunarto, Sunarto; Kelly, Marcella J; Parakkasi, Karmila; Klenzendorf, Sybille; Septayuda, Eka; Kurniawan, Harry

    2012-01-01

    The critically endangered Sumatran tiger (Panthera tigris sumatrae Pocock, 1929) is generally known as a forest-dependent animal. With large-scale conversion of forests into plantations, however, it is crucial for restoration efforts to understand to what extent tigers use modified habitats. We investigated tiger-habitat relationships at 2 spatial scales: occupancy across the landscape and habitat use within the home range. Across major landcover types in central Sumatra, we conducted systematic detection, non-detection sign surveys in 47, 17×17 km grid cells. Within each cell, we surveyed 40, 1-km transects and recorded tiger detections and habitat variables in 100 m segments totaling 1,857 km surveyed. We found that tigers strongly preferred forest and used plantations of acacia and oilpalm, far less than their availability. Tiger probability of occupancy covaried positively and strongly with altitude, positively with forest area, and negatively with distance-to-forest centroids. At the fine scale, probability of habitat use by tigers across landcover types covaried positively and strongly with understory cover and altitude, and negatively and strongly with human settlement. Within forest areas, tigers strongly preferred sites that are farther from water bodies, higher in altitude, farther from edge, and closer to centroid of large forest block; and strongly preferred sites with thicker understory cover, lower level of disturbance, higher altitude, and steeper slope. These results indicate that to thrive, tigers depend on the existence of large contiguous forest blocks, and that with adjustments in plantation management, tigers could use mosaics of plantations (as additional roaming zones), riparian forests (as corridors) and smaller forest patches (as stepping stones), potentially maintaining a metapopulation structure in fragmented landscapes. This study highlights the importance of a multi-spatial scale analysis and provides crucial information relevant to restoring tigers and other wildlife in forest and plantation landscapes through improvement in habitat extent, quality, and connectivity.

  8. Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network

    NASA Astrophysics Data System (ADS)

    Jones, A. S.; Horsburgh, J. S.

    2014-12-01

    Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban Transitions (GAMUT). The variety of environmental sensors and the multi-watershed, multi-institutional nature of the network necessitate a well-planned and efficient workflow for acquiring, managing, and sharing sensor data, which should be useful for similar large-scale and long-term networks.

  9. A Large-scale Plume in an X-class Solar Flare

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fleishman, Gregory D.; Nita, Gelu M.; Gary, Dale E.

    Ever-increasing multi-frequency imaging of solar observations suggests that solar flares often involve more than one magnetic fluxtube. Some of the fluxtubes are closed, while others can contain open fields. The relative proportion of nonthermal electrons among those distinct loops is highly important for understanding energy release, particle acceleration, and transport. The access of nonthermal electrons to the open field is also important because the open field facilitates the solar energetic particle (SEP) escape from the flaring site, and thus controls the SEP fluxes in the solar system, both directly and as seed particles for further acceleration. The large-scale fluxtubes aremore » often filled with a tenuous plasma, which is difficult to detect in either EUV or X-ray wavelengths; however, they can dominate at low radio frequencies, where a modest component of nonthermal electrons can render the source optically thick and, thus, bright enough to be observed. Here we report the detection of a large-scale “plume” at the impulsive phase of an X-class solar flare, SOL2001-08-25T16:23, using multi-frequency radio data from Owens Valley Solar Array. To quantify the flare’s spatial structure, we employ 3D modeling utilizing force-free-field extrapolations from the line of sight SOHO /MDI magnetograms with our modeling tool GX-Simulator. We found that a significant fraction of the nonthermal electrons that accelerated at the flare site low in the corona escapes to the plume, which contains both closed and open fields. We propose that the proportion between the closed and open fields at the plume is what determines the SEP population escaping into interplanetary space.« less

  10. A Paradox-based data collection and management system for multi-center randomized clinical trials.

    PubMed

    Abdellatif, Mazen; Reda, Domenic J

    2004-02-01

    We have developed a Paradox-based data collection and management system for large-scale multi-site randomized clinical trials. The system runs under Windows operating system and integrates Symantec pcAnywhere32 telecommunications software for data transmission and remote control sessions, PKZIP utility for the compression/decompression of transmitted data, and Stat/Transfer for exporting the centralized Paradox database for analyses. We initially developed this system for VA Cooperative Study #399 'The Effect of Antiarrhythmic Therapy in Maintaining Stability of Sinus Rhythm in Atrial Fibrillation', which collects over 1000 variables on 706 patients at 20 sites. Patient intake for this 5-year study began in March of 1998. We have also developed an enhanced version of this system, which is being used in the NIH-funded 'Glucosamine/Chondroitin Arthritis Intervention Trial (GAIT)' that collects over 1200 variables on 1588 patients at 13 sites. Patient intake for this 4-year study began in October of 2000.

  11. Interpreting forest and grassland biome productivity utilizing nested scales of image resolution and biogeographical analysis

    NASA Technical Reports Server (NTRS)

    Iverson, Louis R.; Cook, Elizabeth A.; Graham, Robin L.; Olson, Jerry S.; Frank, Thomas; Ke, Ying; Treworgy, Colin; Risser, Paul G.

    1987-01-01

    This report summarizes progress made in our investigation of forest productivity assessment using TM and other biogeographical data during the third six-month period of the grant. Data acquisition and methodology hurdles are largely complete. Four study areas for which the appropriate TM and ancillary data were available are currently being intensively analyzed. Significant relationships have been found on a site by site basis to suggest that forest productivity can be qualitatively assessed using TM band values and site characteristics. Perhaps the most promising results relate TM unsupervised classes to forest productivity, with enhancement from elevation data. During the final phases of the research, multi-temporal and regional comparisons of results will be addressed, as well as the predictability of forest productivity patterns over a large region using TM data and/or TM nested within AVHRR data.

  12. Natal and breeding philopatry of female Steller sea lions in southeastern Alaska.

    PubMed

    Hastings, Kelly K; Jemison, Lauri A; Pendleton, Grey W; Raum-Suryan, Kimberly L; Pitcher, Kenneth W

    2017-01-01

    Information on drivers of dispersal is critical for wildlife conservation but is rare for long-lived marine mammal species with large geographic ranges. We fit multi-state mark-recapture models to resighting data of 369 known-aged Steller sea lion (Eumetopias jubatus) females marked as pups on their natal rookeries in southeastern Alaska from 1994-2005 and monitored from 2001-15. We estimated probabilities of females being first observed parous at their natal site (natal philopatry), and of not moving breeding sites among years (breeding philopatry) at large (> 400 km, all five rookeries in southeastern Alaska) and small (< 4 km, all islands within the largest rookery, Forrester Island Complex, F) spatial scales. At the rookery scale, natal philopatry was moderately high (0.776-0.859) for most rookeries and breeding philopatry was nearly 1, with < 3% of females switching breeding rookeries between years. At more populous islands at F, natal philopatry was 0.500-0.684 versus 0.295-0.437 at less populous islands, and breeding philopatry was 0.919-0.926 versus 0.604-0.858. At both spatial scales, the probability of pupping at a non-natal site increased with population size of, and declined with distance from, the destination site. Natal philopatry of < 1 would increase gene flow, improve population resilience, and promote population recovery after decline in a heterogeneous environment. Very high breeding philopatry suggests that familiarity with neighboring females and knowledge of the breeding site (the topography of pupping sites and nearby foraging locations) may be a critical component to reproductive strategies of sea lions.

  13. A second generation experiment in fault-tolerant software

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1986-01-01

    Information was collected on the efficacy of fault-tolerant software by conducting two large-scale controlled experiments. In the first, an empirical study of multi-version software (MVS) was conducted. The second experiment is an empirical evaluation of self testing as a method of error detection (STED). The purpose ot the MVS experiment was to obtain empirical measurement of the performance of multi-version systems. Twenty versions of a program were prepared at four different sites under reasonably realistic development conditions from the same specifications. The purpose of the STED experiment was to obtain empirical measurements of the performance of assertions in error detection. Eight versions of a program were modified to include assertions at two different sites under controlled conditions. The overall structure of the testing environment for the MVS experiment and its status are described. Work to date in the STED experiment is also presented.

  14. Attention/Vigilance in Schizophrenia: Performance Results from a Large Multi-Site Study of the Consortium on the Genetics of Schizophrenia (COGS)

    PubMed Central

    Nuechterlein, Keith H.; Green, Michael F.; Calkins, Monica E.; Greenwood, Tiffany A.; Gur, Raquel E.; Gur, Ruben C.; Lazzeroni, Laura C.; Light, Gregory A.; Radant, Allen D.; Seidman, Larry J.; Siever, Larry J.; Silverman, Jeremy M.; Sprock, Joyce; Stone, William S.; Sugar, Catherine A.; Swerdlow, Neal R.; Tsuang, Debby W.; Tsuang, Ming T.; Turetsky, Bruce I.; Braff, David L.

    2015-01-01

    Attention/vigilance impairments are present in individuals with schizophrenia across psychotic and remitted states and in their first-degree relatives. An important question is whether deficits in attention/vigilance can be consistently and reliably measured across sites varying in many participant demographic, clinical, and functional characteristics, as needed for large-scale genetic studies of endophenotypes. We examined Continuous Performance Test (CPT) data from Phase 2 of the Consortium on the Genetics of Schizophrenia (COGS-2), the largest-scale assessment of cognitive and psychophysiological endophenotypes relevant to schizophrenia. CPT data from 2251 participants from five sites were examined. A perceptual-load vigilance task (the Degraded Stimulus CPT or DS-CPT) and a memory-load vigilance task (CPT - Identical Pairs or CPT-IP) were utilized. Schizophrenia patients performed more poorly than healthy comparison subjects (HCS) across sites, despite significant site differences in participant age, sex, education, and racial distribution. Patient-HCS differences in signal/noise discrimination (d’) in the DS-CPT varied significantly across sites, but averaged a medium effect size. CPT-IP performance showed large patient-HCS differences across sites. Poor CPT performance was independent of or weakly correlated with symptom severity, but was significantly associated with lower educational achievement and functional capacity. Current smoking was associated with poorer CPT-IP d’. Patients taking both atypical and typical antipsychotic medication performed more poorly than those on no or atypical antipsychotic medications, likely reflecting their greater severity of illness. We conclude that CPT deficits in schizophrenia can be reliably detected across sites, are relatively independent of current symptom severity, and are related to functional capacity. PMID:25749017

  15. Attention/vigilance in schizophrenia: performance results from a large multi-site study of the Consortium on the Genetics of Schizophrenia (COGS).

    PubMed

    Nuechterlein, Keith H; Green, Michael F; Calkins, Monica E; Greenwood, Tiffany A; Gur, Raquel E; Gur, Ruben C; Lazzeroni, Laura C; Light, Gregory A; Radant, Allen D; Seidman, Larry J; Siever, Larry J; Silverman, Jeremy M; Sprock, Joyce; Stone, William S; Sugar, Catherine A; Swerdlow, Neal R; Tsuang, Debby W; Tsuang, Ming T; Turetsky, Bruce I; Braff, David L

    2015-04-01

    Attention/vigilance impairments are present in individuals with schizophrenia across psychotic and remitted states and in their first-degree relatives. An important question is whether deficits in attention/vigilance can be consistently and reliably measured across sites varying in many participant demographic, clinical, and functional characteristics, as needed for large-scale genetic studies of endophenotypes. We examined Continuous Performance Test (CPT) data from phase 2 of the Consortium on the Genetics of Schizophrenia (COGS-2), the largest-scale assessment of cognitive and psychophysiological endophenotypes relevant to schizophrenia. The CPT data from 2251 participants from five sites were examined. A perceptual-load vigilance task (the Degraded Stimulus CPT or DS-CPT) and a memory-load vigilance task (CPT-Identical Pairs or CPT-IP) were utilized. Schizophrenia patients performed more poorly than healthy comparison subjects (HCS) across sites, despite significant site differences in participant age, sex, education, and racial distribution. Patient-HCS differences in signal/noise discrimination (d') in the DS-CPT varied significantly across sites, but averaged a medium effect size. CPT-IP performance showed large patient-HCS differences across sites. Poor CPT performance was independent of or weakly correlated with symptom severity, but was significantly associated with lower educational achievement and functional capacity. Current smoking was associated with poorer CPT-IP d'. Patients taking both atypical and typical antipsychotic medication performed more poorly than those on no or atypical antipsychotic medications, likely reflecting their greater severity of illness. We conclude that CPT deficits in schizophrenia can be reliably detected across sites, are relatively independent of current symptom severity, and are related to functional capacity. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Synoptic-scale circulation patterns during summer derived from tree rings in mid-latitude Asia

    NASA Astrophysics Data System (ADS)

    Seim, Andrea; Schultz, Johannes A.; Leland, Caroline; Davi, Nicole; Byambasuren, Oyunsanaa; Liang, Eryuan; Wang, Xiaochun; Beck, Christoph; Linderholm, Hans W.; Pederson, Neil

    2017-09-01

    Understanding past and recent climate and atmospheric circulation variability is vital for regions that are affected by climate extremes. In mid-latitude Asia, however, the synoptic climatology is complex and not yet fully understood. The aim of this study was to investigate dominant synoptic-scale circulation patterns during the summer season using a multi-species tree-ring width (TRW) network comprising 78 sites from mid-latitude Asia. For each TRW chronology, we calculated an atmospheric circulation tree-ring index (ACTI), based on 1000 hPa geopotential height data, to directly link tree growth to 13 summertime weather types and their associated local climate conditions for the period 1871-1993. Using the ACTI, three groups of similarly responding tree-ring sites can be associated with distinct large-scale atmospheric circulation patterns: 1. growth of drought sensitive trees is positively affected by a cyclone over northern Russia; 2. temperature sensitive trees show positive associations to a cyclone over northwestern Russia and an anticyclone over Mongolia; 3. trees at two high elevation sites show positive relations to a zonal cyclone extending from mid-latitude Eurasia to the West Pacific. The identified synoptic-scale circulation patterns showed spatiotemporal variability in their intensity and position, causing temporally varying climate conditions in mid-latitude Asia. Our results highlight that for regions with less pronounced atmospheric action centers during summer such as the occurrence of large-scale cyclones and anticyclones, synoptic-scale circulation patterns can be extracted and linked to the Northern Hemisphere circulation system. Thus, we provide a new and solid envelope for climate studies covering the past to the future.

  17. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  18. Assessment of multiple geophysical techniques for the characterization of municipal waste deposit sites

    NASA Astrophysics Data System (ADS)

    Gaël, Dumont; Tanguy, Robert; Nicolas, Marck; Frédéric, Nguyen

    2017-10-01

    In this study, we tested the ability of geophysical methods to characterize a large technical landfill installed in a former sand quarry. The geophysical surveys specifically aimed at delimitating the deposit site horizontal extension, at estimating its thickness and at characterizing the waste material composition (the moisture content in the present case). The site delimitation was conducted with electromagnetic (in-phase and out-of-phase) and magnetic (vertical gradient and total field) methods that clearly showed the transition between the waste deposit and the host formation. Regarding waste deposit thickness evaluation, electrical resistivity tomography appeared inefficient on this particularly thick deposit site. Thus, we propose a combination of horizontal to vertical noise spectral ratio (HVNSR) and multichannel analysis of the surface waves (MASW), which successfully determined the approximate waste deposit thickness in our test landfill. However, ERT appeared to be an appropriate tool to characterize the moisture content of the waste, which is of prior information for the organic waste biodegradation process. The global multi-scale and multi-method geophysical survey offers precious information for site rehabilitation studies, water content mitigation processes for enhanced biodegradation or landfill mining operation planning.

  19. Piezometer completion report for borehole cluster sites DC-19, DC-20, and DC-22

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, R.L.; Diediker, L.D.; Ledgerwood, R.K.

    1984-07-01

    This report describes the design and installation of multi-level piezometers at borehole cluster sites DC-19, DC-20 and DC-22. The network of borehole cluster sites will provide facilities for multi-level water-level monitoring across the RRL for piezometer baseline monitoring and for large-scale hydraulic stress testing. These groundwater-monitoring facilities were installed between August 1983 and March 1984. Three series of piezometer nests (A-, C- and D-series) were installed in nine hydrogeologic units (monitoring horizons) within the Columbia River Basalt Group at each borehole cluster site. In addition to the piezometer facilities, a B-series pumping well was installed at borehole cluster sites DC-20more » and DC-22. The A-series piezometer nest monitors the basal Ringold sediments and the Rattlesnake Ridge interbed. The C-series piezometer nests monitors the six deepest horizons, which are in increasing depth, the Priest Rapids interflow, Sentinel Gap flow top, Ginkgo flow top, Rocky Coulee flow top, Cohassett flow top and Umtanum flow top. The D-series piezometer monitors the Mabton interbed. The B-series pumping well was completed in the Priest Rapids interflow. 21 refs., 6 figs., 6 tabs.« less

  20. Multi-level discriminative dictionary learning with application to large scale image classification.

    PubMed

    Shen, Li; Sun, Gang; Huang, Qingming; Wang, Shuhui; Lin, Zhouchen; Wu, Enhua

    2015-10-01

    The sparse coding technique has shown flexibility and capability in image representation and analysis. It is a powerful tool in many visual applications. Some recent work has shown that incorporating the properties of task (such as discrimination for classification task) into dictionary learning is effective for improving the accuracy. However, the traditional supervised dictionary learning methods suffer from high computation complexity when dealing with large number of categories, making them less satisfactory in large scale applications. In this paper, we propose a novel multi-level discriminative dictionary learning method and apply it to large scale image classification. Our method takes advantage of hierarchical category correlation to encode multi-level discriminative information. Each internal node of the category hierarchy is associated with a discriminative dictionary and a classification model. The dictionaries at different layers are learnt to capture the information of different scales. Moreover, each node at lower layers also inherits the dictionary of its parent, so that the categories at lower layers can be described with multi-scale information. The learning of dictionaries and associated classification models is jointly conducted by minimizing an overall tree loss. The experimental results on challenging data sets demonstrate that our approach achieves excellent accuracy and competitive computation cost compared with other sparse coding methods for large scale image classification.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marrinan, Thomas; Leigh, Jason; Renambot, Luc

    Mixed presence collaboration involves remote collaboration between multiple collocated groups. This paper presents the design and results of a user study that focused on mixed presence collaboration using large-scale tiled display walls. The research was conducted in order to compare data synchronization schemes for multi-user visualization applications. Our study compared three techniques for sharing data between display spaces with varying constraints and affordances. The results provide empirical evidence that using data sharing techniques with continuous synchronization between the sites lead to improved collaboration for a search and analysis task between remotely located groups. We have also identified aspects of synchronizedmore » sessions that result in increased remote collaborator awareness and parallel task coordination. It is believed that this research will lead to better utilization of large-scale tiled display walls for distributed group work.« less

  2. Real-time adaptive ramp metering : phase I, MILOS proof of concept (multi-objective, integrated, large-scale, optimized system).

    DOT National Transportation Integrated Search

    2006-12-01

    Over the last several years, researchers at the University of Arizonas ATLAS Center have developed an adaptive ramp : metering system referred to as MILOS (Multi-Objective, Integrated, Large-Scale, Optimized System). The goal of this project : is ...

  3. Multi-scale groundwater flow modeling during temperate climate conditions for the safety assessment of the proposed high-level nuclear waste repository site at Forsmark, Sweden

    NASA Astrophysics Data System (ADS)

    Joyce, Steven; Hartley, Lee; Applegate, David; Hoek, Jaap; Jackson, Peter

    2014-09-01

    Forsmark in Sweden has been proposed as the site of a geological repository for spent high-level nuclear fuel, to be located at a depth of approximately 470 m in fractured crystalline rock. The safety assessment for the repository has required a multi-disciplinary approach to evaluate the impact of hydrogeological and hydrogeochemical conditions close to the repository and in a wider regional context. Assessing the consequences of potential radionuclide releases requires quantitative site-specific information concerning the details of groundwater flow on the scale of individual waste canister locations (1-10 m) as well as details of groundwater flow and composition on the scale of groundwater pathways between the facility and the surface (500 m to 5 km). The purpose of this article is to provide an illustration of multi-scale modeling techniques and the results obtained when combining aspects of local-scale flows in fractures around a potential contaminant source with regional-scale groundwater flow and transport subject to natural evolution of the system. The approach set out is novel, as it incorporates both different scales of model and different levels of detail, combining discrete fracture network and equivalent continuous porous medium representations of fractured bedrock.

  4. Predictors of breeding site occupancy by amphibians in montane landscapes

    USGS Publications Warehouse

    Groff, Luke A.; Loftin, Cynthia S.; Calhoun, Aram J.K.

    2017-01-01

    Ecological relationships and processes vary across species’ geographic distributions, life stages and spatial, and temporal scales. Montane landscapes are characterized by low wetland densities, rugged topographies, and cold climates. Consequently, aquatic-dependent and low-vagility ectothermic species (e.g., pool-breeding amphibians) may exhibit unique ecological associations in montane landscapes. We evaluated the relative importance of breeding- and landscape-scale features associated with spotted salamander (Ambystoma maculatum) and wood frog (Lithobates sylvaticus) wetland occupancy in Maine's Upper Montane-Alpine Zone ecoregion, and we determined whether models performed better when the inclusive landscape-scale covariates were estimated with topography-weighted or circular buffers. We surveyed 135 potential breeding sites during May 2013–June 2014 and evaluated environmental relationships with multi-season implicit dynamics occupancy models. Breeding site occupancy by both species was influenced solely by breeding-scale habitat features. Spotted salamander occupancy probabilities increased with previous or current beaver (Castor canadensis) presence, and models generally were better supported when the inclusive landscape-scale covariates were estimated with topography-weighted rather than circular buffers. Wood frog occupancy probabilities increased with site area and percent shallows, but neither buffer type was better supported than the other. Model rank order and support varied between buffer types, but model inferences did not. Our results suggest pool-breeding amphibian conservation in montane Maine include measures to maintain beaver populations and large wetlands with proportionally large areas of shallows ≤1-m deep. Inconsistencies between our study and previous studies substantiate the value of region-specific research for augmenting species’ conservation management plans and suggest the application of out-of-region inferences may promote ineffective conservation. 

  5. A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales

    PubMed Central

    Ayton, Gary S.; Voth, Gregory A.

    2009-01-01

    A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167

  6. HammerCloud: A Stress Testing System for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel C.; Elmsheuser, Johannes; Úbeda García, Mario; Paladin, Massimo

    2011-12-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  7. Contrasting Patterns of Damage and Recovery in Logged Amazon Forests From Small Footprint LiDAR Data

    NASA Technical Reports Server (NTRS)

    Morton, D. C.; Keller, M.; Cook, B. D.; Hunter, Maria; Sales, Marcio; Spinelli, L.; Victoria, D.; Andersen, H.-E.; Saleska, S.

    2012-01-01

    Tropical forests ecosystems respond dynamically to climate variability and disturbances on time scales of minutes to millennia. To date, our knowledge of disturbance and recovery processes in tropical forests is derived almost exclusively from networks of forest inventory plots. These plots typically sample small areas (less than or equal to 1 ha) in conservation units that are protected from logging and fire. Amazon forests with frequent disturbances from human activity remain under-studied. Ongoing negotiations on REDD+ (Reducing Emissions from Deforestation and Forest Degradation plus enhancing forest carbon stocks) have placed additional emphasis on identifying degraded forests and quantifying changing carbon stocks in both degraded and intact tropical forests. We evaluated patterns of forest disturbance and recovery at four -1000 ha sites in the Brazilian Amazon using small footprint LiDAR data and coincident field measurements. Large area coverage with airborne LiDAR data in 2011-2012 included logged and unmanaged areas in Cotriguacu (Mato Grosso), Fiona do Jamari (Rondonia), and Floresta Estadual do Antimary (Acre), and unmanaged forest within Reserva Ducke (Amazonas). Logging infrastructure (skid trails, log decks, and roads) was identified using LiDAR returns from understory vegetation and validated based on field data. At each logged site, canopy gaps from logging activity and LiDAR metrics of canopy heights were used to quantify differences in forest structure between logged and unlogged areas. Contrasting patterns of harvesting operations and canopy damages at the three logged sites reflect different levels of pre-harvest planning (i.e., informal logging compared to state or national logging concessions), harvest intensity, and site conditions. Finally, we used multi-temporal LiDAR data from two sites, Reserva Ducke (2009, 2012) and Antimary (2010, 2011), to evaluate gap phase dynamics in unmanaged forest areas. The rates and patterns of canopy gap formation at these sites illustrate potential issues for separating logging damages from natural forest disturbances over longer time scales. Multi-temporal airborne LiDAR data and coincident field measurements provide complementary perspectives on disturbance and recovery processes in intact and degraded Amazon forests. Compared to forest inventory plots, the large size of each individual site permitted analyses of landscape-scale processes that would require extremely high investments to study using traditional forest inventory methods.

  8. Evaluating the Performance of the Goddard Multi-Scale Modeling Framework against GPM, TRMM and CloudSat/CALIPSO Products

    NASA Astrophysics Data System (ADS)

    Chern, J. D.; Tao, W. K.; Lang, S. E.; Matsui, T.; Mohr, K. I.

    2014-12-01

    Four six-month (March-August 2014) experiments with the Goddard Multi-scale Modeling Framework (MMF) were performed to study the impacts of different Goddard one-moment bulk microphysical schemes and large-scale forcings on the performance of the MMF. Recently a new Goddard one-moment bulk microphysics with four-ice classes (cloud ice, snow, graupel, and frozen drops/hail) has been developed based on cloud-resolving model simulations with large-scale forcings from field campaign observations. The new scheme has been successfully implemented to the MMF and two MMF experiments were carried out with this new scheme and the old three-ice classes (cloud ice, snow graupel) scheme. The MMF has global coverage and can rigorously evaluate microphysics performance for different cloud regimes. The results show MMF with the new scheme outperformed the old one. The MMF simulations are also strongly affected by the interaction between large-scale and cloud-scale processes. Two MMF sensitivity experiments with and without nudging large-scale forcings to those of ERA-Interim reanalysis were carried out to study the impacts of large-scale forcings. The model simulated mean and variability of surface precipitation, cloud types, cloud properties such as cloud amount, hydrometeors vertical profiles, and cloud water contents, etc. in different geographic locations and climate regimes are evaluated against GPM, TRMM, CloudSat/CALIPSO satellite observations. The Goddard MMF has also been coupled with the Goddard Satellite Data Simulation Unit (G-SDSU), a system with multi-satellite, multi-sensor, and multi-spectrum satellite simulators. The statistics of MMF simulated radiances and backscattering can be directly compared with satellite observations to assess the strengths and/or deficiencies of MMF simulations and provide guidance on how to improve the MMF and microphysics.

  9. Financial Management of a Large Multi-site Randomized Clinical Trial

    PubMed Central

    Sheffet, Alice J.; Flaxman, Linda; Tom, MeeLee; Hughes, Susan E.; Longbottom, Mary E.; Howard, Virginia J.; Marler, John R.; Brott, Thomas G.

    2014-01-01

    Background The Carotid Revascularization Endarterectomy versus Stenting Trial (CREST) received five years’ funding ($21,112,866) from the National Institutes of Health to compare carotid stenting to surgery for stroke prevention in 2,500 randomized participants at 40 sites. Aims Herein we evaluate the change in the CREST budget from a fixed to variable-cost model and recommend strategies for the financial management of large-scale clinical trials. Methods Projections of the original grant’s fixed-cost model were compared to the actual costs of the revised variable-cost model. The original grant’s fixed-cost budget included salaries, fringe benefits, and other direct and indirect costs. For the variable-cost model, the costs were actual payments to the clinical sites and core centers based upon actual trial enrollment. We compared annual direct and indirect costs and per-patient cost for both the fixed and variable models. Differences between clinical site and core center expenditures were also calculated. Results Using a variable-cost budget for clinical sites, funding was extended by no-cost extension from five to eight years. Randomizing sites tripled from 34 to 109. Of the 2,500 targeted sample size, 138 (5.5%) were randomized during the first five years and 1,387 (55.5%) during the no-cost extension. The actual per-patient costs of the variable model were 9% ($13,845) of the projected per-patient costs ($152,992) of the fixed model. Conclusions Performance-based budgets conserve funding, promote compliance, and allow for additional sites at modest additional cost. Costs of large-scale clinical trials can thus be reduced through effective management without compromising scientific integrity. PMID:24661748

  10. Bridging Scales: A Model-Based Assessment of the Technical Tidal-Stream Energy Resource off Massachusetts, USA

    NASA Astrophysics Data System (ADS)

    Cowles, G. W.; Hakim, A.; Churchill, J. H.

    2016-02-01

    Tidal in-stream energy conversion (TISEC) facilities provide a highly predictable and dependable source of energy. Given the economic and social incentives to migrate towards renewable energy sources there has been tremendous interest in the technology. Key challenges to the design process stem from the wide range of problem scales extending from device to array. In the present approach we apply a multi-model approach to bridge the scales of interest and select optimal device geometries to estimate the technical resource for several realistic sites in the coastal waters of Massachusetts, USA. The approach links two computational models. To establish flow conditions at site scales ( 10m), a barotropic setup of the unstructured grid ocean model FVCOM is employed. The model is validated using shipboard and fixed ADCP as well as pressure data. For device scale, the structured multiblock flow solver SUmb is selected. A large ensemble of simulations of 2D cross-flow tidal turbines is used to construct a surrogate design model. The surrogate model is then queried using velocity profiles extracted from the tidal model to determine the optimal geometry for the conditions at each site. After device selection, the annual technical yield of the array is evaluated with FVCOM using a linear momentum actuator disk approach to model the turbines. Results for several key Massachusetts sites including comparison with theoretical approaches will be presented.

  11. Early Intervention to Reduce Alcohol Misuse and Abuse in the Ohio Army National Guard

    DTIC Science & Technology

    2016-09-01

    Brittany Brownrigg, Data Coordinator at Case Western Reserve University, with the opportunity to work on a large-scale, multi-site, USAMRAA funded study ...CHCR have completed the intro video which will be posted on the app stores. The following meetings took place between study staff and OHARNG members...background/rationale, the design and methodology of the study and study progress as of that date. She also got their feedback on the first iteration of

  12. Distribution of guidance models for cardiac resynchronization therapy in the setting of multi-center clinical trials

    NASA Astrophysics Data System (ADS)

    Rajchl, Martin; Abhari, Kamyar; Stirrat, John; Ukwatta, Eranga; Cantor, Diego; Li, Feng P.; Peters, Terry M.; White, James A.

    2014-03-01

    Multi-center trials provide the unique ability to investigate novel techniques across a range of geographical sites with sufficient statistical power, the inclusion of multiple operators determining feasibility under a wider array of clinical environments and work-flows. For this purpose, we introduce a new means of distributing pre-procedural cardiac models for image-guided interventions across a large scale multi-center trial. In this method, a single core facility is responsible for image processing, employing a novel web-based interface for model visualization and distribution. The requirements for such an interface, being WebGL-based, are minimal and well within the realms of accessibility for participating centers. We then demonstrate the accuracy of our approach using a single-center pacemaker lead implantation trial with generic planning models.

  13. Evaluating a complex, multi-site, community-based program to improve healthcare quality: the summative research design for the Aligning Forces for Quality initiative.

    PubMed

    Scanlon, Dennis P; Wolf, Laura J; Alexander, Jeffrey A; Christianson, Jon B; Greene, Jessica; Jean-Jacques, Muriel; McHugh, Megan; Shi, Yunfeng; Leitzell, Brigitt; Vanderbrink, Jocelyn M

    2016-08-01

    The Aligning Forces for Quality (AF4Q) initiative was the Robert Wood Johnson Foundation's (RWJF's) signature effort to increase the overall quality of healthcare in targeted communities throughout the country. In addition to sponsoring this 16-site complex program, RWJF funded an independent scientific evaluation to support objective research on the initiative's effectiveness and contributions to basic knowledge in 5 core programmatic areas. The research design, data, and challenges faced during the summative evaluation phase of this near decade-long program are discussed. A descriptive overview of the summative research design and its development for a multi-site, community-based, healthcare quality improvement initiative is provided. The summative research design employed by the evaluation team is discussed. The evaluation team's summative research design involved a data-driven assessment of the effectiveness of the AF4Q program at large, assessments of the impact of AF4Q in the specific programmatic areas, and an assessment of how the AF4Q alliances were positioned for the future at the end of the program. The AF4Q initiative was the largest privately funded community-based healthcare improvement initiative in the United States to date and was implemented at a time of rapid change in national healthcare policy. The implementation of large-scale, multi-site initiatives is becoming an increasingly common approach for addressing problems in healthcare. The summative evaluation research design for the AF4Q initiative, and the lessons learned from its approach, may be valuable to others tasked with evaluating similarly complex community-based initiatives.

  14. Large Scale Crop Classification in Ukraine using Multi-temporal Landsat-8 Images with Missing Data

    NASA Astrophysics Data System (ADS)

    Kussul, N.; Skakun, S.; Shelestov, A.; Lavreniuk, M. S.

    2014-12-01

    At present, there are no globally available Earth observation (EO) derived products on crop maps. This issue is being addressed within the Sentinel-2 for Agriculture initiative where a number of test sites (including from JECAM) participate to provide coherent protocols and best practices for various global agriculture systems, and subsequently crop maps from Sentinel-2. One of the problems in dealing with optical images for large territories (more than 10,000 sq. km) is the presence of clouds and shadows that result in having missing values in data sets. In this abstract, a new approach to classification of multi-temporal optical satellite imagery with missing data due to clouds and shadows is proposed. First, self-organizing Kohonen maps (SOMs) are used to restore missing pixel values in a time series of satellite imagery. SOMs are trained for each spectral band separately using non-missing values. Missing values are restored through a special procedure that substitutes input sample's missing components with neuron's weight coefficients. After missing data restoration, a supervised classification is performed for multi-temporal satellite images. For this, an ensemble of neural networks, in particular multilayer perceptrons (MLPs), is proposed. Ensembling of neural networks is done by the technique of average committee, i.e. to calculate the average class probability over classifiers and select the class with the highest average posterior probability for the given input sample. The proposed approach is applied for large scale crop classification using multi temporal Landsat-8 images for the JECAM test site in Ukraine [1-2]. It is shown that ensemble of MLPs provides better performance than a single neural network in terms of overall classification accuracy and kappa coefficient. The obtained classification map is also validated through estimated crop and forest areas and comparison to official statistics. 1. A.Yu. Shelestov et al., "Geospatial information system for agricultural monitoring," Cybernetics Syst. Anal., vol. 49, no. 1, pp. 124-132, 2013. 2. J. Gallego et al., "Efficiency Assessment of Different Approaches to Crop Classification Based on Satellite and Ground Observations," J. Autom. Inform. Scie., vol. 44, no. 5, pp. 67-80, 2012.

  15. Natal and breeding philopatry of female Steller sea lions in southeastern Alaska

    PubMed Central

    2017-01-01

    Information on drivers of dispersal is critical for wildlife conservation but is rare for long-lived marine mammal species with large geographic ranges. We fit multi-state mark-recapture models to resighting data of 369 known-aged Steller sea lion (Eumetopias jubatus) females marked as pups on their natal rookeries in southeastern Alaska from 1994–2005 and monitored from 2001–15. We estimated probabilities of females being first observed parous at their natal site (natal philopatry), and of not moving breeding sites among years (breeding philopatry) at large (> 400 km, all five rookeries in southeastern Alaska) and small (< 4 km, all islands within the largest rookery, Forrester Island Complex, F) spatial scales. At the rookery scale, natal philopatry was moderately high (0.776–0.859) for most rookeries and breeding philopatry was nearly 1, with < 3% of females switching breeding rookeries between years. At more populous islands at F, natal philopatry was 0.500–0.684 versus 0.295–0.437 at less populous islands, and breeding philopatry was 0.919–0.926 versus 0.604–0.858. At both spatial scales, the probability of pupping at a non-natal site increased with population size of, and declined with distance from, the destination site. Natal philopatry of < 1 would increase gene flow, improve population resilience, and promote population recovery after decline in a heterogeneous environment. Very high breeding philopatry suggests that familiarity with neighboring females and knowledge of the breeding site (the topography of pupping sites and nearby foraging locations) may be a critical component to reproductive strategies of sea lions. PMID:28591130

  16. Regional Geochemistry - an Introduction

    NASA Astrophysics Data System (ADS)

    Reimann, Clemens

    2017-04-01

    Building on the pioneering ideas and work of V. Vernadsky (1883-1945) and V.M. Goldschmidt (1888-1947) the Geological Surveys of Europe have more than 60 years experience with geochemical mapping at a large variety of scales. Surveys using hundreds of samples per km2 for mineral exploration projects, 1 to 4 sites per km2 for mapping the urban environment, 1 site per 2 to 10 km2 in county or country-wide mapping projects to 1 site per 1000 to 5000 km2 for mapping at the continental scale have been successfully completed. Sample materials for these surveys include groundwater, surface water, stream sediments, floodplain sediments, different soil horizons (preferably soil O, A, B and C horizon) and plant materials from moss to trees. Surveys combining several sample materials from local to sub-continental scale in multi-media, multi-element geochemical investigations reflecting the interplay of chemical elements between the different compartments (lithosphere, pedosphere, biosphere and hydrosphere) of the ecosystem have also been carried out. These surveys provide ample empirical evidence that different geochemical processes become visible at different scales. Not all sample materials are suitable for all scales. A variety of scales in combination with a variety of different sample materials are needed to fully understand geochemical processes in the critical zone. Examples are shown that highlight the importance of a strategy to optimize sampling density and design for the chosen scale already during the planning stages of a project. Anthropogenic element sources are visible at a local scale and the major impact of geology, mineralogy and climate (as a driving force for weathering) dominates geochemical maps at the continental scale. Interestingly, mineralisation can generate features which are visible at a variety of scales. Some further issues that need attention when carrying out geochemical surveys at a variety of scales are (a) the need for an excellent and well documented analytical quality control, (b) the choice of the elements to be analysed (as many as possible) (c) the required detection limits (the lowest possible) and (d) the choice of extraction (several if feasible).

  17. The 'cube' meta-model for the information system of large health sector organizations--a (platform neutral) mapping tool to integrate information system development with changing business functions and organizational development.

    PubMed

    Balkányi, László

    2002-01-01

    To develop information systems (IS) in the changing environment of the health sector, a simple but throughout model, avoiding the techno-jargon of informatics, might be useful for the top management. A platform neutral, extensible, transparent conceptual model should be established. Limitations of current methods lead to a simple, but comprehensive mapping, in the form of a three-dimensional cube. The three 'orthogonal' views are (a) organization functionality, (b) organizational structures and (c) information technology. Each of the cube-sides is described according to its nature. This approach enables to define any kind of an IS component as a certain point/layer/domain of the cube and enables also the management to label all IS components independently form any supplier(s) and/or any specific platform. The model handles changes in organization structure, business functionality and the serving info-system independently form each other. Practical application extends to (a) planning complex, new ISs, (b) guiding development of multi-vendor, multi-site ISs, (c) supporting large-scale public procurement procedures and the contracting, implementation phase by establishing a platform neutral reference, (d) keeping an exhaustive inventory of an existing large-scale system, that handles non-tangible aspects of the IS.

  18. Medical image classification based on multi-scale non-negative sparse coding.

    PubMed

    Zhang, Ruijie; Shen, Jian; Wei, Fushan; Li, Xiong; Sangaiah, Arun Kumar

    2017-11-01

    With the rapid development of modern medical imaging technology, medical image classification has become more and more important in medical diagnosis and clinical practice. Conventional medical image classification algorithms usually neglect the semantic gap problem between low-level features and high-level image semantic, which will largely degrade the classification performance. To solve this problem, we propose a multi-scale non-negative sparse coding based medical image classification algorithm. Firstly, Medical images are decomposed into multiple scale layers, thus diverse visual details can be extracted from different scale layers. Secondly, for each scale layer, the non-negative sparse coding model with fisher discriminative analysis is constructed to obtain the discriminative sparse representation of medical images. Then, the obtained multi-scale non-negative sparse coding features are combined to form a multi-scale feature histogram as the final representation for a medical image. Finally, SVM classifier is combined to conduct medical image classification. The experimental results demonstrate that our proposed algorithm can effectively utilize multi-scale and contextual spatial information of medical images, reduce the semantic gap in a large degree and improve medical image classification performance. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2010-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also

  20. Navajo-Hopi Land Commission Renewable Energy Development Project (NREP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas Benally, Deputy Director,

    2012-05-15

    The Navajo Hopi Land Commission Office (NHLCO), a Navajo Nation executive branch agency has conducted activities to determine capacity-building, institution-building, outreach and management activities to initiate the development of large-scale renewable energy - 100 megawatt (MW) or larger - generating projects on land in Northwestern New Mexico in the first year of a multi-year program. The Navajo Hopi Land Commission Renewable Energy Development Project (NREP) is a one year program that will develop and market a strategic business plan; form multi-agency and public-private project partnerships; compile site-specific solar, wind and infrastructure data; and develop and use project communication and marketingmore » tools to support outreach efforts targeting the public, vendors, investors and government audiences.« less

  1. Multi-scale Holocene Asian monsoon variability deduced from a twin-stalagmite record in southwestern China

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Wang, Yongjin; Cheng, Hai; Edwards, Richard Lawrence; Shen, Chuan-Chou; Liu, Dianbing; Shao, Qingfeng; Deng, Chao; Zhang, Zhenqiu; Wang, Quan

    2016-07-01

    We present two isotopic (δ18O and δ13C) sequences of a twin-stalagmite from Zhuliuping Cave, southwestern China, with 230Th dates from 14.6 to 4.6 ka. The stalagmite δ18O record characterizes orbital- to decadal-scale variability of Asian summer monsoon (ASM) intensity, with the Holocene optimum period (HOP) between 9.8 and 6.8 ka BP which is reinforced by its co-varying δ13C data. The large multi-decadal scale amplitude of the cave δ18O indicates its high sensitivity to climate change. Four centennial-scale weak ASM events during the early Holocene are centered at 11.2, 10.8, 9.1 and 8.2 ka. They can be correlated to cold periods in the northern high latitudes, possibly resulting from rapid dynamics of atmospheric circulation associated with North Atlantic cooling. The 8.2 ka event has an amplitude more than two-thirds that of the Younger Dryas (YD), and is significantly stronger than other cave records in the Asia monsoon region, likely indicating a more severe dry climate condition at the cave site. At the end of the YD event, the δ13C record lags the δ18O record by 300-500 yr, suggesting a multi-centennial slow response of vegetation and soil processes to monsoon enhancement.

  2. Shared worlds: multi-sited ethnography and nursing research.

    PubMed

    Molloy, Luke; Walker, Kim; Lakeman, Richard

    2017-03-22

    Background Ethnography, originally developed for the study of supposedly small-scale societies, is now faced with an increasingly mobile, changing and globalised world. Cultural identities can exist without reference to a specific location and extend beyond regional and national boundaries. It is therefore no longer imperative that the sole object of the ethnographer's practice should be a geographically bounded site. Aim To present a critical methodological review of multi-sited ethnography. Discussion Understanding that it can no longer be taken with any certainty that location alone determines culture, multi-sited ethnography provides a method of contextualising multi-sited social phenomena. The method enables researchers to examine social phenomena that are simultaneously produced in different locations. It has been used to undertake cultural analysis of diverse areas such as organ trafficking, global organisations, technologies and anorexia. Conclusion The authors contend that multi-sited ethnography is particularly suited to nursing research as it provides researchers with an ethnographic method that is more relevant to the interconnected world of health and healthcare services. Implications for practice Multi-sited ethnography provides nurse researchers with an approach to cultural analysis in areas such as the social determinants of health, healthcare services and the effects of health policies across multiple locations.

  3. Intrahemispheric theta rhythm desynchronization impairs working memory.

    PubMed

    Alekseichuk, Ivan; Pabel, Stefanie Corinna; Antal, Andrea; Paulus, Walter

    2017-01-01

    There is a growing interest in large-scale connectivity as one of the crucial factors in working memory. Correlative evidence has revealed the anatomical and electrophysiological players in the working memory network, but understanding of the effective role of their connectivity remains elusive. In this double-blind, placebo-controlled study we aimed to identify the causal role of theta phase connectivity in visual-spatial working memory. The frontoparietal network was over- or de-synchronized in the anterior-posterior direction by multi-electrode, 6 Hz transcranial alternating current stimulation (tACS). A decrease in memory performance and increase in reaction time was caused by frontoparietal intrahemispheric desynchronization. According to the diffusion drift model, this originated in a lower signal-to-noise ratio, known as the drift rate index, in the memory system. The EEG analysis revealed a corresponding decrease in phase connectivity between prefrontal and parietal areas after tACS-driven desynchronization. The over-synchronization did not result in any changes in either the behavioral or electrophysiological levels in healthy participants. Taken together, we demonstrate the feasibility of manipulating multi-site large-scale networks in humans, and the disruptive effect of frontoparietal desynchronization on theta phase connectivity and visual-spatial working memory.

  4. Multi-temporal thermal analyses for submarine groundwater discharge (SGD) detection over large spatial scales in the Mediterranean

    NASA Astrophysics Data System (ADS)

    Hennig, Hanna; Mallast, Ulf; Merz, Ralf

    2015-04-01

    Submarine groundwater discharge (SGD) sites act as important pathways for nutrients and contaminants that deteriorate marine ecosystems. In the Mediterranean it is estimated that 75% of freshwater input is contributed from karst aquifers. Thermal remote sensing can be used for a pre-screening of potential SGD sites in order to optimize field surveys. Although different platforms (ground-, air- and spaceborne) may serve for thermal remote sensing, the most cost-effective are spaceborne platforms (satellites) that likewise cover the largest spatial scale (>100 km per image). Therefore an automatized and objective approach that uses thermal satellite images from Landsat 7 and Landsat 8 was used to localize potential SGD sites on a large spatial scale. The method using descriptive statistic parameter specially range and standard deviation by (Mallast et al., 2014) was adapted to the Mediterranean Sea. Since the method was developed for the Dead Sea were satellite images with cloud cover are rare and no sea level change occurs through tidal cycles it was essential to adapt the method to a region where tidal cycles occur and cloud cover is more frequent . These adaptations include: (1) an automatic and adaptive coastline detection (2) include and process cloud covered scenes to enlarge the data basis, (3) implement tidal data in order to analyze low tide images as SGD is enhanced during these phases and (4) test the applicability for Landsat 8 images that will provide data in the future once Landsat 7 stops working. As previously shown, the range method shows more accurate results compared to the standard deviation. However, the result exclusively depends on two scenes (minimum and maximum) and is largely influenced by outliers. Counteracting on this drawback we developed a new approach. Since it is assumed that sea surface temperature (SST) is stabilized by groundwater at SGD sites, the slope of a bootstrapped linear model fitted to sorted SST per pixel would be less steep than the slope of the surrounding area, resulting in less influence through outliers and an equal weighting of all integrated scenes. Both methods could be used to detect SGD sites in the Mediterranean regardless to the discharge characteristics (diffuse and focused) exceptions are sites with deep emergences. Better results could be shown in bays compared to more exposed sites. Since the range of the SST is mostly influenced by maximum and minimum of the scenes, the slope approach can be seen as a more representative method using all scenes. References: Mallast, U., Gloaguen, R., Friesen, J., Rödiger, T., Geyer, S., Merz, R., Siebert, C., 2014. How to identify groundwater-caused thermal anomalies in lakes based on multi-temporal satellite data in semi-arid regions. Hydrol. Earth Syst. Sci. 18 (7), 2773-2787.

  5. Intensity of Territorial Marking Predicts Wolf Reproduction: Implications for Wolf Monitoring

    PubMed Central

    García, Emilio J.

    2014-01-01

    Background The implementation of intensive and complex approaches to monitor large carnivores is resource demanding, restricted to endangered species, small populations, or small distribution ranges. Wolf monitoring over large spatial scales is difficult, but the management of such contentious species requires regular estimations of abundance to guide decision-makers. The integration of wolf marking behaviour with simple sign counts may offer a cost-effective alternative to monitor the status of wolf populations over large spatial scales. Methodology/Principal Findings We used a multi-sampling approach, based on the collection of visual and scent wolf marks (faeces and ground scratching) and the assessment of wolf reproduction using howling and observation points, to test whether the intensity of marking behaviour around the pup-rearing period (summer-autumn) could reflect wolf reproduction. Between 1994 and 2007 we collected 1,964 wolf marks in a total of 1,877 km surveyed and we searched for the pups' presence (1,497 howling and 307 observations points) in 42 sampling sites with a regular presence of wolves (120 sampling sites/year). The number of wolf marks was ca. 3 times higher in sites with a confirmed presence of pups (20.3 vs. 7.2 marks). We found a significant relationship between the number of wolf marks (mean and maximum relative abundance index) and the probability of wolf reproduction. Conclusions/Significance This research establishes a real-time relationship between the intensity of wolf marking behaviour and wolf reproduction. We suggest a conservative cutting point of 0.60 for the probability of wolf reproduction to monitor wolves on a regional scale combined with the use of the mean relative abundance index of wolf marks in a given area. We show how the integration of wolf behaviour with simple sampling procedures permit rapid, real-time, and cost-effective assessments of the breeding status of wolf packs with substantial implications to monitor wolves at large spatial scales. PMID:24663068

  6. Can a workbook work? Examining whether a practitioner evaluation toolkit can promote instrumental use.

    PubMed

    Campbell, Rebecca; Townsend, Stephanie M; Shaw, Jessica; Karim, Nidal; Markowitz, Jenifer

    2015-10-01

    In large-scale, multi-site contexts, developing and disseminating practitioner-oriented evaluation toolkits are an increasingly common strategy for building evaluation capacity. Toolkits explain the evaluation process, present evaluation design choices, and offer step-by-step guidance to practitioners. To date, there has been limited research on whether such resources truly foster the successful design, implementation, and use of evaluation findings. In this paper, we describe a multi-site project in which we developed a practitioner evaluation toolkit and then studied the extent to which the toolkit and accompanying technical assistance was effective in promoting successful completion of local-level evaluations and fostering instrumental use of the findings (i.e., whether programs directly used their findings to improve practice, see Patton, 2008). Forensic nurse practitioners from six geographically dispersed service programs completed methodologically rigorous evaluations; furthermore, all six programs used the findings to create programmatic and community-level changes to improve local practice. Implications for evaluation capacity building are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Multi-site pain and working conditions as predictors of work ability in a 4-year follow-up among food industry employees.

    PubMed

    Neupane, S; Virtanen, P; Leino-Arjas, P; Miranda, H; Siukola, A; Nygård, C-H

    2013-03-01

    We investigated the separate and joint effects of multi-site musculoskeletal pain and physical and psychosocial exposures at work on future work ability. A survey was conducted among employees of a Finnish food industry company in 2005 (n = 1201) and a follow-up survey in 2009 (n = 734). Information on self-assessed work ability (current work ability on a scale from 0 to 10; 7 = poor work ability), multi-site musculoskeletal pain (pain in at least two anatomical areas of four), leisure-time physical activity, body mass index and physical and psychosocial exposures was obtained by questionnaire. The separate and joint effects of multi-site pain and work exposures on work ability at follow-up, among subjects with good work ability at baseline, were assessed by logistic regression, and p-values for the interaction derived. Compared with subjects with neither multi-site pain nor adverse work exposure, multi-site pain at baseline increased the risk of poor work ability at follow-up, allowing for age, gender, occupational class, body mass index and leisure-time physical activity. The separate effects of the work exposures on work ability were somewhat smaller than those of multi-site pain. Multi-site pain had an interactive effect with work environment and awkward postures, such that no association of multi-site pain with poor work ability was seen when work environment was poor or awkward postures present. The decline in work ability connected with multi-site pain was not increased by exposure to adverse physical or psychosocial factors at work. © 2012 European Federation of International Association for the Study of Pain Chapters.

  8. FIREX (Fire Influence on Regional and Global Environments Experiment): Measurements of Nitrogen Containing Volatile Organic Compounds

    NASA Astrophysics Data System (ADS)

    Warneke, C.; Schwarz, J. P.; Yokelson, R. J.; Roberts, J. M.; Koss, A.; Coggon, M.; Yuan, B.; Sekimoto, K.

    2017-12-01

    A combination of a warmer, drier climate with fire-control practices over the last century have produced a situation in which we can expect more frequent fires and fires of larger magnitude in the Western U.S. and Canada. There are urgent needs to better understand the impacts of wildfire and biomass burning (BB) on the atmosphere and climate system, and for policy-relevant science to aid in the process of managing fires. The FIREX (Fire Influence on Regional and Global Environment Experiment) research effort is a multi-year, multi-agency measurement campaign focused on the impact of BB on climate and air quality from western North American wild fires, where research takes place on scales ranging from the flame-front to the global atmosphere. FIREX includes methods development and small- and large-scale laboratory and field experiments. FIREX will include: emission factor measurements from typical North American fuels in the fire science laboratory in Missoula, Montana; mobile laboratory deployments; ground site measurements at sites influenced by BB from several western states. The main FIREX effort will be a large field study with multiple aircraft and mobile labs in the fire season of 2019. One of the main advances of FIREX is the availability of various new measurement techniques that allows for smoke evaluation in unprecedented detail. The first major effort of FIREX was the fire science laboratory measurements in October 2016, where a large number of previously understudied Nitrogen containing volatile organic compounds (NVOCs) were measured using H3O+CIMS and I-CIMS instruments. The contribution of NVOCs to the total reactive Nitrogen budget and the relationship to the Nitrogen content of the fuel are investigated.

  9. Large-scale road safety programmes in low- and middle-income countries: an opportunity to generate evidence.

    PubMed

    Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David

    2013-01-01

    The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.

  10. Multi-view L2-SVM and its multi-view core vector machine.

    PubMed

    Huang, Chengquan; Chung, Fu-lai; Wang, Shitong

    2016-03-01

    In this paper, a novel L2-SVM based classifier Multi-view L2-SVM is proposed to address multi-view classification tasks. The proposed Multi-view L2-SVM classifier does not have any bias in its objective function and hence has the flexibility like μ-SVC in the sense that the number of the yielded support vectors can be controlled by a pre-specified parameter. The proposed Multi-view L2-SVM classifier can make full use of the coherence and the difference of different views through imposing the consensus among multiple views to improve the overall classification performance. Besides, based on the generalized core vector machine GCVM, the proposed Multi-view L2-SVM classifier is extended into its GCVM version MvCVM which can realize its fast training on large scale multi-view datasets, with its asymptotic linear time complexity with the sample size and its space complexity independent of the sample size. Our experimental results demonstrated the effectiveness of the proposed Multi-view L2-SVM classifier for small scale multi-view datasets and the proposed MvCVM classifier for large scale multi-view datasets. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Multi-approaches analysis reveals local adaptation in the emmer wheat (Triticum dicoccoides) at macro- but not micro-geographical scale.

    PubMed

    Volis, Sergei; Ormanbekova, Danara; Yermekbayev, Kanat; Song, Minshu; Shulgina, Irina

    2015-01-01

    Detecting local adaptation and its spatial scale is one of the most important questions of evolutionary biology. However, recognition of the effect of local selection can be challenging when there is considerable environmental variation across the distance at the whole species range. We analyzed patterns of local adaptation in emmer wheat, Triticum dicoccoides, at two spatial scales, small (inter-population distance less than one km) and large (inter-population distance more than 50 km) using several approaches. Plants originating from four distinct habitats at two geographic scales (cold edge, arid edge and two topographically dissimilar core locations) were reciprocally transplanted and their success over time was measured as 1) lifetime fitness in a year of planting, and 2) population growth four years after planting. In addition, we analyzed molecular (SSR) and quantitative trait variation and calculated the QST/FST ratio. No home advantage was detected at the small spatial scale. At the large spatial scale, home advantage was detected for the core population and the cold edge population in the year of introduction via measuring life-time plant performance. However, superior performance of the arid edge population in its own environment was evident only after several generations via measuring experimental population growth rate through genotyping with SSRs allowing counting the number of plants and seeds per introduced genotype per site. These results highlight the importance of multi-generation surveys of population growth rate in local adaptation testing. Despite predominant self-fertilization of T. dicoccoides and the associated high degree of structuring of genetic variation, the results of the QST - FST comparison were in general agreement with the pattern of local adaptation at the two spatial scales detected by reciprocal transplanting.

  12. The Emergence of Dominant Design(s) in Large Scale Cyber-Infrastructure Systems

    ERIC Educational Resources Information Center

    Diamanti, Eirini Ilana

    2012-01-01

    Cyber-infrastructure systems are integrated large-scale IT systems designed with the goal of transforming scientific practice by enabling multi-disciplinary, cross-institutional collaboration. Their large scale and socio-technical complexity make design decisions for their underlying architecture practically irreversible. Drawing on three…

  13. Capturing remote mixing due to internal tides using multi-scale modeling tool: SOMAR-LES

    NASA Astrophysics Data System (ADS)

    Santilli, Edward; Chalamalla, Vamsi; Scotti, Alberto; Sarkar, Sutanu

    2016-11-01

    Internal tides that are generated during the interaction of an oscillating barotropic tide with the bottom bathymetry dissipate only a fraction of their energy near the generation region. The rest is radiated away in the form of low- high-mode internal tides. These internal tides dissipate energy at remote locations when they interact with the upper ocean pycnocline, continental slope, and large scale eddies. Capturing the wide range of length and time scales involved during the life-cycle of internal tides is computationally very expensive. A recently developed multi-scale modeling tool called SOMAR-LES combines the adaptive grid refinement features of SOMAR with the turbulence modeling features of a Large Eddy Simulation (LES) to capture multi-scale processes at a reduced computational cost. Numerical simulations of internal tide generation at idealized bottom bathymetries are performed to demonstrate this multi-scale modeling technique. Although each of the remote mixing phenomena have been considered independently in previous studies, this work aims to capture remote mixing processes during the life cycle of an internal tide in more realistic settings, by allowing multi-level (coarse and fine) grids to co-exist and exchange information during the time stepping process.

  14. Integrating ecosystems measurements from multiple eddy-covariance sites to a simple model of ecosystem process - Are there possibilities for a uniform model calibration?

    NASA Astrophysics Data System (ADS)

    Minunno, Francesco; Peltoniemi, Mikko; Launiainen, Samuli; Mäkelä, Annikki

    2014-05-01

    Biogeochemical models quantify the material and energy flux exchanges between biosphere, atmosphere and soil, however there is still considerable uncertainty underpinning model structure and parametrization. The increasing availability of data from of multiple sources provides useful information for model calibration and validation at different space and time scales. We calibrated a simplified ecosystem process model PRELES to data from multiple sites. In this work we had the following objective: to compare a multi-site calibration and site-specific calibrations, in order to test if PRELES is a model of general applicability, and to test how well one parameterization can predict ecosystem fluxes. Model calibration and evaluation were carried out by the means of the Bayesian method; Bayesian calibration (BC) and Bayesian model comparison (BMC) were used to quantify the uncertainty in model parameters and model structure. Evapotranspiration (ET) and gross primary production (GPP) measurements collected in 9 sites of Finland and Sweden were used in the study; half dataset was used for model calibrations and half for the comparative analyses. 10 BCs were performed; the model was independently calibrated for each of the nine sites (site-specific calibrations) and a multi-site calibration was achieved using the data from all the sites in one BC. Then 9 BMCs were carried out, one for each site, using output from the multi-site and the site-specific versions of PRELES. Similar estimates were obtained for the parameters at which model outputs are most sensitive. Not surprisingly, the joint posterior distribution achieved through the multi-site calibration was characterized by lower uncertainty, because more data were involved in the calibration process. No significant differences were encountered in the prediction of the multi-site and site-specific versions of PRELES, and after BMC, we concluded that the model can be reliably used at regional scale to simulate carbon and water fluxes of Boreal forests. Despite being a simple model, PRELES provided good estimates of GPP and ET; only for one site PRELES multi-site version underestimated water fluxes. Our study implies convergence of GPP and water processes in boreal zone to the extent that their plausible prediction is possible with a simple model using global parameterization.

  15. Quantifying the multi-scale response of avifauna to prescribed fire experiments in the southwest United States.

    PubMed

    Dickson, Brett G; Noon, Barry R; Flather, Curtis H; Jentsch, Stephanie; Block, William M

    2009-04-01

    Landscape-scale disturbance events, including ecological restoration and fuel reduction activities, can modify habitat and affect relationships between species and their environment. To reduce the risk of uncharacteristic stand-replacing fires in the southwestern United States, land managers are implementing restoration and fuels treatments (e.g., mechanical thinning, prescribed fire) in progressively larger stands of dry, lower elevation ponderosa pine (Pinus ponderosa) forest. We used a Before-After/Control-Impact experimental design to quantify the multi-scale response of avifauna to large (approximately 250-400 ha) prescribed fire treatments on four sites in Arizona and New Mexico dominated by ponderosa pine. Using distance sampling and an information-theoretic approach, we estimated changes in density for 14 bird species detected before (May-June 2002-2003) and after (May-June 2004-2005) prescribed fire treatments. We observed few site-level differences in pre- and posttreatment density, and no species responded strongly to treatment on all four sites. Point-level spatial models of individual species response to treatment, habitat variables, and fire severity revealed ecological relationships that were more easily interpreted. At this scale, pretreatment forest structure and patch characteristics were important predictors of posttreatment differences in bird species density. Five species (Pygmy Nuthatch [Sitta pygmaea], Western Bluebird [Sialia mexicana], Steller's Jay [Cyanocitta stelleri], American Robin [Turdus migratorius], and Hairy Woodpecker [Picoides villosus]) exhibited a strong treatment response, and two of these species (American Robin and Hairy Woodpecker) could be associated with meaningful fire severity response functions. The avifaunal response patterns that we observed were not always consistent with those reported by more common studies of wildland fire events. Our results suggest that, in the short-term, the distribution and abundance of common members of the breeding bird community in Southwestern ponderosa pine forests appear to be tolerant of low- to moderate-intensity prescribed fire treatments at multiple spatial scales and across multiple geographic locations.

  16. The AMMA-CATCH Gourma observatory site in Mali: Relating climatic variations to changes in vegetation, surface hydrology, fluxes and natural resources

    NASA Astrophysics Data System (ADS)

    Mougin, E.; Hiernaux, P.; Kergoat, L.; Grippa, M.; de Rosnay, P.; Timouk, F.; Le Dantec, V.; Demarez, V.; Lavenu, F.; Arjounin, M.; Lebel, T.; Soumaguel, N.; Ceschia, E.; Mougenot, B.; Baup, F.; Frappart, F.; Frison, P. L.; Gardelle, J.; Gruhier, C.; Jarlan, L.; Mangiarotti, S.; Sanou, B.; Tracol, Y.; Guichard, F.; Trichon, V.; Diarra, L.; Soumaré, A.; Koité, M.; Dembélé, F.; Lloyd, C.; Hanan, N. P.; Damesin, C.; Delon, C.; Serça, D.; Galy-Lacaux, C.; Seghieri, J.; Becerra, S.; Dia, H.; Gangneron, F.; Mazzega, P.

    2009-08-01

    SummaryThe Gourma site in Mali is one of the three instrumented meso-scale sites deployed in West-Africa as part of the African Monsoon Multi-disciplinary Analysis (AMMA) project. Located both in the Sahelian zone sensu stricto, and in the Saharo-Sahelian transition zone, the Gourma meso-scale window is the northernmost site of the AMMA-CATCH observatory reached by the West African Monsoon. The experimental strategy includes deployment of a variety of instruments, from local to meso-scale, dedicated to monitoring and documentation of the major variables characterizing the climate forcing, and the spatio-temporal variability of surface processes and state variables such as vegetation mass, leaf area index (LAI), soil moisture and surface fluxes. This paper describes the Gourma site, its associated instrumental network and the research activities that have been carried out since 1984. In the AMMA project, emphasis is put on the relations between climate, vegetation and surface fluxes. However, the Gourma site is also important for development and validation of satellite products, mainly due to the existence of large and relatively homogeneous surfaces. The social dimension of the water resource uses and governance is also briefly analyzed, relying on field enquiry and interviews. The climate of the Gourma region is semi-arid, daytime air temperatures are always high and annual rainfall amounts exhibit strong inter-annual and seasonal variations. Measurements sites organized along a north-south transect reveal sharp gradients in surface albedo, net radiation, vegetation production, and distribution of plant functional types. However, at any point along the gradient, surface energy budget, soil moisture and vegetation growth contrast between two main types of soil surfaces and hydrologic systems. On the one hand, sandy soils with high water infiltration rates and limited run-off support almost continuous herbaceous vegetation with scattered woody plants. On the other hand, water infiltration is poor on shallow soils, and vegetation is sparse and discontinuous, with more concentrated run-off that ends in pools or low lands within structured endorheic watersheds. Land surface in the Gourma is characterized by rapid response to climate variability, strong intra-seasonal, seasonal and inter-annual variations in vegetation growth, soil moisture and energy balance. Despite the multi-decadal drought, which still persists, ponds and lakes have increased, the grass cover has largely recovered, and there are signs of increased tree cover at least in the low lands.

  17. The 300 Area Integrated Field Research Challenge Quality Assurance Project Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fix, N. J.

    Pacific Northwest National Laboratory and a group of expert collaborators are using the U.S. Department of Energy Hanford Site 300 Area uranium plume within the footprint of the 300-FF-5 groundwater operable unit as a site for an Integrated Field-Scale Subsurface Research Challenge (IFRC). The IFRC is entitled Multi-Scale Mass Transfer Processes Controlling Natural Attenuation and Engineered Remediation: An IFRC Focused on the Hanford Site 300 Area Uranium Plume Project. The theme is investigation of multi-scale mass transfer processes. A series of forefront science questions on mass transfer are posed for research that relate to the effect of spatial heterogeneities; themore » importance of scale; coupled interactions between biogeochemical, hydrologic, and mass transfer processes; and measurements/approaches needed to characterize and model a mass transfer-dominated system. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the 300 Area IFRC Project. This plan is designed to be used exclusively by project staff.« less

  18. Impact of Scale-Dependent Coupled Processes on Solute Fate and Transport in the Critical Zone: Case Studies Involving Inorganic and Radioactive Contaminants

    NASA Astrophysics Data System (ADS)

    Jardine, P. M.; Gentry, R. W.

    2011-12-01

    Soil, the thin veneer of matter covering the Earths surface that supports a web of living diversity, is often abused through anthropogenic inputs of toxic waste. This subsurface regime, coupled with life sustaining surface water and groundwater is known as the "Critical Zone". The disposal of radioactive and toxic organic and inorganic waste generated by industry and various government agencies has historically involved shallow land burial or the use of surface impoundments in unsaturated soils and sediments. Presently, contaminated sites have been closing rapidly and many remediation strategies have chosen to leave contaminants in-place. As such, contaminants will continue to interact with the geosphere and investigations on long term changes and interactive processes is imperative to verify risks. In this presentation we provide a snap-shot of subsurface science research from the past 25 y that seeks to provide an improved understanding and predictive capability of multi-scale contaminant fate and transport processes in heterogeneous unsaturated and saturated environments. Investigations focus on coupled hydrological, geochemical, and microbial processes that control reactive contaminant transport and that involve multi-scale fundamental research ranging from the molecular scale (e.g. synchrotrons, electron sources, arrays) to in situ plume interrogation strategies at the macroscopic scale (e.g. geophysics, field biostimulation, coupled processes monitoring). We show how this fundamental research is used to provide multi-process, multi-scale predictive monitoring and modeling tools that can be used at contaminated sites to (1) inform and improve the technical basis for decision making, and (2) assess which sites are amenable to natural attenuation and which would benefit from source zone remedial intervention.

  19. Large-scale delamination of multi-layers transition metal carbides and carbonitrides “MXenes”

    DOE PAGES

    Naguib, Michael; Unocic, Raymond R.; Armstrong, Beth L.; ...

    2015-04-17

    Herein we report on a general approach to delaminate multi-layered MXenes using an organic base to induce swelling that in turn weakens the bonds between the MX layers. Simple agitation or mild sonication of the swollen MXene in water resulted in the large-scale delamination of the MXene layers. The delamination method is demonstrated for vanadium carbide, and titanium carbonitrides MXenes.

  20. Modeling sugar cane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    NASA Astrophysics Data System (ADS)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Huth, N.; Marin, F.; Martiné, J.-F.

    2014-01-01

    Agro-Land Surface Models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, a particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of Agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS' phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte-Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used to quantify the sensitivity of harvested biomass to input parameters on a continental scale across the large regions of intensive sugar cane cultivation in Australia and Brazil. Ten parameters driving most of the uncertainty in the ORCHIDEE-STICS modeled biomass at the 7 sites are identified by the screening procedure. We found that the 10 most sensitive parameters control phenology (maximum rate of increase of LAI) and root uptake of water and nitrogen (root profile and root growth rate, nitrogen stress threshold) in STICS, and photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), and transpiration and respiration (stomatal conductance, growth and maintenance respiration coefficients) in ORCHIDEE. We find that the optimal carboxylation rate and photosynthesis temperature parameters contribute most to the uncertainty in harvested biomass simulations at site scale. The spatial variation of the ranked correlation between input parameters and modeled biomass at harvest is well explained by rain and temperature drivers, suggesting climate-mediated different sensitivities of modeled sugar cane yield to the model parameters, for Australia and Brazil. This study reveals the spatial and temporal patterns of uncertainty variability for a highly parameterized agro-LSM and calls for more systematic uncertainty analyses of such models.

  1. Modeling sugarcane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    NASA Astrophysics Data System (ADS)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Caubel, A.; Huth, N.; Marin, F.; Martiné, J.-F.

    2014-06-01

    Agro-land surface models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugarcane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte Carlo sampling method associated with the calculation of partial ranked correlation coefficients is used to quantify the sensitivity of harvested biomass to input parameters on a continental scale across the large regions of intensive sugarcane cultivation in Australia and Brazil. The ten parameters driving most of the uncertainty in the ORCHIDEE-STICS modeled biomass at the 7 sites are identified by the screening procedure. We found that the 10 most sensitive parameters control phenology (maximum rate of increase of LAI) and root uptake of water and nitrogen (root profile and root growth rate, nitrogen stress threshold) in STICS, and photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), and transpiration and respiration (stomatal conductance, growth and maintenance respiration coefficients) in ORCHIDEE. We find that the optimal carboxylation rate and photosynthesis temperature parameters contribute most to the uncertainty in harvested biomass simulations at site scale. The spatial variation of the ranked correlation between input parameters and modeled biomass at harvest is well explained by rain and temperature drivers, suggesting different climate-mediated sensitivities of modeled sugarcane yield to the model parameters, for Australia and Brazil. This study reveals the spatial and temporal patterns of uncertainty variability for a highly parameterized agro-LSM and calls for more systematic uncertainty analyses of such models.

  2. Databases for multilevel biophysiology research available at Physiome.jp.

    PubMed

    Asai, Yoshiyuki; Abe, Takeshi; Li, Li; Oka, Hideki; Nomura, Taishin; Kitano, Hiroaki

    2015-01-01

    Physiome.jp (http://physiome.jp) is a portal site inaugurated in 2007 to support model-based research in physiome and systems biology. At Physiome.jp, several tools and databases are available to support construction of physiological, multi-hierarchical, large-scale models. There are three databases in Physiome.jp, housing mathematical models, morphological data, and time-series data. In late 2013, the site was fully renovated, and in May 2015, new functions were implemented to provide information infrastructure to support collaborative activities for developing models and performing simulations within the database framework. This article describes updates to the databases implemented since 2013, including cooperation among the three databases, interactive model browsing, user management, version management of models, management of parameter sets, and interoperability with applications.

  3. Multi-thread parallel algorithm for reconstructing 3D large-scale porous structures

    NASA Astrophysics Data System (ADS)

    Ju, Yang; Huang, Yaohui; Zheng, Jiangtao; Qian, Xu; Xie, Heping; Zhao, Xi

    2017-04-01

    Geomaterials inherently contain many discontinuous, multi-scale, geometrically irregular pores, forming a complex porous structure that governs their mechanical and transport properties. The development of an efficient reconstruction method for representing porous structures can significantly contribute toward providing a better understanding of the governing effects of porous structures on the properties of porous materials. In order to improve the efficiency of reconstructing large-scale porous structures, a multi-thread parallel scheme was incorporated into the simulated annealing reconstruction method. In the method, four correlation functions, which include the two-point probability function, the linear-path functions for the pore phase and the solid phase, and the fractal system function for the solid phase, were employed for better reproduction of the complex well-connected porous structures. In addition, a random sphere packing method and a self-developed pre-conditioning method were incorporated to cast the initial reconstructed model and select independent interchanging pairs for parallel multi-thread calculation, respectively. The accuracy of the proposed algorithm was evaluated by examining the similarity between the reconstructed structure and a prototype in terms of their geometrical, topological, and mechanical properties. Comparisons of the reconstruction efficiency of porous models with various scales indicated that the parallel multi-thread scheme significantly shortened the execution time for reconstruction of a large-scale well-connected porous model compared to a sequential single-thread procedure.

  4. Quarter Scale RLV Multi-Lobe LH2 Tank Test Program

    NASA Technical Reports Server (NTRS)

    Blum, Celia; Puissegur, Dennis; Tidwell, Zeb; Webber, Carol

    1998-01-01

    Thirty cryogenic pressure cycles have been completed on the Lockheed Martin Michoud Space Systems quarter scale RLV composite multi-lobe liquid hydrogen propellant tank assembly, completing the initial phases of testing and demonstrating technologies key to the success of large scale composite cryogenic tankage for X33, RLV, and other future launch vehicles.

  5. Geospatial Optimization of Siting Large-Scale Solar Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Quinby, Ted; Caulfield, Emmet

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent withmore » each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.« less

  6. Multi-scale temporal and spatial variation in genotypic composition of Cladophora-borne Escherichia coli populations in Lake Michigan.

    PubMed

    Badgley, Brian D; Ferguson, John; Vanden Heuvel, Amy; Kleinheinz, Gregory T; McDermott, Colleen M; Sandrin, Todd R; Kinzelman, Julie; Junion, Emily A; Byappanahalli, Muruleedhara N; Whitman, Richard L; Sadowsky, Michael J

    2011-01-01

    High concentrations of Escherichia coli in mats of Cladophora in the Great Lakes have raised concern over the continued use of this bacterium as an indicator of microbial water quality. Determining the impacts of these environmentally abundant E. coli, however, necessitates a better understanding of their ecology. In this study, the population structure of 4285 Cladophora-borne E. coli isolates, obtained over multiple three day periods from Lake Michigan Cladophora mats in 2007-2009, was examined by using DNA fingerprint analyses. In contrast to previous studies that have been done using isolates from attached Cladophora obtained over large time scales and distances, the extensive sampling done here on free-floating mats over successive days at multiple sites provided a large dataset that allowed for a detailed examination of changes in population structure over a wide range of spatial and temporal scales. While Cladophora-borne E. coli populations were highly diverse and consisted of many unique isolates, multiple clonal groups were also present and accounted for approximately 33% of all isolates examined. Patterns in population structure were also evident. At the broadest scales, E. coli populations showed some temporal clustering when examined by year, but did not show good spatial distinction among sites. E. coli population structure also showed significant patterns at much finer temporal scales. Populations were distinct on an individual mat basis at a given site, and on individual days within a single mat. Results of these studies indicate that Cladophora-borne E. coli populations consist of a mixture of stable, and possibly naturalized, strains that persist during the life of the mat, and more unique, transient strains that can change over rapid time scales. It is clear that further study of microbial processes at fine spatial and temporal scales is needed, and that caution must be taken when interpolating short term microbial dynamics from results obtained from weekly or monthly samples. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Multi-scale temporal and spatial variation in genotypic composition of Cladophora-borne Escherichia coli populations in Lake Michigan

    USGS Publications Warehouse

    Badgley, B.D.; Ferguson, J.; Heuvel, A.V.; Kleinheinz, G.T.; McDermott, C.M.; Sandrin, T.R.; Kinzelman, J.; Junion, E.A.; Byappanahalli, M.N.; Whitman, R.L.; Sadowsky, M.J.

    2011-01-01

    High concentrations of Escherichia coli in mats of Cladophora in the Great Lakes have raised concern over the continued use of this bacterium as an indicator of microbial water quality. Determining the impacts of these environmentally abundant E. coli, however, necessitates a better understanding of their ecology. In this study, the population structure of 4285 Cladophora-borne E. coli isolates, obtained over multiple three day periods from Lake Michigan Cladophora mats in 2007-2009, was examined by using DNA fingerprint analyses. In contrast to previous studies that have been done using isolates from attached Cladophora obtained over large time scales and distances, the extensive sampling done here on free-floating mats over successive days at multiple sites provided a large dataset that allowed for a detailed examination of changes in population structure over a wide range of spatial and temporal scales. While Cladophora-borne E. coli populations were highly diverse and consisted of many unique isolates, multiple clonal groups were also present and accounted for approximately 33% of all isolates examined. Patterns in population structure were also evident. At the broadest scales, E. coli populations showed some temporal clustering when examined by year, but did not show good spatial distinction among sites. E. coli population structure also showed significant patterns at much finer temporal scales. Populations were distinct on an individual mat basis at a given site, and on individual days within a single mat. Results of these studies indicate that Cladophora-borne E. coli populations consist of a mixture of stable, and possibly naturalized, strains that persist during the life of the mat, and more unique, transient strains that can change over rapid time scales. It is clear that further study of microbial processes at fine spatial and temporal scales is needed, and that caution must be taken when interpolating short term microbial dynamics from results obtained from weekly or monthly samples.

  8. New multi-scale approach to improve explanation of patterns of contemporary morphodynamics in the badland landscapes of Central Italy: the important Quaternary context

    NASA Astrophysics Data System (ADS)

    Vergari, Francesca; Troiani, Francesco; Della Seta, Marta; Faulkner, Hazel; Schwanghart, Wolfgang; Ciccacci, Sirio; Del Monte, Maurizio; Fredi, Paola

    2016-04-01

    Spatial patterns and magnitudes of short-term erosional processes are often the result of longer-term landscape-wide morphodynamics. Their combined analysis, however, is challenged by different spatial scales, data availability and resolution. Integrating both analyses has thus rarely been done though urgently needed to better understand and manage present day erosional dynamics and land degradation. In this study we aim at overcoming these shortcomings by exploring a multi-scale approach, based on a nested experimental design that integrates the traditional monitoring of erosion processes at local and short time scale, with the longer-term (over the last 103-105 yr) and basin-to-morphostructure scale analysis of landscape morphodynamics. We investigated the geomorphological behaviour of a Mediterranean active badland site located in the Upper Orcia Valley (Southern Tuscany, Italy). This choice is justified by the availability of decadal erosion monitoring datasets at a range of scales, and the rapidity of development of erosion processes. Based on the analysis of drainage network and its longitudinal and planform pattern, we tested the hypothesis that this rejuvenating, actively erosional landscape presents hotspots of denudation processes on hillslope and in channel network that are largely associated with (a) knickpoints on stream longitudinal profiles, (b) sites of strong connectivity, and (c) sites of strong divide competition with adjacent, aggressive and non-aggressive systems. To illustrate and explore this nested approach, we extracted the channel network and analysed stream longitudinal profiles using the MATLAB-based TopoToolbox program, starting from the 27x27 m Aster GDEM. The stream network morphometric analyses involved computing and mapping χ-values, a transformation that normalizes the longitudinal distance by upslope area and which serves as a proxy of the dynamic state of river basins based on the current geometry of the river network. Finally, we projected on the longitudinal profiles of the Orcia River and some of its main tributaries a full range of geomorphic features which are relevant for the interpretation of the landscape morphoevolution, connectivity and erosion/deposition dynamics: i) competitive divides; ii) sites with different degree of connectivity within the drainage system; iii) sites experiencing different erosion rates; iv) sites with in-channel depositional features and landslide deposits; v) remnants of relict geomorphic surfaces. The plano-altimetric distribution of such features, compared with the drainage network evolutionary stage, allowed to better understand the morphodynamics of badland areas and to define future scenarios in the perspective of a better management of hazardous processes.

  9. Challenges and opportunities for large landscape-scale management in a shifting climate: The importance of nested adaptation responses across geospatial and temporal scales

    Treesearch

    Gary M. Tabor; Anne Carlson; Travis Belote

    2014-01-01

    The Yellowstone to Yukon Conservation Initiative (Y2Y) was established over 20 years ago as an experiment in large landscape conservation. Initially, Y2Y emerged as a response to large scale habitat fragmentation by advancing ecological connectivity. It also laid the foundation for large scale multi-stakeholder conservation collaboration with almost 200 non-...

  10. NREL and IBM Improve Solar Forecasting with Big Data | Energy Systems

    Science.gov Websites

    forecasting model using deep-machine-learning technology. The multi-scale, multi-model tool, named Watt-sun the first standard suite of metrics for this purpose. Validating Watt-sun at multiple sites across the

  11. Detecting vegetation cover change on the summit of Cadillac Mountain using multi-temporal remote sensing datasets: 1979, 2001, and 2007.

    PubMed

    Kim, Min-Kook; Daigle, John J

    2011-09-01

    This study examines the efficacy of management strategies implemented in 2000 to reduce visitor-induced vegetation impact and enhance vegetation recovery at the summit loop trail on Cadillac Mountain at Acadia National Park, Maine. Using single-spectral high-resolution remote sensing datasets captured in 1979, 2001, and 2007, pre-classification change detection analysis techniques were applied to measure fractional vegetation cover changes between the time periods. This popular sub-alpine summit with low-lying vegetation and attractive granite outcroppings experiences dispersed visitor use away from the designated trail, so three pre-defined spatial scales (small, 0-30 m; medium, 0-60 m; and large, 0-90 m) were examined in the vicinity of the summit loop trail with visitor use (experimental site) and a site chosen nearby in a relatively pristine undisturbed area (control site) with similar spatial scales. Results reveal significant changes in terms of rates of vegetation impact between 1979 and 2001 extending out to 90 m from the summit loop trail with no management at the site. No significant differences were detected among three spatial zones (inner, 0-30 m; middle, 30-60 m; and outer, 60-90 m) at the experimental site, but all were significantly higher rates of impact compared to similar spatial scales at the control site (all p < 0.001). In contrast, significant changes in rates of recovery between 2001 and 2007 were observed in the medium and large spatial scales at the experimental site under management as compared to the control site (all p < 0.05). Also during this later period a higher rate of recovery was observed in the outer zone as compared to the inner zone at the experimental site (p < 0.05). The overall study results suggest a trend in the desired direction for the site and visitor management strategies designed to reduce vegetation impact and enhance vegetation recovery at the summit loop trail of Cadillac Mountain since 2000. However, the vegetation recovery has been rather minimal and did not reach the level of cover observed during the 1979 time period. In addition, the advantages and some limitations of using remote sensing technologies are discussed in detecting vegetation change in this setting and potential application to other recreation settings.

  12. Detection and Monitoring of Small-Scale Mining Operations in the Eastern Democratic Republic of the Congo (DRC) Using Multi-Temporal, Multi-Sensor Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Walther, Christian; Frei, Michaela

    2017-04-01

    Mining of so-called "conflict minerals" is often related with small-scale mining activities. The here discussed activities are located in forested areas in the eastern DRC, which are often remote, difficult to access and insecure for traditional geological field inspection. In order to accelerate their CTC (Certified Trading Chain)-certification process, remote sensing data are used for detection and monitoring of these small-scale mining operations. This requires a high image acquisition frequency due to mining site relocations and for compensation of year-round high cloud coverage, especially for optical data evaluation. Freely available medium resolution optical data of Sentinel-2 and Landsat-8 as well as SAR data of Sentinel-1 are used for detecting small mining targets with a minimum size of approximately 0.5 km2. The developed method enables a robust multi-temporal detection of mining sites, monitoring of mining site spatio-temporal relocations and environmental changes. Since qualitative and quantitative comparable results are generated, the followed change detection approach is objective and transparent and may push the certification process forward.

  13. A Unified Multi-scale Model for Cross-Scale Evaluation and Integration of Hydrological and Biogeochemical Processes

    NASA Astrophysics Data System (ADS)

    Liu, C.; Yang, X.; Bailey, V. L.; Bond-Lamberty, B. P.; Hinkle, C.

    2013-12-01

    Mathematical representations of hydrological and biogeochemical processes in soil, plant, aquatic, and atmospheric systems vary with scale. Process-rich models are typically used to describe hydrological and biogeochemical processes at the pore and small scales, while empirical, correlation approaches are often used at the watershed and regional scales. A major challenge for multi-scale modeling is that water flow, biogeochemical processes, and reactive transport are described using different physical laws and/or expressions at the different scales. For example, the flow is governed by the Navier-Stokes equations at the pore-scale in soils, by the Darcy law in soil columns and aquifer, and by the Navier-Stokes equations again in open water bodies (ponds, lake, river) and atmosphere surface layer. This research explores whether the physical laws at the different scales and in different physical domains can be unified to form a unified multi-scale model (UMSM) to systematically investigate the cross-scale, cross-domain behavior of fundamental processes at different scales. This presentation will discuss our research on the concept, mathematical equations, and numerical execution of the UMSM. Three-dimensional, multi-scale hydrological processes at the Disney Wilderness Preservation (DWP) site, Florida will be used as an example for demonstrating the application of the UMSM. In this research, the UMSM was used to simulate hydrological processes in rooting zones at the pore and small scales including water migration in soils under saturated and unsaturated conditions, root-induced hydrological redistribution, and role of rooting zone biogeochemical properties (e.g., root exudates and microbial mucilage) on water storage and wetting/draining. The small scale simulation results were used to estimate effective water retention properties in soil columns that were superimposed on the bulk soil water retention properties at the DWP site. The UMSM parameterized from smaller scale simulations were then used to simulate coupled flow and moisture migration in soils in saturated and unsaturated zones, surface and groundwater exchange, and surface water flow in streams and lakes at the DWP site under dynamic precipitation conditions. Laboratory measurements of soil hydrological and biogeochemical properties are used to parameterize the UMSM at the small scales, and field measurements are used to evaluate the UMSM.

  14. Changes in US extreme sea levels and the role of large scale climate variations

    NASA Astrophysics Data System (ADS)

    Wahl, T.; Chambers, D. P.

    2015-12-01

    We analyze a set of 20 tide gauge records covering the contiguous United States (US) coastline and the period from 1929 to 2013 to identify long-term trends and multi-decadal variations in extreme sea levels (ESLs) relative to changes in mean sea level (MSL). Significant but small long-term trends in ESLs above/below MSL are found at individual sites along most coastline stretches, but are mostly confined to the southeast coast and the winter season when storm surges are primarily driven by extra-tropical cyclones. We identify six regions with broadly coherent and considerable multi-decadal ESL variations unrelated to MSL changes. Using a quasi-non-stationary extreme value analysis approach we show that the latter would have caused variations in design relevant return water levels (RWLs; 50 to 200 year return periods) ranging from ~10 cm to as much as 110 cm across the six regions. To explore the origin of these temporal changes and the role of large-scale climate variability we develop different sets of simple and multiple linear regression models with RWLs as dependent variables and climate indices, or tailored (toward the goal of predicting multi-decadal RWL changes) versions of them, and wind stress curl as independent predictors. The models, after being tested for spatial and temporal stability, explain up to 97% of the observed variability at individual sites and almost 80% on average. Using the model predictions as covariates for the quasi-non-stationary extreme value analysis also significantly reduces the range of change in the 100-year RWLs over time, turning a non-stationary process into a stationary one. This highlights that the models - when used with regional and global climate model output of the predictors - should also be capable of projecting future RWL changes to be used by decision makers for improved flood preparedness and long-term resiliency.

  15. Promising strategies for advancement in knowledge of suicide risk factors and prevention.

    PubMed

    Sareen, Jitender; Isaak, Corinne; Katz, Laurence Y; Bolton, James; Enns, Murray W; Stein, Murray B

    2014-09-01

    Suicide is an important public health problem. Although there have been advances in our knowledge of suicide, gaps remain in knowledge about suicide risk factors and prevention. Here, we discuss research pathways that have the potential to rapidly advance knowledge in suicide risk assessment and reduction of suicide deaths over the next decade. We provide a concise overview of the methodologic approaches that have the capacity to rapidly increase knowledge and change practice, which have been successful in past work in psychiatry and other areas of medicine. We suggest three specific pathways to advance knowledge of suicide risk factors and prevention. First, analysis of large-scale epidemiologic surveys and administrative data sets can advance the understanding of suicide. Second, given the low base rate of suicide, there is a need for networks/consortia of investigators in the field of suicide prevention. Such consortia have the capacity to analyze existing epidemiologic data sets, create multi-site cohort studies of high-risk groups to increase knowledge of biological and other risk factors, and create a platform for multi-site clinical trials. Third, partnerships with policymakers and researchers would facilitate careful scientific evaluation of policies and programs aimed at reducing suicide. Suicide intervention policies are often multifaceted, expensive, and rarely evaluated. Using quasi-experimental methods or sophisticated analytic strategies such as propensity score-matching techniques, the impact of large-scale interventions on suicide can be evaluated. Furthermore, such partnerships between policymakers and researchers can lead to the design and support of prospective RCTs (e.g., cluster randomized trials, stepped wedge designs, waiting list designs) in high-risk groups (e.g., people with a history of suicide attempts, multi-axial comorbidity, and offspring of people who have died by suicide). These research pathways could lead to rapid knowledge uptake between communities and have the strong potential to reduce suicide. Copyright © 2014 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  16. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    NASA Astrophysics Data System (ADS)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we contend that creating believable soil carbon predictions requires a robust, transparent, and community-available benchmarking framework. I will present an ILAMB evaluation of several of the above-mentioned approaches in ACME, and attempt to motivate community adoption of this evaluation approach.

  17. The Large-scale Structure of the Universe: Probes of Cosmology and Structure Formation

    NASA Astrophysics Data System (ADS)

    Noh, Yookyung

    The usefulness of large-scale structure as a probe of cosmology and structure formation is increasing as large deep surveys in multi-wavelength bands are becoming possible. The observational analysis of large-scale structure guided by large volume numerical simulations are beginning to offer us complementary information and crosschecks of cosmological parameters estimated from the anisotropies in Cosmic Microwave Background (CMB) radiation. Understanding structure formation and evolution and even galaxy formation history is also being aided by observations of different redshift snapshots of the Universe, using various tracers of large-scale structure. This dissertation work covers aspects of large-scale structure from the baryon acoustic oscillation scale, to that of large scale filaments and galaxy clusters. First, I discuss a large- scale structure use for high precision cosmology. I investigate the reconstruction of Baryon Acoustic Oscillation (BAO) peak within the context of Lagrangian perturbation theory, testing its validity in a large suite of cosmological volume N-body simulations. Then I consider galaxy clusters and the large scale filaments surrounding them in a high resolution N-body simulation. I investigate the geometrical properties of galaxy cluster neighborhoods, focusing on the filaments connected to clusters. Using mock observations of galaxy clusters, I explore the correlations of scatter in galaxy cluster mass estimates from multi-wavelength observations and different measurement techniques. I also examine the sources of the correlated scatter by considering the intrinsic and environmental properties of clusters.

  18. Cardiac Light-Sheet Fluorescent Microscopy for Multi-Scale and Rapid Imaging of Architecture and Function

    NASA Astrophysics Data System (ADS)

    Fei, Peng; Lee, Juhyun; Packard, René R. Sevag; Sereti, Konstantina-Ioanna; Xu, Hao; Ma, Jianguo; Ding, Yichen; Kang, Hanul; Chen, Harrison; Sung, Kevin; Kulkarni, Rajan; Ardehali, Reza; Kuo, C.-C. Jay; Xu, Xiaolei; Ho, Chih-Ming; Hsiai, Tzung K.

    2016-03-01

    Light Sheet Fluorescence Microscopy (LSFM) enables multi-dimensional and multi-scale imaging via illuminating specimens with a separate thin sheet of laser. It allows rapid plane illumination for reduced photo-damage and superior axial resolution and contrast. We hereby demonstrate cardiac LSFM (c-LSFM) imaging to assess the functional architecture of zebrafish embryos with a retrospective cardiac synchronization algorithm for four-dimensional reconstruction (3-D space + time). By combining our approach with tissue clearing techniques, we reveal the entire cardiac structures and hypertrabeculation of adult zebrafish hearts in response to doxorubicin treatment. By integrating the resolution enhancement technique with c-LSFM to increase the resolving power under a large field-of-view, we demonstrate the use of low power objective to resolve the entire architecture of large-scale neonatal mouse hearts, revealing the helical orientation of individual myocardial fibers. Therefore, our c-LSFM imaging approach provides multi-scale visualization of architecture and function to drive cardiovascular research with translational implication in congenital heart diseases.

  19. Multi-level structure in the large scale distribution of optically luminous galaxies

    NASA Astrophysics Data System (ADS)

    Deng, Xin-fa; Deng, Zu-gan; Liu, Yong-zhen

    1992-04-01

    Fractal dimensions in the large scale distribution of galaxies have been calculated with the method given by Wen et al. [1] Samples are taken from CfA redshift survey in northern and southern galactic [2] hemisphere in our analysis respectively. Results from these two regions are compared with each other. There are significant differences between the distributions in these two regions. However, our analyses do show some common features of the distributions in these two regions. All subsamples show multi-level fractal character distinctly. Combining it with the results from analyses of samples given by IRAS galaxies and results from samples given by redshift survey in pencil-beam fields, [3,4] we suggest that multi-level fractal structure is most likely to be a general and important character in the large scale distribution of galaxies. The possible implications of this character are discussed.

  20. Study of multi-functional precision optical measuring system for large scale equipment

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi

    2017-10-01

    The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.

  1. Field Trials of the Multi-Source Approach for Resistivity and Induced Polarization Data Acquisition

    NASA Astrophysics Data System (ADS)

    LaBrecque, D. J.; Morelli, G.; Fischanger, F.; Lamoureux, P.; Brigham, R.

    2013-12-01

    Implementing systems of distributed receivers and transmitters for resistivity and induced polarization data is an almost inevitable result of the availability of wireless data communication modules and GPS modules offering precise timing and instrument locations. Such systems have a number of advantages; for example, they can be deployed around obstacles such as rivers, canyons, or mountains which would be difficult with traditional 'hard-wired' systems. However, deploying a system of identical, small, battery powered, transceivers, each capable of injecting a known current and measuring the induced potential has an additional and less obvious advantage in that multiple units can inject current simultaneously. The original purpose for using multiple simultaneous current sources (multi-source) was to increase signal levels. In traditional systems, to double the received signal you inject twice the current which requires you to apply twice the voltage and thus four times the power. Alternatively, one approach to increasing signal levels for large-scale surveys collected using small, battery powered transceivers is it to allow multiple units to transmit in parallel. In theory, using four 400 watt transmitters on separate, parallel dipoles yields roughly the same signal as a single 6400 watt transmitter. Furthermore, implementing the multi-source approach creates the opportunity to apply more complex current flow patterns than simple, parallel dipoles. For a perfect, noise-free system, multi-sources adds no new information to a data set that contains a comprehensive set of data collected using single sources. However, for realistic, noisy systems, it appears that multi-source data can substantially impact survey results. In preliminary model studies, the multi-source data produced such startling improvements in subsurface images that even the authors questioned their veracity. Between December of 2012 and July of 2013, we completed multi-source surveys at five sites with depths of exploration ranging from 150 to 450 m. The sites included shallow geothermal sites near Reno Nevada, Pomarance Italy, and Volterra Italy; a mineral exploration site near Timmins Quebec; and a landslide investigation near Vajont Dam in northern Italy. These sites provided a series of challenges in survey design and deployment including some extremely difficult terrain and a broad range of background resistivity and induced values. Despite these challenges, comparison of multi-source results to resistivity and induced polarization data collection with more traditional methods support the thesis that the multi-source approach is capable of providing substantial improvements in both depth of penetration and resolution over conventional approaches.

  2. Hydropower and sustainability: resilience and vulnerability in China's powersheds.

    PubMed

    McNally, Amy; Magee, Darrin; Wolf, Aaron T

    2009-07-01

    Large dams represent a whole complex of social, economic and ecological processes, perhaps more than any other large infrastructure project. Today, countries with rapidly developing economies are constructing new dams to provide energy and flood control to growing populations in riparian and distant urban communities. If the system is lacking institutional capacity to absorb these physical and institutional changes there is potential for conflict, thereby threatening human security. In this paper, we propose analyzing sustainability (political, socioeconomic, and ecological) in terms of resilience versus vulnerability, framed within the spatial abstraction of a powershed. The powershed framework facilitates multi-scalar and transboundary analysis while remaining focused on the questions of resilience and vulnerability relating to hydropower dams. Focusing on examples from China, this paper describes the complex nature of dams using the sustainability and powershed frameworks. We then analyze the roles of institutions in China to understand the relationships between power, human security and the socio-ecological system. To inform the study of conflicts over dams China is a particularly useful case study because we can examine what happens at the international, national and local scales. The powershed perspective allows us to examine resilience and vulnerability across political boundaries from a dynamic, process-defined analytical scale while remaining focused on a host of questions relating to hydro-development that invoke drivers and impacts on national and sub-national scales. The ability to disaggregate the affects of hydropower dam construction from political boundaries allows for a deeper analysis of resilience and vulnerability. From our analysis we find that reforms in China's hydropower sector since 1996 have been motivated by the need to create stability at the national scale rather than resilient solutions to China's growing demand for energy and water resource control at the local and international scales. Some measures that improved economic development through the market economy and a combination of dam construction and institutional reform may indeed improve hydro-political resilience at a single scale. However, if China does address large-scale hydropower construction's potential to create multi-scale geopolitical tensions, they may be vulnerable to conflict - though not necessarily violent - in domestic and international political arenas. We conclude with a look toward a resilient basin institution for the Nu/Salween River, the site of a proposed large-scale hydropower development effort in China and Myanmar.

  3. Child/Adolescent Anxiety Multimodal Study (CAMS): rationale, design, and methods

    PubMed Central

    2010-01-01

    Objective To present the design, methods, and rationale of the Child/Adolescent Anxiety Multimodal Study (CAMS), a recently completed federally-funded, multi-site, randomized placebo-controlled trial that examined the relative efficacy of cognitive-behavior therapy (CBT), sertraline (SRT), and their combination (COMB) against pill placebo (PBO) for the treatment of separation anxiety disorder (SAD), generalized anxiety disorder (GAD) and social phobia (SoP) in children and adolescents. Methods Following a brief review of the acute outcomes of the CAMS trial, as well as the psychosocial and pharmacologic treatment literature for pediatric anxiety disorders, the design and methods of the CAMS trial are described. Results CAMS was a six-year, six-site, randomized controlled trial. Four hundred eighty-eight (N = 488) children and adolescents (ages 7-17 years) with DSM-IV-TR diagnoses of SAD, GAD, or SoP were randomly assigned to one of four treatment conditions: CBT, SRT, COMB, or PBO. Assessments of anxiety symptoms, safety, and functional outcomes, as well as putative mediators and moderators of treatment response were completed in a multi-measure, multi-informant fashion. Manual-based therapies, trained clinicians and independent evaluators were used to ensure treatment and assessment fidelity. A multi-layered administrative structure with representation from all sites facilitated cross-site coordination of the entire trial, study protocols and quality assurance. Conclusions CAMS offers a model for clinical trials methods applicable to psychosocial and psychopharmacological comparative treatment trials by using state-of-the-art methods and rigorous cross-site quality controls. CAMS also provided a large-scale examination of the relative and combined efficacy and safety of the best evidenced-based psychosocial (CBT) and pharmacologic (SSRI) treatments to date for the most commonly occurring pediatric anxiety disorders. Primary and secondary results of CAMS will hold important implications for informing practice-relevant decisions regarding the initial treatment of youth with anxiety disorders. Trial registration ClinicalTrials.gov NCT00052078. PMID:20051130

  4. The Multi-Scale Mass Transfer Processes Controlling Natural Attenuation and Engineered Remediation: An IFC Focused on Hanford’s 300 Area Uranium Plume Quality Assurance Project Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fix, N. J.

    The purpose of the project is to conduct research at an Integrated Field-Scale Research Challenge Site in the Hanford Site 300 Area, CERCLA OU 300-FF-5 (Figure 1), to investigate multi-scale mass transfer processes associated with a subsurface uranium plume impacting both the vadose zone and groundwater. The project will investigate a series of science questions posed for research related to the effect of spatial heterogeneities, the importance of scale, coupled interactions between biogeochemical, hydrologic, and mass transfer processes, and measurements/approaches needed to characterize a mass-transfer dominated system. The research will be conducted by evaluating three (3) different hypotheses focused onmore » multi-scale mass transfer processes in the vadose zone and groundwater, their influence on field-scale U(VI) biogeochemistry and transport, and their implications to natural systems and remediation. The project also includes goals to 1) provide relevant materials and field experimental opportunities for other ERSD researchers and 2) generate a lasting, accessible, and high-quality field experimental database that can be used by the scientific community for testing and validation of new conceptual and numerical models of subsurface reactive transport.« less

  5. Multi-Scale Models for the Scale Interaction of Organized Tropical Convection

    NASA Astrophysics Data System (ADS)

    Yang, Qiu

    Assessing the upscale impact of organized tropical convection from small spatial and temporal scales is a research imperative, not only for having a better understanding of the multi-scale structures of dynamical and convective fields in the tropics, but also for eventually helping in the design of new parameterization strategies to improve the next-generation global climate models. Here self-consistent multi-scale models are derived systematically by following the multi-scale asymptotic methods and used to describe the hierarchical structures of tropical atmospheric flows. The advantages of using these multi-scale models lie in isolating the essential components of multi-scale interaction and providing assessment of the upscale impact of the small-scale fluctuations onto the large-scale mean flow through eddy flux divergences of momentum and temperature in a transparent fashion. Specifically, this thesis includes three research projects about multi-scale interaction of organized tropical convection, involving tropical flows at different scaling regimes and utilizing different multi-scale models correspondingly. Inspired by the observed variability of tropical convection on multiple temporal scales, including daily and intraseasonal time scales, the goal of the first project is to assess the intraseasonal impact of the diurnal cycle on the planetary-scale circulation such as the Hadley cell. As an extension of the first project, the goal of the second project is to assess the intraseasonal impact of the diurnal cycle over the Maritime Continent on the Madden-Julian Oscillation. In the third project, the goals are to simulate the baroclinic aspects of the ITCZ breakdown and assess its upscale impact on the planetary-scale circulation over the eastern Pacific. These simple multi-scale models should be useful to understand the scale interaction of organized tropical convection and help improve the parameterization of unresolved processes in global climate models.

  6. Streamflow prediction using multi-site rainfall obtained from hydroclimatic teleconnection

    NASA Astrophysics Data System (ADS)

    Kashid, S. S.; Ghosh, Subimal; Maity, Rajib

    2010-12-01

    SummarySimultaneous variations in weather and climate over widely separated regions are commonly known as "hydroclimatic teleconnections". Rainfall and runoff patterns, over continents, are found to be significantly teleconnected, with large-scale circulation patterns, through such hydroclimatic teleconnections. Though such teleconnections exist in nature, it is very difficult to model them, due to their inherent complexity. Statistical techniques and Artificial Intelligence (AI) tools gain popularity in modeling hydroclimatic teleconnection, based on their ability, in capturing the complicated relationship between the predictors (e.g. sea surface temperatures) and predictand (e.g., rainfall). Genetic Programming is such an AI tool, which is capable of capturing nonlinear relationship, between predictor and predictand, due to its flexible functional structure. In the present study, gridded multi-site weekly rainfall is predicted from El Niño Southern Oscillation (ENSO) indices, Equatorial Indian Ocean Oscillation (EQUINOO) indices, Outgoing Longwave Radiation (OLR) and lag rainfall at grid points, over the catchment, using Genetic Programming. The predicted rainfall is further used in a Genetic Programming model to predict streamflows. The model is applied for weekly forecasting of streamflow in Mahanadi River, India, and satisfactory performance is observed.

  7. Multi-scale Visualization of Molecular Architecture Using Real-Time Ambient Occlusion in Sculptor.

    PubMed

    Wahle, Manuel; Wriggers, Willy

    2015-10-01

    The modeling of large biomolecular assemblies relies on an efficient rendering of their hierarchical architecture across a wide range of spatial level of detail. We describe a paradigm shift currently under way in computer graphics towards the use of more realistic global illumination models, and we apply the so-called ambient occlusion approach to our open-source multi-scale modeling program, Sculptor. While there are many other higher quality global illumination approaches going all the way up to full GPU-accelerated ray tracing, they do not provide size-specificity of the features they shade. Ambient occlusion is an aspect of global lighting that offers great visual benefits and powerful user customization. By estimating how other molecular shape features affect the reception of light at some surface point, it effectively simulates indirect shadowing. This effect occurs between molecular surfaces that are close to each other, or in pockets such as protein or ligand binding sites. By adding ambient occlusion, large macromolecular systems look much more natural, and the perception of characteristic surface features is strongly enhanced. In this work, we present a real-time implementation of screen space ambient occlusion that delivers realistic cues about tunable spatial scale characteristics of macromolecular architecture. Heretofore, the visualization of large biomolecular systems, comprising e.g. hundreds of thousands of atoms or Mega-Dalton size electron microscopy maps, did not take into account the length scales of interest or the spatial resolution of the data. Our approach has been uniquely customized with shading that is tuned for pockets and cavities of a user-defined size, making it useful for visualizing molecular features at multiple scales of interest. This is a feature that none of the conventional ambient occlusion approaches provide. Actual Sculptor screen shots illustrate how our implementation supports the size-dependent rendering of molecular surface features.

  8. Estimation of future flow regime for a spatially varied Himalayan watershed using improved multi-site calibration method of SWAT model.

    NASA Astrophysics Data System (ADS)

    Pradhanang, S. M.; Hasan, M. A.; Booth, P.; Fallatah, O.

    2016-12-01

    The monsoon and snow driven regime in the Himalayan region has received increasing attention in the recent decade regarding the effects of climate change on hydrologic regimes. Modeling streamflow in such spatially varied catchment requires proper calibration and validation in hydrologic modeling. While calibration and validation are time consuming and computationally intensive, an effective regionalized approach with multi-site information is crucial for flow estimation, especially in daily scale. In this study, we adopted a multi-site approach to calibration and validation of the Soil Water Assessment Tool (SWAT) model for the Karnali river catchment, which is characterized as being the most vulnerable catchment to climate change in the Himalayan region. APHRODITE's (Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation) daily gridded precipitation data, one of the accurate and reliable weather date over this region were utilized in this study. The model evaluation of the entire catchment divided into four sub-catchments, utilizing discharge records from 1963 to 2010. In previous studies, multi-site calibration used only a single set of calibration parameters for all sub-catchment of a large watershed. In this study, we introduced a technique that can incorporate different sets of calibration parameters for each sub-basin, which eventually ameliorate the flow of the whole watershed. Results show that the calibrated model with new method can capture almost identical pattern of flow over the region. The predicted daily streamflow matched the observed values, with a Nash-Sutcliffe coefficient of 0.73 during calibration and 0.71 during validation period. The method perfumed better than existing multi-site calibration methods. To assess the influence of continued climate change on hydrologic processes, we modified the weather inputs for the model using precipitation and temperature changes for two Representative Concentration Pathways (RCPs) scenarios, RCP 4.5 and 8.5. Climate simulation for RCP scenarios were conducted from 1981-2100, where 1981-2005 was considered as baseline and 2006-2100 was considered as the future projection. The result shows that probability of flooding will eventually increase in future years due to increased flow in both scenarios.

  9. Scale Interactions in the Tropics from a Simple Multi-Cloud Model

    NASA Astrophysics Data System (ADS)

    Niu, X.; Biello, J. A.

    2017-12-01

    Our lack of a complete understanding of the interaction between the moisture convection and equatorial waves remains an impediment in the numerical simulation of large-scale organization, such as the Madden-Julian Oscillation (MJO). The aim of this project is to understand interactions across spatial scales in the tropics from a simplified framework for scale interactions while a using a simplified framework to describe the basic features of moist convection. Using multiple asymptotic scales, Biello and Majda[1] derived a multi-scale model of moist tropical dynamics (IMMD[1]), which separates three regimes: the planetary scale climatology, the synoptic scale waves, and the planetary scale anomalies regime. The scales and strength of the observed MJO would categorize it in the regime of planetary scale anomalies - which themselves are forced from non-linear upscale fluxes from the synoptic scales waves. In order to close this model and determine whether it provides a self-consistent theory of the MJO. A model for diabatic heating due to moist convection must be implemented along with the IMMD. The multi-cloud parameterization is a model proposed by Khouider and Majda[2] to describe the three basic cloud types (congestus, deep and stratiform) that are most responsible for tropical diabatic heating. We implement a simplified version of the multi-cloud model that is based on results derived from large eddy simulations of convection [3]. We present this simplified multi-cloud model and show results of numerical experiments beginning with a variety of convective forcing states. Preliminary results on upscale fluxes, from synoptic scales to planetary scale anomalies, will be presented. [1] Biello J A, Majda A J. Intraseasonal multi-scale moist dynamics of the tropical atmosphere[J]. Communications in Mathematical Sciences, 2010, 8(2): 519-540. [2] Khouider B, Majda A J. A simple multicloud parameterization for convectively coupled tropical waves. Part I: Linear analysis[J]. Journal of the atmospheric sciences, 2006, 63(4): 1308-1323. [3] Dorrestijn J, Crommelin D T, Biello J A, et al. A data-driven multi-cloud model for stochastic parametrization of deep convection[J]. Philosophical Transactions of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, 2013, 371(1991): 20120374.

  10. Multiscale characterization and mechanical modeling of an Al-Zn-Mg electron beam weld

    NASA Astrophysics Data System (ADS)

    Puydt, Quentin; Flouriot, Sylvain; Ringeval, Sylvain; Parry, Guillaume; De Geuser, Frédéric; Deschamps, Alexis

    Welding of precipitation hardening alloys results in multi-scale microstructural heterogeneities, from the hardening nano-scale precipitates to the micron-scale solidification structures and to the component geometry. This heterogeneity results in a complex mechanical response, with gradients in strength, stress triaxiality and damage initiation sites.

  11. Multi-scale signed envelope inversion

    NASA Astrophysics Data System (ADS)

    Chen, Guo-Xin; Wu, Ru-Shan; Wang, Yu-Qing; Chen, Sheng-Chang

    2018-06-01

    Envelope inversion based on modulation signal mode was proposed to reconstruct large-scale structures of underground media. In order to solve the shortcomings of conventional envelope inversion, multi-scale envelope inversion was proposed using new envelope Fréchet derivative and multi-scale inversion strategy to invert strong contrast models. In multi-scale envelope inversion, amplitude demodulation was used to extract the low frequency information from envelope data. However, only to use amplitude demodulation method will cause the loss of wavefield polarity information, thus increasing the possibility of inversion to obtain multiple solutions. In this paper we proposed a new demodulation method which can contain both the amplitude and polarity information of the envelope data. Then we introduced this demodulation method into multi-scale envelope inversion, and proposed a new misfit functional: multi-scale signed envelope inversion. In the numerical tests, we applied the new inversion method to the salt layer model and SEG/EAGE 2-D Salt model using low-cut source (frequency components below 4 Hz were truncated). The results of numerical test demonstrated the effectiveness of this method.

  12. Characterizing multi-scale self-similar behavior and non-statistical properties of fluctuations in financial time series

    NASA Astrophysics Data System (ADS)

    Ghosh, Sayantan; Manimaran, P.; Panigrahi, Prasanta K.

    2011-11-01

    We make use of wavelet transform to study the multi-scale, self-similar behavior and deviations thereof, in the stock prices of large companies, belonging to different economic sectors. The stock market returns exhibit multi-fractal characteristics, with some of the companies showing deviations at small and large scales. The fact that, the wavelets belonging to the Daubechies’ (Db) basis enables one to isolate local polynomial trends of different degrees, plays the key role in isolating fluctuations at different scales. One of the primary motivations of this work is to study the emergence of the k-3 behavior [X. Gabaix, P. Gopikrishnan, V. Plerou, H. Stanley, A theory of power law distributions in financial market fluctuations, Nature 423 (2003) 267-270] of the fluctuations starting with high frequency fluctuations. We make use of Db4 and Db6 basis sets to respectively isolate local linear and quadratic trends at different scales in order to study the statistical characteristics of these financial time series. The fluctuations reveal fat tail non-Gaussian behavior, unstable periodic modulations, at finer scales, from which the characteristic k-3 power law behavior emerges at sufficiently large scales. We further identify stable periodic behavior through the continuous Morlet wavelet.

  13. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2009-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...pilot study to utilize large scale parallel Grid computing to harness the nationwide cluster infrastructure for optimization of medical image ... analysis parameters. Additionally, we investigated the use of cutting edge dataanalysis/ mining techniques as applied to Ultrasound, FFDM, and DCE-MRI Breast

  14. By land, sea and air (and space): Verifying UK methane emissions at a range of scales by integrating multiple measurement platforms

    NASA Astrophysics Data System (ADS)

    Rigby, M. L.; Lunt, M. F.; Ganesan, A.

    2015-12-01

    The Greenhouse gAs Uk and Global Emissions (GAUGE) programme and Department of Energy and Climate Change (DECC) network aim to quantify the magnitude and uncertainty of UK greenhouse gas (GHG) emissions at a resolution and accuracy higher than has previously been possible. The on going DECC tall tower network consists of three sites, and an eastern background site in Ireland. The GAUGE project includes instruments at two additional tall tower sites, a high-density measurement network over agricultural land in eastern England, a ferry that performs near-daily transects along the east coast of the UK, and a research aircraft that has been deployed on a campaign basis. Together with data collected by the GOSAT satellite, these data represent the GAUGE/DECC GHG measurement network that is being used to quantify UK GHG fluxes. As part of the wider GAUGE modelling efforts, we have derived methane flux estimates for the UK and northwest Europe using the UK Met Office NAME atmospheric transport model and a novel hierarchical Bayesian "trans-dimensional" inversion framework. We will show that our estimated fluxes for the UK as a whole are largely consistent between individual measurement platforms, albeit with very different uncertainties. Our novel inversion approach uses the data to objectively determine the extent to which we can further refine our national estimates to the level of large urban areas, major hotspots or larger sub-national regions. In this talk, we will outline some initial findings of the GAUGE project, tackling questions such as: At what spatial scale can we effectively derive greenhouse gas fluxes with a dense, multi-platform national network? Can we resolve individual metropolitan areas or major hotspots? What is relative impact of individual stations, platforms and network configurations on flux estimates for a country of the size of the UK? How can we effectively use multi-platform observations to cross-validate flux estimates and determine likely errors in model transport?

  15. Multi-site precipitation downscaling using a stochastic weather generator

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Chen, Hua; Guo, Shenglian

    2018-03-01

    Statistical downscaling is an efficient way to solve the spatiotemporal mismatch between climate model outputs and the data requirements of hydrological models. However, the most commonly-used downscaling method only produces climate change scenarios for a specific site or watershed average, which is unable to drive distributed hydrological models to study the spatial variability of climate change impacts. By coupling a single-site downscaling method and a multi-site weather generator, this study proposes a multi-site downscaling approach for hydrological climate change impact studies. Multi-site downscaling is done in two stages. The first stage involves spatially downscaling climate model-simulated monthly precipitation from grid scale to a specific site using a quantile mapping method, and the second stage involves the temporal disaggregating of monthly precipitation to daily values by adjusting the parameters of a multi-site weather generator. The inter-station correlation is specifically considered using a distribution-free approach along with an iterative algorithm. The performance of the downscaling approach is illustrated using a 10-station watershed as an example. The precipitation time series derived from the National Centers for Environment Prediction (NCEP) reanalysis dataset is used as the climate model simulation. The precipitation time series of each station is divided into 30 odd years for calibration and 29 even years for validation. Several metrics, including the frequencies of wet and dry spells and statistics of the daily, monthly and annual precipitation are used as criteria to evaluate the multi-site downscaling approach. The results show that the frequencies of wet and dry spells are well reproduced for all stations. In addition, the multi-site downscaling approach performs well with respect to reproducing precipitation statistics, especially at monthly and annual timescales. The remaining biases mainly result from the non-stationarity of NCEP precipitation. Overall, the proposed approach is efficient for generating multi-site climate change scenarios that can be used to investigate the spatial variability of climate change impacts on hydrology.

  16. HD-MTL: Hierarchical Deep Multi-Task Learning for Large-Scale Visual Recognition.

    PubMed

    Fan, Jianping; Zhao, Tianyi; Kuang, Zhenzhong; Zheng, Yu; Zhang, Ji; Yu, Jun; Peng, Jinye

    2017-02-09

    In this paper, a hierarchical deep multi-task learning (HD-MTL) algorithm is developed to support large-scale visual recognition (e.g., recognizing thousands or even tens of thousands of atomic object classes automatically). First, multiple sets of multi-level deep features are extracted from different layers of deep convolutional neural networks (deep CNNs), and they are used to achieve more effective accomplishment of the coarseto- fine tasks for hierarchical visual recognition. A visual tree is then learned by assigning the visually-similar atomic object classes with similar learning complexities into the same group, which can provide a good environment for determining the interrelated learning tasks automatically. By leveraging the inter-task relatedness (inter-class similarities) to learn more discriminative group-specific deep representations, our deep multi-task learning algorithm can train more discriminative node classifiers for distinguishing the visually-similar atomic object classes effectively. Our hierarchical deep multi-task learning (HD-MTL) algorithm can integrate two discriminative regularization terms to control the inter-level error propagation effectively, and it can provide an end-to-end approach for jointly learning more representative deep CNNs (for image representation) and more discriminative tree classifier (for large-scale visual recognition) and updating them simultaneously. Our incremental deep learning algorithms can effectively adapt both the deep CNNs and the tree classifier to the new training images and the new object classes. Our experimental results have demonstrated that our HD-MTL algorithm can achieve very competitive results on improving the accuracy rates for large-scale visual recognition.

  17. From large-eddy simulation to multi-UAVs sampling of shallow cumulus clouds

    NASA Astrophysics Data System (ADS)

    Lamraoui, Fayçal; Roberts, Greg; Burnet, Frédéric

    2016-04-01

    In-situ sampling of clouds that can provide simultaneous measurements at satisfying spatio-temporal resolutions to capture 3D small scale physical processes continues to present challenges. This project (SKYSCANNER) aims at bringing together cloud sampling strategies using a swarm of unmanned aerial vehicles (UAVs) based on Large-eddy simulation (LES). The multi-UAV-based field campaigns with a personalized sampling strategy for individual clouds and cloud fields will significantly improve the understanding of the unresolved cloud physical processes. An extensive set of LES experiments for case studies from ARM-SGP site have been performed using MesoNH model at high resolutions down to 10 m. The carried out simulations led to establishing a macroscopic model that quantifies the interrelationship between micro- and macrophysical properties of shallow convective clouds. Both the geometry and evolution of individual clouds are critical to multi-UAV cloud sampling and path planning. The preliminary findings of the current project reveal several linear relationships that associate many cloud geometric parameters to cloud related meteorological variables. In addition, the horizontal wind speed indicates a proportional impact on cloud number concentration as well as triggering and prolonging the occurrence of cumulus clouds. In the framework of the joint collaboration that involves a Multidisciplinary Team (including institutes specializing in aviation, robotics and atmospheric science), this model will be a reference point for multi-UAVs sampling strategies and path planning.

  18. Factors Controlling the Properties of Multi-Phase Arctic Stratocumulus Clouds

    NASA Technical Reports Server (NTRS)

    Fridlind, Ann; Ackerman, Andrew; Menon, Surabi

    2005-01-01

    The 2004 Multi-Phase Arctic Cloud Experiment (M-PACE) IOP at the ARM NSA site focused on measuring the properties of autumn transition-season arctic stratus and the environmental conditions controlling them, including concentrations of heterogeneous ice nuclei. Our work aims to use a large-eddy simulation (LES) code with embedded size-resolved aerosol and cloud microphysics to identify factors controlling multi-phase arctic stratus. Our preliminary simulations of autumn transition-season clouds observed during the 1994 Beaufort and Arctic Seas Experiment (BASE) indicated that low concentrations of ice nuclei, which were not measured, may have significantly lowered liquid water content and thereby stabilized cloud evolution. However, cloud drop concentrations appeared to be virtually immune to changes in liquid water content, indicating an active Bergeron process with little effect of collection on drop number concentration. We will compare these results with preliminary simulations from October 8-13 during MPACE. The sensitivity of cloud properties to uncertainty in other factors, such as large-scale forcings and aerosol profiles, will also be investigated. Based on the LES simulations with M-PACE data, preliminary results from the NASA GlSS single-column model (SCM) will be used to examine the sensitivity of predicted cloud properties to changing cloud drop number concentrations for multi-phase arctic clouds. Present parametrizations assumed fixed cloud droplet number concentrations and these will be modified using M-PACE data.

  19. Coupling Geophysical, Geotechnical and Stratigraphic Data to Interpret the Genesis of Mega-Scale-Glacial-Lineations on the Yermak Plateau, Arctic Ocean

    NASA Astrophysics Data System (ADS)

    O'Regan, M. A.; Jakobsson, M.; Kirchner, N.; Dowdeswell, J. A.; Hogan, K.

    2010-12-01

    The recent collection and analysis of multi-beam bathymetry data has revealed Mega-Scale Glacial Lineations (MSGL) in up to 600 m present water depth on the Yermak Plateau (Dowdeswell et al., 2010; Jakobsson et al., 2010). This evidence for large-scale ice grounding in the region supports previous interpretations from side-scan sonar, high-resolution subbottom and multi-channel seismic data. Detailed integration with regional subbottom data illustrates that the formation of the MSGL occurred in the late Quaternary, around MIS6. This event is distinct from a middle Quaternary ice grounding in the same region, that was first recognized by the transition into heavily overconsolidated sediments at ~20 mbsf at Ocean Drilling Program Site 910. While the middle Quaternary ice grounding left an easily recognizable imprint on the geotechnical properties of the sediments, the imprint from the late Quaternary event is far subtler, and not formerly recognized by analysis of sediments from Site 910. Furthermore, stratigraphic information indicates that neither event was associated with significant erosion, implying that the observed stress state of the sediments arose from ice-loading. Coupled with the orientation of the late Quaternary MSGL, the available evidence argues against an active ice-stream being responsible for their formation, and that they were more likely formed by a very large tabular iceberg traversing the ridge. This lends considerable support to the argument that MSGL-like features are not exclusively associated with fast flowing ice-streams. References Jakobsson, M., et al., An Arctic Ocean iceshelf during MIS 6 constrained by new geophysical and geological data. Quaternary Science Reviews (2010), doi:10.1016/j.quascirev.2010.03.015. Dowdeswell, J. A., et al., High-resolution geophysical observations of the Yermak Plateau and northern Svalbard margin: implications for ice-sheet grounding and deep-keeled icebergs. Quaternary Science Reviews (2010), doi:10.1016/j.quascirev.2010.06.002

  20. Detecting Multi-scale Structures in Chandra Images of Centaurus A

    NASA Astrophysics Data System (ADS)

    Karovska, M.; Fabbiano, G.; Elvis, M. S.; Evans, I. N.; Kim, D. W.; Prestwich, A. H.; Schwartz, D. A.; Murray, S. S.; Forman, W.; Jones, C.; Kraft, R. P.; Isobe, T.; Cui, W.; Schreier, E. J.

    1999-12-01

    Centaurus A (NGC 5128) is a giant early-type galaxy with a merger history, containing the nearest radio-bright AGN. Recent Chandra High Resolution Camera (HRC) observations of Cen A reveal X-ray multi-scale structures in this object with unprecedented detail and clarity. We show the results of an analysis of the Chandra data with smoothing and edge enhancement techniques that allow us to enhance and quantify the multi-scale structures present in the HRC images. These techniques include an adaptive smoothing algorithm (Ebeling et al 1999), and a multi-directional gradient detection algorithm (Karovska et al 1994). The Ebeling et al adaptive smoothing algorithm, which is incorporated in the CXC analysis s/w package, is a powerful tool for smoothing images containing complex structures at various spatial scales. The adaptively smoothed images of Centaurus A show simultaneously the high-angular resolution bright structures at scales as small as an arcsecond and the extended faint structures as large as several arc minutes. The large scale structures suggest complex symmetry, including a component possibly associated with the inner radio lobes (as suggested by the ROSAT HRI data, Dobereiner et al 1996), and a separate component with an orthogonal symmetry that may be associated with the galaxy as a whole. The dust lane and the x-ray ridges are very clearly visible. The adaptively smoothed images and the edge-enhanced images also suggest several filamentary features including a large filament-like structure extending as far as about 5 arcminutes to North-West.

  1. The Drivers of the CH4 Seasonal Cycle in the Arctic and What Long-Term Observations of CH4 Imply About Trends in Arctic CH4 Fluxes

    NASA Astrophysics Data System (ADS)

    Sweeney, C.; Karion, A.; Bruhwiler, L.; Miller, J. B.; Wofsy, S. C.; Miller, C. E.; Chang, R. Y.; Dlugokencky, E. J.; Daube, B.; Pittman, J. V.; Dinardo, S. J.

    2012-12-01

    The large seasonal change in the atmospheric column for CH4 in the Arctic is driven by two dominant processes: transport of CH4 from low latitudes and surface emissions throughout the Arctic region. The NOAA ESRL Carbon Cycle Group Aircraft Program along with the NASA funded Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) have initiated an effort to better understand the factors controlling the seasonal changes in the mole fraction of CH4 in the Arctic with a multi-scale aircraft observing network in Alaska. The backbone of this network is multi-species flask sampling from 500 to 8000 masl that has been conducted every two weeks for the last 10 years over Poker Flat, AK. In addition regular profiles at the interior Alaska site at Poker Flat, NOAA has teamed up with the United States Coast Guard to make profiling flights with continuous observations of CO2, CO, CH4 and Ozone between Kodiak and Barrow every 2 weeks. More recently, CARVE has significantly added to this observational network with targeted flights focused on exploring the variability of CO2, CH4 and CO in the boundary layer both in the interior and the North Slope regions of Alaska. Taken together with the profiling of HIAPER Pole-to-Pole Observations (HIPPO), ground sites at Barrow and a new CARVE interior Alaska surface site just north of Fairbanks, AK, we now have the ability to investigate the full evolution of the seasonal cycle in the Arctic using both the multi-scale sampling offered by the different aircraft platforms as well as the multi-species sampling offered by in-situ and flask sampling. The flasks also provide a valuable tie-point between different platforms so that spatial and temporal gradients can be properly interpreted. In the context of the seasonal cycle observed by the aircraft platforms we will look at long term ground observations over the last 20 years to assess changes in Arctic CH4 emissions which have occurred as a result of 0.6C/decade changes in mean surface temperatures.

  2. Effects of Large-Scale Solar Installations on Dust Mobilization and Air Quality

    NASA Astrophysics Data System (ADS)

    Pratt, J. T.; Singh, D.; Diffenbaugh, N. S.

    2012-12-01

    Large-scale solar projects are increasingly being developed worldwide and many of these installations are located in arid, desert regions. To examine the effects of these projects on regional dust mobilization and air quality, we analyze aerosol product data from NASA's Multi-angle Imaging Spectroradiometer (MISR) at annual and seasonal time intervals near fifteen photovoltaic and solar thermal stations ranging from 5-200 MW (12-4,942 acres) in size. The stations are distributed over eight different countries and were chosen based on size, location and installation date; most of the installations are large-scale, took place in desert climates and were installed between 2006 and 2010. We also consider air quality measurements of particulate matter between 2.5 and 10 micrometers (PM10) from the Environmental Protection Agency (EPA) monitoring sites near and downwind from the project installations in the U.S. We use monthly wind data from the NOAA's National Center for Atmospheric Prediction (NCEP) Global Reanalysis to select the stations downwind from the installations, and then perform statistical analysis on the data to identify any significant changes in these quantities. We find that fourteen of the fifteen regions have lower aerosol product after the start of the installations as well as all six PM10 monitoring stations showing lower particulate matter measurements after construction commenced. Results fail to show any statistically significant differences in aerosol optical index or PM10 measurements before and after the large-scale solar installations. However, many of the large installations are very recent, and there is insufficient data to fully understand the long-term effects on air quality. More data and higher resolution analysis is necessary to better understand the relationship between large-scale solar, dust and air quality.

  3. Big GABA: Edited MR spectroscopy at 24 research sites.

    PubMed

    Mikkelsen, Mark; Barker, Peter B; Bhattacharyya, Pallab K; Brix, Maiken K; Buur, Pieter F; Cecil, Kim M; Chan, Kimberly L; Chen, David Y-T; Craven, Alexander R; Cuypers, Koen; Dacko, Michael; Duncan, Niall W; Dydak, Ulrike; Edmondson, David A; Ende, Gabriele; Ersland, Lars; Gao, Fei; Greenhouse, Ian; Harris, Ashley D; He, Naying; Heba, Stefanie; Hoggard, Nigel; Hsu, Tun-Wei; Jansen, Jacobus F A; Kangarlu, Alayar; Lange, Thomas; Lebel, R Marc; Li, Yan; Lin, Chien-Yuan E; Liou, Jy-Kang; Lirng, Jiing-Feng; Liu, Feng; Ma, Ruoyun; Maes, Celine; Moreno-Ortega, Marta; Murray, Scott O; Noah, Sean; Noeske, Ralph; Noseworthy, Michael D; Oeltzschner, Georg; Prisciandaro, James J; Puts, Nicolaas A J; Roberts, Timothy P L; Sack, Markus; Sailasuta, Napapon; Saleh, Muhammad G; Schallmo, Michael-Paul; Simard, Nicholas; Swinnen, Stephan P; Tegenthoff, Martin; Truong, Peter; Wang, Guangbin; Wilkinson, Iain D; Wittsack, Hans-Jörg; Xu, Hongmin; Yan, Fuhua; Zhang, Chencheng; Zipunnikov, Vadim; Zöllner, Helge J; Edden, Richard A E

    2017-10-01

    Magnetic resonance spectroscopy (MRS) is the only biomedical imaging method that can noninvasively detect endogenous signals from the neurotransmitter γ-aminobutyric acid (GABA) in the human brain. Its increasing popularity has been aided by improvements in scanner hardware and acquisition methodology, as well as by broader access to pulse sequences that can selectively detect GABA, in particular J-difference spectral editing sequences. Nevertheless, implementations of GABA-edited MRS remain diverse across research sites, making comparisons between studies challenging. This large-scale multi-vendor, multi-site study seeks to better understand the factors that impact measurement outcomes of GABA-edited MRS. An international consortium of 24 research sites was formed. Data from 272 healthy adults were acquired on scanners from the three major MRI vendors and analyzed using the Gannet processing pipeline. MRS data were acquired in the medial parietal lobe with standard GABA+ and macromolecule- (MM-) suppressed GABA editing. The coefficient of variation across the entire cohort was 12% for GABA+ measurements and 28% for MM-suppressed GABA measurements. A multilevel analysis revealed that most of the variance (72%) in the GABA+ data was accounted for by differences between participants within-site, while site-level differences accounted for comparatively more variance (20%) than vendor-level differences (8%). For MM-suppressed GABA data, the variance was distributed equally between site- (50%) and participant-level (50%) differences. The findings show that GABA+ measurements exhibit strong agreement when implemented with a standard protocol. There is, however, increased variability for MM-suppressed GABA measurements that is attributed in part to differences in site-to-site data acquisition. This study's protocol establishes a framework for future methodological standardization of GABA-edited MRS, while the results provide valuable benchmarks for the MRS community. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    PubMed

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.

  5. A reduced-order modeling approach to represent subgrid-scale hydrological dynamics for land-surface simulations: application in a polygonal tundra landscape

    DOE PAGES

    Pau, G. S. H.; Bisht, G.; Riley, W. J.

    2014-09-17

    Existing land surface models (LSMs) describe physical and biological processes that occur over a wide range of spatial and temporal scales. For example, biogeochemical and hydrological processes responsible for carbon (CO 2, CH 4) exchanges with the atmosphere range from the molecular scale (pore-scale O 2 consumption) to tens of kilometers (vegetation distribution, river networks). Additionally, many processes within LSMs are nonlinearly coupled (e.g., methane production and soil moisture dynamics), and therefore simple linear upscaling techniques can result in large prediction error. In this paper we applied a reduced-order modeling (ROM) technique known as "proper orthogonal decomposition mapping method" thatmore » reconstructs temporally resolved fine-resolution solutions based on coarse-resolution solutions. We developed four different methods and applied them to four study sites in a polygonal tundra landscape near Barrow, Alaska. Coupled surface–subsurface isothermal simulations were performed for summer months (June–September) at fine (0.25 m) and coarse (8 m) horizontal resolutions. We used simulation results from three summer seasons (1998–2000) to build ROMs of the 4-D soil moisture field for the study sites individually (single-site) and aggregated (multi-site). The results indicate that the ROM produced a significant computational speedup (> 10 3) with very small relative approximation error (< 0.1%) for 2 validation years not used in training the ROM. We also demonstrate that our approach: (1) efficiently corrects for coarse-resolution model bias and (2) can be used for polygonal tundra sites not included in the training data set with relatively good accuracy (< 1.7% relative error), thereby allowing for the possibility of applying these ROMs across a much larger landscape. By coupling the ROMs constructed at different scales together hierarchically, this method has the potential to efficiently increase the resolution of land models for coupled climate simulations to spatial scales consistent with mechanistic physical process representation.« less

  6. Evaluating 20th Century precipitation characteristics between multi-scale atmospheric models with different land-atmosphere coupling

    NASA Astrophysics Data System (ADS)

    Phillips, M.; Denning, A. S.; Randall, D. A.; Branson, M.

    2016-12-01

    Multi-scale models of the atmosphere provide an opportunity to investigate processes that are unresolved by traditional Global Climate Models while at the same time remaining viable in terms of computational resources for climate-length time scales. The MMF represents a shift away from large horizontal grid spacing in traditional GCMs that leads to overabundant light precipitation and lack of heavy events, toward a model where precipitation intensity is allowed to vary over a much wider range of values. Resolving atmospheric motions on the scale of 4 km makes it possible to recover features of precipitation, such as intense downpours, that were previously only obtained by computationally expensive regional simulations. These heavy precipitation events may have little impact on large-scale moisture and energy budgets, but are outstanding in terms of interaction with the land surface and potential impact on human life. Three versions of the Community Earth System Model were used in this study; the standard CESM, the multi-scale `Super-Parameterized' CESM where large-scale parameterizations have been replaced with a 2D cloud-permitting model, and a multi-instance land version of the SP-CESM where each column of the 2D CRM is allowed to interact with an individual land unit. These simulations were carried out using prescribed Sea Surface Temperatures for the period from 1979-2006 with daily precipitation saved for all 28 years. Comparisons of the statistical properties of precipitation between model architectures and against observations from rain gauges were made, with specific focus on detection and evaluation of extreme precipitation events.

  7. Multi-scale pixel-based image fusion using multivariate empirical mode decomposition.

    PubMed

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P; McDonald-Maier, Klaus D

    2015-05-08

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.

  8. Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

    PubMed Central

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P.; McDonald-Maier, Klaus D.

    2015-01-01

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences. PMID:26007714

  9. Hierarchical Learning of Tree Classifiers for Large-Scale Plant Species Identification.

    PubMed

    Fan, Jianping; Zhou, Ning; Peng, Jinye; Gao, Ling

    2015-11-01

    In this paper, a hierarchical multi-task structural learning algorithm is developed to support large-scale plant species identification, where a visual tree is constructed for organizing large numbers of plant species in a coarse-to-fine fashion and determining the inter-related learning tasks automatically. For a given parent node on the visual tree, it contains a set of sibling coarse-grained categories of plant species or sibling fine-grained plant species, and a multi-task structural learning algorithm is developed to train their inter-related classifiers jointly for enhancing their discrimination power. The inter-level relationship constraint, e.g., a plant image must first be assigned to a parent node (high-level non-leaf node) correctly if it can further be assigned to the most relevant child node (low-level non-leaf node or leaf node) on the visual tree, is formally defined and leveraged to learn more discriminative tree classifiers over the visual tree. Our experimental results have demonstrated the effectiveness of our hierarchical multi-task structural learning algorithm on training more discriminative tree classifiers for large-scale plant species identification.

  10. Phage-bacteria infection networks: From nestedness to modularity

    NASA Astrophysics Data System (ADS)

    Flores, Cesar O.; Valverde, Sergi; Weitz, Joshua S.

    2013-03-01

    Bacteriophages (viruses that infect bacteria) are the most abundant biological life-forms on Earth. However, very little is known regarding the structure of phage-bacteria infections. In a recent study we re-evaluated 38 prior studies and demonstrated that phage-bacteria infection networks tend to be statistically nested in small scale communities (Flores et al 2011). Nestedness is consistent with a hierarchy of infection and resistance within phages and bacteria, respectively. However, we predicted that at large scales, phage-bacteria infection networks should be typified by a modular structure. We evaluate and confirm this hypothesis using the most extensive study of phage-bacteria infections (Moebus and Nattkemper 1981). In this study, cross-infections were evaluated between 215 marine phages and 286 marine bacteria. We develop a novel multi-scale network analysis and find that the Moebus and Nattkemper (1981) study, is highly modular (at the whole network scale), yet also exhibits nestedness and modularity at the within-module scale. We examine the role of geography in driving these modular patterns and find evidence that phage-bacteria interactions can exhibit strong similarity despite large distances between sites. CFG acknowledges the support of CONACyT Foundation. JSW holds a Career Award at the Scientific Interface from the Burroughs Wellcome Fund and acknowledges the support of the James S. McDonnell Foundation

  11. Large-scale Instability during Gravitational Collapse with Neutrino Transport and a Core-Collapse Supernova

    NASA Astrophysics Data System (ADS)

    Aksenov, A. G.; Chechetkin, V. M.

    2018-04-01

    Most of the energy released in the gravitational collapse of the cores of massive stars is carried away by neutrinos. Neutrinos play a pivotal role in explaining core-collape supernovae. Currently, mathematical models of the gravitational collapse are based on multi-dimensional gas dynamics and thermonuclear reactions, while neutrino transport is considered in a simplified way. Multidimensional gas dynamics is used with neutrino transport in the flux-limited diffusion approximation to study the role of multi-dimensional effects. The possibility of large-scale convection is discussed, which is interesting both for explaining SN II and for setting up observations to register possible high-energy (≳10MeV) neutrinos from the supernova. A new multi-dimensional, multi-temperature gas dynamics method with neutrino transport is presented.

  12. Large-scale automated image analysis for computational profiling of brain tissue surrounding implanted neuroprosthetic devices using Python.

    PubMed

    Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri

    2014-01-01

    In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries.

  13. Implementation of a multi-threaded framework for large-scale scientific applications

    DOE PAGES

    Sexton-Kennedy, E.; Gartung, Patrick; Jones, C. D.; ...

    2015-05-22

    The CMS experiment has recently completed the development of a multi-threaded capable application framework. In this paper, we will discuss the design, implementation and application of this framework to production applications in CMS. For the 2015 LHC run, this functionality is particularly critical for both our online and offline production applications, which depend on faster turn-around times and a reduced memory footprint relative to before. These applications are complex codes, each including a large number of physics-driven algorithms. While the framework is capable of running a mix of thread-safe and 'legacy' modules, algorithms running in our production applications need tomore » be thread-safe for optimal use of this multi-threaded framework at a large scale. Towards this end, we discuss the types of changes, which were necessary for our algorithms to achieve good performance of our multithreaded applications in a full-scale application. Lastly performance numbers for what has been achieved for the 2015 run are presented.« less

  14. A Large-Scale Multi-Hop Localization Algorithm Based on Regularized Extreme Learning for Wireless Networks.

    PubMed

    Zheng, Wei; Yan, Xiaoyong; Zhao, Wei; Qian, Chengshan

    2017-12-20

    A novel large-scale multi-hop localization algorithm based on regularized extreme learning is proposed in this paper. The large-scale multi-hop localization problem is formulated as a learning problem. Unlike other similar localization algorithms, the proposed algorithm overcomes the shortcoming of the traditional algorithms which are only applicable to an isotropic network, therefore has a strong adaptability to the complex deployment environment. The proposed algorithm is composed of three stages: data acquisition, modeling and location estimation. In data acquisition stage, the training information between nodes of the given network is collected. In modeling stage, the model among the hop-counts and the physical distances between nodes is constructed using regularized extreme learning. In location estimation stage, each node finds its specific location in a distributed manner. Theoretical analysis and several experiments show that the proposed algorithm can adapt to the different topological environments with low computational cost. Furthermore, high accuracy can be achieved by this method without setting complex parameters.

  15. ASSESSING ECOLOGICAL RISKS AT LARGE SPATIAL SCALES

    EPA Science Inventory

    The history of environmental management and regulation in the United States has been one of initial focus on localized, end-of-the-pipe problems to increasing attention to multi-scalar, multi-stressor, and multi- resource issues. Concomitant with this reorientation is the need fo...

  16. CO 2 leakage impacts on shallow groundwater. Field-scale reactive-transport simulations informed by observations at a natural analog site

    DOE PAGES

    Keating, Elizabeth H.; Hakala, J. Alexandra; Viswanathan, Hari; ...

    2013-03-01

    It is challenging to predict the degree to which shallow groundwater might be affected by leaks from a CO 2 sequestration reservoir, particularly over long time scales and large spatial scales. In this study observations at a CO 2 enriched shallow aquifer natural analog were used to develop a predictive model which is then used to simulate leakage scenarios. This natural analog provides the opportunity to make direct field observations of groundwater chemistry in the presence of elevated CO 2, to collect aquifer samples and expose them to CO 2 under controlled conditions in the laboratory, and to test themore » ability of multiphase reactive transport models to reproduce measured geochemical trends at the field-scale. The field observations suggest that brackish water entrained with the upwelling CO 2 are a more significant source of trace metals than in situ mobilization of metals due to exposure to CO 2. The study focuses on a single trace metal of concern at this site: U. Experimental results indicate that cation exchange/adsorption and dissolution/precipitation of calcite containing trace amounts of U are important reactions controlling U in groundwater at this site, and that the amount of U associated with calcite is fairly well constrained. Simulations incorporating these results into a 3-D multi-phase reactive transport model are able to reproduce the measured ranges and trends between pH, pCO 2, Ca, total C, U and Cl -at the field site. Although the true fluxes at the natural analog site are unknown, the cumulative CO 2 flux inferred from these simulations are approximately equivalent to 37.8E-3 MT, approximately corresponding to a .001% leak rate for injection at a large (750 MW) power plant. The leakage scenario simulations suggest that if the leak only persists for a short time the volume of aquifer contaminated by CO 2-induced mobilization of U will be relatively small, yet persistent over 100 a.« less

  17. Trans-oceanic Remote Power Hardware-in-the-Loop: Multi-site Hardware, Integrated Controller, and Electric Network Co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel

    Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less

  18. Trans-oceanic Remote Power Hardware-in-the-Loop: Multi-site Hardware, Integrated Controller, and Electric Network Co-simulation

    DOE PAGES

    Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel; ...

    2017-07-24

    Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less

  19. Experiences with a Decade of Wireless Sensor Networks in Mountain Cryosphere Research

    NASA Astrophysics Data System (ADS)

    Beutel, Jan

    2017-04-01

    Research in geoscience depends on high-quality measurements over long periods of time in order to understand processes and to create and validate models. The promise of wireless sensor networks to monitor autonomously at unprecedented spatial and temporal scale motivated the use of this novel technology for studying mountain permafrost in the mid 2000s. Starting from a first experimental deployment to investigate the thermal properties of steep bedrock permafrost in 2006 on the Jungfraujoch, Switzerland at 3500 m asl using prototype wireless sensors the PermaSense project has evolved into a multi-site and multi-discipline initiative. We develop, deploy and operate wireless sensing systems customized for long-term autonomous operation in high-mountain environments. Around this central element, we develop concepts, methods and tools to investigate and to quantify the connection between climate, cryosphere (permafrost, glaciers, snow) and geomorphodynamics. In this presentation, we describe the concepts and system architecture used both for the wireless sensor network as well as for data management and processing. Furthermore, we will discuss the experience gained in over a decade of planning, installing and operating large deployments on field sites spread across a large part of the Swiss and French Alps and applications ranging from academic, experimental research campaigns, long-term monitoring and natural hazard warning in collaboration with government authorities and local industry partners. Reference http://www.permasense.ch Online Open Data Access http://data.permasense.ch

  20. Building a multi-scaled geospatial temporal ecology database from disparate data sources: fostering open science and data reuse.

    PubMed

    Soranno, Patricia A; Bissell, Edward G; Cheruvelil, Kendra S; Christel, Samuel T; Collins, Sarah M; Fergus, C Emi; Filstrup, Christopher T; Lapierre, Jean-Francois; Lottig, Noah R; Oliver, Samantha K; Scott, Caren E; Smith, Nicole J; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A; Gries, Corinna; Henry, Emily N; Skaff, Nick K; Stanley, Emily H; Stow, Craig A; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km(2)). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated database reproducible and extensible, allowing users to ask new research questions with the existing database or through the addition of new data. The largest challenge of this task was the heterogeneity of the data, formats, and metadata. Many steps of data integration need manual input from experts in diverse fields, requiring close collaboration.

  1. Building a multi-scaled geospatial temporal ecology database from disparate data sources: Fostering open science through data reuse

    USGS Publications Warehouse

    Soranno, Patricia A.; Bissell, E.G.; Cheruvelil, Kendra S.; Christel, Samuel T.; Collins, Sarah M.; Fergus, C. Emi; Filstrup, Christopher T.; Lapierre, Jean-Francois; Lotting, Noah R.; Oliver, Samantha K.; Scott, Caren E.; Smith, Nicole J.; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A.; Gries, Corinna; Henry, Emily N.; Skaff, Nick K.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km2). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated database reproducible and extensible, allowing users to ask new research questions with the existing database or through the addition of new data. The largest challenge of this task was the heterogeneity of the data, formats, and metadata. Many steps of data integration need manual input from experts in diverse fields, requiring close collaboration.

  2. Scale-dependent spatial variability in peatland lead pollution in the southern Pennines, UK.

    PubMed

    Rothwell, James J; Evans, Martin G; Lindsay, John B; Allott, Timothy E H

    2007-01-01

    Increasingly, within-site and regional comparisons of peatland lead pollution have been undertaken using the inventory approach. The peatlands of the Peak District, southern Pennines, UK, have received significant atmospheric inputs of lead over the last few hundred years. A multi-core study at three peatland sites in the Peak District demonstrates significant within-site spatial variability in industrial lead pollution. Stochastic simulations reveal that 15 peat cores are required to calculate reliable lead inventories at the within-site and within-region scale for this highly polluted area of the southern Pennines. Within-site variability in lead pollution is dominant at the within-region scale. The study demonstrates that significant errors may be associated with peatland lead inventories at sites where only a single peat core has been used to calculate an inventory. Meaningful comparisons of lead inventories at the regional or global scale can only be made if the within-site variability of lead pollution has been quantified reliably.

  3. PTMscape: an open source tool to predict generic post-translational modifications and map modification crosstalk in protein domains and biological processes.

    PubMed

    Li, Ginny X H; Vogel, Christine; Choi, Hyungwon

    2018-06-07

    While tandem mass spectrometry can detect post-translational modifications (PTM) at the proteome scale, reported PTM sites are often incomplete and include false positives. Computational approaches can complement these datasets by additional predictions, but most available tools use prediction models pre-trained for single PTM type by the developers and it remains a difficult task to perform large-scale batch prediction for multiple PTMs with flexible user control, including the choice of training data. We developed an R package called PTMscape which predicts PTM sites across the proteome based on a unified and comprehensive set of descriptors of the physico-chemical microenvironment of modified sites, with additional downstream analysis modules to test enrichment of individual or pairs of PTMs in protein domains. PTMscape is flexible in the ability to process any major modifications, such as phosphorylation and ubiquitination, while achieving the sensitivity and specificity comparable to single-PTM methods and outperforming other multi-PTM tools. Applying this framework, we expanded proteome-wide coverage of five major PTMs affecting different residues by prediction, especially for lysine and arginine modifications. Using a combination of experimentally acquired sites (PSP) and newly predicted sites, we discovered that the crosstalk among multiple PTMs occur more frequently than by random chance in key protein domains such as histone, protein kinase, and RNA recognition motifs, spanning various biological processes such as RNA processing, DNA damage response, signal transduction, and regulation of cell cycle. These results provide a proteome-scale analysis of crosstalk among major PTMs and can be easily extended to other types of PTM.

  4. Eurasian continental background and regionally polluted levels of ozone and CO observed in northeast Asia

    NASA Astrophysics Data System (ADS)

    Pochanart, Pakpong; Kato, Shungo; Katsuno, Takao; Akimoto, Hajime

    The roles of Eurasian/Siberian continental air masses transport and the impact of large-scale East Asian anthropogenic emissions on tropospheric ozone and carbon monoxide levels in northeast Asia were investigated. Seasonal behaviors of O 3 and CO mixing ratios in background continental (BC) air masses and regionally polluted continental (RPC) air masses were identified using trajectory analyses of Eurasian continental air masses and multi-year O 3 and CO data observed at Happo, a mountain site in Japan. RPC air masses show significantly higher O 3 and CO mixing ratios (annual average of 53.9±6.0 and 200±41 ppb, respectively) than BC air masses (44.4±3.6 and 167±17 ppb, respectively). Large scale anthropogenic emissions in East Asia are suggested to contribute about 10 ppb of photochemical O 3 and 32 ppb of CO at Happo. A comparative study of O 3 and CO observed at other sites, i.e., Oki Islands and Mondy in northeast Asia, showed similarities suggesting that O 3 mixing ratios in BC air masses at Happo could be representative for remote northeast Asia. However, CO mixing ratios in BC air masses at Happo are higher than the background level in Siberia. The overestimate is probably related to an increase in the CO baseline gradient between Siberia and the East Asia Pacific rim, and perturbations by sub-grid scale pollution transport and regional-scale boreal forest fires in Siberia when the background continental air masses are transported to Japan.

  5. Sub-seasonal-to-seasonal Reservoir Inflow Forecast using Bayesian Hierarchical Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, S.; Arumugam, S.

    2017-12-01

    Sub-seasonal-to-seasonal (S2S) (15-90 days) streamflow forecasting is an emerging area of research that provides seamless information for reservoir operation from weather time scales to seasonal time scales. From an operational perspective, sub-seasonal inflow forecasts are highly valuable as these enable water managers to decide short-term releases (15-30 days), while holding water for seasonal needs (e.g., irrigation and municipal supply) and to meet end-of-the-season target storage at a desired level. We propose a Bayesian Hierarchical Hidden Markov Model (BHHMM) to develop S2S inflow forecasts for the Tennessee Valley Area (TVA) reservoir system. Here, the hidden states are predicted by relevant indices that influence the inflows at S2S time scale. The hidden Markov model also captures the both spatial and temporal hierarchy in predictors that operate at S2S time scale with model parameters being estimated as a posterior distribution using a Bayesian framework. We present our work in two steps, namely single site model and multi-site model. For proof of concept, we consider inflows to Douglas Dam, Tennessee, in the single site model. For multisite model we consider reservoirs in the upper Tennessee valley. Streamflow forecasts are issued and updated continuously every day at S2S time scale. We considered precipitation forecasts obtained from NOAA Climate Forecast System (CFSv2) GCM as predictors for developing S2S streamflow forecasts along with relevant indices for predicting hidden states. Spatial dependence of the inflow series of reservoirs are also preserved in the multi-site model. To circumvent the non-normality of the data, we consider the HMM in a Generalized Linear Model setting. Skill of the proposed approach is tested using split sample validation against a traditional multi-site canonical correlation model developed using the same set of predictors. From the posterior distribution of the inflow forecasts, we also highlight different system behavior under varied global and local scale climatic influences from the developed BHMM.

  6. Multi-scale streamflow variability responses to precipitation over the headwater catchments in southern China

    NASA Astrophysics Data System (ADS)

    Niu, Jun; Chen, Ji; Wang, Keyi; Sivakumar, Bellie

    2017-08-01

    This paper examines the multi-scale streamflow variability responses to precipitation over 16 headwater catchments in the Pearl River basin, South China. The long-term daily streamflow data (1952-2000), obtained using a macro-scale hydrological model, the Variable Infiltration Capacity (VIC) model, and a routing scheme, are studied. Temporal features of streamflow variability at 10 different timescales, ranging from 6 days to 8.4 years, are revealed with the Haar wavelet transform. The principal component analysis (PCA) is performed to categorize the headwater catchments with the coherent modes of multi-scale wavelet spectra. The results indicate that three distinct modes, with different variability distributions at small timescales and seasonal scales, can explain 95% of the streamflow variability. A large majority of the catchments (i.e. 12 out of 16) exhibit consistent mode feature on multi-scale variability throughout three sub-periods (1952-1968, 1969-1984, and 1985-2000). The multi-scale streamflow variability responses to precipitation are identified to be associated with the regional flood and drought tendency over the headwater catchments in southern China.

  7. Tracking Virus Particles in Fluorescence Microscopy Images Using Multi-Scale Detection and Multi-Frame Association.

    PubMed

    Jaiswal, Astha; Godinez, William J; Eils, Roland; Lehmann, Maik Jorg; Rohr, Karl

    2015-11-01

    Automatic fluorescent particle tracking is an essential task to study the dynamics of a large number of biological structures at a sub-cellular level. We have developed a probabilistic particle tracking approach based on multi-scale detection and two-step multi-frame association. The multi-scale detection scheme allows coping with particles in close proximity. For finding associations, we have developed a two-step multi-frame algorithm, which is based on a temporally semiglobal formulation as well as spatially local and global optimization. In the first step, reliable associations are determined for each particle individually in local neighborhoods. In the second step, the global spatial information over multiple frames is exploited jointly to determine optimal associations. The multi-scale detection scheme and the multi-frame association finding algorithm have been combined with a probabilistic tracking approach based on the Kalman filter. We have successfully applied our probabilistic tracking approach to synthetic as well as real microscopy image sequences of virus particles and quantified the performance. We found that the proposed approach outperforms previous approaches.

  8. A Multidisciplinary Approach to Assessing the Causal Components of Climate Change

    NASA Astrophysics Data System (ADS)

    Gosnold, W. D.; Todhunter, P. E.; Dong, X.; Rundquist, B.; Majorowicz, J.; Blackwell, D. D.

    2004-05-01

    Separation of climate forcing by anthropogenic greenhouse gases from natural radiative climate forcing is difficult because the composite temperature signal in the meteorological and multi-proxy temperature records cannot be resolved directly into radiative forcing components. To address this problem, we have initiated a large-scale, multidisciplinary project to test coherence between ground surface temperatures (GST) reconstructed from borehole T-z profiles, surface air temperatures (SAT), soil temperatures, and solar radiation. Our hypothesis is that radiative heating and heat exchange between the ground and the air directly control the ground surface temperature. Consequently, borehole T-z measurements at multi-year intervals spanning time periods when solar radiation, soil and air temperatures have been recorded should enable comparison of the thermal energy stored in the ground to these quantities. If coherence between energy storage, solar radiation, GST, SAT and multi-proxy temperature data can be discerned for a one or two decade scale, synthesis of GST and multi-proxy data over the past several centuries may enable us to separately determine the anthropogenic and natural forcings of climate change. The data we are acquiring include: (1) New T-z measurements in boreholes previously used in paleoclimate and heat flow research in Canada and the United States from the 1970's to the present. (2) Meteorological data from the US Historical Climatology Network and the Automated Weather Data Network of the High Plains Regional Climate Center, and Environment Canada. (3) Direct and remotely sensed data on land use, environment, and soil properties at selected borehole and meteorological sites for the periods between borehole observations. The project addresses three related questions: What is the coherence between the GST, SAT, soil temperatures and solar radiation? Have microclimate changes at borehole sites and climate stations affected temperature trends? If good coherence is obtained, can the coherence between thermal energy stored in the ground and radiative forcing during the time between T-z measurements be extended several centuries into the past?

  9. Mechanical Stability of Fractured Rift Basin Mudstones: from lab to basin scale

    NASA Astrophysics Data System (ADS)

    Zakharova, N. V.; Goldberg, D.; Collins, D.; Swager, L.; Payne, W. G.

    2016-12-01

    Understanding petrophysical and mechanical properties of caprock mudstones is essential for ensuring good containment and mechanical formation stability at potential CO2 storage sites. Natural heterogeneity and presence of fractures, however, create challenges for accurate prediction of mudstone behavior under injection conditions and at reservoir scale. In this study, we present a multi-scale geomechanical analysis for Mesozoic mudstones from the Newark Rift basin, integrating petropyshical core and borehole data, in situ stress measurements, and caprock stability modeling. The project funded by the U.S. Department of Energy's National Energy Technology Laboratory (NETL) focuses on the Newark basin as a representative locality for a series of the Mesozoic rift basins in eastern North America considered as potential CO2 storage sites. An extensive core characterization program, which included laboratory CT scans, XRD, SEM, MICP, porosity, permeability, acoustic velocity measurements, and geomechanical testing under a range of confining pressures, revealed large variability and heterogeneity in both petrophysical and mechanical properties. Estimates of unconfined compressive strength for these predominantly lacustrine mudstones range from 5,000 to 50,000 psi, with only a weak correlation to clay content. Thinly bedded intervals exhibit up to 30% strength anisotropy. Mineralized fractures, abundant in most formations, are characterized by compressive strength as low as 10% of matrix strength. Upscaling these observations from core to reservoir scale is challenging. No simple one-to-one correlation between mechanical and petrophyscial properties exists, and therefore, we develop multivariate empirical relationships among these properties. A large suite of geophysical logs, including new measurements of the in situ stress field, is used to extrapolate these relationships to a basin-scale geomechanical model and predict mudstone behavior under injection conditions.

  10. Multi-Scale Transport Properties of Fine-Grained Rocks: A Case Study of the Kirtland Formation, San Juan Basin, USA

    NASA Astrophysics Data System (ADS)

    Heath, J. E.; Dewers, T. A.; McPherson, B. J.; Wilson, T. H.; Flach, T.

    2009-12-01

    Understanding and characterizing transport properties of fine-grained rocks is critical in development of shale gas plays or assessing retention of CO2 at geologic storage sites. Difficulties arise in that both small scale (i.e., ~ nm) properties of the rock matrix and much larger scale fractures, faults, and sedimentological architecture govern migration of multiphase fluids. We present a multi-scale investigation of sealing and transport properties of the Kirtland Formation, which is a regional aquitard and reservoir seal in the San Juan Basin, USA. Sub-micron dual FIB/SEM imaging and reconstruction of 3D pore networks in core samples reveal a variety of pore types, including slit-shaped pores that are co-located with sedimentary structures and variations in mineralogy. Micron-scale chemical analysis and XRD reveal a mixture of mixed-layer smectite/illite, chlorite, quartz, and feldspar with little organic matter. Analysis of sub-micron digital reconstructions, mercury capillary injection pressure, and gas breakthrough measurements indicate a high quality sealing matrix. Natural full and partially mineralized fractures observed in core and in FMI logs include those formed from early soil-forming processes, differential compaction, and tectonic events. The potential impact of both fracture and matrix properties on large-scale transport is investigated through an analysis of natural helium from core samples, 3D seismic data and poro-elastic modeling. While seismic interpretations suggest considerable fracturing of the Kirtland, large continuous fracture zones and faults extending through the seal to the surface cannot be inferred from the data. Observed Kirtland Formation multi-scale transport properties are included as part of a risk assessment methodology for CO2 storage. Acknowledgements: The authors gratefully acknowledge the U.S. Department of Energy’s (DOE) National Energy Technology Laboratory for sponsoring this project. The DOE’s Basic Energy Science Office funded the dual FIB/SEM analysis. The Kirtland Formation overlies the coal seams of the Fruitland into which CO2 has been injected as a Phase II demonstration of the Southwest Regional Partnership on Carbon Sequestration. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the U.S. Department of Energy under contract DE-ACOC4-94AL85000.

  11. A simplified gross primary production and evapotranspiration model for boreal coniferous forests - is a generic calibration sufficient?

    NASA Astrophysics Data System (ADS)

    Minunno, F.; Peltoniemi, M.; Launiainen, S.; Aurela, M.; Lindroth, A.; Lohila, A.; Mammarella, I.; Minkkinen, K.; Mäkelä, A.

    2015-07-01

    The problem of model complexity has been lively debated in environmental sciences as well as in the forest modelling community. Simple models are less input demanding and their calibration involves a lower number of parameters, but they might be suitable only at local scale. In this work we calibrated a simplified ecosystem process model (PRELES) to data from multiple sites and we tested if PRELES can be used at regional scale to estimate the carbon and water fluxes of Boreal conifer forests. We compared a multi-site (M-S) with site-specific (S-S) calibrations. Model calibrations and evaluations were carried out by the means of the Bayesian method; Bayesian calibration (BC) and Bayesian model comparison (BMC) were used to quantify the uncertainty in model parameters and model structure. To evaluate model performances BMC results were combined with more classical analysis of model-data mismatch (M-DM). Evapotranspiration (ET) and gross primary production (GPP) measurements collected in 10 sites of Finland and Sweden were used in the study. Calibration results showed that similar estimates were obtained for the parameters at which model outputs are most sensitive. No significant differences were encountered in the predictions of the multi-site and site-specific versions of PRELES with exception of a site with agricultural history (Alkkia). Although PRELES predicted GPP better than evapotranspiration, we concluded that the model can be reliably used at regional scale to simulate carbon and water fluxes of Boreal forests. Our analyses underlined also the importance of using long and carefully collected flux datasets in model calibration. In fact, even a single site can provide model calibrations that can be applied at a wider spatial scale, since it covers a wide range of variability in climatic conditions.

  12. Compact Multimedia Systems in Multi-chip Module Technology

    NASA Technical Reports Server (NTRS)

    Fang, Wai-Chi; Alkalaj, Leon

    1995-01-01

    This tutorial paper shows advanced multimedia system designs based on multi-chip module (MCM) technologies that provide essential computing, compression, communication, and storage capabilities for various large scale information highway applications.!.

  13. Determining the Influence of Groundwater Composition on the Performance of Arsenic Adsorption Columns Using Rapid Small-Scale Column Tests

    NASA Astrophysics Data System (ADS)

    Aragon, A. R.; Siegel, M.

    2004-12-01

    The USEPA has established a more stringent drinking water standard for arsenic, reducing the maximum contaminant level (MCL) from 50 μ g/L to 10 μ g/L. This will affect many small communities in the US that lack the appropriate treatment infrastructure and funding to reduce arsenic to such levels. For such communities, adsorption systems are the preferred technology based on ease of operation and relatively lower costs. The performance of adsorption media for the removal of arsenic from drinking water is dependent on site-specific water quality. At certain concentrations, co-occurring solutes will compete effectively with arsenic for sorption sites, potentially reducing the sorption capacity of the media. Due to the site-specific nature of water quality and variations in media properties, pilot scale studies are typically carried out to ensure that a proposed treatment technique is cost effective before installation of a full-scale system. Sandia National Laboratories is currently developing an approach to utilize rapid small-scale columns in lieu of pilot columns to test innovative technologies that could significantly reduce the cost of treatment in small communities. Rapid small-scale column tests (RSSCTs) were developed to predict full-scale treatment of organic contaminants by adsorption onto granular activated carbon (GAC). This process greatly reduced the time and costs required to verify performance of GAC adsorption columns. In this study, the RSSCT methodology is used to predict the removal of inorganic arsenic using mixed metal oxyhydroxide adsorption media. The media are engineered and synthesized from materials that control arsenic behavior in natural and disturbed systems. We describe the underlying theory and application of RSSCTs for the performance evaluation of novel media in several groundwater compositions. Results of small-scale laboratory columns are being used to predict the performance of pilot-scale systems and ultimately to design full-scale systems. RSSCTs will be performed on a suite of water compositions representing the variety of water supplies in the United States that are affected by the new drinking water standard. Ultimately, this approach will be used to carry out inexpensive short-term pilot studies at a large number of sites where large-scale pilots are not economically feasible. Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a Lockheed Martin company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  14. A Multi-Scale Settlement Matching Algorithm Based on ARG

    NASA Astrophysics Data System (ADS)

    Yue, Han; Zhu, Xinyan; Chen, Di; Liu, Lingjia

    2016-06-01

    Homonymous entity matching is an important part of multi-source spatial data integration, automatic updating and change detection. Considering the low accuracy of existing matching methods in dealing with matching multi-scale settlement data, an algorithm based on Attributed Relational Graph (ARG) is proposed. The algorithm firstly divides two settlement scenes at different scales into blocks by small-scale road network and constructs local ARGs in each block. Then, ascertains candidate sets by merging procedures and obtains the optimal matching pairs by comparing the similarity of ARGs iteratively. Finally, the corresponding relations between settlements at large and small scales are identified. At the end of this article, a demonstration is presented and the results indicate that the proposed algorithm is capable of handling sophisticated cases.

  15. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    NASA Astrophysics Data System (ADS)

    Fonseca, R. A.; Vieira, J.; Fiuza, F.; Davidson, A.; Tsung, F. S.; Mori, W. B.; Silva, L. O.

    2013-12-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ˜106 cores and sustained performance over ˜2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios.

  16. Multi-scale Modeling of Energy Balance Fluxes in a Dense Tamarisk Riparian Forest

    NASA Astrophysics Data System (ADS)

    Neale, C. M.; Santos, C. A.; Watts, D.; Osterberg, J.; Hipps, L. E.; Sritharan, S. I.

    2008-12-01

    Remote sensing of energy balance fluxes has become operationally more viable over the last 10 years with the development of more robust multi-layer models and the availability of quasi-real time satellite imagery from most sensors. Riparian corridors in semi-arid and arid areas present a challenge to satellite based techniques for estimating evapotranspiration due to issues of scale and pixel resolution, especially when using the thermal infrared bands. This paper will present energy balance measurement and modeling results over a Salt Cedar (Tamarix Ramosissima) forest in the Cibola National Wildlife Refuge along the Colorado River south of Blythe, CA. The research site encompasses a 600 hectare area populated by mostly Tamarisk stands of varying density. Three Bowen ratio systems are installed on tall towers within varying densities of forest cover in the upwind footprint and growing under varying depths to the water table. An additional eddy covariance tower is installed alongside a Bowen ratio system on one of the towers. Flux data has been gathered continuously since early 2007. In the summer of 2007, a Scintec large aperture scintillometer was installed between two of the towers over 1 km apart and has been working continuously along with the flux towers. Two intensive field campaigns were organized in June 2007 and May 2008 to coincide with LANDSAT TM5, MODIS and ASTER overpasses. High resolution multispectral and thermal imagery was acquired at the same time with the USU airborne system to provide information for the up- scaling of the energy balance fluxes from tower to satellite scales. The paper will present comparisons between the different energy balance measuring techniques under the highly advective conditions of the experimental site, concentrating on the scintillometer data. Preliminary results of remotely sensed modeling of the fluxes at different scales and model complexity will also be presented.

  17. What is the benefit of driving a hydrological model with data from a multi-site weather generator compared to data from a simple delta change approach?"

    NASA Astrophysics Data System (ADS)

    Rössler, Ole; Keller, Denise; Fischer, Andreas

    2016-04-01

    In 2011 the Swiss national consortium C2SM providednew climate change scenarios were released in Switzerland that came with a comprehensive data set of temperature and precipitation changes under climate change conditions for every a large network of meteorological stations, and for aggregated as well as regions in across Switzerland. These climate change signals were generated for three emission scenarios and three different future time-periods and designed to be used asbased on a delta change factors approach. This data set proved to be very successful in Switzerland as many different users, researchers, private companies, and societal users were able to use and interpret the climate data set. Thus, a range of applications that are all based on the same climate data set enabled a comparable view on climate change impact in several disciplines. The main limitation and criticism to this data set was the usage of the delta change approach for downscaling as it comes with severe limitations such as underestimatinges changes in extreme values and neglecting changes in variability and changes in temporal sequencesneglecting changes in variability, be it year-to-year or day-to-day, and changes in temporal sequences . lacks a change in the day-to-day-variability. One way to overcome this the latter limitation is the usage of stochastic weather generators in a downscaling context. Weather generators are known to be one suitable downscaling technique, but A common limitation of most weather generators is the absence of spatial consistency rrelation in the generated daily time-series, resulting in an underestimation of areal means over several stations that are often low-biased. refer to one point scale (single-site) and lacks the spatial representation of weather. The latter A realistic representation of the inter-station correlation in the downscaled time-series This is of high particular importance in some impact studies, especially infor any hydrological impact studiesy. Recently, a multi-site weather generator was developed and tested for downscaling purposes over Switzerland. The weather generator is of type Richardson, that is run with spatially correlated random number streams to ensure spatial consistency. As a downside, multi-site weather generators are much more complex to develop, but they are a very promising alternative downscaling technique. A new multi-site-weather generator was developed for Switzerland in a previous study (Keller et al. 2014). In this study, we tested this new multi-site-weather generator against the "standard" delta change derived data in a hydrological impact assessment study that focused on runoff in the meso-scale catchment of the river Thur catchment. Two hydrological models of different complexity were run with the data sets under present (1980-2009) and under future conditions (2070-2099), assuming the SRES A1B emission2070-2100 scenario conditions. Eight meteorological stations were used to interpolate a meteorological field that served as input to calibrate and validate the two hydrological models against runoff. The downscaling intercomparison was done for We applied 10 GCM-RCM combinations simulations of the ENSEMBLES. In case of the weather generator, that allows for multiple synthetic realizations, we generated for which change factors for each station (delta change approach) were available and generated 25 realizations of multi-site weather. with each climate model projection. Results show that the delta change driven data constitutes only one appropriate representation compared to theof a bandwidth of runoff projections yielded by the multi-site weather generator data. Especially oOn average, differences between both the two approaches are small. Low and high runoff Runoff values to both extremes are however better reproduced with the weather generator driven data set. The stochastic representation of multiday rainfall events are considered as the main reason. Hence, tThere is a clear yet small added value to the delta change approach that in turn performs rather well. Although these small but considerable differences might questioning the need to construct a multi-site-weather generator with a huge effort, the potential and possibilities to further develop the multi-site weather generator is undoubted.

  18. NASA Exhibits

    NASA Technical Reports Server (NTRS)

    Deardorff, Glenn; Djomehri, M. Jahed; Freeman, Ken; Gambrel, Dave; Green, Bryan; Henze, Chris; Hinke, Thomas; Hood, Robert; Kiris, Cetin; Moran, Patrick; hide

    2001-01-01

    A series of NASA presentations for the Supercomputing 2001 conference are summarized. The topics include: (1) Mars Surveyor Landing Sites "Collaboratory"; (2) Parallel and Distributed CFD for Unsteady Flows with Moving Overset Grids; (3) IP Multicast for Seamless Support of Remote Science; (4) Consolidated Supercomputing Management Office; (5) Growler: A Component-Based Framework for Distributed/Collaborative Scientific Visualization and Computational Steering; (6) Data Mining on the Information Power Grid (IPG); (7) Debugging on the IPG; (8) Debakey Heart Assist Device: (9) Unsteady Turbopump for Reusable Launch Vehicle; (10) Exploratory Computing Environments Component Framework; (11) OVERSET Computational Fluid Dynamics Tools; (12) Control and Observation in Distributed Environments; (13) Multi-Level Parallelism Scaling on NASA's Origin 1024 CPU System; (14) Computing, Information, & Communications Technology; (15) NAS Grid Benchmarks; (16) IPG: A Large-Scale Distributed Computing and Data Management System; and (17) ILab: Parameter Study Creation and Submission on the IPG.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pau, G. S. H.; Bisht, G.; Riley, W. J.

    Existing land surface models (LSMs) describe physical and biological processes that occur over a wide range of spatial and temporal scales. For example, biogeochemical and hydrological processes responsible for carbon (CO 2, CH 4) exchanges with the atmosphere range from the molecular scale (pore-scale O 2 consumption) to tens of kilometers (vegetation distribution, river networks). Additionally, many processes within LSMs are nonlinearly coupled (e.g., methane production and soil moisture dynamics), and therefore simple linear upscaling techniques can result in large prediction error. In this paper we applied a reduced-order modeling (ROM) technique known as "proper orthogonal decomposition mapping method" thatmore » reconstructs temporally resolved fine-resolution solutions based on coarse-resolution solutions. We developed four different methods and applied them to four study sites in a polygonal tundra landscape near Barrow, Alaska. Coupled surface–subsurface isothermal simulations were performed for summer months (June–September) at fine (0.25 m) and coarse (8 m) horizontal resolutions. We used simulation results from three summer seasons (1998–2000) to build ROMs of the 4-D soil moisture field for the study sites individually (single-site) and aggregated (multi-site). The results indicate that the ROM produced a significant computational speedup (> 10 3) with very small relative approximation error (< 0.1%) for 2 validation years not used in training the ROM. We also demonstrate that our approach: (1) efficiently corrects for coarse-resolution model bias and (2) can be used for polygonal tundra sites not included in the training data set with relatively good accuracy (< 1.7% relative error), thereby allowing for the possibility of applying these ROMs across a much larger landscape. By coupling the ROMs constructed at different scales together hierarchically, this method has the potential to efficiently increase the resolution of land models for coupled climate simulations to spatial scales consistent with mechanistic physical process representation.« less

  20. Comparative Evaluation of Registration Algorithms in Different Brain Databases With Varying Difficulty: Results and Insights

    PubMed Central

    Akbari, Hamed; Bilello, Michel; Da, Xiao; Davatzikos, Christos

    2015-01-01

    Evaluating various algorithms for the inter-subject registration of brain magnetic resonance images (MRI) is a necessary topic receiving growing attention. Existing studies evaluated image registration algorithms in specific tasks or using specific databases (e.g., only for skull-stripped images, only for single-site images, etc.). Consequently, the choice of registration algorithms seems task- and usage/parameter-dependent. Nevertheless, recent large-scale, often multi-institutional imaging-related studies create the need and raise the question whether some registration algorithms can 1) generally apply to various tasks/databases posing various challenges; 2) perform consistently well, and while doing so, 3) require minimal or ideally no parameter tuning. In seeking answers to this question, we evaluated 12 general-purpose registration algorithms, for their generality, accuracy and robustness. We fixed their parameters at values suggested by algorithm developers as reported in the literature. We tested them in 7 databases/tasks, which present one or more of 4 commonly-encountered challenges: 1) inter-subject anatomical variability in skull-stripped images; 2) intensity homogeneity, noise and large structural differences in raw images; 3) imaging protocol and field-of-view (FOV) differences in multi-site data; and 4) missing correspondences in pathology-bearing images. Totally 7,562 registrations were performed. Registration accuracies were measured by (multi-)expert-annotated landmarks or regions of interest (ROIs). To ensure reproducibility, we used public software tools, public databases (whenever possible), and we fully disclose the parameter settings. We show evaluation results, and discuss the performances in light of algorithms’ similarity metrics, transformation models and optimization strategies. We also discuss future directions for the algorithm development and evaluations. PMID:24951685

  1. An i2b2-based, generalizable, open source, self-scaling chronic disease registry

    PubMed Central

    Quan, Justin; Ortiz, David M; Bousvaros, Athos; Ilowite, Norman T; Inman, Christi J; Marsolo, Keith; McMurry, Andrew J; Sandborg, Christy I; Schanberg, Laura E; Wallace, Carol A; Warren, Robert W; Weber, Griffin M; Mandl, Kenneth D

    2013-01-01

    Objective Registries are a well-established mechanism for obtaining high quality, disease-specific data, but are often highly project-specific in their design, implementation, and policies for data use. In contrast to the conventional model of centralized data contribution, warehousing, and control, we design a self-scaling registry technology for collaborative data sharing, based upon the widely adopted Integrating Biology & the Bedside (i2b2) data warehousing framework and the Shared Health Research Information Network (SHRINE) peer-to-peer networking software. Materials and methods Focusing our design around creation of a scalable solution for collaboration within multi-site disease registries, we leverage the i2b2 and SHRINE open source software to create a modular, ontology-based, federated infrastructure that provides research investigators full ownership and access to their contributed data while supporting permissioned yet robust data sharing. We accomplish these objectives via web services supporting peer-group overlays, group-aware data aggregation, and administrative functions. Results The 56-site Childhood Arthritis & Rheumatology Research Alliance (CARRA) Registry and 3-site Harvard Inflammatory Bowel Diseases Longitudinal Data Repository now utilize i2b2 self-scaling registry technology (i2b2-SSR). This platform, extensible to federation of multiple projects within and between research networks, encompasses >6000 subjects at sites throughout the USA. Discussion We utilize the i2b2-SSR platform to minimize technical barriers to collaboration while enabling fine-grained control over data sharing. Conclusions The implementation of i2b2-SSR for the multi-site, multi-stakeholder CARRA Registry has established a digital infrastructure for community-driven research data sharing in pediatric rheumatology in the USA. We envision i2b2-SSR as a scalable, reusable solution facilitating interdisciplinary research across diseases. PMID:22733975

  2. An i2b2-based, generalizable, open source, self-scaling chronic disease registry.

    PubMed

    Natter, Marc D; Quan, Justin; Ortiz, David M; Bousvaros, Athos; Ilowite, Norman T; Inman, Christi J; Marsolo, Keith; McMurry, Andrew J; Sandborg, Christy I; Schanberg, Laura E; Wallace, Carol A; Warren, Robert W; Weber, Griffin M; Mandl, Kenneth D

    2013-01-01

    Registries are a well-established mechanism for obtaining high quality, disease-specific data, but are often highly project-specific in their design, implementation, and policies for data use. In contrast to the conventional model of centralized data contribution, warehousing, and control, we design a self-scaling registry technology for collaborative data sharing, based upon the widely adopted Integrating Biology & the Bedside (i2b2) data warehousing framework and the Shared Health Research Information Network (SHRINE) peer-to-peer networking software. Focusing our design around creation of a scalable solution for collaboration within multi-site disease registries, we leverage the i2b2 and SHRINE open source software to create a modular, ontology-based, federated infrastructure that provides research investigators full ownership and access to their contributed data while supporting permissioned yet robust data sharing. We accomplish these objectives via web services supporting peer-group overlays, group-aware data aggregation, and administrative functions. The 56-site Childhood Arthritis & Rheumatology Research Alliance (CARRA) Registry and 3-site Harvard Inflammatory Bowel Diseases Longitudinal Data Repository now utilize i2b2 self-scaling registry technology (i2b2-SSR). This platform, extensible to federation of multiple projects within and between research networks, encompasses >6000 subjects at sites throughout the USA. We utilize the i2b2-SSR platform to minimize technical barriers to collaboration while enabling fine-grained control over data sharing. The implementation of i2b2-SSR for the multi-site, multi-stakeholder CARRA Registry has established a digital infrastructure for community-driven research data sharing in pediatric rheumatology in the USA. We envision i2b2-SSR as a scalable, reusable solution facilitating interdisciplinary research across diseases.

  3. Users matter : multi-agent systems model of high performance computing cluster users.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Hood, C. S.; Decision and Information Sciences

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less

  4. Disappearance of Anisotropic Intermittency in Large-amplitude MHD Turbulence and Its Comparison with Small-amplitude MHD Turbulence

    NASA Astrophysics Data System (ADS)

    Yang, Liping; Zhang, Lei; He, Jiansen; Tu, Chuanyi; Li, Shengtai; Wang, Xin; Wang, Linghua

    2018-03-01

    Multi-order structure functions in the solar wind are reported to display a monofractal scaling when sampled parallel to the local magnetic field and a multifractal scaling when measured perpendicularly. Whether and to what extent will the scaling anisotropy be weakened by the enhancement of turbulence amplitude relative to the background magnetic strength? In this study, based on two runs of the magnetohydrodynamic (MHD) turbulence simulation with different relative levels of turbulence amplitude, we investigate and compare the scaling of multi-order magnetic structure functions and magnetic probability distribution functions (PDFs) as well as their dependence on the direction of the local field. The numerical results show that for the case of large-amplitude MHD turbulence, the multi-order structure functions display a multifractal scaling at all angles to the local magnetic field, with PDFs deviating significantly from the Gaussian distribution and a flatness larger than 3 at all angles. In contrast, for the case of small-amplitude MHD turbulence, the multi-order structure functions and PDFs have different features in the quasi-parallel and quasi-perpendicular directions: a monofractal scaling and Gaussian-like distribution in the former, and a conversion of a monofractal scaling and Gaussian-like distribution into a multifractal scaling and non-Gaussian tail distribution in the latter. These results hint that when intermittencies are abundant and intense, the multifractal scaling in the structure functions can appear even if it is in the quasi-parallel direction; otherwise, the monofractal scaling in the structure functions remains even if it is in the quasi-perpendicular direction.

  5. Productivity benefits of warming at regional scale could be offset by detrimental impacts on site level hydrology.

    PubMed

    Zeng, Qing; Zhang, Yamian; Wen, Li; Li, Zhaxijie; Duo, Hairui; Lei, Guangchun

    2017-11-09

    Climate change affects the distribution and persistence of wildlife. Broad scale studies have demonstrated that climate change shifts the geographic ranges and phenology of species. These findings are influential for making high level strategies but not practical enough to guide site specific management. In this study, we explored the environment factors affecting the population of Bar-headed Goose in the key breeding site of Qinghai using generalized additive mixed model (GAMM). Our results showed that 1) there were significant increasing trends in climate variables and river flows to the Qinghai Lake; 2) NDVI in the sites decreased significantly despite the regional positive trend induced by the warmer and wetter climate; 3) NDVI at site scale was negatively correlated to lake water level; and 4) the abundance of Bar-headed Goose decreased significantly at all sites. While the abundance was positively related to NDVI at breeding sites, the GAMM revealed an opposite relationship at foraging areas. Our findings demonstrated the multi-facet effects of climate change on population dynamics; and the effect at global/regional scale could be complicated by site level factors.

  6. Large-scale recording of thalamocortical circuits: in vivo electrophysiology with the two-dimensional electronic depth control silicon probe

    PubMed Central

    Fiáth, Richárd; Beregszászi, Patrícia; Horváth, Domonkos; Wittner, Lucia; Aarts, Arno A. A.; Ruther, Patrick; Neves, Hercules P.; Bokor, Hajnalka; Acsády, László

    2016-01-01

    Recording simultaneous activity of a large number of neurons in distributed neuronal networks is crucial to understand higher order brain functions. We demonstrate the in vivo performance of a recently developed electrophysiological recording system comprising a two-dimensional, multi-shank, high-density silicon probe with integrated complementary metal-oxide semiconductor electronics. The system implements the concept of electronic depth control (EDC), which enables the electronic selection of a limited number of recording sites on each of the probe shafts. This innovative feature of the system permits simultaneous recording of local field potentials (LFP) and single- and multiple-unit activity (SUA and MUA, respectively) from multiple brain sites with high quality and without the actual physical movement of the probe. To evaluate the in vivo recording capabilities of the EDC probe, we recorded LFP, MUA, and SUA in acute experiments from cortical and thalamic brain areas of anesthetized rats and mice. The advantages of large-scale recording with the EDC probe are illustrated by investigating the spatiotemporal dynamics of pharmacologically induced thalamocortical slow-wave activity in rats and by the two-dimensional tonotopic mapping of the auditory thalamus. In mice, spatial distribution of thalamic responses to optogenetic stimulation of the neocortex was examined. Utilizing the benefits of the EDC system may result in a higher yield of useful data from a single experiment compared with traditional passive multielectrode arrays, and thus in the reduction of animals needed for a research study. PMID:27535370

  7. Ground-truthing electrical resistivity methods in support of submarine groundwater discharge studies: Examples from Hawaii, Washington, and California

    USGS Publications Warehouse

    Johnson, Cordell; Swarzenski, Peter W.; Richardson, Christina M.; Smith, Christopher G.; Kroeger, Kevin D.; Ganguli, Priya M.

    2015-01-01

    Rigorous ground-truthing at each field site showed that multi-channel electrcial resistivity techniques can reproduce the scales and dynamics of a seepage field when such data are correctly collected, and when the model inversions are tuned to field site characteristics. Such information can provide a unique perspective on the scales and dynamics of exchange processes within a coastal aquifer—information essential to scientists and resource managers alike.

  8. Multi-Chambered Treatment Train (MCTT) For Treating Stormwater Runoff From Highly Polluted Source Areas

    EPA Science Inventory

    A full-scaled Multi-Chambered Treatment Train (MCTT) stormwater treatment system was tested in Taiwan during the spring and summer of 2007. The MCTT was installed in a parking lot in Ping-Lin, Northern Taiwan. The site is 85% impervious and has a drainage area to the MCTT unit of...

  9. Value-focused framework for defining landscape-scale conservation targets

    USGS Publications Warehouse

    Romañach, Stephanie; Benscoter, Allison M.; Brandt, Laura A.

    2016-01-01

    Conservation of natural resources can be challenging in a rapidly changing world and require collaborative efforts for success. Conservation planning is the process of deciding how to protect, conserve, and enhance or minimize loss of natural and cultural resources. Establishing conservation targets (also called indicators or endpoints), the measurable expressions of desired resource conditions, can help with site-specific up to landscape-scale conservation planning. Using conservation targets and tracking them through time can deliver benefits such as insight into ecosystem health and providing early warnings about undesirable trends. We describe an approach using value-focused thinking to develop statewide conservation targets for Florida. Using such an approach allowed us to first identify stakeholder objectives and then define conservation targets to meet those objectives. Stakeholders were able to see how their shared efforts fit into the broader conservation context, and also anticipate the benefits of multi-agency and -organization collaboration. We developed an iterative process for large-scale conservation planning that included defining a shared framework for the process, defining the conservation targets themselves, as well as developing management and monitoring strategies for evaluation of their effectiveness. The process we describe is applicable to other geographies where multiple parties are seeking to implement collaborative, large-scale biological planning.

  10. Application of large-scale, multi-resolution watershed modeling framework using the Hydrologic and Water Quality System (HAWQS)

    USDA-ARS?s Scientific Manuscript database

    In recent years, large-scale watershed modeling has been implemented broadly in the field of water resources planning and management. Complex hydrological, sediment, and nutrient processes can be simulated by sophisticated watershed simulation models for important issues such as water resources all...

  11. Assessing the weighted multi-objective adaptive surrogate model optimization to derive large-scale reservoir operating rules with sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Jingwen; Wang, Xu; Liu, Pan; Lei, Xiaohui; Li, Zejun; Gong, Wei; Duan, Qingyun; Wang, Hao

    2017-01-01

    The optimization of large-scale reservoir system is time-consuming due to its intrinsic characteristics of non-commensurable objectives and high dimensionality. One way to solve the problem is to employ an efficient multi-objective optimization algorithm in the derivation of large-scale reservoir operating rules. In this study, the Weighted Multi-Objective Adaptive Surrogate Model Optimization (WMO-ASMO) algorithm is used. It consists of three steps: (1) simplifying the large-scale reservoir operating rules by the aggregation-decomposition model, (2) identifying the most sensitive parameters through multivariate adaptive regression splines (MARS) for dimensional reduction, and (3) reducing computational cost and speeding the searching process by WMO-ASMO, embedded with weighted non-dominated sorting genetic algorithm II (WNSGAII). The intercomparison of non-dominated sorting genetic algorithm (NSGAII), WNSGAII and WMO-ASMO are conducted in the large-scale reservoir system of Xijiang river basin in China. Results indicate that: (1) WNSGAII surpasses NSGAII in the median of annual power generation, increased by 1.03% (from 523.29 to 528.67 billion kW h), and the median of ecological index, optimized by 3.87% (from 1.879 to 1.809) with 500 simulations, because of the weighted crowding distance and (2) WMO-ASMO outperforms NSGAII and WNSGAII in terms of better solutions (annual power generation (530.032 billion kW h) and ecological index (1.675)) with 1000 simulations and computational time reduced by 25% (from 10 h to 8 h) with 500 simulations. Therefore, the proposed method is proved to be more efficient and could provide better Pareto frontier.

  12. Multi-Site Diagnostic Classification of Schizophrenia Using Discriminant Deep Learning with Functional Connectivity MRI.

    PubMed

    Zeng, Ling-Li; Wang, Huaning; Hu, Panpan; Yang, Bo; Pu, Weidan; Shen, Hui; Chen, Xingui; Liu, Zhening; Yin, Hong; Tan, Qingrong; Wang, Kai; Hu, Dewen

    2018-04-01

    A lack of a sufficiently large sample at single sites causes poor generalizability in automatic diagnosis classification of heterogeneous psychiatric disorders such as schizophrenia based on brain imaging scans. Advanced deep learning methods may be capable of learning subtle hidden patterns from high dimensional imaging data, overcome potential site-related variation, and achieve reproducible cross-site classification. However, deep learning-based cross-site transfer classification, despite less imaging site-specificity and more generalizability of diagnostic models, has not been investigated in schizophrenia. A large multi-site functional MRI sample (n = 734, including 357 schizophrenic patients from seven imaging resources) was collected, and a deep discriminant autoencoder network, aimed at learning imaging site-shared functional connectivity features, was developed to discriminate schizophrenic individuals from healthy controls. Accuracies of approximately 85·0% and 81·0% were obtained in multi-site pooling classification and leave-site-out transfer classification, respectively. The learned functional connectivity features revealed dysregulation of the cortical-striatal-cerebellar circuit in schizophrenia, and the most discriminating functional connections were primarily located within and across the default, salience, and control networks. The findings imply that dysfunctional integration of the cortical-striatal-cerebellar circuit across the default, salience, and control networks may play an important role in the "disconnectivity" model underlying the pathophysiology of schizophrenia. The proposed discriminant deep learning method may be capable of learning reliable connectome patterns and help in understanding the pathophysiology and achieving accurate prediction of schizophrenia across multiple independent imaging sites. Copyright © 2018 German Center for Neurodegenerative Diseases (DZNE). Published by Elsevier B.V. All rights reserved.

  13. Multi-scale structural community organisation of the human genome.

    PubMed

    Boulos, Rasha E; Tremblay, Nicolas; Arneodo, Alain; Borgnat, Pierre; Audit, Benjamin

    2017-04-11

    Structural interaction frequency matrices between all genome loci are now experimentally achievable thanks to high-throughput chromosome conformation capture technologies. This ensues a new methodological challenge for computational biology which consists in objectively extracting from these data the structural motifs characteristic of genome organisation. We deployed the fast multi-scale community mining algorithm based on spectral graph wavelets to characterise the networks of intra-chromosomal interactions in human cell lines. We observed that there exist structural domains of all sizes up to chromosome length and demonstrated that the set of structural communities forms a hierarchy of chromosome segments. Hence, at all scales, chromosome folding predominantly involves interactions between neighbouring sites rather than the formation of links between distant loci. Multi-scale structural decomposition of human chromosomes provides an original framework to question structural organisation and its relationship to functional regulation across the scales. By construction the proposed methodology is independent of the precise assembly of the reference genome and is thus directly applicable to genomes whose assembly is not fully determined.

  14. 3D ion-scale dynamics of BBFs and their associated emissions in Earth's magnetotail using 3D hybrid simulations and MMS multi-spacecraft observations

    NASA Astrophysics Data System (ADS)

    Breuillard, H.; Aunai, N.; Le Contel, O.; Catapano, F.; Alexandrova, A.; Retino, A.; Cozzani, G.; Gershman, D. J.; Giles, B. L.; Khotyaintsev, Y. V.; Lindqvist, P. A.; Ergun, R.; Strangeway, R. J.; Russell, C. T.; Magnes, W.; Plaschke, F.; Nakamura, R.; Fuselier, S. A.; Turner, D. L.; Schwartz, S. J.; Torbert, R. B.; Burch, J.

    2017-12-01

    Transient and localized jets of hot plasma, also known as Bursty Bulk Flows (BBFs), play a crucial role in Earth's magnetotail dynamics because the energy input from the solar wind is partly dissipated in their vicinity, notably in their embedded dipolarization front (DF). This dissipation is in the form of strong low-frequency waves that can heat and accelerate energetic particles up to the high-latitude plasma sheet. The ion-scale dynamics of BBFs have been revealed by the Cluster and THEMIS multi-spacecraft missions. However, the dynamics of BBF propagation in the magnetotail are still under debate due to instrumental limitations and spacecraft separation distances, as well as simulation limitations. The NASA/MMS fleet, which features unprecedented high time resolution instruments and four spacecraft separated by kinetic-scale distances, has also shown recently that the DF normal dynamics and its associated emissions are below the ion gyroradius scale in this region. Large variations in the dawn-dusk direction were also observed. However, most of large-scale simulations are using the MHD approach and are assumed 2D in the XZ plane. Thus, in this study we take advantage of both multi-spacecraft observations by MMS and large-scale 3D hybrid simulations to investigate the 3D dynamics of BBFs and their associated emissions at ion-scale in Earth's magnetotail, and their impact on particle heating and acceleration.

  15. Using Formative Research to Develop Environmental and Ecological Interventions to Address Overweight and Obesity

    PubMed Central

    Wilson, Mark G.; Goetzel, Ron Z.; Ozminkowski, Ronald J.; DeJoy, Dave M.; Della, Lindsay; Roemer, Enid Chung; Schneider, Jennifer; Tully, Karen J.; White, John M.; Baase, Catherine M.

    2010-01-01

    Objective This paper presents the formative research phase of a large multi-site intervention study conducted to inform the feasibility of introducing environmental and ecological interventions. Methods Using mixed methods that included an environmental assessment, climate survey, leadership focus groups and interviews, and archival data, information was collected on employee health and job factors, the physical environment, social-organizational environment, and current health programs. Results Results show that 83% of employees at the study sites were overweight or obese. Leadership was very supportive of health initiatives and felt integrating the strategies into organizational operations would increase their likelihood of success. Environmental assessment scores ranged from 47 to 19 on a 100 point scale. Health services personnel tended to view the organizational climate for health more positively than site leadership (mean of 3.6 vs 3.0 respectively). Conclusions Intervention strategies chosen included increasing healthy food choices in vending, cafeterias, and company meetings, providing a walking path, targeting messages, developing site goals, training leaders, and establishing leaders at the work group level. PMID:18073340

  16. Multi-decadal analysis reveals contrasting patterns of resilience and decline in coral assemblages

    NASA Astrophysics Data System (ADS)

    Tanner, Jason E.

    2017-12-01

    A 50-yr study of coral dynamics at Heron Island, on Australia's Great Barrier Reef, shows that community change on a single reef is highly variable and that while some areas of the reef are in decline, others are recovering well 40 yr after a major cyclonic disturbance that eliminated all corals in the study plot. At one site, the genus-level composition in 2012 was identical to that before the cyclone, although it took around 30 yr to recover, and there were still differences at the species level. The colony size structure of some species at this site, notably the dominant Acropora digitifera, which had over 40% cover in 2012, also recovered after 30 yr, although sub-dominant species still lacked large colonies even after 40 yr. Given the small scale of the individual study plots (1 m2), this shows an unexpected degree of determinism in assemblage structure. At a second site, however, both composition and size structure changed dramatically over the last 40 yr of the study as both external and internal factors altered local environmental conditions. At both sites, major changes in composition appear to be related to drying out of the reef crest due to changes in flow regimes and/or natural accretion. At the site that has recovered, erosion has reversed this drying out, whereas no such erosion has occurred at the second site. If such erosion occurs, or sea levels increase due to global warming, then the second site may also prove to be resilient over decadal time scales.

  17. D Reconstruction and Visualization of Cultural Heritage: Analyzing Our Legacy Through Time

    NASA Astrophysics Data System (ADS)

    Rodríguez-Gonzálvez, P.; Muñoz-Nieto, A. L.; del Pozo, S.; Sanchez-Aparicio, L. J.; Gonzalez-Aguilera, D.; Micoli, L.; Gonizzi Barsanti, S.; Guidi, G.; Mills, J.; Fieber, K.; Haynes, I.; Hejmanowska, B.

    2017-02-01

    Temporal analyses and multi-temporal 3D reconstruction are fundamental for the preservation and maintenance of all forms of Cultural Heritage (CH) and are the basis for decisions related to interventions and promotion. Introducing the fourth dimension of time into three-dimensional geometric modelling of real data allows the creation of a multi-temporal representation of a site. In this way, scholars from various disciplines (surveyors, geologists, archaeologists, architects, philologists, etc.) are provided with a new set of tools and working methods to support the study of the evolution of heritage sites, both to develop hypotheses about the past and to model likely future developments. The capacity to "see" the dynamic evolution of CH assets across different spatial scales (e.g. building, site, city or territory) compressed in diachronic model, affords the possibility to better understand the present status of CH according to its history. However, there are numerous challenges in order to carry out 4D modelling and the requisite multi-data source integration. It is necessary to identify the specifications, needs and requirements of the CH community to understand the required levels of 4D model information. In this way, it is possible to determine the optimum material and technologies to be utilised at different CH scales, as well as the data management and visualization requirements. This manuscript aims to provide a comprehensive approach for CH time-varying representations, analysis and visualization across different working scales and environments: rural landscape, urban landscape and architectural scales. Within this aim, the different available metric data sources are systemized and evaluated in terms of their suitability.

  18. Accelerating three-dimensional FDTD calculations on GPU clusters for electromagnetic field simulation.

    PubMed

    Nagaoka, Tomoaki; Watanabe, Soichi

    2012-01-01

    Electromagnetic simulation with anatomically realistic computational human model using the finite-difference time domain (FDTD) method has recently been performed in a number of fields in biomedical engineering. To improve the method's calculation speed and realize large-scale computing with the computational human model, we adapt three-dimensional FDTD code to a multi-GPU cluster environment with Compute Unified Device Architecture and Message Passing Interface. Our multi-GPU cluster system consists of three nodes. The seven GPU boards (NVIDIA Tesla C2070) are mounted on each node. We examined the performance of the FDTD calculation on multi-GPU cluster environment. We confirmed that the FDTD calculation on the multi-GPU clusters is faster than that on a multi-GPU (a single workstation), and we also found that the GPU cluster system calculate faster than a vector supercomputer. In addition, our GPU cluster system allowed us to perform the large-scale FDTD calculation because were able to use GPU memory of over 100 GB.

  19. CLASS: The Cosmology Large Angular Scale Surveyor

    NASA Technical Reports Server (NTRS)

    Essinger-Hileman, Thomas; Ali, Aamir; Amiri, Mandana; Appel, John W.; Araujo, Derek; Bennett, Charles L.; Boone, Fletcher; Chan, Manwei; Cho, Hsiao-Mei; Chuss, David T.; hide

    2014-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is an experiment to measure the signature of a gravitational wave background from inflation in the polarization of the cosmic microwave background (CMB). CLASS is a multi-frequency array of four telescopes operating from a high-altitude site in the Atacama Desert in Chile. CLASS will survey 70% of the sky in four frequency bands centered at 38, 93, 148, and 217 GHz, which are chosen to straddle the Galactic-foreground minimum while avoiding strong atmospheric emission lines. This broad frequency coverage ensures that CLASS can distinguish Galactic emission from the CMB. The sky fraction of the CLASS survey will allow the full shape of the primordial B-mode power spectrum to be characterized, including the signal from reionization at low-length. Its unique combination of large sky coverage, control of systematic errors, and high sensitivity will allow CLASS to measure or place upper limits on the tensor-to-scalar ratio at a level of r = 0:01 and make a cosmic-variance-limited measurement of the optical depth to the surface of last scattering, tau. (c) (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

  20. Multi-scale heterogeneity of the 2011 Great Tohoku-oki Earthquake from dynamic simulations

    NASA Astrophysics Data System (ADS)

    Aochi, H.; Ide, S.

    2011-12-01

    In order to explain the scaling issues of earthquakes of different sizes, multi-scale heterogeneity conception is necessary to characterize earthquake faulting property (Ide and Aochi, JGR, 2005; Aochi and Ide, JGR, 2009).The 2011 Great Tohoku-oki earthquake (M9) is characterized by a slow initial phase of about M7, a M8 class deep rupture, and a M9 main rupture with quite large slip near the trench (e.g. Ide et al., Science, 2011) as well as the presence of foreshocks. We dynamically model these features based on the multi-scale conception. We suppose a significantly large fracture energy (corresponding to slip-weakening distance of 3.2 m) in most of the fault dimension to represent the M9 rupture. However we give local heterogeneity with relatively small circular patches of smaller fracture energy, by assuming the linear scaling relation between the radius and fracture energy. The calculation is carried out using 3D Boundary Integral Equation Method. We first begin only with the mainshock (Aochi and Ide, EPS, 2011), but later we find it important to take into account of a series of foreshocks since the 9th March (M7.4). The smaller patches including the foreshock area are necessary to launch the M9 rupture area of large fracture energy. We then simulate the ground motion in low frequencies using Finite Difference Method. Qualitatively, the observed tendency is consistent with our simulations, in the meaning of the transition from the central part to the southern part in low frequencies (10 - 20 sec). At higher frequencies (1-10 sec), further small asperities are inferred in the observed signals, and this feature matches well with our multi-scale conception.

  1. Space Technology 5 Multi-point Observations of Field-aligned Currents: Temporal Variability of Meso-Scale Structures

    NASA Technical Reports Server (NTRS)

    Le, Guan; Wang, Yongli; Slavin, James A.; Strangeway, Robert J.

    2007-01-01

    Space Technology 5 (ST5) is a three micro-satellite constellation deployed into a 300 x 4500 km, dawn-dusk, sun-synchronous polar orbit from March 22 to June 21, 2006, for technology validations. In this paper, we present a study of the temporal variability of field-aligned currents using multi-point magnetic field measurements from ST5. The data demonstrate that meso-scale current structures are commonly embedded within large-scale field-aligned current sheets. The meso-scale current structures are very dynamic with highly variable current density and/or polarity in time scales of - 10 min. They exhibit large temporal variations during both quiet and disturbed times in such time scales. On the other hand, the data also shown that the time scales for the currents to be relatively stable are approx. 1 min for meso-scale currents and approx. 10 min for large scale current sheets. These temporal features are obviously associated with dynamic variations of their particle carriers (mainly electrons) as they respond to the variations of the parallel electric field in auroral acceleration region. The characteristic time scales for the temporal variability of meso-scale field-aligned currents are found to be consistent with those of auroral parallel electric field.

  2. Multi-fidelity methods for uncertainty quantification in transport problems

    NASA Astrophysics Data System (ADS)

    Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.

    2016-12-01

    We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.

  3. Large-scale modeling on the fate and transport of polycyclic aromatic hydrocarbons (PAHs) in multimedia over China

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Liu, M.; Wada, Y.; He, X.; Sun, X.

    2017-12-01

    In recent decades, with rapid economic growth, industrial development and urbanization, expanding pollution of polycyclic aromatic hydrocarbons (PAHs) has become a diversified and complicated phenomenon in China. However, the availability of sufficient monitoring activities for PAHs in multi-compartment and the corresponding multi-interface migration processes are still limited, especially at a large geographic area. In this study, we couple the Multimedia Fate Model (MFM) to the Community Multi-Scale Air Quality (CMAQ) model in order to consider the fugacity and the transient contamination processes. This coupled dynamic contaminant model can evaluate the detailed local variations and mass fluxes of PAHs in different environmental media (e.g., air, surface film, soil, sediment, water and vegetation) across different spatial (a county to country) and temporal (days to years) scales. This model has been applied to a large geographical domain of China at a 36 km by 36 km grid resolution. The model considers response characteristics of typical environmental medium to complex underlying surface. Results suggest that direct emission is the main input pathway of PAHs entering the atmosphere, while advection is the main outward flow of pollutants from the environment. In addition, both soil and sediment act as the main sink of PAHs and have the longest retention time. Importantly, the highest PAHs loadings are found in urbanized and densely populated regions of China, such as Yangtze River Delta and Pearl River Delta. This model can provide a good scientific basis towards a better understanding of the large-scale dynamics of environmental pollutants for land conservation and sustainable development. In a next step, the dynamic contaminant model will be integrated with the continental-scale hydrological and water resources model (i.e., Community Water Model, CWatM) to quantify a more accurate representation and feedbacks between the hydrological cycle and water quality at even larger geographical domains. Keywords: PAHs; Community multi-scale air quality model; Multimedia fate model; Land use

  4. Sea-level rise and archaeological site destruction: An example from the southeastern United States using DINAA (Digital Index of North American Archaeology).

    PubMed

    Anderson, David G; Bissett, Thaddeus G; Yerka, Stephen J; Wells, Joshua J; Kansa, Eric C; Kansa, Sarah W; Myers, Kelsey Noack; DeMuth, R Carl; White, Devin A

    2017-01-01

    The impact of changing climate on terrestrial and underwater archaeological sites, historic buildings, and cultural landscapes can be examined through quantitatively-based analyses encompassing large data samples and broad geographic and temporal scales. The Digital Index of North American Archaeology (DINAA) is a multi-institutional collaboration that allows researchers online access to linked heritage data from multiple sources and data sets. The effects of sea-level rise and concomitant human population relocation is examined using a sample from nine states encompassing much of the Gulf and Atlantic coasts of the southeastern United States. A 1 m rise in sea-level will result in the loss of over >13,000 recorded historic and prehistoric archaeological sites, as well as over 1000 locations currently eligible for inclusion on the National Register of Historic Places (NRHP), encompassing archaeological sites, standing structures, and other cultural properties. These numbers increase substantially with each additional 1 m rise in sea level, with >32,000 archaeological sites and >2400 NRHP properties lost should a 5 m rise occur. Many more unrecorded archaeological and historic sites will also be lost as large areas of the landscape are flooded. The displacement of millions of people due to rising seas will cause additional impacts where these populations resettle. Sea level rise will thus result in the loss of much of the record of human habitation of the coastal margin in the Southeast within the next one to two centuries, and the numbers indicate the magnitude of the impact on the archaeological record globally. Construction of large linked data sets is essential to developing procedures for sampling, triage, and mitigation of these impacts.

  5. Sea-level rise and archaeological site destruction: An example from the southeastern United States using DINAA (Digital Index of North American Archaeology)

    PubMed Central

    Wells, Joshua J.; Kansa, Eric C.; Kansa, Sarah W.; Myers, Kelsey Noack; DeMuth, R. Carl; White, Devin A.

    2017-01-01

    The impact of changing climate on terrestrial and underwater archaeological sites, historic buildings, and cultural landscapes can be examined through quantitatively-based analyses encompassing large data samples and broad geographic and temporal scales. The Digital Index of North American Archaeology (DINAA) is a multi-institutional collaboration that allows researchers online access to linked heritage data from multiple sources and data sets. The effects of sea-level rise and concomitant human population relocation is examined using a sample from nine states encompassing much of the Gulf and Atlantic coasts of the southeastern United States. A 1 m rise in sea-level will result in the loss of over >13,000 recorded historic and prehistoric archaeological sites, as well as over 1000 locations currently eligible for inclusion on the National Register of Historic Places (NRHP), encompassing archaeological sites, standing structures, and other cultural properties. These numbers increase substantially with each additional 1 m rise in sea level, with >32,000 archaeological sites and >2400 NRHP properties lost should a 5 m rise occur. Many more unrecorded archaeological and historic sites will also be lost as large areas of the landscape are flooded. The displacement of millions of people due to rising seas will cause additional impacts where these populations resettle. Sea level rise will thus result in the loss of much of the record of human habitation of the coastal margin in the Southeast within the next one to two centuries, and the numbers indicate the magnitude of the impact on the archaeological record globally. Construction of large linked data sets is essential to developing procedures for sampling, triage, and mitigation of these impacts. PMID:29186200

  6. Multi-scale drivers of spatial variation in old-growth forest carbon density disentangled with Lidar and an individual-based landscape model

    Treesearch

    Rupert Seidl; Thomas A. Spies; Werner Rammer; E. Ashley Steel; Robert J. Pabst; Keith. Olsen

    2012-01-01

    Forest ecosystems are the most important terrestrial carbon (C) storage globally, and presently mitigate anthropogenic climate change by acting as a large and persistent sink for atmospheric CO2. Yet, forest C density varies greatly in space, both globally and at stand and landscape levels. Understanding the multi-scale drivers of this variation...

  7. Comprehensive assay of kinase catalytic activity reveals features of kinase inhibitor selectivity

    PubMed Central

    Anastassiadis, Theonie; Deacon, Sean W.; Devarajan, Karthik; Ma, Haiching; Peterson, Jeffrey R.

    2011-01-01

    Small-molecule protein kinase inhibitors are central tools for elucidating cellular signaling pathways and are promising therapeutic agents. Due to evolutionary conservation of the ATP-binding site, most kinase inhibitors that target this site promiscuously inhibit multiple kinases. Interpretation of experiments utilizing these compounds is confounded by a lack of data on the comprehensive kinase selectivity of most inhibitors. Here we profiled the activity of 178 commercially available kinase inhibitors against a panel of 300 recombinant protein kinases using a functional assay. Quantitative analysis revealed complex and often unexpected kinase-inhibitor interactions, with a wide spectrum of promiscuity. Many off-target interactions occur with seemingly unrelated kinases, revealing how large-scale profiling can be used to identify multi-targeted inhibitors of specific, diverse kinases. The results have significant implications for drug development and provide a resource for selecting compounds to elucidate kinase function and for interpreting the results of experiments that use them. PMID:22037377

  8. Coastal Wind and Turbulence Observations during the Morning and Evening Transitions over Tropical Terrain

    DOE PAGES

    Jensen, Derek D.; Price, Timothy A.; Nadeau, Daniel F.; ...

    2017-12-15

    Data collected during a multiyear, wind-resource assessment over a multi-land-use coastal environment in Belize are used to study the development and decay of wind and turbulence through the morning and evening transitions. Observations were made on three tall masts, forming an inland transect of approximately 5 km. The wind distribution is found to be bimodal and governed by synoptic scales, with onshore and offshore flow regimes. The behavior between the coastal and inland sites is found to be very similar when the flow is directed offshore; for onshore flow, stark differences occur. The mean wind speed at the coastal sitemore » is approximately 20% greater than the most inland site and is nearly constant throughout the diurnal cycle. Furthermore, for both flow regimes, the influence of the land–sea breeze circulation is inconsequential relative to the large-scale synoptic forcing. Composite time series are used to study the evolution of sensible heat flux and turbulence kinetic energy (TKE) throughout the morning and evening transitions. The TKE budget reveals that at the coastal site mechanical production of TKE is much more important than buoyant production. This allows for the unexpected case in which TKE increases through the ET despite the decrease of buoyant TKE production. Multiresolution flux decomposition is used to further study this phenomenon as well as the evolution of the sensible heat flux at differing time scales. We present an idealized schematic to illustrate the timing and structure of the morning and evening transitions for an inland site and a coastal site that are subjected to similar synoptic forcing.« less

  9. Coastal Wind and Turbulence Observations during the Morning and Evening Transitions over Tropical Terrain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Derek D.; Price, Timothy A.; Nadeau, Daniel F.

    Data collected during a multiyear, wind-resource assessment over a multi-land-use coastal environment in Belize are used to study the development and decay of wind and turbulence through the morning and evening transitions. Observations were made on three tall masts, forming an inland transect of approximately 5 km. The wind distribution is found to be bimodal and governed by synoptic scales, with onshore and offshore flow regimes. The behavior between the coastal and inland sites is found to be very similar when the flow is directed offshore; for onshore flow, stark differences occur. The mean wind speed at the coastal sitemore » is approximately 20% greater than the most inland site and is nearly constant throughout the diurnal cycle. Furthermore, for both flow regimes, the influence of the land–sea breeze circulation is inconsequential relative to the large-scale synoptic forcing. Composite time series are used to study the evolution of sensible heat flux and turbulence kinetic energy (TKE) throughout the morning and evening transitions. The TKE budget reveals that at the coastal site mechanical production of TKE is much more important than buoyant production. This allows for the unexpected case in which TKE increases through the ET despite the decrease of buoyant TKE production. Multiresolution flux decomposition is used to further study this phenomenon as well as the evolution of the sensible heat flux at differing time scales. We present an idealized schematic to illustrate the timing and structure of the morning and evening transitions for an inland site and a coastal site that are subjected to similar synoptic forcing.« less

  10. A large-scale forest landscape model incorporating multi-scale processes and utilizing forest inventory data

    Treesearch

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson III; David R. Larsen; Jacob S. Fraser; Jian Yang

    2013-01-01

    Two challenges confronting forest landscape models (FLMs) are how to simulate fine, standscale processes while making large-scale (i.e., .107 ha) simulation possible, and how to take advantage of extensive forest inventory data such as U.S. Forest Inventory and Analysis (FIA) data to initialize and constrain model parameters. We present the LANDIS PRO model that...

  11. An objective and reproducible landform and topography description approach based on digital terrain analysis used for soil profile site characteristics

    NASA Astrophysics Data System (ADS)

    Gruber, Fabian E.; Baruck, Jasmin; Hastik, Richard; Geitner, Clemens

    2015-04-01

    All major soil description and classification systems, including the World Reference Base (WRB) and the German Soil description guidelines (KA5), require the characterization of landform and topography for soil profile sites. This is commonly done at more than one scale, for instance at macro-, meso- and micro scale. However, inherent when humans perform such a task, different surveyors will reach different conclusions due to their subjective perception of landscape structure, based on their individual mind-model of soil-landscape structure, emphasizing different aspects and scales of the landscape. In this study we apply a work-flow using the GRASS GIS extension module r.geomorphon to make use of high resolution digital elevation models (DEMs) to characterize the landform elements and topography of soil profile sites at different scales, and compare the results with a large number of soil profile site descriptions performed during the course of forestry surveys in South and North Tyrol (Italy and Austria, respectively). The r.geomorphon extension module for the open source geographic information system GRASS GIS applies a pattern recognition algorithm to delineate landform elements based on an input DEM. For each raster cell it computes and characterizes the visible neighborhood using line-of-sight calculations and then applies a lookup-table to classify the raster cell into one of ten landform elements (flat, peak, ridge, shoulder, slope, spur, hollow, footslope, valley and pit). The input parameter search radius (L) represents the maximum number of pixels for line-of-sight calculation, resulting in landforms larger than L to be split into landform components. The use of these visibility calculations makes this landform delineation approach suitable for comparison with the landform descriptions of soil surveyors, as their spatial perception of the landscape surrounding a soil profile site certainly influences their classification of the landform on which the profile is situated (aided by additional information such as topographic maps and aerial images). Variation of the L-value furthermore presents the opportunity to mimic the different scales at which surveyors describe soil profile locations. We first illustrate the use of r.geomorphon for site descriptions using exemplary artificial elevation profiles resembling typic catenas at different scales (L-values). We then compare the results of a landform element map computed with r.geomorphon to the relief descriptions in the test dataset. We link the surveyors' landform classification to the computed landform elements. Using a multi-scale approach we characterize raster cell locations in a way similar to the micro-, meso- and macroscale definitions used in soil survey, resulting in so-called geomorphon-signatures, such as "pit (meso-scale) located on a ridge (macro-scale)". We investigate which ranges of L-values best represent the different observation-scales as noted by soil surveyors and discuss the impacts of using a large dataset of profile location descriptions performed by different surveyors. Issues that arise are possible individual differences in landscape structure perception, but also questions regarding the accuracy of position and resulting topographic measurements in soil profile site description.

  12. Multi-criteria analysis for the detection of the most critical European UNESCO Heritage sites

    NASA Astrophysics Data System (ADS)

    Valagussa, Andrea; Frattini, Paolo; Berta, Nadia; Spizzichino, Daniele; Leoni, Gabriele; Margottini, Claudio; Battista Crosta, Giovanni

    2017-04-01

    A GIS-based multi-criteria analysis has been implemented to identify and to rank the most critical UNESCO Heritage sites at the European scale in the context of PROTHEGO JPI-Project. Two multi-criteria methods have been tested and applied to more than 300 European UNESCO Sites. First, the Analytic Hierarchy Procedure (AHP) was applied to the data of the UNESCO Periodic Report, in relation to 13 natural hazards that have affected or can potentially affect the Heritage sites. According to these reports, 22% of sites are without any documented hazard and 70% of the sites have at least one hazard affecting the site. The most important hazards on the European country are: fire (wildfire), storm, flooding, earthquake and erosion. For each UNESCO site, the potential risk was calculated as a weighed sum of the hazards that affect the site. The weighs of the 13 hazards were obtained by AHP procedure, which is a technique for multi-attribute decision making that enables the decomposition of a problem into hierarchy, based on the opinion of different experts about the dominance of risks. The weights are obtained by rescaling between 0 and 1 the eigenvectors relative to the maximum eigenvalue for the matrix of the coefficients. The internal coherence of the expert's attributions is defined through the calculation of the consistency ratio (Saaty, 1990). The result of the AHP method consists in a map of the UNESCO sites ranked according to the potential risk, where the site most at risk results to be the Geirangerfjord and Nærøyfjord in Norway. However, the quality of these results lies in the reliability of the Period Reports, which are produced by different experts with unknown level of scientific background. To test the reliability of these results, a comparison of the information of the periodic reports with available high-quality datasets (earthquake, volcano and landslide) at the Italian scale has been performed. Sites properly classified by the Period Reports range from 65% (earthquake hazard) to 98% (volcano hazard), with a high underestimation of landslide hazard. Due to this high value of uncertainty, we developed a new methodology to identify and to rank the most critical UNESCO Heritage sites on the basis of three natural hazards (landslide, earthquake, and volcano) for which reliable European-scale hazard maps are available. For each UNESCO site, a potential risk was calculated as the product of hazard (from the available maps) and potential vulnerability. The latter is obtained considering the typology of site (e.g. monument, cultural landscape, and cultural road), the presence or absence of resident and/or tourist, the position of the site (underground/over-ground). Through this methodology, a new ranking of the European UNESCO Sites has been obtained. In this ranking, the historic center of Naples results to be the most-at-danger site of the European continent.

  13. Multi-atlas learner fusion: An efficient segmentation approach for large-scale data.

    PubMed

    Asman, Andrew J; Huo, Yuankai; Plassard, Andrew J; Landman, Bennett A

    2015-12-01

    We propose multi-atlas learner fusion (MLF), a framework for rapidly and accurately replicating the highly accurate, yet computationally expensive, multi-atlas segmentation framework based on fusing local learners. In the largest whole-brain multi-atlas study yet reported, multi-atlas segmentations are estimated for a training set of 3464 MR brain images. Using these multi-atlas estimates we (1) estimate a low-dimensional representation for selecting locally appropriate example images, and (2) build AdaBoost learners that map a weak initial segmentation to the multi-atlas segmentation result. Thus, to segment a new target image we project the image into the low-dimensional space, construct a weak initial segmentation, and fuse the trained, locally selected, learners. The MLF framework cuts the runtime on a modern computer from 36 h down to 3-8 min - a 270× speedup - by completely bypassing the need for deformable atlas-target registrations. Additionally, we (1) describe a technique for optimizing the weak initial segmentation and the AdaBoost learning parameters, (2) quantify the ability to replicate the multi-atlas result with mean accuracies approaching the multi-atlas intra-subject reproducibility on a testing set of 380 images, (3) demonstrate significant increases in the reproducibility of intra-subject segmentations when compared to a state-of-the-art multi-atlas framework on a separate reproducibility dataset, (4) show that under the MLF framework the large-scale data model significantly improve the segmentation over the small-scale model under the MLF framework, and (5) indicate that the MLF framework has comparable performance as state-of-the-art multi-atlas segmentation algorithms without using non-local information. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Space Technology 5 Multi-Point Observations of Temporal Variability of Field-Aligned Currents

    NASA Technical Reports Server (NTRS)

    Le, Guan; Wang, Yongli; Slavin, James A.; Strangeway, Robert J.

    2008-01-01

    Space Technology 5 (ST5) is a three micro-satellite constellation deployed into a 300 x 4500 km, dawn-dusk, sun-synchronous polar orbit from March 22 to June 21, 2006, for technology validations. In this paper, we present a study of the temporal variability of field-aligned currents using multi-point magnetic field measurements from ST5. The data demonstrate that meso-scale current structures are commonly embedded within large-scale field-aligned current sheets. The meso-scale current structures are very dynamic with highly variable current density and/or polarity in time scales of approximately 10 min. They exhibit large temporal variations during both quiet and disturbed times in such time scales. On the other hand, the data also shown that the time scales for the currents to be relatively stable are approximately 1 min for meso-scale currents and approximately 10 min for large scale current sheets. These temporal features are obviously associated with dynamic variations of their particle carriers (mainly electrons) as they respond to the variations of the parallel electric field in auroral acceleration region. The characteristic time scales for the temporal variability of meso-scale field-aligned currents are found to be consistent with those of auroral parallel electric field.

  15. Telecommunications model for continuing education of health professionals: the Royal Brompton case.

    PubMed

    Kotis, Takis

    2003-01-01

    Telemedicine is said to be helpful to both patients and providers, but we need real-world examples to demonstrate its effectiveness. This paper presents such an example. Royal Brompton, under the Tele-remedy Program of EC Telecom, conducted a project with the Children's Hospital of Athens, Greece, to provide remote diagnosis management and continuing education for heart disease, using European ISDN technology. Preliminary results showed that, when carried out in a large scale multi-site environment, Teleremedy program significantly reduced geographic and socio-economic isolation for the patient and the professional isolation for the physician. Comparison of original vs. transmitted data revealed no significant differences, with diagnosis accuracy of 100%.

  16. The Large-Scale Biosphere-Atmosphere Experiment in Amazonia: Analyzing Regional Land Use Change Effects.

    Treesearch

    Michael Keller; Maria Assunção Silva-Dias; Daniel C. Nepstad; Meinrat O. Andreae

    2004-01-01

    The Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) is a multi-disciplinary, multinational scientific project led by Brazil. LBA researchers seek to understand Amazonia in its global context especially with regard to regional and global climate. Current development activities in Amazonia including deforestation, logging, cattle ranching, and agriculture...

  17. AIDS chief says nonoxynol-9 not effective against HIV.

    PubMed

    According to Dr. Peter Piot, executive director of the Joint UN Programme on HIV/AIDS (UNAIDS), the international multi-site trial of the spermicide nonoxynol-9 in gel form has shown that it is not effective in protecting women from HIV infection. This large-scale Phase III efficacy trial was conducted among female sex workers in Benin, Cote d'Ivoire, South Africa, and Thailand. Apart from receiving the trial microbicide or a placebo, participants also received classical HIV prevention support, such as free condoms, free treatment for sexually transmitted infections, counseling, and peer support. One positive outcome of the trial is that fewer of the sex workers who participated became infected with HIV, compared with the sex workers who did not participate at all in the study. However, Piot states that even if the results of the trials are disappointing, the search for an effective microbicide continues. To this effect, at least 36 compounds are at the preclinical testing stage, while 20 are ready for early safety trials in human volunteers and three additional compounds are being considered for large-scale trials.

  18. Scaling and criticality in a stochastic multi-agent model of a financial market

    NASA Astrophysics Data System (ADS)

    Lux, Thomas; Marchesi, Michele

    1999-02-01

    Financial prices have been found to exhibit some universal characteristics that resemble the scaling laws characterizing physical systems in which large numbers of units interact. This raises the question of whether scaling in finance emerges in a similar way - from the interactions of a large ensemble of market participants. However, such an explanation is in contradiction to the prevalent `efficient market hypothesis' in economics, which assumes that the movements of financial prices are an immediate and unbiased reflection of incoming news about future earning prospects. Within this hypothesis, scaling in price changes would simply reflect similar scaling in the `input' signals that influence them. Here we describe a multi-agent model of financial markets which supports the idea that scaling arises from mutual interactions of participants. Although the `news arrival process' in our model lacks both power-law scaling and any temporal dependence in volatility, we find that it generates such behaviour as a result of interactions between agents.

  19. Multi-scale comparison of source parameter estimation using empirical Green's function approach

    NASA Astrophysics Data System (ADS)

    Chen, X.; Cheng, Y.

    2015-12-01

    Analysis of earthquake source parameters requires correction of path effect, site response, and instrument responses. Empirical Green's function (EGF) method is one of the most effective methods in removing path effects and station responses by taking the spectral ratio between a larger and smaller event. Traditional EGF method requires identifying suitable event pairs, and analyze each event individually. This allows high quality estimations for strictly selected events, however, the quantity of resolvable source parameters is limited, which challenges the interpretation of spatial-temporal coherency. On the other hand, methods that exploit the redundancy of event-station pairs are proposed, which utilize the stacking technique to obtain systematic source parameter estimations for a large quantity of events at the same time. This allows us to examine large quantity of events systematically, facilitating analysis of spatial-temporal patterns, and scaling relationship. However, it is unclear how much resolution is scarified during this process. In addition to the empirical Green's function calculation, choice of model parameters and fitting methods also lead to biases. Here, using two regional focused arrays, the OBS array in the Mendocino region, and the borehole array in the Salton Sea geothermal field, I compare the results from the large scale stacking analysis, small-scale cluster analysis, and single event-pair analysis with different fitting methods to systematically compare the results within completely different tectonic environment, in order to quantify the consistency and inconsistency in source parameter estimations, and the associated problems.

  20. Low Frequency Turbulence as the Source of High Frequency Waves in Multi-Component Space Plasmas

    NASA Technical Reports Server (NTRS)

    Khazanov, George V.; Krivorutsky, Emmanuel N.; Uritsky, Vadim M.

    2011-01-01

    Space plasmas support a wide variety of waves, and wave-particle interactions as well as wavewave interactions are of crucial importance to magnetospheric and ionospheric plasma behavior. High frequency wave turbulence generation by the low frequency (LF) turbulence is restricted by two interconnected requirements: the turbulence should be strong enough and/or the coherent wave trains should have the appropriate length. These requirements are strongly relaxed in the multi-component plasmas, due to the heavy ions large drift velocity in the field of LF wave. The excitation of lower hybrid waves (LHWs), in particular, is a widely discussed mechanism of interaction between plasma species in space and is one of the unresolved questions of magnetospheric multi-ion plasmas. It is demonstrated that large-amplitude Alfven waves, in particular those associated with LF turbulence, may generate LHW s in the auroral zone and ring current region and in some cases (particularly in the inner magnetosphere) this serves as the Alfven wave saturation mechanism. We also argue that the described scenario can playa vital role in various parts of the outer magnetosphere featuring strong LF turbulence accompanied by LHW activity. Using the data from THEMIS spacecraft, we validate the conditions for such cross-scale coupling in the near-Earth "flow-braking" magnetotail region during the passage of sharp injection/dipolarization fronts, as well as in the turbulent outflow region of the midtail reconnection site.

  1. The XMM Large Scale Structure Survey

    NASA Astrophysics Data System (ADS)

    Pierre, Marguerite

    2005-10-01

    We propose to complete, by an additional 5 deg2, the XMM-LSS Survey region overlying the Spitzer/SWIRE field. This field already has CFHTLS and Integral coverage, and will encompass about 10 deg2. The resulting multi-wavelength medium-depth survey, which complements XMM and Chandra deep surveys, will provide a unique view of large-scale structure over a wide range of redshift, and will show active galaxies in the full range of environments. The complete coverage by optical and IR surveys provides high-quality photometric redshifts, so that cosmological results can quickly be extracted. In the spirit of a Legacy survey, we will make the raw X-ray data immediately public. Multi-band catalogues and images will also be made available on short time scales.

  2. Multi-Genetic Marker Approach and Spatio-Temporal Analysis Suggest There Is a Single Panmictic Population of Swordfish Xiphias gladius in the Indian Ocean

    PubMed Central

    Muths, Delphine; Le Couls, Sarah; Evano, Hugues; Grewe, Peter; Bourjea, Jerome

    2013-01-01

    Genetic population structure of swordfish Xiphias gladius was examined based on 2231 individual samples, collected mainly between 2009 and 2010, among three major sampling areas within the Indian Ocean (IO; twelve distinct sites), Atlantic (two sites) and Pacific (one site) Oceans using analysis of nineteen microsatellite loci (n = 2146) and mitochondrial ND2 sequences (n = 2001) data. Sample collection was stratified in time and space in order to investigate the stability of the genetic structure observed with a special focus on the South West Indian Ocean. Significant AMOVA variance was observed for both markers indicating genetic population subdivision was present between oceans. Overall value of F-statistics for ND2 sequences confirmed that Atlantic and Indian Oceans swordfish represent two distinct genetic stocks. Indo-Pacific differentiation was also significant but lower than that observed between Atlantic and Indian Oceans. However, microsatellite F-statistics failed to reveal structure even at the inter-oceanic scale, indicating that resolving power of our microsatellite loci was insufficient for detecting population subdivision. At the scale of the Indian Ocean, results obtained from both markers are consistent with swordfish belonging to a single unique panmictic population. Analyses partitioned by sampling area, season, or sex also failed to identify any clear structure within this ocean. Such large spatial and temporal homogeneity of genetic structure, observed for such a large highly mobile pelagic species, suggests as satisfactory to consider swordfish as a single panmictic population in the Indian Ocean. PMID:23717447

  3. Evaluation of multi-scale mineralized collagen-polycaprolactone composites for bone tissue engineering.

    PubMed

    Weisgerber, D W; Erning, K; Flanagan, C L; Hollister, S J; Harley, B A C

    2016-08-01

    A particular challenge in biomaterial development for treating orthopedic injuries stems from the need to balance bioactive design criteria with the mechanical and geometric constraints governed by the physiological wound environment. Such trade-offs are of particular importance in large craniofacial bone defects which arise from both acute trauma and chronic conditions. Ongoing efforts in our laboratory have demonstrated a mineralized collagen biomaterial that can promote human mesenchymal stem cell osteogenesis in the absence of osteogenic media but that possesses suboptimal mechanical properties in regards to use in loaded wound sites. Here we demonstrate a multi-scale composite consisting of a highly bioactive mineralized collagen-glycosaminoglycan scaffold with micron-scale porosity and a polycaprolactone support frame (PCL) with millimeter-scale porosity. Fabrication of the composite was performed by impregnating the PCL support frame with the mineral scaffold precursor suspension prior to lyophilization. Here we evaluate the mechanical properties, permeability, and bioactivity of the resulting composite. Results indicated that the PCL support frame dominates the bulk mechanical response of the composite resulting in a 6000-fold increase in modulus compared to the mineral scaffold alone. Similarly, the incorporation of the mineral scaffold matrix into the composite resulted in a higher specific surface area compared to the PCL frame alone. The increased specific surface area in the collagen-PCL composite promoted increased initial attachment of porcine adipose derived stem cells versus the PCL construct. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Multisite longitudinal reliability of tract-based spatial statistics in diffusion tensor imaging of healthy elderly subjects.

    PubMed

    Jovicich, Jorge; Marizzoni, Moira; Bosch, Beatriz; Bartrés-Faz, David; Arnold, Jennifer; Benninghoff, Jens; Wiltfang, Jens; Roccatagliata, Luca; Picco, Agnese; Nobili, Flavio; Blin, Oliver; Bombois, Stephanie; Lopes, Renaud; Bordet, Régis; Chanoine, Valérie; Ranjeva, Jean-Philippe; Didic, Mira; Gros-Dagnac, Hélène; Payoux, Pierre; Zoccatelli, Giada; Alessandrini, Franco; Beltramello, Alberto; Bargalló, Núria; Ferretti, Antonio; Caulo, Massimo; Aiello, Marco; Ragucci, Monica; Soricelli, Andrea; Salvadori, Nicola; Tarducci, Roberto; Floridi, Piero; Tsolaki, Magda; Constantinidis, Manos; Drevelegas, Antonios; Rossini, Paolo Maria; Marra, Camillo; Otto, Josephin; Reiss-Zimmermann, Martin; Hoffmann, Karl-Titus; Galluzzi, Samantha; Frisoni, Giovanni B

    2014-11-01

    Large-scale longitudinal neuroimaging studies with diffusion imaging techniques are necessary to test and validate models of white matter neurophysiological processes that change in time, both in healthy and diseased brains. The predictive power of such longitudinal models will always be limited by the reproducibility of repeated measures acquired during different sessions. At present, there is limited quantitative knowledge about the across-session reproducibility of standard diffusion metrics in 3T multi-centric studies on subjects in stable conditions, in particular when using tract based spatial statistics and with elderly people. In this study we implemented a multi-site brain diffusion protocol in 10 clinical 3T MRI sites distributed across 4 countries in Europe (Italy, Germany, France and Greece) using vendor provided sequences from Siemens (Allegra, Trio Tim, Verio, Skyra, Biograph mMR), Philips (Achieva) and GE (HDxt) scanners. We acquired DTI data (2 × 2 × 2 mm(3), b = 700 s/mm(2), 5 b0 and 30 diffusion weighted volumes) of a group of healthy stable elderly subjects (5 subjects per site) in two separate sessions at least a week apart. For each subject and session four scalar diffusion metrics were considered: fractional anisotropy (FA), mean diffusivity (MD), radial diffusivity (RD) and axial (AD) diffusivity. The diffusion metrics from multiple subjects and sessions at each site were aligned to their common white matter skeleton using tract-based spatial statistics. The reproducibility at each MRI site was examined by looking at group averages of absolute changes relative to the mean (%) on various parameters: i) reproducibility of the signal-to-noise ratio (SNR) of the b0 images in centrum semiovale, ii) full brain test-retest differences of the diffusion metric maps on the white matter skeleton, iii) reproducibility of the diffusion metrics on atlas-based white matter ROIs on the white matter skeleton. Despite the differences of MRI scanner configurations across sites (vendors, models, RF coils and acquisition sequences) we found good and consistent test-retest reproducibility. White matter b0 SNR reproducibility was on average 7 ± 1% with no significant MRI site effects. Whole brain analysis resulted in no significant test-retest differences at any of the sites with any of the DTI metrics. The atlas-based ROI analysis showed that the mean reproducibility errors largely remained in the 2-4% range for FA and AD and 2-6% for MD and RD, averaged across ROIs. Our results show reproducibility values comparable to those reported in studies using a smaller number of MRI scanners, slightly different DTI protocols and mostly younger populations. We therefore show that the acquisition and analysis protocols used are appropriate for multi-site experimental scenarios. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Visualizing Phenology and Climate Data at the National Scale

    NASA Astrophysics Data System (ADS)

    Rosemartin, A.; Marsh, L.

    2013-12-01

    Nature's Notebook is the USA National Phenology Network's national-scale plant and animal phenology observation program, designed to address the challenges posed by global change and its impacts on ecosystems and human health. Since its inception in 2009, 2,500 participants in Nature's Notebook have submitted 2.3 million records on the phenology of 17,000 organisms across the United States. An information architecture has been developed to facilitate collaboration and participatory data collection and digitization. Browser-based and mobile applications support data submission, and a MySQL/Drupal multi-site infrastructure enables data storage, access and discovery. Web services are available for both input and export of data resources. In this presentation we will focus on a tool for visualizing phenology data at the national scale. Effective data exploration for this multi-dimensional dataset requires the ability to plot sites, select species and phenophases, graph organismal phenology through time, and view integrated precipitation and temperature data. We will demonstrate the existing tool's capacity, discuss future directions and solicit feedback from the community.

  6. Multi-scale calculation based on dual domain material point method combined with molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dhakal, Tilak Raj

    This dissertation combines the dual domain material point method (DDMP) with molecular dynamics (MD) in an attempt to create a multi-scale numerical method to simulate materials undergoing large deformations with high strain rates. In these types of problems, the material is often in a thermodynamically non-equilibrium state, and conventional constitutive relations are often not available. In this method, the closure quantities, such as stress, at each material point are calculated from a MD simulation of a group of atoms surrounding the material point. Rather than restricting the multi-scale simulation in a small spatial region, such as phase interfaces, or crackmore » tips, this multi-scale method can be used to consider non-equilibrium thermodynamic e ects in a macroscopic domain. This method takes advantage that the material points only communicate with mesh nodes, not among themselves; therefore MD simulations for material points can be performed independently in parallel. First, using a one-dimensional shock problem as an example, the numerical properties of the original material point method (MPM), the generalized interpolation material point (GIMP) method, the convected particle domain interpolation (CPDI) method, and the DDMP method are investigated. Among these methods, only the DDMP method converges as the number of particles increases, but the large number of particles needed for convergence makes the method very expensive especially in our multi-scale method where we calculate stress in each material point using MD simulation. To improve DDMP, the sub-point method is introduced in this dissertation, which provides high quality numerical solutions with a very small number of particles. The multi-scale method based on DDMP with sub-points is successfully implemented for a one dimensional problem of shock wave propagation in a cerium crystal. The MD simulation to calculate stress in each material point is performed in GPU using CUDA to accelerate the computation. The numerical properties of the multiscale method are investigated as well as the results from this multi-scale calculation are compared of particles needed for convergence makes the method very expensive especially in our multi-scale method where we calculate stress in each material point using MD simulation. To improve DDMP, the sub-point method is introduced in this dissertation, which provides high quality numerical solutions with a very small number of particles. The multi-scale method based on DDMP with sub-points is successfully implemented for a one dimensional problem of shock wave propagation in a cerium crystal. The MD simulation to calculate stress in each material point is performed in GPU using CUDA to accelerate the computation. The numerical properties of the multiscale method are investigated as well as the results from this multi-scale calculation are compared with direct MD simulation results to demonstrate the feasibility of the method. Also, the multi-scale method is applied for a two dimensional problem of jet formation around copper notch under a strong impact.« less

  7. A multi-scale spatial approach to address environmental effects of small hydropower development.

    PubMed

    McManamay, Ryan A; Samu, Nicole; Kao, Shih-Chieh; Bevelhimer, Mark S; Hetrick, Shelaine C

    2015-01-01

    Hydropower development continues to grow worldwide in developed and developing countries. While the ecological and physical responses to dam construction have been well documented, translating this information into planning for hydropower development is extremely difficult. Very few studies have conducted environmental assessments to guide site-specific or widespread hydropower development. Herein, we propose a spatial approach for estimating environmental effects of hydropower development at multiple scales, as opposed to individual site-by-site assessments (e.g., environmental impact assessment). Because the complex, process-driven effects of future hydropower development may be uncertain or, at best, limited by available information, we invested considerable effort in describing novel approaches to represent environmental concerns using spatial data and in developing the spatial footprint of hydropower infrastructure. We then use two case studies in the US, one at the scale of the conterminous US and another within two adjoining rivers basins, to examine how environmental concerns can be identified and related to areas of varying energy capacity. We use combinations of reserve-design planning and multi-metric ranking to visualize tradeoffs among environmental concerns and potential energy capacity. Spatial frameworks, like the one presented, are not meant to replace more in-depth environmental assessments, but to identify information gaps and measure the sustainability of multi-development scenarios as to inform policy decisions at the basin or national level. Most importantly, the approach should foster discussions among environmental scientists and stakeholders regarding solutions to optimize energy development and environmental sustainability.

  8. Evaluating Unmanned Aerial Platforms for Cultural Heritage Large Scale Mapping

    NASA Astrophysics Data System (ADS)

    Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E. K.

    2016-06-01

    When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.

  9. AMMA-CATCH a Hydrological, Meteorological and Ecological Long Term Observatory on West Africa : Some Recent Results

    NASA Astrophysics Data System (ADS)

    Galle, S.; Grippa, M.; Peugeot, C.; Bouzou Moussa, I.; Cappelaere, B.; Demarty, J.; Mougin, E.; Lebel, T.; Chaffard, V.

    2015-12-01

    AMMA-CATCH is a multi-scale observation system dedicated to long-term monitoring of the water cycle, the vegetation dynamics and their interaction with climate and water resources in West Africa. In the context of the global change, long-term observations are required to i) gain understanding in eco-hydrological processes over this highly contrasted region, ii) help their representation in Earth System Models, and iii) detect trends and infer their impacts on water resources and living conditions. It is made of three meso-scale sites (~ 1°x1°) in Mali, Niger and Benin, extending along the West African eco-climatic gradient. Within this regional window (5° by 9°), each of the three sites comprises a multi-scale set-up which helps documenting the components of the hydrologic budget and the evolutions of the surface conditions over a range of time scales: raingages, piezometers, river discharge stations, soil moisture and temperature profiles, turbulent fluxes measurements, LAI/biomass monitoring. This observation system has been continuously generating coherent datasets for 10 to 25 years depending on the datasets. It is jointly operated by French and African (Mali, Niger and Benin) research institutions. The data-base is available to the community through the website (www.amma-catch.org). AMMA-CATCH is a member of the French critical zone observatory network "Réseau des Bassins Versants", (RBV). AMMA-CATH participates to several global or regional observation networks, such as FluxNet, CarboAfrica, International Soil Moisture Networks (ISMN) and to calibration/validation campaigns for satellite missions such as SMOS (CNES, ESA), MEGHA-TROPIQUES (France/India) or SWAP(NASA). AMMA-CATCH fills a gap over a region, West Africa, where environmental data are largely lacking, and thus, it can usefully contribute to the international networking effort for environmental monitoring and research. Recent results on regional evolution of land cover, rainfall intensity and their consequences on eco-hydrological processes and hydrosystems will be presented.

  10. A cloud-based framework for large-scale traditional Chinese medical record retrieval.

    PubMed

    Liu, Lijun; Liu, Li; Fu, Xiaodong; Huang, Qingsong; Zhang, Xianwen; Zhang, Yin

    2018-01-01

    Electronic medical records are increasingly common in medical practice. The secondary use of medical records has become increasingly important. It relies on the ability to retrieve the complete information about desired patient populations. How to effectively and accurately retrieve relevant medical records from large- scale medical big data is becoming a big challenge. Therefore, we propose an efficient and robust framework based on cloud for large-scale Traditional Chinese Medical Records (TCMRs) retrieval. We propose a parallel index building method and build a distributed search cluster, the former is used to improve the performance of index building, and the latter is used to provide high concurrent online TCMRs retrieval. Then, a real-time multi-indexing model is proposed to ensure the latest relevant TCMRs are indexed and retrieved in real-time, and a semantics-based query expansion method and a multi- factor ranking model are proposed to improve retrieval quality. Third, we implement a template-based visualization method for displaying medical reports. The proposed parallel indexing method and distributed search cluster can improve the performance of index building and provide high concurrent online TCMRs retrieval. The multi-indexing model can ensure the latest relevant TCMRs are indexed and retrieved in real-time. The semantics expansion method and the multi-factor ranking model can enhance retrieval quality. The template-based visualization method can enhance the availability and universality, where the medical reports are displayed via friendly web interface. In conclusion, compared with the current medical record retrieval systems, our system provides some advantages that are useful in improving the secondary use of large-scale traditional Chinese medical records in cloud environment. The proposed system is more easily integrated with existing clinical systems and be used in various scenarios. Copyright © 2017. Published by Elsevier Inc.

  11. Probing Inflation Using Galaxy Clustering On Ultra-Large Scales

    NASA Astrophysics Data System (ADS)

    Dalal, Roohi; de Putter, Roland; Dore, Olivier

    2018-01-01

    A detailed understanding of curvature perturbations in the universe is necessary to constrain theories of inflation. In particular, measurements of the local non-gaussianity parameter, flocNL, enable us to distinguish between two broad classes of inflationary theories, single-field and multi-field inflation. While most single-field theories predict flocNL ≈ ‑5/12 (ns -1), in multi-field theories, flocNL is not constrained to this value and is allowed to be observably large. Achieving σ(flocNL) = 1 would give us discovery potential for detecting multi-field inflation, while finding flocNL=0 would rule out a good fraction of interesting multi-field models. We study the use of galaxy clustering on ultra-large scales to achieve this level of constraint on flocNL. Upcoming surveys such as Euclid and LSST will give us galaxy catalogs from which we can construct the galaxy power spectrum and hence infer a value of flocNL. We consider two possible methods of determining the galaxy power spectrum from a catalog of galaxy positions: the traditional Feldman Kaiser Peacock (FKP) Power Spectrum Estimator, and an Optimal Quadratic Estimator (OQE). We implemented and tested each method using mock galaxy catalogs, and compared the resulting constraints on flocNL. We find that the FKP estimator can measure flocNL in an unbiased way, but there remains room for improvement in its precision. We also find that the OQE is not computationally fast, but remains a promising option due to its ability to isolate the power spectrum at large scales. We plan to extend this research to study alternative methods, such as pixel-based likelihood functions. We also plan to study the impact of general relativistic effects at these scales on our ability to measure flocNL.

  12. The OMIV Observatory on landslides - Observing with Multi-parameters the Instability of Versants

    NASA Astrophysics Data System (ADS)

    Grasso, J.-R.; Garambois, S.; D; Jongmans; Helmstetter, A.; Lebourg, T.; Malet, J.-P.; Berolo, W.; Bethoux, R.; Daras, L.; Ulrich, P.

    2010-05-01

    The OMIV Observatory on landslides (Observatoire Multi-disciplinaire des Instabilités de Versants; e.g. Multi-disciplinary Observatory on Slope Instabilities) is a French-research initiative clustering five research institutes in earth sciences (e.g. GéoAzur in Nice; EOST-IPGS in Strasbourg, LETG in Caen, LGIT in Grenoble, LST in Lyon) under the auspices of INSU (Institut National des Sciences de l'Univers) since 2007. The primary objectives of OMIV are (1) to deploy and maintain permanent instrumental networks in order to be able to (2) to provide robust, long-lasting multi-parameter, open datasets to the international geoscience community. Such continuous monitoring of ongoing landslides are missing and they will provide constrains on the processes that lead to slope instabilities. Worldwide, the societal impact of landslides is one of the most important natural hazard in mountainous and rocky coastal areas. The variability in time and space of the slope structures and their susceptibility to external forcing (weathering, earthquake, climatic triggers) restrain our ability to simulate and forecast slope instabilities. Four active large landslides are monitored by the OMIV observatory group; these sites have been chosen according to their past history of monitoring, to the risk they may create and to the scientific challenges they raise up. The four studied landslides are: the Avignonet landslide (30 km South of Grenoble) and the Super-Sauze landslide (5 km South to Barcelonnette) which are soft-rock slides developed in clays for which the susceptibility to rainfalls and earthquake is the main open question; the La Clapière (100 km North of Nice) and the Séchilienne landslide (25km East of Grenoble) which are typical mature and immature large scale rock mass gravitational instabilities, respectively. On these four pilot sites, the OMIV research group is monitoring in continuous three types of observations: landslide kinematics (deformation and displacements), landslide seismic activity (through passive seismic auscultation), and landslide slope hydrology (hydrodynamics and hydro-geochemistry). These observables are open datasets which are available through the OMIV website (for the four sites, http://www-lgit.obs.ujf-grenoble.fr/observations/omiv/donnees.html and for the Super-Sauze landslide also at http://eost.u-strasbg.fr/omiv). When kinematics, hydrology and seismic activity are the main observables for many monitored landslides worldwide, only a few of them combines the three types of observables at relevant spatial and temporal scales. It is hypothesized by the OMIV observatory group that the combination of these three measurements will give access to a better knowledge on the physical processes controlling landslide behavior, such as the generation of brittle damage in the landslide material during sliding, the recognition and characterization of slip surface(s), the characterization of the hydrological behavior of the slope before and after failure. It opens possible routes toward characterizing the macro-scale rheology of the systems (e.g. brittle plastic transition for hard rock slopes, slide to flow transition for soft-rock landslides). The cross analysis of the monitoring data will bring new insights on the kinematics and dynamics of unstable slopes. In this study, we present (i) the technical organization of the multi-parameter monitoring datasets, and (ii) preliminary results from the ongoing monitoring.

  13. Multi-scale curvature for automated identification of glaciated mountain landscapes

    NASA Astrophysics Data System (ADS)

    Prasicek, Günther; Otto, Jan-Christoph; Montgomery, David; Schrott, Lothar

    2014-05-01

    Automated morphometric interpretation of digital terrain data based on impartial rule sets holds substantial promise for large dataset processing and objective landscape classification. However, the geomorphological realm presents tremendous complexity in the translation of qualitative descriptions into geomorphometric semantics. Here, the simple, conventional distinction of V-shaped fluvial and U-shaped glacial valleys is analyzed quantitatively using the relation of multi-scale curvature and drainage area. Glacial and fluvial erosion shapes mountain landscapes in a long-recognized and characteristic way. Valleys incised by fluvial processes typically have V-shaped cross-sections with uniform and moderately steep slopes, whereas glacial valleys tend to have U-shaped profiles and topographic gradients steepening with distance from valley floor. On a DEM, thalweg cells are determined by a drainage area cutoff and multiple moving window sizes are used to derive per-cell curvature over a variety of scales ranging from the vicinity of the flow path at the valley bottom to catchment sections fully including valley sides. The relation of the curvatures calculated for the user-defined minimum scale and the automatically detected maximum scale is presented as a novel morphometric variable termed Difference of Minimum Curvature (DMC). DMC thresholds determined from typical glacial and fluvial sample catchments are employed to identify quadrats of glaciated and non-glaciated mountain landscapes and the distinctions are validated by field-based geological and geomorphological maps. A first test of the novel algorithm at three study sites in the western United States and a subsequent application to Europe and western Asia demonstrate the transferability of the approach.

  14. 77 FR 52754 - Draft Midwest Wind Energy Multi-Species Habitat Conservation Plan Within Eight-State Planning Area

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ... include new and existing small-scale wind energy facilities, such as single-turbine demonstration projects, as well as large, multi-turbine commercial wind facilities. Covered Species The planning partners are...-FF03E00000] Draft Midwest Wind Energy Multi-Species Habitat Conservation Plan Within Eight-State Planning...

  15. Boise Hydrogeophysical Research Site: Control Volume/Test Cell and Community Research Asset

    NASA Astrophysics Data System (ADS)

    Barrash, W.; Bradford, J.; Malama, B.

    2008-12-01

    The Boise Hydrogeophysical Research Site (BHRS) is a research wellfield or field-scale test facility developed in a shallow, coarse, fluvial aquifer with the objectives of supporting: (a) development of cost- effective, non- or minimally-invasive quantitative characterization and imaging methods in heterogeneous aquifers using hydrologic and geophysical techniques; (b) examination of fundamental relationships and processes at multiple scales; (c) testing theories and models for groundwater flow and solute transport; and (d) educating and training of students in multidisciplinary subsurface science and engineering. The design of the wells and the wellfield support modular use and reoccupation of wells for a wide range of single-well, cross-hole, multiwell and multilevel hydrologic, geophysical, and combined hydrologic-geophysical experiments. Efforts to date by Boise State researchers and collaborators have been largely focused on: (a) establishing the 3D distributions of geologic, hydrologic, and geophysical parameters which can then be used as the basis for jointly inverting hard and soft data to return the 3D K distribution and (b) developing subsurface measurement and imaging methods including tomographic characterization and imaging methods. At this point the hydrostratigraphic framework of the BHRS is known to be a hierarchical multi-scale system which includes layers and lenses that are recognized with geologic, hydrologic, radar, seismic, and EM methods; details are now emerging which may allow 3D deterministic characterization of zones and/or material variations at the meter scale in the central wellfield. Also the site design and subsurface framework have supported a variety of testing configurations for joint hydrologic and geophysical experiments. Going forward we recognize the opportunity to increase the R&D returns from use of the BHRS with additional infrastructure (especially for monitoring the vadose zone and surface water-groundwater interactions), more collaborative activity, and greater access to site data. Our broader goal of becoming more available as a research asset for the scientific community also supports the long-term business plan of increasing funding opportunities to maintain and operate the site.

  16. A Deep Underground Science and Engineering Laboratory (DUSEL) at Kimballton

    NASA Astrophysics Data System (ADS)

    Vogelaar, R. Bruce

    2004-11-01

    The National Academy of Science, as well as several long-range plans from the physics communities, have endorsed the need to create a large, multi-disciplinary underground laboratory in the US. Several potential sites have been identified, and the National Science Foundation has begun a solicitation process to help formulate the science program as well as to identify and develop candidate sites. The only site on the East Coast is at Kimballton, near Blacksburg, in western Virginia. Of all the sites, it is the only one located in sedimentary rocks. This makes it an IDEAL and unique location for both physics, geoscience, and engineering studies. Kimballton is also only half an hour from Virginia Tech, the largest university in the state of Virginia. A multi-institution group has been developing this possibility, and will be competing on the national scale to have DUSEL located at Kimballton. One of the assets of this location is a large limestone mine, already at a depth of 2300 ft (1850 mwe), with true drive-in access and extremely large caverns. The DUSEL facility at this location will try to take advantage of the existing infrastructure, while at the same time develop complementary and adjacent facilities down to 7000 ft (6000 mwe) to allow independent operation of the future facility. Since 2003, Virginia Tech and the Naval Research Laboratory have been working to also develop a general low-level facility at this location. The initial program is to help develop extremely low-background germanium and gas proportional counters, and a single super-module of the Low-Energy Neutrino Spectroscopy (LENS) detector -- designed to measure the real-time low-energy neutrino spectrum from the Sun, including the pp-flux. Progress in this program (including seismic imaging), and the proposed overall extensive science program (Phys, Geo, Eng, Bio) which can be addressed at Kimballton will be presented. For further information, see our webpage http://www.phys.vt.edu/ kimballton/ Clearly, if such a national facility were located in the south-east it would be a tremendous resource to regional universities and laboratories. New partners and collaborators are very welcome.

  17. Validation of Mismatch Negativity and P3a for Use in Multi-Site Studies of Schizophrenia: Characterization of Demographic, Clinical, Cognitive, and Functional Correlates in COGS-2

    PubMed Central

    Light, Gregory A.; Swerdlow, Neal R.; Thomas, Michael L.; Calkins, Monica E.; Green, Michael F.; Greenwood, Tiffany A.; Gur, Raquel E.; Gur, Ruben C.; Lazzeroni, Laura C.; Nuechterlein, Keith H.; Pela, Marlena; Radant, Allen D.; Seidman, Larry J.; Sharp, Richard F.; Siever, Larry J.; Silverman, Jeremy M.; Sprock, Joyce; Stone, William S.; Sugar, Catherine A.; Tsuang, Debby W.; Tsuang, Ming T.; Braff, David L.; Turetsky, Bruce I.

    2014-01-01

    Mismatch negativity (MMN) and P3a are auditory event-related potential (ERP) components that show robust deficits in schizophrenia (SZ) patients and exhibit qualities of endophenotypes, including substantial heritability, test-retest reliability, and trait-like stability. These measures also fulfill criteria for use as cognition and function-linked biomarkers in outcome studies, but have not yet been validated for use in large-scale multi-site clinical studies. This study tested the feasibility of adding MMN and P3a to the ongoing Consortium on the Genetics of Schizophrenia (COGS) study. The extent to which demographic, clinical, cognitive, and functional characteristics contribute to variability in MMN and P3a amplitudes was also examined. Participants (HCS n=824, SZ n=966) underwent testing at 5 geographically distributed COGS laboratories. Valid ERP data was obtained from 91% of HCS and 91% of SZ patients. Highly significant MMN (d=0.96) and P3a (d=0.93) amplitude reductions were observed in SZ patients, comparable in magnitude to those observed in single-lab studies with no appreciable differences across laboratories. Demographic characteristics accounted for 26% and 18% of the variance in MMN and P3a amplitudes, respectively. Significant relationships were observed among demographically-adjusted MMN and P3a measures and medication status as well as several clinical, cognitive, and functional characteristics of the SZ patients. This study demonstrates that MMN and P3a ERP biomarkers can be feasibly used in multi-site clinical studies. As with many clinical tests of brain function, demographic factors contribute to MMN and P3a amplitudes and should be carefully considered in future biomarker-informed clinical studies. PMID:25449710

  18. Validation of mismatch negativity and P3a for use in multi-site studies of schizophrenia: characterization of demographic, clinical, cognitive, and functional correlates in COGS-2.

    PubMed

    Light, Gregory A; Swerdlow, Neal R; Thomas, Michael L; Calkins, Monica E; Green, Michael F; Greenwood, Tiffany A; Gur, Raquel E; Gur, Ruben C; Lazzeroni, Laura C; Nuechterlein, Keith H; Pela, Marlena; Radant, Allen D; Seidman, Larry J; Sharp, Richard F; Siever, Larry J; Silverman, Jeremy M; Sprock, Joyce; Stone, William S; Sugar, Catherine A; Tsuang, Debby W; Tsuang, Ming T; Braff, David L; Turetsky, Bruce I

    2015-04-01

    Mismatch negativity (MMN) and P3a are auditory event-related potential (ERP) components that show robust deficits in schizophrenia (SZ) patients and exhibit qualities of endophenotypes, including substantial heritability, test-retest reliability, and trait-like stability. These measures also fulfill criteria for use as cognition and function-linked biomarkers in outcome studies, but have not yet been validated for use in large-scale multi-site clinical studies. This study tested the feasibility of adding MMN and P3a to the ongoing Consortium on the Genetics of Schizophrenia (COGS) study. The extent to which demographic, clinical, cognitive, and functional characteristics contribute to variability in MMN and P3a amplitudes was also examined. Participants (HCS n=824, SZ n=966) underwent testing at 5 geographically distributed COGS laboratories. Valid ERP recordings were obtained from 91% of HCS and 91% of SZ patients. Highly significant MMN (d=0.96) and P3a (d=0.93) amplitude reductions were observed in SZ patients, comparable in magnitude to those observed in single-lab studies with no appreciable differences across laboratories. Demographic characteristics accounted for 26% and 18% of the variance in MMN and P3a amplitudes, respectively. Significant relationships were observed among demographically-adjusted MMN and P3a measures and medication status as well as several clinical, cognitive, and functional characteristics of the SZ patients. This study demonstrates that MMN and P3a ERP biomarkers can be feasibly used in multi-site clinical studies. As with many clinical tests of brain function, demographic factors contribute to MMN and P3a amplitudes and should be carefully considered in future biomarker-informed clinical studies. Published by Elsevier B.V.

  19. SHRINE: Enabling Nationally Scalable Multi-Site Disease Studies

    PubMed Central

    McMurry, Andrew J.; Murphy, Shawn N.; MacFadden, Douglas; Weber, Griffin; Simons, William W.; Orechia, John; Bickel, Jonathan; Wattanasin, Nich; Gilbert, Clint; Trevvett, Philip; Churchill, Susanne; Kohane, Isaac S.

    2013-01-01

    Results of medical research studies are often contradictory or cannot be reproduced. One reason is that there may not be enough patient subjects available for observation for a long enough time period. Another reason is that patient populations may vary considerably with respect to geographic and demographic boundaries thus limiting how broadly the results apply. Even when similar patient populations are pooled together from multiple locations, differences in medical treatment and record systems can limit which outcome measures can be commonly analyzed. In total, these differences in medical research settings can lead to differing conclusions or can even prevent some studies from starting. We thus sought to create a patient research system that could aggregate as many patient observations as possible from a large number of hospitals in a uniform way. We call this system the ‘Shared Health Research Information Network’, with the following properties: (1) reuse electronic health data from everyday clinical care for research purposes, (2) respect patient privacy and hospital autonomy, (3) aggregate patient populations across many hospitals to achieve statistically significant sample sizes that can be validated independently of a single research setting, (4) harmonize the observation facts recorded at each institution such that queries can be made across many hospitals in parallel, (5) scale to regional and national collaborations. The purpose of this report is to provide open source software for multi-site clinical studies and to report on early uses of this application. At this time SHRINE implementations have been used for multi-site studies of autism co-morbidity, juvenile idiopathic arthritis, peripartum cardiomyopathy, colorectal cancer, diabetes, and others. The wide range of study objectives and growing adoption suggest that SHRINE may be applicable beyond the research uses and participating hospitals named in this report. PMID:23533569

  20. Large-scale multi-stage constructed wetlands for secondary effluents treatment in northern China: Carbon dynamics.

    PubMed

    Wu, Haiming; Fan, Jinlin; Zhang, Jian; Ngo, Huu Hao; Guo, Wenshan

    2018-02-01

    Multi-stage constructed wetlands (CWs) have been proved to be a cost-effective alternative in the treatment of various wastewaters for improving the treatment performance as compared with the conventional single-stage CWs. However, few long-term full-scale multi-stage CWs have been performed and evaluated for polishing effluents from domestic wastewater treatment plants (WWTP). This study investigated the seasonal and spatial dynamics of carbon and the effects of the key factors (input loading and temperature) in the large-scale seven-stage Wu River CW polishing domestic WWTP effluents in northern China. The results indicated a significant improvement in water quality. Significant seasonal and spatial variations of organics removal were observed in the Wu River CW with a higher COD removal efficiency of 64-66% in summer and fall. Obvious seasonal and spatial variations of CH 4 and CO 2 emissions were also found with the average CH 4 and CO 2 emission rates of 3.78-35.54 mg m -2 d -1 and 610.78-8992.71 mg m -2 d -1 , respectively, while the higher CH 4 and CO 2 emission flux was obtained in spring and summer. Seasonal air temperatures and inflow COD loading rates significantly affected organics removal and CH 4 emission, but they appeared to have a weak influence on CO 2 emission. Overall, this study suggested that large-scale Wu River CW might be a potential source of GHG, but considering the sustainability of the multi-stage CW, the inflow COD loading rate of 1.8-2.0 g m -2 d -1 and temperature of 15-20 °C may be the suitable condition for achieving the higher organics removal efficiency and lower greenhouse gases (GHG) emission in polishing the domestic WWTP effluent. The obtained knowledge of the carbon dynamics in large-scale Wu River CW will be helpful for understanding the carbon cycles, but also can provide useful field experience for the design, operation and management of multi-stage CW treatments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Up-scaling of multi-variable flood loss models from objects to land use units at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai; Merz, Bruno

    2016-05-01

    Flood risk management increasingly relies on risk analyses, including loss modelling. Most of the flood loss models usually applied in standard practice have in common that complex damaging processes are described by simple approaches like stage-damage functions. Novel multi-variable models significantly improve loss estimation on the micro-scale and may also be advantageous for large-scale applications. However, more input parameters also reveal additional uncertainty, even more in upscaling procedures for meso-scale applications, where the parameters need to be estimated on a regional area-wide basis. To gain more knowledge about challenges associated with the up-scaling of multi-variable flood loss models the following approach is applied: Single- and multi-variable micro-scale flood loss models are up-scaled and applied on the meso-scale, namely on basis of ATKIS land-use units. Application and validation is undertaken in 19 municipalities, which were affected during the 2002 flood by the River Mulde in Saxony, Germany by comparison to official loss data provided by the Saxon Relief Bank (SAB).In the meso-scale case study based model validation, most multi-variable models show smaller errors than the uni-variable stage-damage functions. The results show the suitability of the up-scaling approach, and, in accordance with micro-scale validation studies, that multi-variable models are an improvement in flood loss modelling also on the meso-scale. However, uncertainties remain high, stressing the importance of uncertainty quantification. Thus, the development of probabilistic loss models, like BT-FLEMO used in this study, which inherently provide uncertainty information are the way forward.

  2. PedsQL™ Multidimensional Fatigue Scale in sickle cell disease: feasibility, reliability, and validity.

    PubMed

    Panepinto, Julie A; Torres, Sylvia; Bendo, Cristiane B; McCavit, Timothy L; Dinu, Bogdan; Sherman-Bien, Sandra; Bemrich-Stolz, Christy; Varni, James W

    2014-01-01

    Sickle cell disease (SCD) is an inherited blood disorder characterized by a chronic hemolytic anemia that can contribute to fatigue and global cognitive impairment in patients. The study objective was to report on the feasibility, reliability, and validity of the PedsQL™ Multidimensional Fatigue Scale in SCD for pediatric patient self-report ages 5-18 years and parent proxy-report for ages 2-18 years. This was a cross-sectional multi-site study whereby 240 pediatric patients with SCD and 303 parents completed the 18-item PedsQL™ Multidimensional Fatigue Scale. Participants also completed the PedsQL™ 4.0 Generic Core Scales. The PedsQL™ Multidimensional Fatigue Scale evidenced excellent feasibility, excellent reliability for the Total Scale Scores (patient self-report α = 0.90; parent proxy-report α = 0.95), and acceptable reliability for the three individual scales (patient self-report α = 0.77-0.84; parent proxy-report α = 0.90-0.97). Intercorrelations of the PedsQL™ Multidimensional Fatigue Scale with the PedsQL™ Generic Core Scales were predominantly in the large (≥0.50) range, supporting construct validity. PedsQL™ Multidimensional Fatigue Scale Scores were significantly worse with large effects sizes (≥0.80) for patients with SCD than for a comparison sample of healthy children, supporting known-groups discriminant validity. Confirmatory factor analysis demonstrated an acceptable to excellent model fit in SCD. The PedsQL™ Multidimensional Fatigue Scale demonstrated acceptable to excellent measurement properties in SCD. The results demonstrate the relative severity of fatigue symptoms in pediatric patients with SCD, indicating the potential clinical utility of multidimensional assessment of fatigue in patients with SCD in clinical research and practice. © 2013 Wiley Periodicals, Inc.

  3. PedsQL™ Multidimensional Fatigue Scale in Sickle Cell Disease: Feasibility, Reliability and Validity

    PubMed Central

    Panepinto, Julie A.; Torres, Sylvia; Bendo, Cristiane B.; McCavit, Timothy L.; Dinu, Bogdan; Sherman-Bien, Sandra; Bemrich-Stolz, Christy; Varni, James W.

    2013-01-01

    Background Sickle cell disease (SCD) is an inherited blood disorder characterized by a chronic hemolytic anemia that can contribute to fatigue and global cognitive impairment in patients. The study objective was to report on the feasibility, reliability, and validity of the PedsQL™ Multidimensional Fatigue Scale in SCD for pediatric patient self-report ages 5–18 years and parent proxy-report for ages 2–18 years. Procedure This was a cross-sectional multi-site study whereby 240 pediatric patients with SCD and 303 parents completed the 18-item PedsQL™ Multidimensional Fatigue Scale. Participants also completed the PedsQL™ 4.0 Generic Core Scales. Results The PedsQL™ Multidimensional Fatigue Scale evidenced excellent feasibility, excellent reliability for the Total Scale Scores (patient self-report α = 0.90; parent proxy-report α = 0.95), and acceptable reliability for the three individual scales (patient self-report α = 0.77–0.84; parent proxy-report α = 0.90–0.97). Intercorrelations of the PedsQL™ Multidimensional Fatigue Scale with the PedsQL™ Generic Core Scales were predominantly in the large (≥ 0.50) range, supporting construct validity. PedsQL™ Multidimensional Fatigue Scale Scores were significantly worse with large effects sizes (≥0.80) for patients with SCD than for a comparison sample of healthy children, supporting known-groups discriminant validity. Confirmatory factor analysis demonstrated an acceptable to excellent model fit in SCD. Conclusions The PedsQL™ Multidimensional Fatigue Scale demonstrated acceptable to excellent measurement properties in SCD. The results demonstrate the relative severity of fatigue symptoms in pediatric patients with SCD, indicating the potential clinical utility of multidimensional assessment of fatigue in patients with SCD in clinical research and practice. PMID:24038960

  4. Validating Remotely Sensed Land Surface Evapotranspiration Based on Multi-scale Field Measurements

    NASA Astrophysics Data System (ADS)

    Jia, Z.; Liu, S.; Ziwei, X.; Liang, S.

    2012-12-01

    The land surface evapotranspiration plays an important role in the surface energy balance and the water cycle. There have been significant technical and theoretical advances in our knowledge of evapotranspiration over the past two decades. Acquisition of the temporally and spatially continuous distribution of evapotranspiration using remote sensing technology has attracted the widespread attention of researchers and managers. However, remote sensing technology still has many uncertainties coming from model mechanism, model inputs, parameterization schemes, and scaling issue in the regional estimation. Achieving remotely sensed evapotranspiration (RS_ET) with confident certainty is required but difficult. As a result, it is indispensable to develop the validation methods to quantitatively assess the accuracy and error sources of the regional RS_ET estimations. This study proposes an innovative validation method based on multi-scale evapotranspiration acquired from field measurements, with the validation results including the accuracy assessment, error source analysis, and uncertainty analysis of the validation process. It is a potentially useful approach to evaluate the accuracy and analyze the spatio-temporal properties of RS_ET at both the basin and local scales, and is appropriate to validate RS_ET in diverse resolutions at different time-scales. An independent RS_ET validation using this method was presented over the Hai River Basin, China in 2002-2009 as a case study. Validation at the basin scale showed good agreements between the 1 km annual RS_ET and the validation data such as the water balanced evapotranspiration, MODIS evapotranspiration products, precipitation, and landuse types. Validation at the local scale also had good results for monthly, daily RS_ET at 30 m and 1 km resolutions, comparing to the multi-scale evapotranspiration measurements from the EC and LAS, respectively, with the footprint model over three typical landscapes. Although some validation experiments demonstrated that the models yield accurate estimates at flux measurement sites, the question remains whether they are performing well over the broader landscape. Moreover, a large number of RS_ET products have been released in recent years. Thus, we also pay attention to the cross-validation method of RS_ET derived from multi-source models. "The Multi-scale Observation Experiment on Evapotranspiration over Heterogeneous Land Surfaces: Flux Observation Matrix" campaign is carried out at the middle reaches of the Heihe River Basin, China in 2012. Flux measurements from an observation matrix composed of 22 EC and 4 LAS are acquired to investigate the cross-validation of multi-source models over different landscapes. In this case, six remote sensing models, including the empirical statistical model, the one-source and two-source models, the Penman-Monteith equation based model, the Priestley-Taylor equation based model, and the complementary relationship based model, are used to perform an intercomparison. All the results from the two cases of RS_ET validation showed that the proposed validation methods are reasonable and feasible.

  5. Large-scale recording of thalamocortical circuits: in vivo electrophysiology with the two-dimensional electronic depth control silicon probe.

    PubMed

    Fiáth, Richárd; Beregszászi, Patrícia; Horváth, Domonkos; Wittner, Lucia; Aarts, Arno A A; Ruther, Patrick; Neves, Hercules P; Bokor, Hajnalka; Acsády, László; Ulbert, István

    2016-11-01

    Recording simultaneous activity of a large number of neurons in distributed neuronal networks is crucial to understand higher order brain functions. We demonstrate the in vivo performance of a recently developed electrophysiological recording system comprising a two-dimensional, multi-shank, high-density silicon probe with integrated complementary metal-oxide semiconductor electronics. The system implements the concept of electronic depth control (EDC), which enables the electronic selection of a limited number of recording sites on each of the probe shafts. This innovative feature of the system permits simultaneous recording of local field potentials (LFP) and single- and multiple-unit activity (SUA and MUA, respectively) from multiple brain sites with high quality and without the actual physical movement of the probe. To evaluate the in vivo recording capabilities of the EDC probe, we recorded LFP, MUA, and SUA in acute experiments from cortical and thalamic brain areas of anesthetized rats and mice. The advantages of large-scale recording with the EDC probe are illustrated by investigating the spatiotemporal dynamics of pharmacologically induced thalamocortical slow-wave activity in rats and by the two-dimensional tonotopic mapping of the auditory thalamus. In mice, spatial distribution of thalamic responses to optogenetic stimulation of the neocortex was examined. Utilizing the benefits of the EDC system may result in a higher yield of useful data from a single experiment compared with traditional passive multielectrode arrays, and thus in the reduction of animals needed for a research study. Copyright © 2016 the American Physiological Society.

  6. Multi-Scale Three-Dimensional Variational Data Assimilation System for Coastal Ocean Prediction

    NASA Technical Reports Server (NTRS)

    Li, Zhijin; Chao, Yi; Li, P. Peggy

    2012-01-01

    A multi-scale three-dimensional variational data assimilation system (MS-3DVAR) has been formulated and the associated software system has been developed for improving high-resolution coastal ocean prediction. This system helps improve coastal ocean prediction skill, and has been used in support of operational coastal ocean forecasting systems and field experiments. The system has been developed to improve the capability of data assimilation for assimilating, simultaneously and effectively, sparse vertical profiles and high-resolution remote sensing surface measurements into coastal ocean models, as well as constraining model biases. In this system, the cost function is decomposed into two separate units for the large- and small-scale components, respectively. As such, data assimilation is implemented sequentially from large to small scales, the background error covariance is constructed to be scale-dependent, and a scale-dependent dynamic balance is incorporated. This scheme then allows effective constraining large scales and model bias through assimilating sparse vertical profiles, and small scales through assimilating high-resolution surface measurements. This MS-3DVAR enhances the capability of the traditional 3DVAR for assimilating highly heterogeneously distributed observations, such as along-track satellite altimetry data, and particularly maximizing the extraction of information from limited numbers of vertical profile observations.

  7. Geomorphic analysis of large alluvial rivers

    NASA Astrophysics Data System (ADS)

    Thorne, Colin R.

    2002-05-01

    Geomorphic analysis of a large river presents particular challenges and requires a systematic and organised approach because of the spatial scale and system complexity involved. This paper presents a framework and blueprint for geomorphic studies of large rivers developed in the course of basic, strategic and project-related investigations of a number of large rivers. The framework demonstrates the need to begin geomorphic studies early in the pre-feasibility stage of a river project and carry them through to implementation and post-project appraisal. The blueprint breaks down the multi-layered and multi-scaled complexity of a comprehensive geomorphic study into a number of well-defined and semi-independent topics, each of which can be performed separately to produce a clearly defined, deliverable product. Geomorphology increasingly plays a central role in multi-disciplinary river research and the importance of effective quality assurance makes it essential that audit trails and quality checks are hard-wired into study design. The structured approach presented here provides output products and production trails that can be rigorously audited, ensuring that the results of a geomorphic study can stand up to the closest scrutiny.

  8. Initial assessment of multi-scale measures of C02 and H20 flux in the Siberian taiga

    Treesearch

    D.Y. Hollinger; F.M. Kelliher; E.-D. Schulze; N.N. Vygodskaya; A. Varlagin; I. Milukova; J.N. Byers; A. Sogachov; J.E. Hunt; T.M. McSeveny; K.I. Kobak; G. Bauer; A. Arneth

    1995-01-01

    We measured CO2 and H2O fluxes between undisturbed Larix gmelinii forest and the atmosphere at a remote Eastern Siberian site in July 1993. Scaled-up leaf-level porometer measurements agreed with those derived from the eddy correlation technique for the canopy fluxes of CO2 and H...

  9. A multi-scalar approach to theorizing socio-ecological dynamics of urban residential landscapes

    Treesearch

    Rinku Roy Chowdhury; Kelli Larson; Morgan Grove; Colin Polsky; Elizabeth Cook; Jeffrey Onsted; Laura Ogden

    2011-01-01

    Urban residential expansion increasingly drives land use, land cover and ecological changes worldwide, yet social science theories explaining such change remain under-developed. Existing theories often focus on processes occurring at one scale, while ignoring other scales. Emerging evidence from four linked U.S. research sites suggests it is essential to examine...

  10. Fast dimension reduction and integrative clustering of multi-omics data using low-rank approximation: application to cancer molecular classification.

    PubMed

    Wu, Dingming; Wang, Dongfang; Zhang, Michael Q; Gu, Jin

    2015-12-01

    One major goal of large-scale cancer omics study is to identify molecular subtypes for more accurate cancer diagnoses and treatments. To deal with high-dimensional cancer multi-omics data, a promising strategy is to find an effective low-dimensional subspace of the original data and then cluster cancer samples in the reduced subspace. However, due to data-type diversity and big data volume, few methods can integrative and efficiently find the principal low-dimensional manifold of the high-dimensional cancer multi-omics data. In this study, we proposed a novel low-rank approximation based integrative probabilistic model to fast find the shared principal subspace across multiple data types: the convexity of the low-rank regularized likelihood function of the probabilistic model ensures efficient and stable model fitting. Candidate molecular subtypes can be identified by unsupervised clustering hundreds of cancer samples in the reduced low-dimensional subspace. On testing datasets, our method LRAcluster (low-rank approximation based multi-omics data clustering) runs much faster with better clustering performances than the existing method. Then, we applied LRAcluster on large-scale cancer multi-omics data from TCGA. The pan-cancer analysis results show that the cancers of different tissue origins are generally grouped as independent clusters, except squamous-like carcinomas. While the single cancer type analysis suggests that the omics data have different subtyping abilities for different cancer types. LRAcluster is a very useful method for fast dimension reduction and unsupervised clustering of large-scale multi-omics data. LRAcluster is implemented in R and freely available via http://bioinfo.au.tsinghua.edu.cn/software/lracluster/ .

  11. A new hybrid meta-heuristic algorithm for optimal design of large-scale dome structures

    NASA Astrophysics Data System (ADS)

    Kaveh, A.; Ilchi Ghazaan, M.

    2018-02-01

    In this article a hybrid algorithm based on a vibrating particles system (VPS) algorithm, multi-design variable configuration (Multi-DVC) cascade optimization, and an upper bound strategy (UBS) is presented for global optimization of large-scale dome truss structures. The new algorithm is called MDVC-UVPS in which the VPS algorithm acts as the main engine of the algorithm. The VPS algorithm is one of the most recent multi-agent meta-heuristic algorithms mimicking the mechanisms of damped free vibration of single degree of freedom systems. In order to handle a large number of variables, cascade sizing optimization utilizing a series of DVCs is used. Moreover, the UBS is utilized to reduce the computational time. Various dome truss examples are studied to demonstrate the effectiveness and robustness of the proposed method, as compared to some existing structural optimization techniques. The results indicate that the MDVC-UVPS technique is a powerful search and optimization method for optimizing structural engineering problems.

  12. Next-Generation MDAC Discrimination Procedure Using Multi-Dimensional Spectral Analyses

    DTIC Science & Technology

    2007-09-01

    explosions near the Lop Nor, Novaya Zemlya, Semipalatinsk , Nevada, and Indian test sites . We have computed regional phase spectra and are correcting... test sites as mainly due to differences in explosion P and S corner frequencies. Fisk (2007) used source model fits to estimate Pn, Pg, and Lg corner...frequencies for Nevada Test Site (NTS) explosions and found that Lg corner frequencies exhibit similar scaling with source size as for Pn and Pg

  13. Structural Dynamics of Tropical Moist Forest Gaps

    PubMed Central

    Hunter, Maria O.; Keller, Michael; Morton, Douglas; Cook, Bruce; Lefsky, Michael; Ducey, Mark; Saleska, Scott; de Oliveira, Raimundo Cosme; Schietti, Juliana

    2015-01-01

    Gap phase dynamics are the dominant mode of forest turnover in tropical forests. However, gap processes are infrequently studied at the landscape scale. Airborne lidar data offer detailed information on three-dimensional forest structure, providing a means to characterize fine-scale (1 m) processes in tropical forests over large areas. Lidar-based estimates of forest structure (top down) differ from traditional field measurements (bottom up), and necessitate clear-cut definitions unencumbered by the wisdom of a field observer. We offer a new definition of a forest gap that is driven by forest dynamics and consistent with precise ranging measurements from airborne lidar data and tall, multi-layered tropical forest structure. We used 1000 ha of multi-temporal lidar data (2008, 2012) at two sites, the Tapajos National Forest and Ducke Reserve, to study gap dynamics in the Brazilian Amazon. Here, we identified dynamic gaps as contiguous areas of significant growth, that correspond to areas > 10 m2, with height <10 m. Applying the dynamic definition at both sites, we found over twice as much area in gap at Tapajos National Forest (4.8 %) as compared to Ducke Reserve (2.0 %). On average, gaps were smaller at Ducke Reserve and closed slightly more rapidly, with estimated height gains of 1.2 m y-1 versus 1.1 m y-1 at Tapajos. At the Tapajos site, height growth in gap centers was greater than the average height gain in gaps (1.3 m y-1 versus 1.1 m y-1). Rates of height growth between lidar acquisitions reflect the interplay between gap edge mortality, horizontal ingrowth and gap size at the two sites. We estimated that approximately 10 % of gap area closed via horizontal ingrowth at Ducke Reserve as opposed to 6 % at Tapajos National Forest. Height loss (interpreted as repeat damage and/or mortality) and horizontal ingrowth accounted for similar proportions of gap area at Ducke Reserve (13 % and 10 %, respectively). At Tapajos, height loss had a much stronger signal (23 % versus 6 %) within gaps. Both sites demonstrate limited gap contagiousness defined by an increase in the likelihood of mortality in the immediate vicinity (~6 m) of existing gaps. PMID:26168242

  14. Structural Dynamics of Tropical Moist Forest Gaps.

    PubMed

    Hunter, Maria O; Keller, Michael; Morton, Douglas; Cook, Bruce; Lefsky, Michael; Ducey, Mark; Saleska, Scott; de Oliveira, Raimundo Cosme; Schietti, Juliana

    2015-01-01

    Gap phase dynamics are the dominant mode of forest turnover in tropical forests. However, gap processes are infrequently studied at the landscape scale. Airborne lidar data offer detailed information on three-dimensional forest structure, providing a means to characterize fine-scale (1 m) processes in tropical forests over large areas. Lidar-based estimates of forest structure (top down) differ from traditional field measurements (bottom up), and necessitate clear-cut definitions unencumbered by the wisdom of a field observer. We offer a new definition of a forest gap that is driven by forest dynamics and consistent with precise ranging measurements from airborne lidar data and tall, multi-layered tropical forest structure. We used 1000 ha of multi-temporal lidar data (2008, 2012) at two sites, the Tapajos National Forest and Ducke Reserve, to study gap dynamics in the Brazilian Amazon. Here, we identified dynamic gaps as contiguous areas of significant growth, that correspond to areas > 10 m2, with height <10 m. Applying the dynamic definition at both sites, we found over twice as much area in gap at Tapajos National Forest (4.8%) as compared to Ducke Reserve (2.0%). On average, gaps were smaller at Ducke Reserve and closed slightly more rapidly, with estimated height gains of 1.2 m y-1 versus 1.1 m y-1 at Tapajos. At the Tapajos site, height growth in gap centers was greater than the average height gain in gaps (1.3 m y-1 versus 1.1 m y-1). Rates of height growth between lidar acquisitions reflect the interplay between gap edge mortality, horizontal ingrowth and gap size at the two sites. We estimated that approximately 10% of gap area closed via horizontal ingrowth at Ducke Reserve as opposed to 6% at Tapajos National Forest. Height loss (interpreted as repeat damage and/or mortality) and horizontal ingrowth accounted for similar proportions of gap area at Ducke Reserve (13% and 10%, respectively). At Tapajos, height loss had a much stronger signal (23% versus 6%) within gaps. Both sites demonstrate limited gap contagiousness defined by an increase in the likelihood of mortality in the immediate vicinity (~6 m) of existing gaps.

  15. Cloud Size Distributions from Multi-sensor Observations of Shallow Cumulus Clouds

    NASA Astrophysics Data System (ADS)

    Kleiss, J.; Riley, E.; Kassianov, E.; Long, C. N.; Riihimaki, L.; Berg, L. K.

    2017-12-01

    Combined radar-lidar observations have been used for almost two decades to document temporal changes of shallow cumulus clouds at the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Facility's Southern Great Plains (SGP) site in Oklahoma, USA. Since the ARM zenith-pointed radars and lidars have a narrow field-of-view (FOV), the documented cloud statistics, such as distributions of cloud chord length (or horizontal length scale), represent only a slice along the wind direction of a region surrounding the SGP site, and thus may not be representative for this region. To investigate this impact, we compare cloud statistics obtained from wide-FOV sky images collected by ground-based observations at the SGP site to those from the narrow FOV active sensors. The main wide-FOV cloud statistics considered are cloud area distributions of shallow cumulus clouds, which are frequently required to evaluate model performance, such as routine large eddy simulation (LES) currently being conducted by the ARM LASSO (LES ARM Symbiotic Simulation and Observation) project. We obtain complementary macrophysical properties of shallow cumulus clouds, such as cloud chord length, base height and thickness, from the combined radar-lidar observations. To better understand the broader observational context where these narrow FOV cloud statistics occur, we compare them to collocated and coincident cloud area distributions from wide-FOV sky images and high-resolution satellite images. We discuss the comparison results and illustrate the possibility to generate a long-term climatology of cloud size distributions from multi-sensor observations at the SGP site.

  16. Quantifying climate changes of the Common Era for Finland

    NASA Astrophysics Data System (ADS)

    Luoto, Tomi P.; Nevalainen, Liisa

    2017-10-01

    In this study, we aim to quantify summer air temperatures from sediment records from Southern, Central and Northern Finland over the past 2000 years. We use lake sediment archives to estimate paleotemperatures applying fossil Chironomidae assemblages and the transfer function approach. The used enhanced Chironomidae-based temperature calibration set was validated in a 70-year high-resolution sediment record against instrumentally measured temperatures. Since the inferred and observed temperatures showed close correlation, we deduced that the new calibration model is reliable for reconstructions beyond the monitoring records. The 700-year long temperature reconstructions from three sites at multi-decadal temporal resolution showed similar trends, although they had differences in timing of the cold Little Ice Age (LIA) and the initiation of recent warming. The 2000-year multi-centennial reconstructions from three different sites showed resemblance with each other having clear signals of the Medieval Climate Anomaly (MCA) and LIA, but with differences in their timing. The influence of external forcing on climate of the southern and central sites appeared to be complex at the decadal scale, but the North Atlantic Oscillation (NAO) was closely linked to the temperature development of the northern site. Solar activity appears to be synchronous with the temperature fluctuations at the multi-centennial scale in all the sites. The present study provides new insights into centennial and decadal variability in air temperature dynamics in Northern Europe and on the external forcing behind these trends. These results are particularly useful in comparing regional responses and lags of temperature trends between different parts of Scandinavia.

  17. Characterising an intense PM pollution episode in March 2015 in France from multi-site approach and near real time data: Climatology, variabilities, geographical origins and model evaluation

    NASA Astrophysics Data System (ADS)

    Petit, J.-E.; Amodeo, T.; Meleux, F.; Bessagnet, B.; Menut, L.; Grenier, D.; Pellan, Y.; Ockler, A.; Rocq, B.; Gros, V.; Sciare, J.; Favez, O.

    2017-04-01

    During March 2015, a severe and large-scale particulate matter (PM) pollution episode occurred in France. Measurements in near real-time of the major chemical composition at four different urban background sites across the country (Paris, Creil, Metz and Lyon) allowed the investigation of spatiotemporal variabilities during this episode. A climatology approach showed that all sites experienced clear unusual rain shortage, a pattern that is also found on a longer timescale, highlighting the role of synoptic conditions over Wester-Europe. This episode is characterized by a strong predominance of secondary pollution, and more particularly of ammonium nitrate, which accounted for more than 50% of submicron aerosols at all sites during the most intense period of the episode. Pollution advection is illustrated by similar variabilities in Paris and Creil (distant of around 100 km), as well as trajectory analyses applied on nitrate and sulphate. Local sources, especially wood burning, are however found to contribute to local/regional sub-episodes, notably in Metz. Finally, simulated concentrations from Chemistry-Transport model CHIMERE were compared to observed ones. Results highlighted different patterns depending on the chemical components and the measuring site, reinforcing the need of such exercises over other pollution episodes and sites.

  18. Predicting Species Distributions Using Record Centre Data: Multi-Scale Modelling of Habitat Suitability for Bat Roosts.

    PubMed

    Bellamy, Chloe; Altringham, John

    2015-01-01

    Conservation increasingly operates at the landscape scale. For this to be effective, we need landscape scale information on species distributions and the environmental factors that underpin them. Species records are becoming increasingly available via data centres and online portals, but they are often patchy and biased. We demonstrate how such data can yield useful habitat suitability models, using bat roost records as an example. We analysed the effects of environmental variables at eight spatial scales (500 m - 6 km) on roost selection by eight bat species (Pipistrellus pipistrellus, P. pygmaeus, Nyctalus noctula, Myotis mystacinus, M. brandtii, M. nattereri, M. daubentonii, and Plecotus auritus) using the presence-only modelling software MaxEnt. Modelling was carried out on a selection of 418 data centre roost records from the Lake District National Park, UK. Target group pseudoabsences were selected to reduce the impact of sampling bias. Multi-scale models, combining variables measured at their best performing spatial scales, were used to predict roosting habitat suitability, yielding models with useful predictive abilities. Small areas of deciduous woodland consistently increased roosting habitat suitability, but other habitat associations varied between species and scales. Pipistrellus were positively related to built environments at small scales, and depended on large-scale woodland availability. The other, more specialist, species were highly sensitive to human-altered landscapes, avoiding even small rural towns. The strength of many relationships at large scales suggests that bats are sensitive to habitat modifications far from the roost itself. The fine resolution, large extent maps will aid targeted decision-making by conservationists and planners. We have made available an ArcGIS toolbox that automates the production of multi-scale variables, to facilitate the application of our methods to other taxa and locations. Habitat suitability modelling has the potential to become a standard tool for supporting landscape-scale decision-making as relevant data and open source, user-friendly, and peer-reviewed software become widely available.

  19. Standardization of fluorine-18 manufacturing processes: new scientific challenges for PET.

    PubMed

    Hjelstuen, Ole K; Svadberg, Anders; Olberg, Dag E; Rosser, Mark

    2011-08-01

    In [(18)F]fluoride chemistry, the minute amounts of radioactivity taking part in a radiolabeling reaction are easily outnumbered by other reactants. Surface areas become comparably larger and more influential than in standard fluorine chemistry, while leachables, extractables, and other components that normally are considered small impurities can have a considerable influence on the efficiency of the reaction. A number of techniques exist to give sufficient (18)F-tracer for a study in a pre-clinical or clinical system, but the chemical and pharmaceutical understanding has significant gaps when it comes to scaling up or making the reaction more efficient. Automation and standardization of [(18)F]fluoride PET tracers is a prerequisite for reproducible manufacturing across multiple PET centers. So far, large-scale, multi-site manufacture has been established only for [(18)F]FDG, but several new tracers are emerging. In general terms, this transition from small- to large-scale production has disclosed several scientific challenges that need to be addressed. There are still areas of limited knowledge in the fundamental [(18)F]fluoride chemistry. The role of pharmaceutical factors that could influence the (18)F-radiosynthesis and the gaps in precise chemistry knowledge are discussed in this review based on a normal synthesis pattern. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Multi-filter spectrophotometry simulations

    NASA Technical Reports Server (NTRS)

    Callaghan, Kim A. S.; Gibson, Brad K.; Hickson, Paul

    1993-01-01

    To complement both the multi-filter observations of quasar environments described in these proceedings, as well as the proposed UBC 2.7 m Liquid Mirror Telescope (LMT) redshift survey, we have initiated a program of simulated multi-filter spectrophotometry. The goal of this work, still very much in progress, is a better quantitative assessment of the multiband technique as a viable mechanism for obtaining useful redshift and morphological class information from large scale multi-filter surveys.

  1. Brain white matter changes associated with urological chronic pelvic pain syndrome: Multi-site neuroimaging from a MAPP case-control study

    PubMed Central

    Huang, Lejian; Kutch, Jason J.; Ellingson, Benjamin M.; Martucci, Katherine T.; Harris, Richard E.; Clauw, Daniel J.; Mackey, Sean; Mayer, Emeran A.; Schaeffer, Anthony J.; Apkarian, A. Vania; Farmer, Melissa A.

    2016-01-01

    Clinical phenotyping of urological chronic pelvic pain syndromes (UCPPS) in men and women has focused on end-organ abnormalities to identify putative clinical subtypes. Initial evidence of abnormal brain function and structure in male pelvic pain has necessitated large-scale, multi-site investigations into potential UCPPS brain biomarkers. We present the first evidence of regional white matter (axonal) abnormalities in men and women with UCPPS, compared to positive (irritable bowel syndrome, IBS) and healthy controls. Epidemiological and neuroimaging data was collected from participants with UCPPS (n=52), IBS (n=39), and healthy, sex- and age-matched controls (n=61). White matter microstructure, measured as fractional anisotropy (FA), was examined with diffusion tensor imaging (DTI). Group differences in regional FA positively correlated with pain severity, including segments of the right corticospinal tract and right anterior thalamic radiation. Increased corticospinal FA was specific and sensitive to UCPPS, positively correlated with pain severity, and reflected sensory (not affective) features of pain. Reduced anterior thalamic radiation FA distinguished IBS from UCPPS patients and controls, suggesting greater microstructural divergence from normal tract organization. Findings confirm that regional white matter abnormalities characterize UCPPS and can distinguish between visceral diagnoses, suggesting that regional axonal microstructure is either altered with ongoing pain or predisposes its development. PMID:27842046

  2. The geologic setting of the Luna 16 landing site

    USGS Publications Warehouse

    McCauley, J.F.; Scott, D.H.

    1972-01-01

    The Luna 16 landing site is similar in its geologic setting to Apollos 11 and 12. All three sites are located on basaltic mare fill which occurs mostly within multi-ring basins formed by impact earlier in the moon's history. A regolith developed by impact bombardment is present at each of these sites. The regolith is composed mostly of locally derived volcanic material, but also contains exotic fine fragments that have been ballistically transported into the landing sites by large impact events which formed craters such as Langrenus and Copernicus. These exotic fragments probably consist mostly of earlier reworked multi-ring basin debris and, although not directly traceable to individual sources, they do represent a good statistical sample of the composition of most of the premare terrac regions. ?? 1972.

  3. Observed Benefits to On-site Medical Services during an Annual 5-day Electronic Dance Music Event with Harm Reduction Services.

    PubMed

    Munn, Matthew Brendan; Lund, Adam; Golby, Riley; Turris, Sheila A

    2016-04-01

    With increasing attendance and media attention, large-scale electronic dance music events (EDMEs) are a subset of mass gatherings that have a unique risk profile for attendees and promoters. Shambhala Music Festival (Canada) is a multi-day event in a rural setting with a recognized history of providing harm reduction (HR) services alongside medical care. Study/Objective This manuscript describes the medical response at a multi-day electronic music festival where on-site HR interventions and dedicated medical care are delivered as parallel public health measures. This study was a descriptive case report. Medical encounters and event-related data were documented prospectively using an established event registry database. In 2014, Shambhala Music Festival had 67,120 cumulative attendees over a 7-day period, with a peak daily attendance of 15,380 people. There were 1,393 patient encounters and the patient presentation rate (PPR) was 20.8 per one thousand. The majority of these (90.9%) were for non-urgent complaints. The ambulance transfer rate (ATR) was 0.194 per one thousand and 0.93% of patient encounters were transferred by ambulance. No patients required intubation and there were no fatalities. Harm reduction services included mobile outreach teams, distribution of educational materials, pill checking facilities, a dedicated women's space, and a "Sanctuary" area that provided non-medical peer support for overwhelmed guests. More than 10,000 encounters were recorded by mobile and booth-based preventive and educational services, and 2,786 pills were checked on-site with a seven percent discard rate. Dedicated medical and HR services represent two complementary public health strategies to minimize risk at a multi-day electronic music festival. The specific extent to which HR strategies reduce the need for medical care is not well understood. Incorporation of HR practices when planning on-site medical care has the potential to inform patient management, reduce presentation rates and acuity, and decrease utilization and cost for local, community-based health services.

  4. PKI security in large-scale healthcare networks.

    PubMed

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  5. Multi-scale occupancy estimation and modelling using multiple detection methods

    USGS Publications Warehouse

    Nichols, James D.; Bailey, Larissa L.; O'Connell, Allan F.; Talancy, Neil W.; Grant, Evan H. Campbell; Gilbert, Andrew T.; Annand, Elizabeth M.; Husband, Thomas P.; Hines, James E.

    2008-01-01

    Occupancy estimation and modelling based on detection–nondetection data provide an effective way of exploring change in a species’ distribution across time and space in cases where the species is not always detected with certainty. Today, many monitoring programmes target multiple species, or life stages within a species, requiring the use of multiple detection methods. When multiple methods or devices are used at the same sample sites, animals can be detected by more than one method.We develop occupancy models for multiple detection methods that permit simultaneous use of data from all methods for inference about method-specific detection probabilities. Moreover, the approach permits estimation of occupancy at two spatial scales: the larger scale corresponds to species’ use of a sample unit, whereas the smaller scale corresponds to presence of the species at the local sample station or site.We apply the models to data collected on two different vertebrate species: striped skunks Mephitis mephitis and red salamanders Pseudotriton ruber. For striped skunks, large-scale occupancy estimates were consistent between two sampling seasons. Small-scale occupancy probabilities were slightly lower in the late winter/spring when skunks tend to conserve energy, and movements are limited to males in search of females for breeding. There was strong evidence of method-specific detection probabilities for skunks. As anticipated, large- and small-scale occupancy areas completely overlapped for red salamanders. The analyses provided weak evidence of method-specific detection probabilities for this species.Synthesis and applications. Increasingly, many studies are utilizing multiple detection methods at sampling locations. The modelling approach presented here makes efficient use of detections from multiple methods to estimate occupancy probabilities at two spatial scales and to compare detection probabilities associated with different detection methods. The models can be viewed as another variation of Pollock's robust design and may be applicable to a wide variety of scenarios where species occur in an area but are not always near the sampled locations. The estimation approach is likely to be especially useful in multispecies conservation programmes by providing efficient estimates using multiple detection devices and by providing device-specific detection probability estimates for use in survey design.

  6. A general Bayesian framework for calibrating and evaluating stochastic models of annual multi-site hydrological data

    NASA Astrophysics Data System (ADS)

    Frost, Andrew J.; Thyer, Mark A.; Srikanthan, R.; Kuczera, George

    2007-07-01

    SummaryMulti-site simulation of hydrological data are required for drought risk assessment of large multi-reservoir water supply systems. In this paper, a general Bayesian framework is presented for the calibration and evaluation of multi-site hydrological data at annual timescales. Models included within this framework are the hidden Markov model (HMM) and the widely used lag-1 autoregressive (AR(1)) model. These models are extended by the inclusion of a Box-Cox transformation and a spatial correlation function in a multi-site setting. Parameter uncertainty is evaluated using Markov chain Monte Carlo techniques. Models are evaluated by their ability to reproduce a range of important extreme statistics and compared using Bayesian model selection techniques which evaluate model probabilities. The case study, using multi-site annual rainfall data situated within catchments which contribute to Sydney's main water supply, provided the following results: Firstly, in terms of model probabilities and diagnostics, the inclusion of the Box-Cox transformation was preferred. Secondly the AR(1) and HMM performed similarly, while some other proposed AR(1)/HMM models with regionally pooled parameters had greater posterior probability than these two models. The practical significance of parameter and model uncertainty was illustrated using a case study involving drought security analysis for urban water supply. It was shown that ignoring parameter uncertainty resulted in a significant overestimate of reservoir yield and an underestimation of system vulnerability to severe drought.

  7. First Large-Scale Proteogenomic Study of Breast Cancer Provides Insight into Potential Therapeutic Targets | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    News Release: May 25, 2016 — Building on data from The Cancer Genome Atlas (TCGA) project, a multi-institutional team of scientists has completed the first large-scale “proteogenomic” study of breast cancer, linking DNA mutations to protein signaling and helping pinpoint the genes that drive cancer.

  8. Multi-Scale Computational Enzymology: Enhancing Our Understanding of Enzymatic Catalysis

    PubMed Central

    Gherib, Rami; Dokainish, Hisham M.; Gauld, James W.

    2014-01-01

    Elucidating the origin of enzymatic catalysis stands as one the great challenges of contemporary biochemistry and biophysics. The recent emergence of computational enzymology has enhanced our atomistic-level description of biocatalysis as well the kinetic and thermodynamic properties of their mechanisms. There exists a diversity of computational methods allowing the investigation of specific enzymatic properties. Small or large density functional theory models allow the comparison of a plethora of mechanistic reactive species and divergent catalytic pathways. Molecular docking can model different substrate conformations embedded within enzyme active sites and determine those with optimal binding affinities. Molecular dynamics simulations provide insights into the dynamics and roles of active site components as well as the interactions between substrate and enzymes. Hybrid quantum mechanical/molecular mechanical (QM/MM) can model reactions in active sites while considering steric and electrostatic contributions provided by the surrounding environment. Using previous studies done within our group, on OvoA, EgtB, ThrRS, LuxS and MsrA enzymatic systems, we will review how these methods can be used either independently or cooperatively to get insights into enzymatic catalysis. PMID:24384841

  9. Microphysics in the Multi-Scale Modeling Systems with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2011-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (l) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, the microphysics developments of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the heavy precipitation processes will be presented.

  10. Epidemiology and Molecular Biology of Head and Neck Cancer.

    PubMed

    Jou, Adriana; Hess, Jochen

    2017-01-01

    Head and neck cancer is a common and aggressive malignancy with a high morbidity and mortality profile. Although the large majority of cases resemble head and neck squamous cell carcinoma (HNSCC), the current classification based on anatomic site and tumor stage fails to capture the high level of biologic heterogeneity, and appropriate clinical management remains a major challenge. Hence, a better understanding of the molecular biology of HNSCC is urgently needed to support biomarker development and personalized care for patients. This review focuses on recent findings based on integrative genomics analysis and multi-scale modeling approaches and how they are beginning to provide more sophisticated clues as to the biological and clinical diversity of HNSCC. © 2017 S. Karger GmbH, Freiburg.

  11. A Feature-based Approach to Big Data Analysis of Medical Images

    PubMed Central

    Toews, Matthew; Wachinger, Christian; Estepar, Raul San Jose; Wells, William M.

    2015-01-01

    This paper proposes an inference method well-suited to large sets of medical images. The method is based upon a framework where distinctive 3D scale-invariant features are indexed efficiently to identify approximate nearest-neighbor (NN) feature matches in O(log N) computational complexity in the number of images N. It thus scales well to large data sets, in contrast to methods based on pair-wise image registration or feature matching requiring O(N) complexity. Our theoretical contribution is a density estimator based on a generative model that generalizes kernel density estimation and K-nearest neighbor (KNN) methods. The estimator can be used for on-the-fly queries, without requiring explicit parametric models or an off-line training phase. The method is validated on a large multi-site data set of 95,000,000 features extracted from 19,000 lung CT scans. Subject-level classification identifies all images of the same subjects across the entire data set despite deformation due to breathing state, including unintentional duplicate scans. State-of-the-art performance is achieved in predicting chronic pulmonary obstructive disorder (COPD) severity across the 5-category GOLD clinical rating, with an accuracy of 89% if both exact and one-off predictions are considered correct. PMID:26221685

  12. A Feature-Based Approach to Big Data Analysis of Medical Images.

    PubMed

    Toews, Matthew; Wachinger, Christian; Estepar, Raul San Jose; Wells, William M

    2015-01-01

    This paper proposes an inference method well-suited to large sets of medical images. The method is based upon a framework where distinctive 3D scale-invariant features are indexed efficiently to identify approximate nearest-neighbor (NN) feature matches-in O (log N) computational complexity in the number of images N. It thus scales well to large data sets, in contrast to methods based on pair-wise image registration or feature matching requiring O(N) complexity. Our theoretical contribution is a density estimator based on a generative model that generalizes kernel density estimation and K-nearest neighbor (KNN) methods.. The estimator can be used for on-the-fly queries, without requiring explicit parametric models or an off-line training phase. The method is validated on a large multi-site data set of 95,000,000 features extracted from 19,000 lung CT scans. Subject-level classification identifies all images of the same subjects across the entire data set despite deformation due to breathing state, including unintentional duplicate scans. State-of-the-art performance is achieved in predicting chronic pulmonary obstructive disorder (COPD) severity across the 5-category GOLD clinical rating, with an accuracy of 89% if both exact and one-off predictions are considered correct.

  13. Field Exploration and Life Detection Sampling Through Planetary Analogue Sampling (FELDSPAR).

    NASA Technical Reports Server (NTRS)

    Stockton, A.; Amador, E. S.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Gentry, D. M.; Kirby, J.; Jacobsen, M.; hide

    2017-01-01

    Exploration missions to Mars rely on rovers to perform analyses over small sampling areas; however, landing sites for these missions are selected based on large-scale, low-resolution remote data. The use of Earth analogue environments to estimate the multi-scale spatial distributions of key signatures of habitability can help ensure mission science goals are met. A main goal of FELDSPAR is to conduct field operations analogous to Mars sample return in its science, operations, and technology from landing site selection, to in-field sampling location selection, remote or stand-off analysis, in situ analysis, and home laboratory analysis. Lava fields and volcanic regions are relevant analogues to Martian landscapes due to desiccation, low nutrient availability, and temperature extremes. Operationally, many Icelandic lava fields are remote enough to require that field expeditions address several sampling constraints that are experienced in robotic exploration, including in situ and sample return missions. The Fimmvruhls lava field was formed by a basaltic effusive eruption associated with the 2010 Eyjafjallajkull eruption. Mlifellssandur is a recently deglaciated plain to the north of the Myrdalsjkull glacier. Holuhraun was formed by a 2014 fissure eruptions just north of the large Vatnajkull glacier. Dyngjusandur is an alluvial plain apparently kept barren by repeated mechanical weathering. Informed by our 2013 expedition, we collected samples in nested triangular grids every decade from the 10 cm scale to the 1 km scale (as permitted by the size of the site). Satellite imagery is available for older sites, and for Mlifellssandur, Holuhraun, and Dyngjusandur we obtained overhead imagery at 1 m to 200 m elevation. PanCam-style photographs were taken in the field by sampling personnel. In-field reflectance spectroscopy was also obtained with an ASD spectrometer in Dyngjusandur. All sites chosen were 'homogeneous' in apparent color, morphology, moisture, grain size, and reflectance spectra at all scales greater than 10 cm. Field lab assays were conducted to monitor microbial habitation, including ATP quantification, qPCR for fungal, bacterial, and archaeal DNA, and direct cell imaging using fluorescence microscopy. Home laboratory analyses include Raman spectroscopy and community sequencing. ATP appeared to be significantly more sensitive to small changes in sampling location than qPCR or fluorescence microscopy. Bacterial and archaeal DNA content were more consistent at the smaller scales, but similarly variable across more distant sites. Conversely, cell counts and fungal DNA content have significant local variation but appear relatively homogeneous over scales of 1 km. ATP, bacterial DNA, and archaeal DNA content were relatively well correlated at many spatial scales. While we have observed spatial variation at various scales and are beginning to observe how that variation fluctuates over time as biodiversity recovers after an eruption, we do not yet fully understand what parameters lead to the observed spatial variation. Home laboratory analyses will help us further understand the elemental and structural composition of the basaltic matrices, but further field analyses are vital for the understanding how temperature, moisture, incident radiation, and so forth influence the habitability of a microclimate.

  14. Large-N Seismic Deployment at the Source Physics Experiment (SPE) Site

    NASA Astrophysics Data System (ADS)

    Chen, T.; Snelson, C. M.; Mellors, R. J.; Pitarka, A.

    2015-12-01

    The Source Physics Experiment (SPE) is multi-institutional and multi-disciplinary project that consists of a series of chemical explosion experiments at the Nevada National Security Site. The goal of SPE is to understand the complicated effect of earth structures on source energy partitioning and seismic wave propagation, develop and validate physics-based monitoring, and ultimately better discriminate low-yield nuclear explosions from background seismicity. Deployment of a large number of seismic sensors is planned for SPE to image the full 3-D wavefield with about 500 three-component sensors and 500 vertical component sensors. This large-N seismic deployment will operate near the site of SPE-5 shot for about one month, recording the SPE-5 shot, ambient noise, and additional controlled-sources. This presentation focuses on the design of the large-N seismic deployment. We show how we optimized the sensor layout based on the geological structure and experiment goals with a limited number of sensors. In addition, we will also show some preliminary record sections from deployment. This work was conducted under Contract No. DE-AC52-06NA25946 with the U.S. Department of Energy.

  15. The Growth of Multi-Site Fatigue Damage in Fuselage Lap Joints

    NASA Technical Reports Server (NTRS)

    Piascik, Robert S.; Willard, Scott A.

    1999-01-01

    Destructive examinations were performed to document the progression of multi-site damage (MSD) in three lap joint panels that were removed from a full scale fuselage test article that was tested to 60,000 full pressurization cycles. Similar fatigue crack growth characteristics were observed for small cracks (50 microns to 10 mm) emanating from counter bore rivets, straight shank rivets, and 100 deg counter sink rivets. Good correlation of the fatigue crack growth data base obtained in this study and FASTRAN Code predictions show that the growth of MSD in the fuselage lap joint structure can be predicted by fracture mechanics based methods.

  16. Action detection by double hierarchical multi-structure space-time statistical matching model

    NASA Astrophysics Data System (ADS)

    Han, Jing; Zhu, Junwei; Cui, Yiyin; Bai, Lianfa; Yue, Jiang

    2018-03-01

    Aimed at the complex information in videos and low detection efficiency, an actions detection model based on neighboring Gaussian structure and 3D LARK features is put forward. We exploit a double hierarchical multi-structure space-time statistical matching model (DMSM) in temporal action localization. First, a neighboring Gaussian structure is presented to describe the multi-scale structural relationship. Then, a space-time statistical matching method is proposed to achieve two similarity matrices on both large and small scales, which combines double hierarchical structural constraints in model by both the neighboring Gaussian structure and the 3D LARK local structure. Finally, the double hierarchical similarity is fused and analyzed to detect actions. Besides, the multi-scale composite template extends the model application into multi-view. Experimental results of DMSM on the complex visual tracker benchmark data sets and THUMOS 2014 data sets show the promising performance. Compared with other state-of-the-art algorithm, DMSM achieves superior performances.

  17. Action detection by double hierarchical multi-structure space–time statistical matching model

    NASA Astrophysics Data System (ADS)

    Han, Jing; Zhu, Junwei; Cui, Yiyin; Bai, Lianfa; Yue, Jiang

    2018-06-01

    Aimed at the complex information in videos and low detection efficiency, an actions detection model based on neighboring Gaussian structure and 3D LARK features is put forward. We exploit a double hierarchical multi-structure space-time statistical matching model (DMSM) in temporal action localization. First, a neighboring Gaussian structure is presented to describe the multi-scale structural relationship. Then, a space-time statistical matching method is proposed to achieve two similarity matrices on both large and small scales, which combines double hierarchical structural constraints in model by both the neighboring Gaussian structure and the 3D LARK local structure. Finally, the double hierarchical similarity is fused and analyzed to detect actions. Besides, the multi-scale composite template extends the model application into multi-view. Experimental results of DMSM on the complex visual tracker benchmark data sets and THUMOS 2014 data sets show the promising performance. Compared with other state-of-the-art algorithm, DMSM achieves superior performances.

  18. Of cattle and feasts: Multi-isotope investigation of animal husbandry and communal feasting at Neolithic Makriyalos, northern Greece.

    PubMed

    Vaiglova, Petra; Halstead, Paul; Pappa, Maria; Triantaphyllou, Sevi; Valamoti, Soultana M; Evans, Jane; Fraser, Rebecca; Karkanas, Panagiotis; Kay, Andrea; Lee-Thorp, Julia; Bogaard, Amy

    2018-01-01

    The aim of this study is to investigate livestock husbandry and its relationship to the mobilization of domestic animals for slaughter at large communal feasting events, in Late Neolithic Makriyalos, northern Greece. A multi-isotope approach is built that integrates analysis of: δ13C and δ15N values of human and animal bone collagen for understanding long-term dietary behavior,Incremental δ13C and δ18O values of domestic animal tooth enamel carbonate for assessing seasonal patterns in grazing habits and mobility, and87Sr/86Sr ratios of cattle tooth enamel for examining the possibility that some of the animals consumed at the site were born outside the local environment. The findings indicate that cattle had isotopically more variable diets than sheep, which may reflect grazing over a wider catchment area in the local landscape. Cattle products did not make a significant contribution to the long-term dietary protein intake of the humans, which may indicate that they were primarily consumed during episodic feasting events. There is no indication that pasturing of livestock was pre-determined by their eventual context of slaughter (i.e. large-scale feasting vs. more routine consumption events). Two non-local cattle identified among those deposited in a feasting context may have been brought to the site as contributions to these feasts. The evidence presented provides a more detailed insight into local land use and into the role of livestock and feasting in forging social relationships within the regional human population.

  19. Assembling Large, Multi-Sensor Climate Datasets Using the SciFlo Grid Workflow System

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Fetzer, E.

    2008-12-01

    NASA's Earth Observing System (EOS) is the world's most ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the A-Train platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the cloud scenes from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time matchups between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, and assemble merged datasets for further scientific and statistical analysis. To meet these large-scale challenges, we are utilizing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data query, access, subsetting, co-registration, mining, fusion, and advanced statistical analysis. SciFlo is a semantically-enabled ("smart") Grid Workflow system that ties together a peer-to-peer network of computers into an efficient engine for distributed computation. The SciFlo workflow engine enables scientists to do multi-instrument Earth Science by assembling remotely-invokable Web Services (SOAP or http GET URLs), native executables, command-line scripts, and Python codes into a distributed computing flow. A scientist visually authors the graph of operation in the VizFlow GUI, or uses a text editor to modify the simple XML workflow documents. The SciFlo client & server engines optimize the execution of such distributed workflows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The engine transparently moves data to the operators, and moves operators to the data (on the dozen trusted SciFlo nodes). SciFlo also deploys a variety of Data Grid services to: query datasets in space and time, locate & retrieve on-line data granules, provide on-the-fly variable and spatial subsetting, and perform pairwise instrument matchups for A-Train datasets. These services are combined into efficient workflows to assemble the desired large-scale, merged climate datasets. SciFlo is currently being applied in several large climate studies: comparisons of aerosol optical depth between MODIS, MISR, AERONET ground network, and U. Michigan's IMPACT aerosol transport model; characterization of long-term biases in microwave and infrared instruments (AIRS, MLS) by comparisons to GPS temperature retrievals accurate to 0.1 degrees Kelvin; and construction of a decade-long, multi-sensor water vapor climatology stratified by classified cloud scene by bringing together datasets from AIRS/AMSU, AMSR-E, MLS, MODIS, and CloudSat (NASA MEASUREs grant, Fetzer PI). The presentation will discuss the SciFlo technologies, their application in these distributed workflows, and the many challenges encountered in assembling and analyzing these massive datasets.

  20. Assembling Large, Multi-Sensor Climate Datasets Using the SciFlo Grid Workflow System

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Manipon, G.; Xing, Z.; Fetzer, E.

    2009-04-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To meet these large-scale challenges, we are utilizing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data query, access, subsetting, co-registration, mining, fusion, and advanced statistical analysis. SciFlo is a semantically-enabled ("smart") Grid Workflow system that ties together a peer-to-peer network of computers into an efficient engine for distributed computation. The SciFlo workflow engine enables scientists to do multi-instrument Earth Science by assembling remotely-invokable Web Services (SOAP or http GET URLs), native executables, command-line scripts, and Python codes into a distributed computing flow. A scientist visually authors the graph of operation in the VizFlow GUI, or uses a text editor to modify the simple XML workflow documents. The SciFlo client & server engines optimize the execution of such distributed workflows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The engine transparently moves data to the operators, and moves operators to the data (on the dozen trusted SciFlo nodes). SciFlo also deploys a variety of Data Grid services to: query datasets in space and time, locate & retrieve on-line data granules, provide on-the-fly variable and spatial subsetting, perform pairwise instrument matchups for A-Train datasets, and compute fused products. These services are combined into efficient workflows to assemble the desired large-scale, merged climate datasets. SciFlo is currently being applied in several large climate studies: comparisons of aerosol optical depth between MODIS, MISR, AERONET ground network, and U. Michigan's IMPACT aerosol transport model; characterization of long-term biases in microwave and infrared instruments (AIRS, MLS) by comparisons to GPS temperature retrievals accurate to 0.1 degrees Kelvin; and construction of a decade-long, multi-sensor water vapor climatology stratified by classified cloud scene by bringing together datasets from AIRS/AMSU, AMSR-E, MLS, MODIS, and CloudSat (NASA MEASUREs grant, Fetzer PI). The presentation will discuss the SciFlo technologies, their application in these distributed workflows, and the many challenges encountered in assembling and analyzing these massive datasets.

  1. Utilizing High-Performance Computing to Investigate Parameter Sensitivity of an Inversion Model for Vadose Zone Flow and Transport

    NASA Astrophysics Data System (ADS)

    Fang, Z.; Ward, A. L.; Fang, Y.; Yabusaki, S.

    2011-12-01

    High-resolution geologic models have proven effective in improving the accuracy of subsurface flow and transport predictions. However, many of the parameters in subsurface flow and transport models cannot be determined directly at the scale of interest and must be estimated through inverse modeling. A major challenge, particularly in vadose zone flow and transport, is the inversion of the highly-nonlinear, high-dimensional problem as current methods are not readily scalable for large-scale, multi-process models. In this paper we describe the implementation of a fully automated approach for addressing complex parameter optimization and sensitivity issues on massively parallel multi- and many-core systems. The approach is based on the integration of PNNL's extreme scale Subsurface Transport Over Multiple Phases (eSTOMP) simulator, which uses the Global Array toolkit, with the Beowulf-Cluster inspired parallel nonlinear parameter estimation software, BeoPEST in the MPI mode. In the eSTOMP/BeoPEST implementation, a pre-processor generates all of the PEST input files based on the eSTOMP input file. Simulation results for comparison with observations are extracted automatically at each time step eliminating the need for post-process data extractions. The inversion framework was tested with three different experimental data sets: one-dimensional water flow at Hanford Grass Site; irrigation and infiltration experiment at the Andelfingen Site; and a three-dimensional injection experiment at Hanford's Sisson and Lu Site. Good agreements are achieved in all three applications between observations and simulations in both parameter estimates and water dynamics reproduction. Results show that eSTOMP/BeoPEST approach is highly scalable and can be run efficiently with hundreds or thousands of processors. BeoPEST is fault tolerant and new nodes can be dynamically added and removed. A major advantage of this approach is the ability to use high-resolution geologic models to preserve the spatial structure in the inverse model, which leads to better parameter estimates and improved predictions when using the inverse-conditioned realizations of parameter fields.

  2. Multi-Scale Multi-Domain Model | Transportation Research | NREL

    Science.gov Websites

    framework for NREL's MSMD model. NREL's MSMD model quantifies the impacts of electrical/thermal pathway : NREL Macroscopic design factors and highly dynamic environmental conditions significantly influence the design of affordable, long-lasting, high-performing, and safe large battery systems. The MSMD framework

  3. Corresponding Functional Dynamics across the Hsp90 Chaperone Family: Insights from a Multiscale Analysis of MD Simulations

    PubMed Central

    Morra, Giulia; Potestio, Raffaello; Micheletti, Cristian; Colombo, Giorgio

    2012-01-01

    Understanding how local protein modifications, such as binding small-molecule ligands, can trigger and regulate large-scale motions of large protein domains is a major open issue in molecular biology. We address various aspects of this problem by analyzing and comparing atomistic simulations of Hsp90 family representatives for which crystal structures of the full length protein are available: mammalian Grp94, yeast Hsp90 and E.coli HtpG. These chaperones are studied in complex with the natural ligands ATP, ADP and in the Apo state. Common key aspects of their functional dynamics are elucidated with a novel multi-scale comparison of their internal dynamics. Starting from the atomic resolution investigation of internal fluctuations and geometric strain patterns, a novel analysis of domain dynamics is developed. The results reveal that the ligand-dependent structural modulations mostly consist of relative rigid-like movements of a limited number of quasi-rigid domains, shared by the three proteins. Two common primary hinges for such movements are identified. The first hinge, whose functional role has been demonstrated by several experimental approaches, is located at the boundary between the N-terminal and Middle-domains. The second hinge is located at the end of a three-helix bundle in the Middle-domain and unfolds/unpacks going from the ATP- to the ADP-state. This latter site could represent a promising novel druggable allosteric site common to all chaperones. PMID:22457611

  4. Multi-scale structures of turbulent magnetic reconnection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakamura, T. K. M., E-mail: takuma.nakamura@oeaw.ac.at; Nakamura, R.; Narita, Y.

    2016-05-15

    We have analyzed data from a series of 3D fully kinetic simulations of turbulent magnetic reconnection with a guide field. A new concept of the guide filed reconnection process has recently been proposed, in which the secondary tearing instability and the resulting formation of oblique, small scale flux ropes largely disturb the structure of the primary reconnection layer and lead to 3D turbulent features [W. Daughton et al., Nat. Phys. 7, 539 (2011)]. In this paper, we further investigate the multi-scale physics in this turbulent, guide field reconnection process by introducing a wave number band-pass filter (k-BPF) technique in whichmore » modes for the small scale (less than ion scale) fluctuations and the background large scale (more than ion scale) variations are separately reconstructed from the wave number domain to the spatial domain in the inverse Fourier transform process. Combining with the Fourier based analyses in the wave number domain, we successfully identify spatial and temporal development of the multi-scale structures in the turbulent reconnection process. When considering a strong guide field, the small scale tearing mode and the resulting flux ropes develop over a specific range of oblique angles mainly along the edge of the primary ion scale flux ropes and reconnection separatrix. The rapid merging of these small scale modes leads to a smooth energy spectrum connecting ion and electron scales. When the guide field is sufficiently weak, the background current sheet is strongly kinked and oblique angles for the small scale modes are widely scattered at the kinked regions. Similar approaches handling both the wave number and spatial domains will be applicable to the data from multipoint, high-resolution spacecraft observations such as the NASA magnetospheric multiscale (MMS) mission.« less

  5. Multi-scale structures of turbulent magnetic reconnection

    NASA Astrophysics Data System (ADS)

    Nakamura, T. K. M.; Nakamura, R.; Narita, Y.; Baumjohann, W.; Daughton, W.

    2016-05-01

    We have analyzed data from a series of 3D fully kinetic simulations of turbulent magnetic reconnection with a guide field. A new concept of the guide filed reconnection process has recently been proposed, in which the secondary tearing instability and the resulting formation of oblique, small scale flux ropes largely disturb the structure of the primary reconnection layer and lead to 3D turbulent features [W. Daughton et al., Nat. Phys. 7, 539 (2011)]. In this paper, we further investigate the multi-scale physics in this turbulent, guide field reconnection process by introducing a wave number band-pass filter (k-BPF) technique in which modes for the small scale (less than ion scale) fluctuations and the background large scale (more than ion scale) variations are separately reconstructed from the wave number domain to the spatial domain in the inverse Fourier transform process. Combining with the Fourier based analyses in the wave number domain, we successfully identify spatial and temporal development of the multi-scale structures in the turbulent reconnection process. When considering a strong guide field, the small scale tearing mode and the resulting flux ropes develop over a specific range of oblique angles mainly along the edge of the primary ion scale flux ropes and reconnection separatrix. The rapid merging of these small scale modes leads to a smooth energy spectrum connecting ion and electron scales. When the guide field is sufficiently weak, the background current sheet is strongly kinked and oblique angles for the small scale modes are widely scattered at the kinked regions. Similar approaches handling both the wave number and spatial domains will be applicable to the data from multipoint, high-resolution spacecraft observations such as the NASA magnetospheric multiscale (MMS) mission.

  6. Scale-invariance underlying the logistic equation and its social applications

    NASA Astrophysics Data System (ADS)

    Hernando, A.; Plastino, A.

    2013-01-01

    On the basis of dynamical principles we i) advance a derivation of the Logistic Equation (LE), widely employed (among multiple applications) in the simulation of population growth, and ii) demonstrate that scale-invariance and a mean-value constraint are sufficient and necessary conditions for obtaining it. We also generalize the LE to multi-component systems and show that the above dynamical mechanisms underlie a large number of scale-free processes. Examples are presented regarding city-populations, diffusion in complex networks, and popularity of technological products, all of them obeying the multi-component logistic equation in an either stochastic or deterministic way.

  7. Association between a Brief Alcohol Craving Measure and Drinking in the Following Week

    PubMed Central

    McHugh, R. Kathryn; Fitzmaurice, Garrett M.; Griffin, Margaret L.; Anton, Raymond F.; Weiss, Roger D.

    2016-01-01

    Background and aims Craving for alcohol is thought to be a predictor of alcohol use, particularly in the near future. The assessment of craving in clinical practice requires brief, simple measures that can be implemented routinely. This study tested whether greater alcohol craving was associated with a higher likelihood of alcohol use in the subsequent week. Design The COMBINE Study was a large, multi-site clinical trial of treatment for alcohol dependence. Participants were randomized (stratified by site) to 1 of 9 treatment conditions involving combinations of pharmacotherapy and psychotherapy. Craving was assessed every other week throughout the treatment period. Setting Substance use disorder treatment settings at 11 academic sites across the United States. Participants Participants from the COMBINE Study (N=1,370) with available craving data. Measurements Craving was assessed using the 3-item self-report Craving Scale. Drinking was assessed using the Timeline Followback method, and was defined as alcohol use in each study week. Findings There was an average of 5.8 (of a possible 7) observation pairs per participant. Craving was strongly associated with alcohol use in the following week (B=0.27, SEB=.06, Wald Chi-Square=43.34, OR=1.31, 95% CI=1.16, 1.47, p<.001). For each 1-unit increase in the Craving Scale, the likelihood of drinking in the next week was 31% higher. Conclusions Craving for alcohol is strongly associated with alcohol use in the following week. Clinicians can measure alcohol craving effectively using a brief self-report craving scale. PMID:26780476

  8. Upscaling of nanoparticle transport in porous media under unfavorable conditions: Pore scale to Darcy scale

    NASA Astrophysics Data System (ADS)

    Seetha, N.; Raoof, Amir; Mohan Kumar, M. S.; Majid Hassanizadeh, S.

    2017-05-01

    Transport and deposition of nanoparticles in porous media is a multi-scale problem governed by several pore-scale processes, and hence, it is critical to link the processes at pore scale to the Darcy-scale behavior. In this study, using pore network modeling, we develop correlation equations for deposition rate coefficients for nanoparticle transport under unfavorable conditions at the Darcy scale based on pore-scale mechanisms. The upscaling tool is a multi-directional pore-network model consisting of an interconnected network of pores with variable connectivities. Correlation equations describing the pore-averaged deposition rate coefficients under unfavorable conditions in a cylindrical pore, developed in our earlier studies, are employed for each pore element. Pore-network simulations are performed for a wide range of parameter values to obtain the breakthrough curves of nanoparticle concentration. The latter is fitted with macroscopic 1-D advection-dispersion equation with a two-site linear reversible deposition accounting for both equilibrium and kinetic sorption. This leads to the estimation of three Darcy-scale deposition coefficients: distribution coefficient, kinetic rate constant, and the fraction of equilibrium sites. The correlation equations for the Darcy-scale deposition coefficients, under unfavorable conditions, are provided as a function of measurable Darcy-scale parameters, including: porosity, mean pore throat radius, mean pore water velocity, nanoparticle radius, ionic strength, dielectric constant, viscosity, temperature, and surface potentials of the particle and grain surfaces. The correlation equations are found to be consistent with the available experimental results, and in qualitative agreement with Colloid Filtration Theory for all parameters, except for the mean pore water velocity and nanoparticle radius.

  9. Cluster galaxy dynamics and the effects of large-scale environment

    NASA Astrophysics Data System (ADS)

    White, Martin; Cohn, J. D.; Smit, Renske

    2010-11-01

    Advances in observational capabilities have ushered in a new era of multi-wavelength, multi-physics probes of galaxy clusters and ambitious surveys are compiling large samples of cluster candidates selected in different ways. We use a high-resolution N-body simulation to study how the influence of large-scale structure in and around clusters causes correlated signals in different physical probes and discuss some implications this has for multi-physics probes of clusters (e.g. richness, lensing, Compton distortion and velocity dispersion). We pay particular attention to velocity dispersions, matching galaxies to subhaloes which are explicitly tracked in the simulation. We find that not only do haloes persist as subhaloes when they fall into a larger host, but groups of subhaloes retain their identity for long periods within larger host haloes. The highly anisotropic nature of infall into massive clusters, and their triaxiality, translates into an anisotropic velocity ellipsoid: line-of-sight galaxy velocity dispersions for any individual halo show large variance depending on viewing angle. The orientation of the velocity ellipsoid is correlated with the large-scale structure, and thus velocity outliers correlate with outliers caused by projection in other probes. We quantify this orientation uncertainty and give illustrative examples. Such a large variance suggests that velocity dispersion estimators will work better in an ensemble sense than for any individual cluster, which may inform strategies for obtaining redshifts of cluster members. We similarly find that the ability of substructure indicators to find kinematic substructures is highly viewing angle dependent. While groups of subhaloes which merge with a larger host halo can retain their identity for many Gyr, they are only sporadically picked up by substructure indicators. We discuss the effects of correlated scatter on scaling relations estimated through stacking, both analytically and in the simulations, showing that the strong correlation of measures with mass and the large scatter in mass at fixed observable mitigate line-of-sight projections.

  10. Conducting a large, multi-site survey about patients' views on broad consent: challenges and solutions.

    PubMed

    Smith, Maureen E; Sanderson, Saskia C; Brothers, Kyle B; Myers, Melanie F; McCormick, Jennifer; Aufox, Sharon; Shrubsole, Martha J; Garrison, Nanibaá A; Mercaldo, Nathaniel D; Schildcrout, Jonathan S; Clayton, Ellen Wright; Antommaria, Armand H Matheny; Basford, Melissa; Brilliant, Murray; Connolly, John J; Fullerton, Stephanie M; Horowitz, Carol R; Jarvik, Gail P; Kaufman, Dave; Kitchner, Terri; Li, Rongling; Ludman, Evette J; McCarty, Catherine; McManus, Valerie; Stallings, Sarah; Williams, Janet L; Holm, Ingrid A

    2016-11-24

    As biobanks play an increasing role in the genomic research that will lead to precision medicine, input from diverse and large populations of patients in a variety of health care settings will be important in order to successfully carry out such studies. One important topic is participants' views towards consent and data sharing, especially since the 2011 Advanced Notice of Proposed Rulemaking (ANPRM), and subsequently the 2015 Notice of Proposed Rulemaking (NPRM) were issued by the Department of Health and Human Services (HHS) and Office of Science and Technology Policy (OSTP). These notices required that participants consent to research uses of their de-identified tissue samples and most clinical data, and allowing such consent be obtained in a one-time, open-ended or "broad" fashion. Conducting a survey across multiple sites provides clear advantages to either a single site survey or using a large online database, and is a potentially powerful way of understanding the views of diverse populations on this topic. A workgroup of the Electronic Medical Records and Genomics (eMERGE) Network, a national consortium of 9 sites (13 separate institutions, 11 clinical centers) supported by the National Human Genome Research Institute (NHGRI) that combines DNA biorepositories with electronic medical record (EMR) systems for large-scale genetic research, conducted a survey to understand patients' views on consent, sample and data sharing for future research, biobank governance, data protection, and return of research results. Working across 9 sites to design and conduct a national survey presented challenges in organization, meeting human subjects guidelines at each institution, and survey development and implementation. The challenges were met through a committee structure to address each aspect of the project with representatives from all sites. Each committee's output was integrated into the overall survey plan. A number of site-specific issues were successfully managed allowing the survey to be developed and implemented uniformly across 11 clinical centers. Conducting a survey across a number of institutions with different cultures and practices is a methodological and logistical challenge. With a clear infrastructure, collaborative attitudes, excellent lines of communication, and the right expertise, this can be accomplished successfully.

  11. Urban seismology - Northridge aftershocks recorded by multi-scale arrays of portable digital seismographs

    USGS Publications Warehouse

    Meremonte, M.; Frankel, A.; Cranswick, E.; Carver, D.; Worley, D.

    1996-01-01

    We deployed portable digital seismographs in the San Fernando Valley (SFV), the Los Angeles basin (LAB), and surrounding hills to record aftershocks of the 17 January 1994 Northridge California earthquake. The purpose of the deployment was to investigate factors relevant to seismic zonation in urban areas, such as site amplification, sedimentary basin effects, and the variability of ground motion over short baselines. We placed seismographs at 47 sites (not all concurrently) and recorded about 290 earthquakes with magnitudes up to 5.1 at five stations or more. We deployed widely spaced stations for profiles across the San Fernando Valley, as well as five dense arrays (apertures of 200 to 500 m) in areas of high damage, such as the collapsed Interstate 10 overpass, Sherman Oaks, and the collapsed parking garage at CalState Northridge. Aftershock data analysis indicates a correlation of site amplification with mainshock damage. We found several cases where the site amplification depended on the azimuth of the aftershock, possibly indicating focusing from basin structures. For the parking garage array, we found large ground-motion variabilities (a factor of 2) over 200-m distances for sites on the same mapped soil unit. Array analysis of the aftershock seismograms demonstrates that sizable arrivals after the direct 5 waves consist of surface waves traveling from the same azimuth as that of the epicenter. These surface waves increase the duration of motions and can have frequencies as high as about 4 Hz. For the events studied here, we do not observe large arrivals reflected from the southern edge of the San Fernando Valley.

  12. Bayesian approach for three-dimensional aquifer characterization at the Hanford 300 Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murakami, Haruko; Chen, X.; Hahn, Melanie S.

    2010-10-21

    This study presents a stochastic, three-dimensional characterization of a heterogeneous hydraulic conductivity field within DOE's Hanford 300 Area site, Washington, by assimilating large-scale, constant-rate injection test data with small-scale, three-dimensional electromagnetic borehole flowmeter (EBF) measurement data. We first inverted the injection test data to estimate the transmissivity field, using zeroth-order temporal moments of pressure buildup curves. We applied a newly developed Bayesian geostatistical inversion framework, the method of anchored distributions (MAD), to obtain a joint posterior distribution of geostatistical parameters and local log-transmissivities at multiple locations. The unique aspects of MAD that make it suitable for this purpose are itsmore » ability to integrate multi-scale, multi-type data within a Bayesian framework and to compute a nonparametric posterior distribution. After we combined the distribution of transmissivities with depth-discrete relative-conductivity profile from EBF data, we inferred the three-dimensional geostatistical parameters of the log-conductivity field, using the Bayesian model-based geostatistics. Such consistent use of the Bayesian approach throughout the procedure enabled us to systematically incorporate data uncertainty into the final posterior distribution. The method was tested in a synthetic study and validated using the actual data that was not part of the estimation. Results showed broader and skewed posterior distributions of geostatistical parameters except for the mean, which suggests the importance of inferring the entire distribution to quantify the parameter uncertainty.« less

  13. Spatial processes decouple management from objectives in a heterogeneous landscape: predator control as a case study.

    PubMed

    Mahoney, Peter J; Young, Julie K; Hersey, Kent R; Larsen, Randy T; McMillan, Brock R; Stoner, David C

    2018-04-01

    Predator control is often implemented with the intent of disrupting top-down regulation in sensitive prey populations. However, ambiguity surrounding the efficacy of predator management, as well as the strength of top-down effects of predators in general, is often exacerbated by the spatially implicit analytical approaches used in assessing data with explicit spatial structure. Here, we highlight the importance of considering spatial context in the case of a predator control study in south-central Utah. We assessed the spatial match between aerial removal risk in coyotes (Canis latrans) and mule deer (Odocoileus hemionus) resource selection during parturition using a spatially explicit, multi-level Bayesian model. With our model, we were able to evaluate spatial congruence between management action (i.e., coyote removal) and objective (i.e., parturient deer site selection) at two distinct scales: the level of the management unit and the individual coyote removal. In the case of the former, our results indicated substantial spatial heterogeneity in expected congruence between removal risk and parturient deer site selection across large areas, and is a reflection of logistical constraints acting on the management strategy and differences in space use between the two species. At the level of the individual removal, we demonstrated that the potential management benefits of a removed coyote were highly variable across all individuals removed and in many cases, spatially distinct from parturient deer resource selection. Our methods and results provide a means of evaluating where we might anticipate an impact of predator control, while emphasizing the need to weight individual removals based on spatial proximity to management objectives in any assessment of large-scale predator control. Although we highlight the importance of spatial context in assessments of predator control strategy, we believe our methods are readily generalizable in any management or large-scale experimental framework where spatial context is likely an important driver of outcomes. © 2018 by the Ecological Society of America.

  14. Closed Large Cell Clouds

    Atmospheric Science Data Center

    2013-04-19

    article title:  Closed Large Cell Clouds in the South Pacific ... the Multi-angle Imaging SpectroRadiometer (MISR) provide an example of very large scale closed cells, and can be contrasted with the  ... MD. The MISR data were obtained from the NASA Langley Research Center Atmospheric Science Data Center in Hampton, VA. Image ...

  15. CLIP-seq analysis of multi-mapped reads discovers novel functional RNA regulatory sites in the human transcriptome.

    PubMed

    Zhang, Zijun; Xing, Yi

    2017-09-19

    Crosslinking or RNA immunoprecipitation followed by sequencing (CLIP-seq or RIP-seq) allows transcriptome-wide discovery of RNA regulatory sites. As CLIP-seq/RIP-seq reads are short, existing computational tools focus on uniquely mapped reads, while reads mapped to multiple loci are discarded. We present CLAM (CLIP-seq Analysis of Multi-mapped reads). CLAM uses an expectation-maximization algorithm to assign multi-mapped reads and calls peaks combining uniquely and multi-mapped reads. To demonstrate the utility of CLAM, we applied it to a wide range of public CLIP-seq/RIP-seq datasets involving numerous splicing factors, microRNAs and m6A RNA methylation. CLAM recovered a large number of novel RNA regulatory sites inaccessible by uniquely mapped reads. The functional significance of these sites was demonstrated by consensus motif patterns and association with alternative splicing (splicing factors), transcript abundance (AGO2) and mRNA half-life (m6A). CLAM provides a useful tool to discover novel protein-RNA interactions and RNA modification sites from CLIP-seq and RIP-seq data, and reveals the significant contribution of repetitive elements to the RNA regulatory landscape of the human transcriptome. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Magnetic measurements with fluxgate 3-components magnetometers in archaeology. Multi-sensor device and associated potential field operators for large scale to centimetre investigations on the 1st millennium BC site of Qasr ʿAllam in the western desert of

    NASA Astrophysics Data System (ADS)

    Gavazzi, Bruno; Alkhatib-Alkontar, Rozan; Munschy, Marc; Colin, Frédéric; Duvette, Catherine

    2016-04-01

    Fluxgate 3-components magnetometers allow vector measurements of the magnetic field. Moreover, they are the magnetometers measuring the intensity of the magnetic field with the lightest weight and the lowest power consumption. Vector measurements make them the only kind of magnetometer allowing compensation of magnetic perturbations due to the equipment carried with the magnetometer. Fluxgate 3-components magnetometers are common in space magnetometry and in aero-geophysics but are never used in archaeology due to the difficulty to calibrate them. This problem is overcome by the use of a simple calibration and compensation procedure on the field developed initially for space research (after calibration and compensation, rms noise is less than 1 nT). It is therefore possible to build a multi-sensor (up to 8) and georeferenced device for investigations at different scales down to the centimetre: because the locus of magnetic measurements is less than a cubic centimetre, magnetic profiling or mapping can be performed a few centimetres outside magnetized bodies. Such an equipment is used in a context of heavy sediment coverage and uneven topography on the 1st millennium BC site of Qasr ʿAllam in the western desert of Egypt. Magnetic measurements with a line spacing of 0.5 m allow to compute a magnetic grid. Interpretation using potential field operators such as double reduction to the pole and fractional vertical derivatives reveals a widespread irrigation system and a vast cultic facility. In some areas, magnetic profiling with a 0.1 m line spacing and at 0.1 m above the ground is performed. Results of interpretations give enough proof to the local authorities to enlarge the protection of the site against the threatening progression of agricultural fields.

  17. Grizzly Bear Noninvasive Genetic Tagging Surveys: Estimating the Magnitude of Missed Detections.

    PubMed

    Fisher, Jason T; Heim, Nicole; Code, Sandra; Paczkowski, John

    2016-01-01

    Sound wildlife conservation decisions require sound information, and scientists increasingly rely on remotely collected data over large spatial scales, such as noninvasive genetic tagging (NGT). Grizzly bears (Ursus arctos), for example, are difficult to study at population scales except with noninvasive data, and NGT via hair trapping informs management over much of grizzly bears' range. Considerable statistical effort has gone into estimating sources of heterogeneity, but detection error-arising when a visiting bear fails to leave a hair sample-has not been independently estimated. We used camera traps to survey grizzly bear occurrence at fixed hair traps and multi-method hierarchical occupancy models to estimate the probability that a visiting bear actually leaves a hair sample with viable DNA. We surveyed grizzly bears via hair trapping and camera trapping for 8 monthly surveys at 50 (2012) and 76 (2013) sites in the Rocky Mountains of Alberta, Canada. We used multi-method occupancy models to estimate site occupancy, probability of detection, and conditional occupancy at a hair trap. We tested the prediction that detection error in NGT studies could be induced by temporal variability within season, leading to underestimation of occupancy. NGT via hair trapping consistently underestimated grizzly bear occupancy at a site when compared to camera trapping. At best occupancy was underestimated by 50%; at worst, by 95%. Probability of false absence was reduced through successive surveys, but this mainly accounts for error imparted by movement among repeated surveys, not necessarily missed detections by extant bears. The implications of missed detections and biased occupancy estimates for density estimation-which form the crux of management plans-require consideration. We suggest hair-trap NGT studies should estimate and correct detection error using independent survey methods such as cameras, to ensure the reliability of the data upon which species management and conservation actions are based.

  18. Grizzly Bear Noninvasive Genetic Tagging Surveys: Estimating the Magnitude of Missed Detections

    PubMed Central

    Fisher, Jason T.; Heim, Nicole; Code, Sandra; Paczkowski, John

    2016-01-01

    Sound wildlife conservation decisions require sound information, and scientists increasingly rely on remotely collected data over large spatial scales, such as noninvasive genetic tagging (NGT). Grizzly bears (Ursus arctos), for example, are difficult to study at population scales except with noninvasive data, and NGT via hair trapping informs management over much of grizzly bears’ range. Considerable statistical effort has gone into estimating sources of heterogeneity, but detection error–arising when a visiting bear fails to leave a hair sample–has not been independently estimated. We used camera traps to survey grizzly bear occurrence at fixed hair traps and multi-method hierarchical occupancy models to estimate the probability that a visiting bear actually leaves a hair sample with viable DNA. We surveyed grizzly bears via hair trapping and camera trapping for 8 monthly surveys at 50 (2012) and 76 (2013) sites in the Rocky Mountains of Alberta, Canada. We used multi-method occupancy models to estimate site occupancy, probability of detection, and conditional occupancy at a hair trap. We tested the prediction that detection error in NGT studies could be induced by temporal variability within season, leading to underestimation of occupancy. NGT via hair trapping consistently underestimated grizzly bear occupancy at a site when compared to camera trapping. At best occupancy was underestimated by 50%; at worst, by 95%. Probability of false absence was reduced through successive surveys, but this mainly accounts for error imparted by movement among repeated surveys, not necessarily missed detections by extant bears. The implications of missed detections and biased occupancy estimates for density estimation–which form the crux of management plans–require consideration. We suggest hair-trap NGT studies should estimate and correct detection error using independent survey methods such as cameras, to ensure the reliability of the data upon which species management and conservation actions are based. PMID:27603134

  19. SmallTool - a toolkit for realizing shared virtual environments on the Internet

    NASA Astrophysics Data System (ADS)

    Broll, Wolfgang

    1998-09-01

    With increasing graphics capabilities of computers and higher network communication speed, networked virtual environments have become available to a large number of people. While the virtual reality modelling language (VRML) provides users with the ability to exchange 3D data, there is still a lack of appropriate support to realize large-scale multi-user applications on the Internet. In this paper we will present SmallTool, a toolkit to support shared virtual environments on the Internet. The toolkit consists of a VRML-based parsing and rendering library, a device library, and a network library. This paper will focus on the networking architecture, provided by the network library - the distributed worlds transfer and communication protocol (DWTP). DWTP provides an application-independent network architecture to support large-scale multi-user environments on the Internet.

  20. A Multi-scale Modeling System with Unified Physics to Study Precipitation Processes

    NASA Astrophysics Data System (ADS)

    Tao, W. K.

    2017-12-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), and (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF). The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitation, processes and their sensitivity on model resolution and microphysics schemes will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  1. Using Multi-Scale Modeling Systems and Satellite Data to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2011-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (l) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, the recent developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitating systems and hurricanes/typhoons will be presented. The high-resolution spatial and temporal visualization will be utilized to show the evolution of precipitation processes. Also how to use of the multi-satellite simulator tqimproy precipitation processes will be discussed.

  2. Using Multi-Scale Modeling Systems and Satellite Data to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei--Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2010-01-01

    In recent years, exponentially increasing computer power extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 sq km in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale models can be run in grid size similar to cloud resolving models through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model). (2) a regional scale model (a NASA unified weather research and forecast, W8F). (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling systems to study the interactions between clouds, precipitation, and aerosols will be presented. Also how to use the multi-satellite simulator to improve precipitation processes will be discussed.

  3. Using Multi-Scale Modeling Systems to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2010-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the interactions between clouds, precipitation, and aerosols will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  4. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    DTIC Science & Technology

    2004-10-01

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) Defense Advanced Research Projects Agency AFRL/IFTC 3701 North Fairfax Drive...Scalable Parallel Libraries for Large-Scale Concurrent Applications," Technical Report UCRL -JC-109251, Lawrence Livermore National Laboratory

  5. Consistency of clinical biomechanical measures between three different institutions: implications for multi-center biomechanical and epidemiological research.

    PubMed

    Myer, Gregory D; Wordeman, Samuel C; Sugimoto, Dai; Bates, Nathaniel A; Roewer, Benjamin D; Medina McKeon, Jennifer M; DiCesare, Christopher A; Di Stasi, Stephanie L; Barber Foss, Kim D; Thomas, Staci M; Hewett, Timothy E

    2014-05-01

    Multi-center collaborations provide a powerful alternative to overcome the inherent limitations to single-center investigations. Specifically, multi-center projects can support large-scale prospective, longitudinal studies that investigate relatively uncommon outcomes, such as anterior cruciate ligament injury. This project was conceived to assess within- and between-center reliability of an affordable, clinical nomogram utilizing two-dimensional video methods to screen for risk of knee injury. The authors hypothesized that the two-dimensional screening methods would provide good-to-excellent reliability within and between institutions for assessment of frontal and sagittal plane biomechanics. Nineteen female, high school athletes participated. Two-dimensional video kinematics of the lower extremity during a drop vertical jump task were collected on all 19 study participants at each of the three facilities. Within-center and between-center reliability were assessed with intra- and inter-class correlation coefficients. Within-center reliability of the clinical nomogram variables was consistently excellent, but between-center reliability was fair-to-good. Within-center intra-class correlation coefficient for all nomogram variables combined was 0.98, while combined between-center inter-class correlation coefficient was 0.63. Injury risk screening protocols were reliable within and repeatable between centers. These results demonstrate the feasibility of multi-site biomechanical studies and establish a framework for further dissemination of injury risk screening algorithms. Specifically, multi-center studies may allow for further validation and optimization of two-dimensional video screening tools. 2b.

  6. Multi-scale approach to the environmental factors effects on spatio-temporal variability of Chironomus salinarius (Diptera: Chironomidae) in a French coastal lagoon

    NASA Astrophysics Data System (ADS)

    Cartier, V.; Claret, C.; Garnier, R.; Fayolle, S.; Franquet, E.

    2010-03-01

    The complexity of the relationships between environmental factors and organisms can be revealed by sampling designs which consider the contribution to variability of different temporal and spatial scales, compared to total variability. From a management perspective, a multi-scale approach can lead to time-saving. Identifying environmental patterns that help maintain patchy distribution is fundamental in studying coastal lagoons, transition zones between continental and marine waters characterised by great environmental variability on spatial and temporal scales. They often present organic enrichment inducing decreased species richness and increased densities of opportunist species like C hironomus salinarius, a common species that tends to swarm and thus constitutes a nuisance for human populations. This species is dominant in the Bolmon lagoon, a French Mediterranean coastal lagoon under eutrophication. Our objective was to quantify variability due to both spatial and temporal scales and identify the contribution of different environmental factors to this variability. The population of C. salinarius was sampled from June 2007 to June 2008 every two months at 12 sites located in two areas of the Bolmon lagoon, at two different depths, with three sites per area-depth combination. Environmental factors (temperature, dissolved oxygen both in sediment and under water surface, sediment organic matter content and grain size) and microbial activities (i.e. hydrolase activities) were also considered as explanatory factors of chironomid densities and distribution. ANOVA analysis reveals significant spatial differences regarding the distribution of chironomid larvae for the area and the depth scales and their interaction. The spatial effect is also revealed for dissolved oxygen (water), salinity and fine particles (area scale), and for water column depth. All factors but water column depth show a temporal effect. Spearman's correlations highlight the seasonal effect (temperature, dissolved oxygen in sediment and water) as well as the effect of microbial activities on chironomid larvae. Our results show that a multi-scale approach identifies patchy distribution, even when there is relative environmental homogeneity.

  7. Progress in centralised ethics review processes: Implications for multi-site health evaluations.

    PubMed

    Prosser, Brenton; Davey, Rachel; Gibson, Diane

    2015-04-01

    Increasingly, public sector programmes respond to complex social problems that intersect specific fields and individual disciplines. Such responses result in multi-site initiatives that can span nations, jurisdictions, sectors and organisations. The rigorous evaluation of public sector programmes is now a baseline expectation. For evaluations of large and complex multi-site programme initiatives, the processes of ethics review can present a significant challenge. However in recent years, there have been new developments in centralised ethics review processes in many nations. This paper provides the case study of an evaluation of a national, inter-jurisdictional, cross-sector, aged care health initiative and its encounters with Australian centralised ethics review processes. Specifically, the paper considers progress against the key themes of a previous five-year, five nation study (Fitzgerald and Phillips, 2006), which found that centralised ethics review processes would save time, money and effort, as well as contribute to more equitable workloads for researchers and evaluators. The paper concludes with insights for those charged with refining centralised ethics review processes, as well as recommendations for future evaluators of complex multi-site programme initiatives. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Uranium plume persistence impacted by hydrologic and geochemical heterogeneity in the groundwater and river water interaction zone of Hanford site

    NASA Astrophysics Data System (ADS)

    Chen, X.; Zachara, J. M.; Vermeul, V. R.; Freshley, M.; Hammond, G. E.

    2015-12-01

    The behavior of a persistent uranium plume in an extended groundwater- river water (GW-SW) interaction zone at the DOE Hanford site is dominantly controlled by river stage fluctuations in the adjacent Columbia River. The plume behavior is further complicated by substantial heterogeneity in physical and geochemical properties of the host aquifer sediments. Multi-scale field and laboratory experiments and reactive transport modeling were integrated to understand the complex plume behavior influenced by highly variable hydrologic and geochemical conditions in time and space. In this presentation we (1) describe multiple data sets from field-scale uranium adsorption and desorption experiments performed at our experimental well-field, (2) develop a reactive transport model that incorporates hydrologic and geochemical heterogeneities characterized from multi-scale and multi-type datasets and a surface complexation reaction network based on laboratory studies, and (3) compare the modeling and observation results to provide insights on how to refine the conceptual model and reduce prediction uncertainties. The experimental results revealed significant spatial variability in uranium adsorption/desorption behavior, while modeling demonstrated that ambient hydrologic and geochemical conditions and heterogeneities in sediment physical and chemical properties both contributed to complex plume behavior and its persistence. Our analysis provides important insights into the characterization, understanding, modeling, and remediation of groundwater contaminant plumes influenced by surface water and groundwater interactions.

  9. Heat Source Characterization In A TREAT Fuel Particle Using Coupled Neutronics Binary Collision Monte-Carlo Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schunert, Sebastian; Schwen, Daniel; Ghassemi, Pedram

    This work presents a multi-physics, multi-scale approach to modeling the Transient Test Reactor (TREAT) currently prepared for restart at the Idaho National Laboratory. TREAT fuel is made up of microscopic fuel grains (r ˜ 20µm) dispersed in a graphite matrix. The novelty of this work is in coupling a binary collision Monte-Carlo (BCMC) model to the Finite Element based code Moose for solving a microsopic heat-conduction problem whose driving source is provided by the BCMC model tracking fission fragment energy deposition. This microscopic model is driven by a transient, engineering scale neutronics model coupled to an adiabatic heating model. Themore » macroscopic model provides local power densities and neutron energy spectra to the microscpic model. Currently, no feedback from the microscopic to the macroscopic model is considered. TREAT transient 15 is used to exemplify the capabilities of the multi-physics, multi-scale model, and it is found that the average fuel grain temperature differs from the average graphite temperature by 80 K despite the low-power transient. The large temperature difference has strong implications on the Doppler feedback a potential LEU TREAT core would see, and it underpins the need for multi-physics, multi-scale modeling of a TREAT LEU core.« less

  10. Scale Dependence of Oak Woodland Historical Fire Intervals: Contrasting the Barrens of Tennessee and Cross Timbers of Oklahoma, USA

    Treesearch

    Michael C. Stambaugh; Richard P. Guyette; Joseph M. Marschall; Daniel C. Dey

    2016-01-01

    Characterization of scale dependence of fire intervals could inform interpretations of fire history and improve fire prescriptions that aim to mimic historical fire regime conditions. We quantified the temporal variability in fire regimes and described the spatial dependence of fire intervals through the analysis of multi-century fire scar records (8 study sites, 332...

  11. Multi-Hazard Analysis for the Estimation of Ground Motion Induced by Landslides and Tectonics

    NASA Astrophysics Data System (ADS)

    Iglesias, Rubén; Koudogbo, Fifame; Ardizzone, Francesca; Mondini, Alessandro; Bignami, Christian

    2016-04-01

    Space-borne synthetic aperture radar (SAR) sensors allow obtaining all-day all-weather terrain complex reflectivity images which can be processed by means of Persistent Scatterer Interferometry (PSI) for the monitoring of displacement episodes with extremely high accuracy. In the work presented, different PSI strategies to measure ground surface displacements for multi-scale multi-hazard mapping are proposed in the context of landslides and tectonic applications. This work is developed in the framework of ESA General Studies Programme (GSP). The present project, called Multi Scale and Multi Hazard Mapping Space based Solutions (MEMpHIS), investigates new Earth Observation (EO) methods and new Information and Communications Technology (ICT) solutions to improve the understanding and management of disasters, with special focus on Disaster Risk Reduction rather than Rapid Mapping. In this paper, the results of the investigation on the key processing steps for measuring large-scale ground surface displacements (like the ones originated by plate tectonics or active faults) as well as local displacements at high resolution (like the ones related with active slopes) will be presented. The core of the proposed approaches is based on the Stable Point Network (SPN) algorithm, which is the advanced PSI processing chain developed by ALTAMIRA INFORMATION. Regarding tectonic applications, the accurate displacement estimation over large-scale areas characterized by low magnitude motion gradients (3-5 mm/year), such as the ones induced by inter-seismic or Earth tidal effects, still remains an open issue. In this context, a low-resolution approach based in the integration of differential phase increments of velocity and topographic error (obtained through the fitting of a linear model adjustment function to data) will be evaluated. Data from the default mode of Sentinel-1, the Interferometric Wide Swath Mode, will be considered for this application. Regarding landslides applications, which typically occur over vegetated scenarios largely affected by temporal and geometrical phenomena, the number of persistent scatterers (PSs) available is crucial. The better the density and reliability of PSs, the better the delineation and characterization of landslides. In this context, an advanced high-resolution processing based on the use of the Non-Local Interferometric SAR (NL-InSAR) filtering will be evaluated. Finally, since SAR systems are only sensitive to the detection of displacements in the line-of-sight (LOS) direction, the importance of projecting final PSI displacement products along the steepest gradient of the terrain slope will be put forward. The high-resolution COSMO-SkyMed sensor will be used for this application. The test site selected to evaluate the performance of the techniques proposed corresponds to the region of Northern Apennines (Italy), which is affected by both landslides and tectonics displacement phenomena. Sentinel-1 (for tectonics) and COSMO-SkyMed (for landslides) SAR data will be employed for the monitoring of the activity within the area of interest. Users of the DRM (Disaster Risk Management) community have been associated to the project, in order to, once validated the algorithms, further evaluate the proposed solution considering selected trial cases.

  12. Automated, on-board terrain analysis for precision landings

    NASA Technical Reports Server (NTRS)

    Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.; Hines, Glenn D.

    2006-01-01

    Advances in space robotics technology hinge to a large extent upon the development and deployment of sophisticated new vision-based methods for automated in-space mission operations and scientific survey. To this end, we have developed a new concept for automated terrain analysis that is based upon a generic image enhancement platform|multi-scale retinex (MSR) and visual servo (VS) processing. This pre-conditioning with the MSR and the vs produces a "canonical" visual representation that is largely independent of lighting variations, and exposure errors. Enhanced imagery is then processed with a biologically inspired two-channel edge detection process, followed by a smoothness based criteria for image segmentation. Landing sites can be automatically determined by examining the results of the smoothness-based segmentation which shows those areas in the image that surpass a minimum degree of smoothness. Though the msr has proven to be a very strong enhancement engine, the other elements of the approach|the vs, terrain map generation, and smoothness-based segmentation|are in early stages of development. Experimental results on data from the Mars Global Surveyor show that the imagery can be processed to automatically obtain smooth landing sites. In this paper, we describe the method used to obtain these landing sites, and also examine the smoothness criteria in terms of the imager and scene characteristics. Several examples of applying this method to simulated and real imagery are shown.

  13. Multi-GPU implementation of a VMAT treatment plan optimization algorithm.

    PubMed

    Tian, Zhen; Peng, Fei; Folkerts, Michael; Tan, Jun; Jia, Xun; Jiang, Steve B

    2015-06-01

    Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU's relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors' group, on a multi-GPU platform to solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors' method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H&N) cancer case is then used to validate the authors' method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H&N patient cases and three prostate cases are used to demonstrate the advantages of the authors' method. The authors' multi-GPU implementation can finish the optimization process within ∼ 1 min for the H&N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23-46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. The results demonstrate that the multi-GPU implementation of the authors' column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors' study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.

  14. Site Selection in Experiments: An Assessment of Site Recruitment and Generalizability in Two Scale-Up Studies

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Fellers, Lauren; Caverly, Sarah; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Ruiz de Castilla, Veronica

    2016-01-01

    Recently, statisticians have begun developing methods to improve the generalizability of results from large-scale experiments in education. This work has included the development of methods for improved site selection when random sampling is infeasible, including the use of stratification and targeted recruitment strategies. This article provides…

  15. Hum-mPLoc: an ensemble classifier for large-scale human protein subcellular location prediction by incorporating samples with multiple sites.

    PubMed

    Shen, Hong-Bin; Chou, Kuo-Chen

    2007-04-20

    Proteins may simultaneously exist at, or move between, two or more different subcellular locations. Proteins with multiple locations or dynamic feature of this kind are particularly interesting because they may have some very special biological functions intriguing to investigators in both basic research and drug discovery. For instance, among the 6408 human protein entries that have experimentally observed subcellular location annotations in the Swiss-Prot database (version 50.7, released 19-Sept-2006), 973 ( approximately 15%) have multiple location sites. The number of total human protein entries (except those annotated with "fragment" or those with less than 50 amino acids) in the same database is 14,370, meaning a gap of (14,370-6408)=7962 entries for which no knowledge is available about their subcellular locations. Although one can use the computational approach to predict the desired information for the gap, so far all the existing methods for predicting human protein subcellular localization are limited in the case of single location site only. To overcome such a barrier, a new ensemble classifier, named Hum-mPLoc, was developed that can be used to deal with the case of multiple location sites as well. Hum-mPLoc is freely accessible to the public as a web server at http://202.120.37.186/bioinf/hum-multi. Meanwhile, for the convenience of people working in the relevant areas, Hum-mPLoc has been used to identify all human protein entries in the Swiss-Prot database that do not have subcellular location annotations or are annotated as being uncertain. The large-scale results thus obtained have been deposited in a downloadable file prepared with Microsoft Excel and named "Tab_Hum-mPLoc.xls". This file is available at the same website and will be updated twice a year to include new entries of human proteins and reflect the continuous development of Hum-mPLoc.

  16. Marital happiness and sleep disturbances in a multi-ethnic sample of middle-aged women.

    PubMed

    Troxel, Wendy M; Buysse, Daniel J; Hall, Martica; Matthews, Karen A

    2009-01-01

    Previous research suggests that divorced individuals, particularly women, have higher rates of sleep disturbances as compared to married individuals. Among the married, however, little is known about the association between relationship quality and sleep. The present study examined the association between marital happiness and self-reported sleep disturbances in a sample of midlife women drawn from the Study of Women's Health Across the Nation (SWAN), a multi-site, multi-ethnic, community-based study (N = 2,148). Marital happiness was measured using a single item from the Dyadic Adjustment Scale, and sleep disturbance was assessed using 4 items from the Women's Health Initiative Insomnia Rating Scale (WHIIRS). After controlling for relevant covariates, maritally happy women reported fewer sleep disturbances, with the association evident among Caucasian women and to a lesser extent among African American women.

  17. A fast learning method for large scale and multi-class samples of SVM

    NASA Astrophysics Data System (ADS)

    Fan, Yu; Guo, Huiming

    2017-06-01

    A multi-class classification SVM(Support Vector Machine) fast learning method based on binary tree is presented to solve its low learning efficiency when SVM processing large scale multi-class samples. This paper adopts bottom-up method to set up binary tree hierarchy structure, according to achieved hierarchy structure, sub-classifier learns from corresponding samples of each node. During the learning, several class clusters are generated after the first clustering of the training samples. Firstly, central points are extracted from those class clusters which just have one type of samples. For those which have two types of samples, cluster numbers of their positive and negative samples are set respectively according to their mixture degree, secondary clustering undertaken afterwards, after which, central points are extracted from achieved sub-class clusters. By learning from the reduced samples formed by the integration of extracted central points above, sub-classifiers are obtained. Simulation experiment shows that, this fast learning method, which is based on multi-level clustering, can guarantee higher classification accuracy, greatly reduce sample numbers and effectively improve learning efficiency.

  18. [Research on non-rigid registration of multi-modal medical image based on Demons algorithm].

    PubMed

    Hao, Peibo; Chen, Zhen; Jiang, Shaofeng; Wang, Yang

    2014-02-01

    Non-rigid medical image registration is a popular subject in the research areas of the medical image and has an important clinical value. In this paper we put forward an improved algorithm of Demons, together with the conservation of gray model and local structure tensor conservation model, to construct a new energy function processing multi-modal registration problem. We then applied the L-BFGS algorithm to optimize the energy function and solve complex three-dimensional data optimization problem. And finally we used the multi-scale hierarchical refinement ideas to solve large deformation registration. The experimental results showed that the proposed algorithm for large de formation and multi-modal three-dimensional medical image registration had good effects.

  19. On the relationship between large-scale climate modes and regional synoptic patterns that drive Victorian rainfall

    NASA Astrophysics Data System (ADS)

    Verdon-Kidd, D. C.; Kiem, A. S.

    2009-04-01

    In this paper regional (synoptic) and large-scale climate drivers of rainfall are investigated for Victoria, Australia. A non-linear classification methodology known as self-organizing maps (SOM) is used to identify 20 key regional synoptic patterns, which are shown to capture a range of significant synoptic features known to influence the climate of the region. Rainfall distributions are assigned to each of the 20 patterns for nine rainfall stations located across Victoria, resulting in a clear distinction between wet and dry synoptic types at each station. The influence of large-scale climate modes on the frequency and timing of the regional synoptic patterns is also investigated. This analysis revealed that phase changes in the El Niño Southern Oscillation (ENSO), the Indian Ocean Dipole (IOD) and/or the Southern Annular Mode (SAM) are associated with a shift in the relative frequency of wet and dry synoptic types on an annual to inter-annual timescale. In addition, the relative frequency of synoptic types is shown to vary on a multi-decadal timescale, associated with changes in the Inter-decadal Pacific Oscillation (IPO). Importantly, these results highlight the potential to utilise the link between the regional synoptic patterns derived in this study and large-scale climate modes to improve rainfall forecasting for Victoria, both in the short- (i.e. seasonal) and long-term (i.e. decadal/multi-decadal scale). In addition, the regional and large-scale climate drivers identified in this study provide a benchmark by which the performance of Global Climate Models (GCMs) may be assessed.

  20. Counting on β-Diversity to Safeguard the Resilience of Estuaries

    PubMed Central

    de Juan, Silvia; Thrush, Simon F.; Hewitt, Judi E.

    2013-01-01

    Coastal ecosystems are often stressed by non-point source and cumulative effects that can lead to local-scale community homogenisation and a concomitant loss of large-scale ecological connectivity. Here we investigate the use of β-diversity as a measure of both community heterogeneity and ecological connectivity. To understand the consequences of different environmental scenarios on heterogeneity and connectivity, it is necessary to understand the scale at which different environmental factors affect β-diversity. We sampled macrofauna from intertidal sites in nine estuaries from New Zealand’s North Island that represented different degrees of stress derived from land-use. We used multiple regression models to identify relationships between β-diversity and local sediment variables, factors related to the estuarine and catchment hydrodynamics and morphology and land-based stressors. At local scales, we found higher β-diversity at sites with a relatively high total richness. At larger scales, β-diversity was positively related to γ-diversity, suggesting that a large regional species pool was linked with large-scale heterogeneity in these systems. Local environmental heterogeneity influenced β-diversity at both local and regional scales, although variables at the estuarine and catchment scales were both needed to explain large scale connectivity. The estuaries expected a priori to be the most stressed exhibited higher variance in community dissimilarity between sites and connectivity to the estuary species pool. This suggests that connectivity and heterogeneity metrics could be used to generate early warning signals of cumulative stress. PMID:23755252

  1. Application of ground-penetrating radar to investigation of near-surface fault properties in the San Francisco Bay region

    USGS Publications Warehouse

    Cai, J.; McMechan, G.A.; Fisher, M.A.

    1996-01-01

    In many geologic environments, ground-penetrating radar (GPR) provides high-resolution images of near-surface Earth structure. GPR data collection is nondestructive and very economical. The scale of features detected by GPR lies between those imaged by high-resolution seismic reflection surveys and those exposed in trenches and is therefore potentially complementary to traditional techniques for fault location and mapping. Sixty-two GPR profiles were collected at 12 sites in the San Francisco Bay region. Results show that GPR data correlate with large-scale features in existing trench observations, can be used to locate faults where they are buried or where their positions are not well known, and can identify previously unknown fault segments. The best data acquired were on a profile across the San Andreas fault, traversing Pleistocene terrace deposits south of Olema in Marin County; this profile shows a complicated multi-branched fault system from the ground surface down to about 40 m, the maximum depth for which data were recorded.

  2. Reliability and Validity of the Dyadic Observed Communication Scale (DOCS).

    PubMed

    Hadley, Wendy; Stewart, Angela; Hunter, Heather L; Affleck, Katelyn; Donenberg, Geri; Diclemente, Ralph; Brown, Larry K

    2013-02-01

    We evaluated the reliability and validity of the Dyadic Observed Communication Scale (DOCS) coding scheme, which was developed to capture a range of communication components between parents and adolescents. Adolescents and their caregivers were recruited from mental health facilities for participation in a large, multi-site family-based HIV prevention intervention study. Seventy-one dyads were randomly selected from the larger study sample and coded using the DOCS at baseline. Preliminary validity and reliability of the DOCS was examined using various methods, such as comparing results to self-report measures and examining interrater reliability. Results suggest that the DOCS is a reliable and valid measure of observed communication among parent-adolescent dyads that captures both verbal and nonverbal communication behaviors that are typical intervention targets. The DOCS is a viable coding scheme for use by researchers and clinicians examining parent-adolescent communication. Coders can be trained to reliably capture individual and dyadic components of communication for parents and adolescents and this complex information can be obtained relatively quickly.

  3. LARGE-SCALE PREDICTIONS OF MOBILE SOURCE CONTRIBUTIONS TO CONCENTRATIONS OF TOXIC AIR POLLUTANTS

    EPA Science Inventory

    This presentation shows concentrations and deposition of toxic air pollutants predicted by a 3-D air quality model, the Community Multi Scale Air Quality (CMAQ) modeling system. Contributions from both on-road and non-road mobile sources are analyzed.

  4. NASA: Assessments of Selected Large-Scale Projects

    DTIC Science & Technology

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  5. Large-Scale, Parallel, Multi-Sensor Data Fusion in the Cloud

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Hua, H.

    2012-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To efficiently assemble such decade-scale datasets in a timely manner, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. "SciReduce" is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, in which simple tuples (keys & values) are passed between the map and reduce functions, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Thus, SciReduce uses the native datatypes (geolocated grids, swaths, and points) that geo-scientists are familiar with. We are deploying within SciReduce a versatile set of python operators for data lookup, access, subsetting, co-registration, mining, fusion, and statistical analysis. All operators take in sets of geo-located arrays and generate more arrays. Large, multi-year satellite and model datasets are automatically "sharded" by time and space across a cluster of nodes so that years of data (millions of granules) can be compared or fused in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP or webification URLs, thereby minimizing the size of the stored input and intermediate datasets. A typical map function might assemble and quality control AIRS Level-2 water vapor profiles for a year of data in parallel, then a reduce function would average the profiles in lat/lon bins (again, in parallel), and a final reduce would aggregate the climatology and write it to output files. We are using SciReduce to automate the production of multiple versions of a multi-year water vapor climatology (AIRS & MODIS), stratified by Cloudsat cloud classification, and compare it to models (ECMWF & MERRA reanalysis). We will present the architecture of SciReduce, describe the achieved "clock time" speedups in fusing huge datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer.

  6. Large-Scale, Parallel, Multi-Sensor Data Fusion in the Cloud

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Manipon, G.; Hua, H.

    2012-04-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To efficiently assemble such decade-scale datasets in a timely manner, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. "SciReduce" is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, in which simple tuples (keys & values) are passed between the map and reduce functions, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Thus, SciReduce uses the native datatypes (geolocated grids, swaths, and points) that geo-scientists are familiar with. We are deploying within SciReduce a versatile set of python operators for data lookup, access, subsetting, co-registration, mining, fusion, and statistical analysis. All operators take in sets of geo-arrays and generate more arrays. Large, multi-year satellite and model datasets are automatically "sharded" by time and space across a cluster of nodes so that years of data (millions of granules) can be compared or fused in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP or webification URLs, thereby minimizing the size of the stored input and intermediate datasets. A typical map function might assemble and quality control AIRS Level-2 water vapor profiles for a year of data in parallel, then a reduce function would average the profiles in bins (again, in parallel), and a final reduce would aggregate the climatology and write it to output files. We are using SciReduce to automate the production of multiple versions of a multi-year water vapor climatology (AIRS & MODIS), stratified by Cloudsat cloud classification, and compare it to models (ECMWF & MERRA reanalysis). We will present the architecture of SciReduce, describe the achieved "clock time" speedups in fusing huge datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer.

  7. The Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey

    NASA Astrophysics Data System (ADS)

    Squires, Gordon K.; Lubin, L. M.; Gal, R. R.

    2007-05-01

    We present the motivation, design, and latest results from the Observations of Redshift Evolution in Large Scale Environments (ORELSE) Survey, a systematic search for structure on scales greater than 10 Mpc around 20 known galaxy clusters at z > 0.6. When complete, the survey will cover nearly 5 square degrees, all targeted at high-density regions, making it complementary and comparable to field surveys such as DEEP2, GOODS, and COSMOS. For the survey, we are using the Large Format Camera on the Palomar 5-m and SuPRIME-Cam on the Subaru 8-m to obtain optical/near-infrared imaging of an approximately 30 arcmin region around previously studied high-redshift clusters. Colors are used to identify likely member galaxies which are targeted for follow-up spectroscopy with the DEep Imaging Multi-Object Spectrograph on the Keck 10-m. This technique has been used to identify successfully the Cl 1604 supercluster at z = 0.9, a large scale structure containing at least eight clusters (Gal & Lubin 2004; Gal, Lubin & Squires 2005). We present the most recent structures to be photometrically and spectroscopically confirmed through this program, discuss the properties of the member galaxies as a function of environment, and describe our planned multi-wavelength (radio, mid-IR, and X-ray) observations of these systems. The goal of this survey is to identify and examine a statistical sample of large scale structures during an active period in the assembly history of the most massive clusters. With such a sample, we can begin to constrain large scale cluster dynamics and determine the effect of the larger environment on galaxy evolution.

  8. Comparison of Multi-Scale Digital Elevation Models for Defining Waterways and Catchments Over Large Areas

    NASA Astrophysics Data System (ADS)

    Harris, B.; McDougall, K.; Barry, M.

    2012-07-01

    Digital Elevation Models (DEMs) allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS) techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment) including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas) are adequate for the creation of waterways and catchments at a regional scale.

  9. Algorithm and Application of Gcp-Independent Block Adjustment for Super Large-Scale Domestic High Resolution Optical Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Sun, Y. S.; Zhang, L.; Xu, B.; Zhang, Y.

    2018-04-01

    The accurate positioning of optical satellite image without control is the precondition for remote sensing application and small/medium scale mapping in large abroad areas or with large-scale images. In this paper, aiming at the geometric features of optical satellite image, based on a widely used optimization method of constraint problem which is called Alternating Direction Method of Multipliers (ADMM) and RFM least-squares block adjustment, we propose a GCP independent block adjustment method for the large-scale domestic high resolution optical satellite image - GISIBA (GCP-Independent Satellite Imagery Block Adjustment), which is easy to parallelize and highly efficient. In this method, the virtual "average" control points are built to solve the rank defect problem and qualitative and quantitative analysis in block adjustment without control. The test results prove that the horizontal and vertical accuracy of multi-covered and multi-temporal satellite images are better than 10 m and 6 m. Meanwhile the mosaic problem of the adjacent areas in large area DOM production can be solved if the public geographic information data is introduced as horizontal and vertical constraints in the block adjustment process. Finally, through the experiments by using GF-1 and ZY-3 satellite images over several typical test areas, the reliability, accuracy and performance of our developed procedure will be presented and studied in this paper.

  10. Multi-scale Material Appearance

    NASA Astrophysics Data System (ADS)

    Wu, Hongzhi

    Modeling and rendering the appearance of materials is important for a diverse range of applications of computer graphics - from automobile design to movies and cultural heritage. The appearance of materials varies considerably at different scales, posing significant challenges due to the sheer complexity of the data, as well the need to maintain inter-scale consistency constraints. This thesis presents a series of studies around the modeling, rendering and editing of multi-scale material appearance. To efficiently render material appearance at multiple scales, we develop an object-space precomputed adaptive sampling method, which precomputes a hierarchy of view-independent points that preserve multi-level appearance. To support bi-scale material appearance design, we propose a novel reflectance filtering algorithm, which rapidly computes the large-scale appearance from small-scale details, by exploiting the low-rank structures of Bidirectional Visible Normal Distribution Functions and pre-rotated Bidirectional Reflectance Distribution Functions in the matrix formulation of the rendering algorithm. This approach can guide the physical realization of appearance, as well as the modeling of real-world materials using very sparse measurements. Finally, we present a bi-scale-inspired high-quality general representation for material appearance described by Bidirectional Texture Functions. Our representation is at once compact, easily editable, and amenable to efficient rendering.

  11. Multi-GNSS PPP-RTK: From Large- to Small-Scale Networks

    PubMed Central

    Nadarajah, Nandakumaran; Wang, Kan; Choudhury, Mazher

    2018-01-01

    Precise point positioning (PPP) and its integer ambiguity resolution-enabled variant, PPP-RTK (real-time kinematic), can benefit enormously from the integration of multiple global navigation satellite systems (GNSS). In such a multi-GNSS landscape, the positioning convergence time is expected to be reduced considerably as compared to the one obtained by a single-GNSS setup. It is therefore the goal of the present contribution to provide numerical insights into the role taken by the multi-GNSS integration in delivering fast and high-precision positioning solutions (sub-decimeter and centimeter levels) using PPP-RTK. To that end, we employ the Curtin PPP-RTK platform and process data-sets of GPS, BeiDou Navigation Satellite System (BDS) and Galileo in stand-alone and combined forms. The data-sets are collected by various receiver types, ranging from high-end multi-frequency geodetic receivers to low-cost single-frequency mass-market receivers. The corresponding stations form a large-scale (Australia-wide) network as well as a small-scale network with inter-station distances less than 30 km. In case of the Australia-wide GPS-only ambiguity-float setup, 90% of the horizontal positioning errors (kinematic mode) are shown to become less than five centimeters after 103 min. The stated required time is reduced to 66 min for the corresponding GPS + BDS + Galieo setup. The time is further reduced to 15 min by applying single-receiver ambiguity resolution. The outcomes are supported by the positioning results of the small-scale network. PMID:29614040

  12. Multi-GNSS PPP-RTK: From Large- to Small-Scale Networks.

    PubMed

    Nadarajah, Nandakumaran; Khodabandeh, Amir; Wang, Kan; Choudhury, Mazher; Teunissen, Peter J G

    2018-04-03

    Precise point positioning (PPP) and its integer ambiguity resolution-enabled variant, PPP-RTK (real-time kinematic), can benefit enormously from the integration of multiple global navigation satellite systems (GNSS). In such a multi-GNSS landscape, the positioning convergence time is expected to be reduced considerably as compared to the one obtained by a single-GNSS setup. It is therefore the goal of the present contribution to provide numerical insights into the role taken by the multi-GNSS integration in delivering fast and high-precision positioning solutions (sub-decimeter and centimeter levels) using PPP-RTK. To that end, we employ the Curtin PPP-RTK platform and process data-sets of GPS, BeiDou Navigation Satellite System (BDS) and Galileo in stand-alone and combined forms. The data-sets are collected by various receiver types, ranging from high-end multi-frequency geodetic receivers to low-cost single-frequency mass-market receivers. The corresponding stations form a large-scale (Australia-wide) network as well as a small-scale network with inter-station distances less than 30 km. In case of the Australia-wide GPS-only ambiguity-float setup, 90% of the horizontal positioning errors (kinematic mode) are shown to become less than five centimeters after 103 min. The stated required time is reduced to 66 min for the corresponding GPS + BDS + Galieo setup. The time is further reduced to 15 min by applying single-receiver ambiguity resolution. The outcomes are supported by the positioning results of the small-scale network.

  13. Overview of Opportunities for Co-Location of Solar Energy Technologies and Vegetation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Beatty, Brenda; Hill, Graham

    2013-12-01

    Large-scale solar facilities have the potential to contribute significantly to national electricity production. Many solar installations are large-scale or utility-scale, with a capacity over 1 MW and connected directly to the electric grid. Large-scale solar facilities offer an opportunity to achieve economies of scale in solar deployment, yet there have been concerns about the amount of land required for solar projects and the impact of solar projects on local habitat. During the site preparation phase for utility-scale solar facilities, developers often grade land and remove all vegetation to minimize installation and operational costs, prevent plants from shading panels, and minimizemore » potential fire or wildlife risks. However, the common site preparation practice of removing vegetation can be avoided in certain circumstances, and there have been successful examples where solar facilities have been co-located with agricultural operations or have native vegetation growing beneath the panels. In this study we outline some of the impacts that large-scale solar facilities can have on the local environment, provide examples of installations where impacts have been minimized through co-location with vegetation, characterize the types of co-location, and give an overview of the potential benefits from co-location of solar energy projects and vegetation. The varieties of co-location can be replicated or modified for site-specific use at other solar energy installations around the world. We conclude with opportunities to improve upon our understanding of ways to reduce the environmental impacts of large-scale solar installations.« less

  14. Projected changes in precipitation intensity and frequency over complex topography: a multi-model perspective

    NASA Astrophysics Data System (ADS)

    Fischer, Andreas; Keller, Denise; Liniger, Mark; Rajczak, Jan; Schär, Christoph; Appenzeller, Christof

    2014-05-01

    Fundamental changes in the hydrological cycle are expected in a future warmer climate. This is of particular relevance for the Alpine region, as a source and reservoir of several major rivers in Europe and being prone to extreme events such as floodings. For this region, climate change assessments based on the ENSEMBLES regional climate models (RCMs) project a significant decrease in summer mean precipitation under the A1B emission scenario by the mid-to-end of this century, while winter mean precipitation is expected to slightly rise. From an impact perspective, projected changes in seasonal means, however, are often insufficient to adequately address the multifaceted challenges of climate change adaptation. In this study, we revisit the full matrix of the ENSEMBLES RCM projections regarding changes in frequency and intensity, precipitation-type (convective versus stratiform) and temporal structure (wet/dry spells and transition probabilities) over Switzerland and surroundings. As proxies for raintype changes, we rely on the model parameterized convective and large-scale precipitation components. Part of the analysis involves a Bayesian multi-model combination algorithm to infer changes from the multi-model ensemble. The analysis suggests a summer drying that evolves altitude-specific: over low-land regions it is associated with wet-day frequency decreases of convective and large-scale precipitation, while over elevated regions it is primarily associated with a decline in large-scale precipitation only. As a consequence, almost all the models project an increase in the convective fraction at elevated Alpine altitudes. The decrease in the number of wet days during summer is accompanied by decreases (increases) in multi-day wet (dry) spells. This shift in multi-day episodes also lowers the likelihood of short dry spell occurrence in all of the models. For spring and autumn the combined multi-model projections indicate higher mean precipitation intensity north of the Alps, while a similar tendency is expected for the winter season over most of Switzerland.

  15. Advances in Multi-Sensor Scanning and Visualization of Complex Plants: the Utmost Case of a Reactor Building

    NASA Astrophysics Data System (ADS)

    Hullo, J.-F.; Thibault, G.; Boucheny, C.

    2015-02-01

    In a context of increased maintenance operations and workers generational renewal, a nuclear owner and operator like Electricité de France (EDF) is interested in the scaling up of tools and methods of "as-built virtual reality" for larger buildings and wider audiences. However, acquisition and sharing of as-built data on a large scale (large and complex multi-floored buildings) challenge current scientific and technical capacities. In this paper, we first present a state of the art of scanning tools and methods for industrial plants with very complex architecture. Then, we introduce the inner characteristics of the multi-sensor scanning and visualization of the interior of the most complex building of a power plant: a nuclear reactor building. We introduce several developments that made possible a first complete survey of such a large building, from acquisition, processing and fusion of multiple data sources (3D laser scans, total-station survey, RGB panoramic, 2D floor plans, 3D CAD as-built models). In addition, we present the concepts of a smart application developed for the painless exploration of the whole dataset. The goal of this application is to help professionals, unfamiliar with the manipulation of such datasets, to take into account spatial constraints induced by the building complexity while preparing maintenance operations. Finally, we discuss the main feedbacks of this large experiment, the remaining issues for the generalization of such large scale surveys and the future technical and scientific challenges in the field of industrial "virtual reality".

  16. Generating large-scale estimates from sparse, in-situ networks: multi-scale soil moisture modeling at ARS watersheds for NASA’s soil moisture active passive (SMAP) calibration/validation mission

    USDA-ARS?s Scientific Manuscript database

    NASA’s SMAP satellite, launched in November of 2014, produces estimates of average volumetric soil moisture at 3, 9, and 36-kilometer scales. The calibration and validation process of these estimates requires the generation of an identically-scaled soil moisture product from existing in-situ networ...

  17. Does remote sensing help translating local SGD investigation to large spatial scales?

    NASA Astrophysics Data System (ADS)

    Moosdorf, N.; Mallast, U.; Hennig, H.; Schubert, M.; Knoeller, K.; Neehaul, Y.

    2016-02-01

    Within the last 20 years, studies on submarine groundwater discharge (SGD) have revealed numerous processes, temporal behavior and quantitative estimations as well as best-practice and localization methods. This plethora on information is valuable regarding the understanding of magnitude and effects of SGD for the respective location. Yet, since given local conditions vary, the translation of local understanding, magnitudes and effects to a regional or global scale is not trivial. In contrast, modeling approaches (e.g. 228Ra budget) tackling SGD on a global scale do provide quantitative global estimates but have not been related to local investigations. This gap between the two approaches, local and global, and the combination and/or translation of either one to the other represents one of the mayor challenges the SGD community currently faces. But what if remote sensing can provide certain information that may be used as translation between the two, similar to transfer functions in many other disciplines allowing an extrapolation from in-situ investigated and quantified SGD (discrete information) to regional scales or beyond? Admittedly, the sketched future is ambitious and we will certainly not be able to present a solution to the raised question. Nonetheless, we will show a remote sensing based approach that is already able to identify potential SGD sites independent on location or hydrogeological conditions. Based on multi-temporal thermal information of the water surface as core of the approach, SGD influenced sites display a smaller thermal variation (thermal anomalies) than surrounding uninfluenced areas. Despite the apparent simplicity, the automatized approach has helped to localize several sites that could be validated with proven in-situ methods. At the same time it embodies the risk to identify false positives that can only be avoided if we can `calibrate' the so obtained thermal anomalies to in-situ data. We will present all pros and cons of our approach with the intention to contribute to the solution of translating SGD investigation to larger scales.

  18. Velocity Model Using the Large-N Seismic Array from the Source Physics Experiment (SPE)

    NASA Astrophysics Data System (ADS)

    Chen, T.; Snelson, C. M.

    2016-12-01

    The Source Physics Experiment (SPE) is a multi-institutional, multi-disciplinary project that consists of a series of chemical explosions conducted at the Nevada National Security Site (NNSS). The goal of SPE is to understand the complicated effect of geological structures on seismic wave propagation and source energy partitioning, develop and validate physics-based modeling, and ultimately better monitor low-yield nuclear explosions. A Large-N seismic array was deployed at the SPE site to image the full 3D wavefield from the most recent SPE-5 explosion on April 26, 2016. The Large-N seismic array consists of 996 geophones (half three-component and half vertical-component sensors), and operated for one month, recording the SPE-5 shot, ambient noise, and additional controlled-sources (a large hammer). This study uses Large-N array recordings of the SPE-5 chemical explosion to develop high resolution images of local geologic structures. We analyze different phases of recorded seismic data and construct a velocity model based on arrival times. The results of this study will be incorporated into the large modeling and simulation efforts as ground-truth further validating the models.

  19. Analysis of Multi-Criteria Evaluation Method of Landfill Site Selection for Municipal Solid Waste Management

    NASA Astrophysics Data System (ADS)

    Mohammed, Habiba Ibrahim; Majid, Zulkepli; Yusof, Norhakim Bin; Bello Yamusa, Yamusa

    2018-03-01

    Landfilling remains the most common systematic technique of solid waste disposal in most of the developed and developing countries. Finding a suitable site for landfill is a very challenging task. Landfill site selection process aims to provide suitable areas that will protect the environment and public health from pollution and hazards. Therefore, various factors such as environmental, physical, socio-economic, and geological criteria must be considered before siting any landfill. This makes the site selection process vigorous and tedious because it involves the processing of large amount of spatial data, rules and regulations from different agencies and also policy from decision makers. This allows the incorporation of conflicting objectives and decision maker preferences into spatial decision models. This paper particularly analyzes the multi-criteria evaluation (MCE) method of landfill site selection for solid waste management by means of literature reviews and surveys. The study will help the decision makers and waste management authorities to choose the most effective method when considering landfill site selection.

  20. Infrastructure to support learning health systems: are we there yet? Innovative solutions and lessons learned from American Recovery and Reinvestment Act CER investments.

    PubMed

    Holve, Erin; Segal, Courtney

    2014-11-01

    The 11 big health data networks participating in the AcademyHealth Electronic Data Methods Forum represent cutting-edge efforts to harness the power of big health data for research and quality improvement. This paper is a comparative case study based on site visits conducted with a subset of these large infrastructure grants funded through the Recovery Act, in which four key issues emerge that can inform the evolution of learning health systems, including the importance of acknowledging the challenges of scaling specialized expertise needed to manage and run CER networks; the delicate balance between privacy protections and the utility of distributed networks; emerging community engagement strategies; and the complexities of developing a robust business model for multi-use networks.

  1. Dynamic occupancy models for explicit colonization processes

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Johnson, Devin S.; Altwegg, Res; Conquest, Loveday

    2016-01-01

    The dynamic, multi-season occupancy model framework has become a popular tool for modeling open populations with occupancies that change over time through local colonizations and extinctions. However, few versions of the model relate these probabilities to the occupancies of neighboring sites or patches. We present a modeling framework that incorporates this information and is capable of describing a wide variety of spatiotemporal colonization and extinction processes. A key feature of the model is that it is based on a simple set of small-scale rules describing how the process evolves. The result is a dynamic process that can account for complicated large-scale features. In our model, a site is more likely to be colonized if more of its neighbors were previously occupied and if it provides more appealing environmental characteristics than its neighboring sites. Additionally, a site without occupied neighbors may also become colonized through the inclusion of a long-distance dispersal process. Although similar model specifications have been developed for epidemiological applications, ours formally accounts for detectability using the well-known occupancy modeling framework. After demonstrating the viability and potential of this new form of dynamic occupancy model in a simulation study, we use it to obtain inference for the ongoing Common Myna (Acridotheres tristis) invasion in South Africa. Our results suggest that the Common Myna continues to enlarge its distribution and its spread via short distance movement, rather than long-distance dispersal. Overall, this new modeling framework provides a powerful tool for managers examining the drivers of colonization including short- vs. long-distance dispersal, habitat quality, and distance from source populations.

  2. Nest-site selection and reproductive success of greater sage-grouse in a fire-affected habitat of northwestern Nevada

    USGS Publications Warehouse

    Lockyer, Zachary B.; Coates, Peter S.; Casazza, Michael L.; Espinosa, Shawn; Delehanty, David J.

    2015-01-01

    Identifying links between micro-habitat selection and wildlife reproduction is imperative to population persistence and recovery. This information is particularly important for landscape species such as greater sage-grouse (Centrocercus urophasianus; sage-grouse). Although this species has been widely studied, because environmental factors can affect sage-grouse populations, local and regional studies are crucial for developing viable conservation strategies. We studied the habitat-use patterns of 71 radio-marked sage-grouse inhabiting an area affected by wildfire in the Virginia Mountains of northwestern Nevada during 2009–2011 to determine the effect of micro-habitat attributes on reproductive success. We measured standard vegetation parameters at nest and random sites using a multi-scale approach (range = 0.01–15,527 ha). We used an information-theoretic modeling approach to identify environmental factors influencing nest-site selection and survival, and determine whether nest survival was a function of resource selection. Sage-grouse selected micro-sites with greater shrub canopy cover and less cheatgrass (Bromus tectorum) cover than random sites. Total shrub canopy, including sagebrush (Artemisia spp.) and other shrub species, at small spatial scales (0.8 ha and 3.1 ha) was the single contributing selection factor to higher nest survival. These results indicate that reducing the risk of wildfire to maintain important sagebrush habitats could be emphasized in sage-grouse conservation strategies in Nevada. Managers may seek to mitigate the influx of annual grass invasion by preserving large intact sagebrush-dominated stands with a mixture of other shrub species. For this area of Nevada, the results suggest that ≥40% total shrub canopy cover in sage-grouse nesting areas could yield improved reproductive success. 

  3. Multi-scale approaches for high-speed imaging and analysis of large neural populations

    PubMed Central

    Ahrens, Misha B.; Yuste, Rafael; Peterka, Darcy S.; Paninski, Liam

    2017-01-01

    Progress in modern neuroscience critically depends on our ability to observe the activity of large neuronal populations with cellular spatial and high temporal resolution. However, two bottlenecks constrain efforts towards fast imaging of large populations. First, the resulting large video data is challenging to analyze. Second, there is an explicit tradeoff between imaging speed, signal-to-noise, and field of view: with current recording technology we cannot image very large neuronal populations with simultaneously high spatial and temporal resolution. Here we describe multi-scale approaches for alleviating both of these bottlenecks. First, we show that spatial and temporal decimation techniques based on simple local averaging provide order-of-magnitude speedups in spatiotemporally demixing calcium video data into estimates of single-cell neural activity. Second, once the shapes of individual neurons have been identified at fine scale (e.g., after an initial phase of conventional imaging with standard temporal and spatial resolution), we find that the spatial/temporal resolution tradeoff shifts dramatically: after demixing we can accurately recover denoised fluorescence traces and deconvolved neural activity of each individual neuron from coarse scale data that has been spatially decimated by an order of magnitude. This offers a cheap method for compressing this large video data, and also implies that it is possible to either speed up imaging significantly, or to “zoom out” by a corresponding factor to image order-of-magnitude larger neuronal populations with minimal loss in accuracy or temporal resolution. PMID:28771570

  4. Tracing Multi-Scale Climate Change at Low Latitude from Glacier Shrinkage

    NASA Astrophysics Data System (ADS)

    Moelg, T.; Cullen, N. J.; Hardy, D. R.; Kaser, G.

    2009-12-01

    Significant shrinkage of glaciers on top of Africa's highest mountain (Kilimanjaro, 5895 m a.s.l.) has been observed between the late 19th century and the present. Multi-year data from our automatic weather station on the largest remaining slope glacier at 5873 m allow us to force and verify a process-based distributed glacier mass balance model. This generates insights into energy and mass fluxes at the glacier-atmosphere interface, their feedbacks, and how they are linked to atmospheric conditions. By means of numerical atmospheric modeling and global climate model simulations, we explore the linkages of the local climate in Kilimanjaro's summit zone to larger-scale climate dynamics - which suggests a causal connection between Indian Ocean dynamics, mesoscale mountain circulation, and glacier mass balance. Based on this knowledge, the verified mass balance model is used for backward modeling of the steady-state glacier extent observed in the 19th century, which yields the characteristics of local climate change between that time and the present (30-45% less precipitation, 0.1-0.3 hPa less water vapor pressure, 2-4 percentage units less cloud cover at present). Our multi-scale approach provides an important contribution, from a cryospheric viewpoint, to the understanding of how large-scale climate change propagates to the tropical free troposphere. Ongoing work in this context targets the millennium-scale relation between large-scale climate and glacier behavior (by downscaling precipitation), and the possible effects of regional anthropogenic activities (land use change) on glacier mass balance.

  5. Uncertainties of Large-Scale Forcing Caused by Surface Turbulence Flux Measurements and the Impacts on Cloud Simulations at the ARM SGP Site

    NASA Astrophysics Data System (ADS)

    Tang, S.; Xie, S.; Tang, Q.; Zhang, Y.

    2017-12-01

    Two types of instruments, the eddy correlation flux measurement system (ECOR) and the energy balance Bowen ratio system (EBBR), are used at the Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP) site to measure surface latent and sensible fluxes. ECOR and EBBR typically sample different land surface types, and the domain-mean surface fluxes derived from ECOR and EBBR are not always consistent. The uncertainties of the surface fluxes will have impacts on the derived large-scale forcing data and further affect the simulations of single-column models (SCM), cloud-resolving models (CRM) and large-eddy simulation models (LES), especially for the shallow-cumulus clouds which are mainly driven by surface forcing. This study aims to quantify the uncertainties of the large-scale forcing caused by surface turbulence flux measurements and investigate the impacts on cloud simulations using long-term observations from the ARM SGP site.

  6. Process, pattern and scale: hydrogeomorphology and plant diversity in forested wetlands across multiple spatial scales

    NASA Astrophysics Data System (ADS)

    Alexander, L.; Hupp, C. R.; Forman, R. T.

    2002-12-01

    Many geodisturbances occur across large spatial scales, spanning entire landscapes and creating ecological phenomena in their wake. Ecological study at large scales poses special problems: (1) large-scale studies require large-scale resources, and (2) sampling is not always feasible at the appropriate scale, and researchers rely on data collected at smaller scales to interpret patterns across broad regions. A criticism of landscape ecology is that findings at small spatial scales are "scaled up" and applied indiscriminately across larger spatial scales. In this research, landscape scaling is addressed through process-pattern relationships between hydrogeomorphic processes and patterns of plant diversity in forested wetlands. The research addresses: (1) whether patterns and relationships between hydrogeomorphic, vegetation, and spatial variables can transcend scale; and (2) whether data collected at small spatial scales can be used to describe patterns and relationships across larger spatial scales. Field measurements of hydrologic, geomorphic, spatial, and vegetation data were collected or calculated for 15- 1-ha sites on forested floodplains of six (6) Chesapeake Bay Coastal Plain streams over a total area of about 20,000 km2. Hydroperiod (day/yr), floodplain surface elevation range (m), discharge (m3/s), stream power (kg-m/s2), sediment deposition (mm/yr), relative position downstream and other variables were used in multivariate analyses to explain differences in species richness, tree diversity (Shannon-Wiener Diversity Index H'), and plant community composition at four spatial scales. Data collected at the plot (400-m2) and site- (c. 1-ha) scales are applied to and tested at the river watershed and regional spatial scales. Results indicate that plant species richness and tree diversity (Shannon-Wiener diversity index H') can be described by hydrogeomorphic conditions at all scales, but are best described at the site scale. Data collected at plot and site scales are tested for spatial heterogeneity across the Chesapeake Bay Coastal Plain using a geostatistical variogram, and multiple regression analysis is used to relate plant diversity, spatial, and hydrogeomorphic variables across Coastal Plain regions and hydrologic regimes. Results indicate that relationships between hydrogeomorphic processes and patterns of plant diversity at finer scales can proxy relationships at coarser scales in some, not all, cases. Findings also suggest that data collected at small scales can be used to describe trends across broader scales under limited conditions.

  7. Systems Engineering Challenges for GSFC Space Science Mission Operations

    NASA Technical Reports Server (NTRS)

    Thienel, Julie; Harman, Richard R.

    2017-01-01

    The NASA Goddard Space Flight Center Space Science Mission Operations (SSMO) project currently manages19 missions for the NASA Science Mission Directorate, within the Planetary, Astrophysics, and Heliophysics Divisions. The mission lifespans range from just a few months to more than20 years. The WIND spacecraft, the oldest SSMO mission, was launched in 1994. SSMO spacecraft reside in low earth, geosynchronous,highly elliptical, libration point, lunar, heliocentric,and Martian orbits. SSMO spacecraft range in size from 125kg (Aeronomy of Ice in the Mesosphere (AIM)) to over 4000kg (Fermi Gamma-Ray Space Telescope (Fermi)). The attitude modes include both spin and three-axis stabilized, with varying requirements on pointing accuracy. The spacecraft are operated from control centers at Goddard and off-site control centers;the Lunar Reconnaissance Orbiter (LRO), the Solar Dynamics Observatory (SDO) and Magnetospheric MultiScale (MMS)mission were built at Goddard. The Advanced Composition Explorer (ACE) and Wind are operated out of a multi-mission operations center, which will also host several SSMO-managed cubesats in 2017. This paper focuses on the systems engineeringchallenges for such a large and varied fleet of spacecraft.

  8. Optimal Multi-scale Demand-side Management for Continuous Power-Intensive Processes

    NASA Astrophysics Data System (ADS)

    Mitra, Sumit

    With the advent of deregulation in electricity markets and an increasing share of intermittent power generation sources, the profitability of industrial consumers that operate power-intensive processes has become directly linked to the variability in energy prices. Thus, for industrial consumers that are able to adjust to the fluctuations, time-sensitive electricity prices (as part of so-called Demand-Side Management (DSM) in the smart grid) offer potential economical incentives. In this thesis, we introduce optimization models and decomposition strategies for the multi-scale Demand-Side Management of continuous power-intensive processes. On an operational level, we derive a mode formulation for scheduling under time-sensitive electricity prices. The formulation is applied to air separation plants and cement plants to minimize the operating cost. We also describe how a mode formulation can be used for industrial combined heat and power plants that are co-located at integrated chemical sites to increase operating profit by adjusting their steam and electricity production according to their inherent flexibility. Furthermore, a robust optimization formulation is developed to address the uncertainty in electricity prices by accounting for correlations and multiple ranges in the realization of the random variables. On a strategic level, we introduce a multi-scale model that provides an understanding of the value of flexibility of the current plant configuration and the value of additional flexibility in terms of retrofits for Demand-Side Management under product demand uncertainty. The integration of multiple time scales leads to large-scale two-stage stochastic programming problems, for which we need to apply decomposition strategies in order to obtain a good solution within a reasonable amount of time. Hence, we describe two decomposition schemes that can be applied to solve two-stage stochastic programming problems: First, a hybrid bi-level decomposition scheme with novel Lagrangean-type and subset-type cuts to strengthen the relaxation. Second, an enhanced cross-decomposition scheme that integrates Benders decomposition and Lagrangean decomposition on a scenario basis. To demonstrate the effectiveness of our developed methodology, we provide several industrial case studies throughout the thesis.

  9. Preliminary results on the fracture analysis of multi-site cracking of lap joints in aircraft skins

    NASA Astrophysics Data System (ADS)

    Beuth, J. L., Jr.; Hutchinson, John W.

    1992-07-01

    Results of a fracture mechanics analysis relevant to fatigue crack growth at rivets in lap joints of aircraft skins are presented. Multi-site damage (MSD) is receiving increased attention within the context of problems of aging aircraft. Fracture analyses previously carried out include small-scale modeling of rivet/skin interactions, larger-scale two-dimensional models of lap joints similar to that developed here, and full scale three-dimensional models of large portions of the aircraft fuselage. Fatigue testing efforts have included flat coupon specimens, two-dimensional lap joint tests, and full scale tests on specimens designed to closely duplicate aircraft sections. Most of this work is documented in the proceedings of previous symposia on the aging aircraft problem. The effect MSD has on the ability of skin stiffeners to arrest the growth of long skin cracks is a particularly important topic that remains to be addressed. One of the most striking features of MSD observed in joints of some test sections and in the joints of some of the older aircraft fuselages is the relative uniformity of the fatigue cracks from rivet to rivet along an extended row of rivets. This regularity suggests that nucleation of the cracks must not be overly difficult. Moreover, it indicates that there is some mechanism which keeps longer cracks from running away from shorter ones, or, equivalently, a mechanism for shorter cracks to catch-up with longer cracks. This basic mechanism has not been identified, and one of the objectives of the work is to see to what extent the mechanism is revealed by a fracture analysis of the MSD cracks. Another related aim is to present accurate stress intensity factor variations with crack length which can be used to estimate fatigue crack growth lifetimes once cracks have been initiated. Results are presented which illustrate the influence of load shedding from rivets with long cracks to neighboring rivets with shorter cracks. Results are also included for the effect of residual stress due to the riveting process itself.

  10. Preliminary results on the fracture analysis of multi-site cracking of lap joints in aircraft skins

    NASA Technical Reports Server (NTRS)

    Beuth, J. L., Jr.; Hutchinson, John W.

    1992-01-01

    Results of a fracture mechanics analysis relevant to fatigue crack growth at rivets in lap joints of aircraft skins are presented. Multi-site damage (MSD) is receiving increased attention within the context of problems of aging aircraft. Fracture analyses previously carried out include small-scale modeling of rivet/skin interactions, larger-scale two-dimensional models of lap joints similar to that developed here, and full scale three-dimensional models of large portions of the aircraft fuselage. Fatigue testing efforts have included flat coupon specimens, two-dimensional lap joint tests, and full scale tests on specimens designed to closely duplicate aircraft sections. Most of this work is documented in the proceedings of previous symposia on the aging aircraft problem. The effect MSD has on the ability of skin stiffeners to arrest the growth of long skin cracks is a particularly important topic that remains to be addressed. One of the most striking features of MSD observed in joints of some test sections and in the joints of some of the older aircraft fuselages is the relative uniformity of the fatigue cracks from rivet to rivet along an extended row of rivets. This regularity suggests that nucleation of the cracks must not be overly difficult. Moreover, it indicates that there is some mechanism which keeps longer cracks from running away from shorter ones, or, equivalently, a mechanism for shorter cracks to catch-up with longer cracks. This basic mechanism has not been identified, and one of the objectives of the work is to see to what extent the mechanism is revealed by a fracture analysis of the MSD cracks. Another related aim is to present accurate stress intensity factor variations with crack length which can be used to estimate fatigue crack growth lifetimes once cracks have been initiated. Results are presented which illustrate the influence of load shedding from rivets with long cracks to neighboring rivets with shorter cracks. Results are also included for the effect of residual stress due to the riveting process itself.

  11. Silviculture and multi-resource management case studies for southwestern pinyon-juniper woodlands

    Treesearch

    Gerald J. Gottfried

    2008-01-01

    Southwestern pinyon-juniper and juniper woodlands cover large areas of the Western United States. The woodlands are heterogeneous, consisting of numerous combinations of tree, shrub, and herbaceous species and stand densities that are representative of the wide range of sites and habitat types they occupy. Silvicultural methods can be employed on better sites to meet...

  12. Multi-focal multiphoton lithography.

    PubMed

    Ritschdorff, Eric T; Nielson, Rex; Shear, Jason B

    2012-03-07

    Multiphoton lithography (MPL) provides unparalleled capabilities for creating high-resolution, three-dimensional (3D) materials from a broad spectrum of building blocks and with few limitations on geometry, qualities that have been key to the design of chemically, mechanically, and biologically functional microforms. Unfortunately, the reliance of MPL on laser scanning limits the speed at which fabrication can be performed, making it impractical in many instances to produce large-scale, high-resolution objects such as complex micromachines, 3D microfluidics, etc. Previously, others have demonstrated the possibility of using multiple laser foci to simultaneously perform MPL at numerous sites in parallel, but use of a stage-scanning system to specify fabrication coordinates resulted in the production of identical features at each focal position. As a more general solution to the bottleneck problem, we demonstrate here the feasibility for performing multi-focal MPL using a dynamic mask to differentially modulate foci, an approach that enables each fabrication site to create independent (uncorrelated) features within a larger, integrated microform. In this proof-of-concept study, two simultaneously scanned foci produced the expected two-fold decrease in fabrication time, and this approach could be readily extended to many scanning foci by using a more powerful laser. Finally, we show that use of multiple foci in MPL can be exploited to assign heterogeneous properties (such as differential swelling) to micromaterials at distinct positions within a fabrication zone.

  13. Multi-format all-optical processing based on a large-scale, hybridly integrated photonic circuit.

    PubMed

    Bougioukos, M; Kouloumentas, Ch; Spyropoulou, M; Giannoulis, G; Kalavrouziotis, D; Maziotis, A; Bakopoulos, P; Harmon, R; Rogers, D; Harrison, J; Poustie, A; Maxwell, G; Avramopoulos, H

    2011-06-06

    We investigate through numerical studies and experiments the performance of a large scale, silica-on-silicon photonic integrated circuit for multi-format regeneration and wavelength-conversion. The circuit encompasses a monolithically integrated array of four SOAs inside two parallel Mach-Zehnder structures, four delay interferometers and a large number of silica waveguides and couplers. Exploiting phase-incoherent techniques, the circuit is capable of processing OOK signals at variable bit rates, DPSK signals at 22 or 44 Gb/s and DQPSK signals at 44 Gbaud. Simulation studies reveal the wavelength-conversion potential of the circuit with enhanced regenerative capabilities for OOK and DPSK modulation formats and acceptable quality degradation for DQPSK format. Regeneration of 22 Gb/s OOK signals with amplified spontaneous emission (ASE) noise and DPSK data signals degraded with amplitude, phase and ASE noise is experimentally validated demonstrating a power penalty improvement up to 1.5 dB.

  14. ACCURATE CHEMICAL MASTER EQUATION SOLUTION USING MULTI-FINITE BUFFERS

    PubMed Central

    Cao, Youfang; Terebus, Anna; Liang, Jie

    2016-01-01

    The discrete chemical master equation (dCME) provides a fundamental framework for studying stochasticity in mesoscopic networks. Because of the multi-scale nature of many networks where reaction rates have large disparity, directly solving dCMEs is intractable due to the exploding size of the state space. It is important to truncate the state space effectively with quantified errors, so accurate solutions can be computed. It is also important to know if all major probabilistic peaks have been computed. Here we introduce the Accurate CME (ACME) algorithm for obtaining direct solutions to dCMEs. With multi-finite buffers for reducing the state space by O(n!), exact steady-state and time-evolving network probability landscapes can be computed. We further describe a theoretical framework of aggregating microstates into a smaller number of macrostates by decomposing a network into independent aggregated birth and death processes, and give an a priori method for rapidly determining steady-state truncation errors. The maximal sizes of the finite buffers for a given error tolerance can also be pre-computed without costly trial solutions of dCMEs. We show exactly computed probability landscapes of three multi-scale networks, namely, a 6-node toggle switch, 11-node phage-lambda epigenetic circuit, and 16-node MAPK cascade network, the latter two with no known solutions. We also show how probabilities of rare events can be computed from first-passage times, another class of unsolved problems challenging for simulation-based techniques due to large separations in time scales. Overall, the ACME method enables accurate and efficient solutions of the dCME for a large class of networks. PMID:27761104

  15. Global calibration of multi-cameras with non-overlapping fields of view based on photogrammetry and reconfigurable target

    NASA Astrophysics Data System (ADS)

    Xia, Renbo; Hu, Maobang; Zhao, Jibin; Chen, Songlin; Chen, Yueling

    2018-06-01

    Multi-camera vision systems are often needed to achieve large-scale and high-precision measurement because these systems have larger fields of view (FOV) than a single camera. Multiple cameras may have no or narrow overlapping FOVs in many applications, which pose a huge challenge to global calibration. This paper presents a global calibration method for multi-cameras without overlapping FOVs based on photogrammetry technology and a reconfigurable target. Firstly, two planar targets are fixed together and made into a long target according to the distance between the two cameras to be calibrated. The relative positions of the two planar targets can be obtained by photogrammetric methods and used as invariant constraints in global calibration. Then, the reprojection errors of target feature points in the two cameras’ coordinate systems are calculated at the same time and optimized by the Levenberg–Marquardt algorithm to find the optimal solution of the transformation matrix between the two cameras. Finally, all the camera coordinate systems are converted to the reference coordinate system in order to achieve global calibration. Experiments show that the proposed method has the advantages of high accuracy (the RMS error is 0.04 mm) and low cost and is especially suitable for on-site calibration.

  16. ESRI applications of GIS technology: Mineral resource development

    NASA Technical Reports Server (NTRS)

    Derrenbacher, W.

    1981-01-01

    The application of geographic information systems technology to large scale regional assessment related to mineral resource development, identifying candidate sites for related industry, and evaluating sites for waste disposal is discussed. Efforts to develop data bases were conducted at scales ranging from 1:3,000,000 to 1:25,000. In several instances, broad screening was conducted for large areas at a very general scale with more detailed studies subsequently undertaken in promising areas windowed out of the generalized data base. Increasingly, the systems which are developed are structured as the spatial framework for the long-term collection, storage, referencing, and retrieval of vast amounts of data about large regions. Typically, the reconnaissance data base for a large region is structured at 1:250,000 scale, data bases for smaller areas being structured at 1:25,000, 1:50,000 or 1:63,360. An integrated data base for the coterminous US was implemented at a scale of 1:3,000,000 for two separate efforts.

  17. Large-scale standardized phenotyping of strawberry in RosBREED

    USDA-ARS?s Scientific Manuscript database

    A large, multi-institutional, international, research project with the goal of bringing genomicists and plant breeders together was funded by USDA-NIFA Specialty Crop Research Initiative. Apple, cherry, peach, and strawberry are the Rosaceous crops included in the project. Many (900+) strawberry g...

  18. Evaluating a community-based program to improve healthcare quality: research design for the Aligning Forces for Quality initiative.

    PubMed

    Scanlon, Dennis P; Alexander, Jeffrey A; Beich, Jeff; Christianson, Jon B; Hasnain-Wynia, Romana; McHugh, Megan C; Mittler, Jessica N; Shi, Yunfeng; Bodenschatz, Laura J

    2012-09-01

    The Aligning Forces for Quality (AF4Q) initiative is the Robert Wood Johnson Foundation's (RWJF's) signature effort to increase the overall quality of healthcare in targeted communities throughout the country. In addition to sponsoring this 16-site, complex program, the RWJF funds an independent scientific evaluation to support objective research on the initiative's effectiveness and contributions to basic knowledge in 5 core programmatic areas. The research design, data, and challenges faced in the evaluation of this 10-year initiative are discussed. A descriptive overview of the evaluation research design for a multi-site, community based, healthcare quality improvement initiative is provided. The multiphase research design employed by the evaluation team is discussed. Evaluation provides formative feedback to the RWJF, participants, and other interested audiences in real time; develops approaches to assess innovative and under-studied interventions; furthers the analysis and understanding of effective community-based collaborative work in healthcare; and helps to differentiate the various facilitators, barriers, and contextual dimensions that affect the implementation and outcomes of community-based health interventions. The AF4Q initiative is arguably the largest community-level healthcare improvement demonstration in the United States to date; it is being implemented at a time of rapid change in national healthcare policy. The implementation of large-scale, multi-site initiatives is becoming an increasingly common approach for addressing problems in healthcare. The evaluation research design for the AF4Q initiative, and the lessons learned from its approach, may be valuable to others tasked with evaluating similar community-based initiatives.

  19. Multi-resource and multi-scale approaches for meeting the challenge of managing multiple species

    Treesearch

    Frank R. Thompson; Deborah M. Finch; John R. Probst; Glen D. Gaines; David S. Dobkin

    1999-01-01

    The large number of Neotropical migratory bird (NTMB) species and their diverse habitat requirements create conflicts and difficulties for land managers and conservationists. We provide examples of assessments or conservation efforts that attempt to address the problem of managing for multiple NTMB species. We advocate approaches at a variety of spatial and geographic...

  20. Optical interconnect for large-scale systems

    NASA Astrophysics Data System (ADS)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  1. Connecting Smartphone and Wearable Fitness Tracker Data with a Nationally Used Electronic Health Record System for Diabetes Education to Facilitate Behavioral Goal Monitoring in Diabetes Care: Protocol for a Pragmatic Multi-Site Randomized Trial

    PubMed Central

    Coleman, Deidra Carroll; Kanter, Justin; Ummer, Brad; Siminerio, Linda

    2018-01-01

    Background Mobile and wearable technology have been shown to be effective in improving diabetes self-management; however, integrating data from these technologies into clinical diabetes care to facilitate behavioral goal monitoring has not been explored. Objective The objective of this paper is to report on a study protocol for a pragmatic multi-site trial along with the intervention components, including the detailed connected health interface. This interface was developed to integrate patient self-monitoring data collected from a wearable fitness tracker and its companion smartphone app to an electronic health record system for diabetes self-management education and support (DSMES) to facilitate behavioral goal monitoring. Methods A 3-month multi-site pragmatic clinical trial was conducted with eligible patients with diabetes mellitus from DSMES programs. The Chronicle Diabetes system is currently freely available to diabetes educators through American Diabetes Association–recognized DSMES programs to set patient nutrition and physical activity goals. To integrate the goal-setting and self-monitoring intervention into the DSMES process, a connected interface in the Chronicle Diabetes system was developed. With the connected interface, patient self-monitoring information collected from smartphones and wearable fitness trackers can facilitate educators’ monitoring of patients’ adherence to their goals. Feasibility outcomes of the 3-month trial included hemoglobin A1c levels, weight, and the usability of the connected system. Results An interface designed to connect data from a wearable fitness tracker with a companion smartphone app for nutrition and physical activity self-monitoring into a diabetes education electronic health record system was successfully developed to enable diabetes educators to facilitate goal setting and monitoring. A total of 60 eligible patients with type 2 diabetes mellitus were randomized into either group 1) standard diabetes education or 2) standard education enhanced with the connected system. Data collection for the 3-month pragmatic trial is completed. Data analysis is in progress. Conclusions If results of the pragmatic multi-site clinical trial show preliminary efficacy and usability of the connected system, a large-scale implementation trial will be conducted. Trial Registration ClinicalTrials.gov NCT02664233; https://clinicaltrials.gov/ct2/show/NCT02664233 (Archived by WebCite at http://www.webcitation.org/6yDEwXHo5) PMID:29610111

  2. Seasonal and Diel Vocalization Patterns of Antarctic Blue Whale (Balaenoptera musculus intermedia) in the Southern Indian Ocean: A Multi-Year and Multi-Site Study.

    PubMed

    Leroy, Emmanuelle C; Samaran, Flore; Bonnel, Julien; Royer, Jean-Yves

    2016-01-01

    Passive acoustic monitoring is an efficient way to provide insights on the ecology of large whales. This approach allows for long-term and species-specific monitoring over large areas. In this study, we examined six years (2010 to 2015) of continuous acoustic recordings at up to seven different locations in the Central and Southern Indian Basin to assess the peak periods of presence, seasonality and migration movements of Antarctic blue whales (Balaenoptera musculus intermedia). An automated method is used to detect the Antarctic blue whale stereotyped call, known as Z-call. Detection results are analyzed in terms of distribution, seasonal presence and diel pattern of emission at each site. Z-calls are detected year-round at each site, except for one located in the equatorial Indian Ocean, and display highly seasonal distribution. This seasonality is stable across years for every site, but varies between sites. Z-calls are mainly detected during autumn and spring at the subantarctic locations, suggesting that these sites are on the Antarctic blue whale migration routes, and mostly during winter at the subtropical sites. In addition to these seasonal trends, there is a significant diel pattern in Z-call emission, with more Z-calls in daytime than in nighttime. This diel pattern may be related to the blue whale feeding ecology.

  3. Seasonal and Diel Vocalization Patterns of Antarctic Blue Whale (Balaenoptera musculus intermedia) in the Southern Indian Ocean: A Multi-Year and Multi-Site Study

    PubMed Central

    Leroy, Emmanuelle C.; Samaran, Flore; Bonnel, Julien; Royer, Jean-Yves

    2016-01-01

    Passive acoustic monitoring is an efficient way to provide insights on the ecology of large whales. This approach allows for long-term and species-specific monitoring over large areas. In this study, we examined six years (2010 to 2015) of continuous acoustic recordings at up to seven different locations in the Central and Southern Indian Basin to assess the peak periods of presence, seasonality and migration movements of Antarctic blue whales (Balaenoptera musculus intermedia). An automated method is used to detect the Antarctic blue whale stereotyped call, known as Z-call. Detection results are analyzed in terms of distribution, seasonal presence and diel pattern of emission at each site. Z-calls are detected year-round at each site, except for one located in the equatorial Indian Ocean, and display highly seasonal distribution. This seasonality is stable across years for every site, but varies between sites. Z-calls are mainly detected during autumn and spring at the subantarctic locations, suggesting that these sites are on the Antarctic blue whale migration routes, and mostly during winter at the subtropical sites. In addition to these seasonal trends, there is a significant diel pattern in Z-call emission, with more Z-calls in daytime than in nighttime. This diel pattern may be related to the blue whale feeding ecology. PMID:27828976

  4. Spreaders and Sponges define metastasis in lung cancer: A Markov chain Monte Carlo Mathematical Model

    PubMed Central

    Newton, Paul K.; Mason, Jeremy; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Norton, Larry; Kuhn, Peter

    2013-01-01

    The classic view of metastatic cancer progression is that it is a unidirectional process initiated at the primary tumor site, progressing to variably distant metastatic sites in a fairly predictable, though not perfectly understood, fashion. A Markov chain Monte Carlo mathematical approach can determine a pathway diagram that classifies metastatic tumors as ‘spreaders’ or ‘sponges’ and orders the timescales of progression from site to site. In light of recent experimental evidence highlighting the potential significance of self-seeding of primary tumors, we use a Markov chain Monte Carlo (MCMC) approach, based on large autopsy data sets, to quantify the stochastic, systemic, and often multi-directional aspects of cancer progression. We quantify three types of multi-directional mechanisms of progression: (i) self-seeding of the primary tumor; (ii) re-seeding of the primary tumor from a metastatic site (primary re-seeding); and (iii) re-seeding of metastatic tumors (metastasis re-seeding). The model shows that the combined characteristics of the primary and the first metastatic site to which it spreads largely determine the future pathways and timescales of systemic disease. For lung cancer, the main ‘spreaders’ of systemic disease are the adrenal gland and kidney, whereas the main ‘sponges’ are regional lymph nodes, liver, and bone. Lung is a significant self-seeder, although it is a ‘sponge’ site with respect to progression characteristics. PMID:23447576

  5. A Systematic Multi-Time Scale Solution for Regional Power Grid Operation

    NASA Astrophysics Data System (ADS)

    Zhu, W. J.; Liu, Z. G.; Cheng, T.; Hu, B. Q.; Liu, X. Z.; Zhou, Y. F.

    2017-10-01

    Many aspects need to be taken into consideration in a regional grid while making schedule plans. In this paper, a systematic multi-time scale solution for regional power grid operation considering large scale renewable energy integration and Ultra High Voltage (UHV) power transmission is proposed. In the time scale aspect, we discuss the problem from month, week, day-ahead, within-day to day-behind, and the system also contains multiple generator types including thermal units, hydro-plants, wind turbines and pumped storage stations. The 9 subsystems of the scheduling system are described, and their functions and relationships are elaborated. The proposed system has been constructed in a provincial power grid in Central China, and the operation results further verified the effectiveness of the system.

  6. Profitability and sustainability of small - medium scale palm biodiesel plant

    NASA Astrophysics Data System (ADS)

    Solikhah, Maharani Dewi; Kismanto, Agus; Raksodewanto, Agus; Peryoga, Yoga

    2017-06-01

    The mandatory of biodiesel application at 20% blending (B20) has been started since January 2016. It creates huge market for biodiesel industry. To build large-scale biodiesel plant (> 100,000 tons/year) is most favorable for biodiesel producers since it can give lower production cost. This cost becomes a challenge for small - medium scale biodiesel plants. However, current biodiesel plants in Indonesia are located mainly in Java and Sumatra, which then distribute biodiesel around Indonesia so that there is an additional cost for transportation from area to area. This factor becomes an opportunity for the small - medium scale biodiesel plants to compete with the large one. This paper discusses the profitability of small - medium scale biodiesel plants conducted on a capacity of 50 tons/day using CPO and its derivatives. The study was conducted by performing economic analysis between scenarios of biodiesel plant that using raw material of stearin, PFAD, and multi feedstock. Comparison on the feasibility of scenarios was also conducted on the effect of transportation cost and selling price. The economic assessment shows that profitability is highly affected by raw material price so that it is important to secure the source of raw materials and consider a multi-feedstock type for small - medium scale biodiesel plants to become a sustainable plant. It was concluded that the small - medium scale biodiesel plants will be profitable and sustainable if they are connected to palm oil mill, have a captive market, and are located minimally 200 km from other biodiesel plants. The use of multi feedstock could increase IRR from 18.68 % to 56.52 %.

  7. Centennial-scale winter climate variability over the last two millennia in the northern Gulf of Mexico based on paired δ18O and Mg/Ca in Globorotalia truncatulinoides

    NASA Astrophysics Data System (ADS)

    Fortiz, V.; Thirumalai, K.; Richey, J. N.; Quinn, T. M.

    2014-12-01

    We present a replicated record of paired foraminiferal δ18O and Mg/Ca variations in multi-cores collected from the Garrison Basin (26º43'N, 93º55'W) in the northern Gulf of Mexico (GOM). Using δ18O (sea surface temperature, SST; sea surface salinity, SSS proxy) and Mg/Ca (SST proxy) variations in non-encrusted planktic foraminifer Globorotalia truncatulinoides we produce time series spanning the last two millennia that is characterized by centennial-scale climate variability. We interpret geochemical variations in G. truncatulinoides to reflect winter climate variability because data from a sediment trap, located ~350 km east of the core site, reveal that annual flux of G. truncatulinoides is heavily weighted towards winter (peak production in January-February; Spear et al., 2011). Similar centennial-scale variability is also observed in the foraminiferal geochemistry of Globigerinoides ruber in the same multi-cores, which likely reflect mean annual climate variations. Our replicated results and comparisons to other SST reconstructions from the region lend confidence that the northern GOM surface ocean underwent large, centennial-scale variability, most likely dominated by changes in winter climate. This variability occurred in a time period where climate forcing is small and background conditions are similar to pre-industrial times. References: Spear, J.W.; Poore, R.Z., and Quinn, T.M., 2011, Globorotalia truncatulinoides (dextral) Mg/Ca as a proxy for Gulf of Mexico winter mixed-layer temperature: Evidence from a sediment trap in the northern Gulf of Mexico. Marine Micropaleontology, 80, 53-61.

  8. Ice911 Research: Preserving and Rebuilding Multi-Year Ice

    NASA Astrophysics Data System (ADS)

    Field, L. A.; Chetty, S.; Manzara, A.

    2013-12-01

    A localized surface albedo modification technique is being developed that shows promise as a method to increase multi-year ice using reflective floating materials, chosen so as to have low subsidiary environmental impact. Multi-year ice has diminished rapidly in the Arctic over the past 3 decades (Riihela et al, Nature Climate Change, August 4, 2013) and this plays a part in the continuing rapid decrease of summer-time ice. As summer-time ice disappears, the Arctic is losing its ability to act as the earth's refrigeration system, and this has widespread climatic effects, as well as a direct effect on sea level rise, as oceans heat, and once-land-based ice melts into the sea. We have tested the albedo modification technique on a small scale over five Winter/Spring seasons at sites including California's Sierra Nevada Mountains, a Canadian lake, and a small man-made lake in Minnesota, using various materials and an evolving array of instrumentation. The materials can float and can be made to minimize effects on marine habitat and species. The instrumentation is designed to be deployed in harsh and remote locations. Localized snow and ice preservation, and reductions in water heating, have been quantified in small-scale testing. Climate modeling is underway to analyze the effects of this method of surface albedo modification in key areas on the rate of oceanic and atmospheric temperature rise. We are also evaluating the effects of snow and ice preservation for protection of infrastructure and habitat stabilization. This paper will also discuss a possible reduction of sea level rise with an eye to quantification of cost/benefit. The most recent season's experimentation on a man-made private lake in Minnesota saw further evolution in the material and deployment approach. The materials were successfully deployed to shield underlying snow and ice from melting; applications of granular materials remained stable in the face of local wind and storms. Localized albedo modification options such as the one being studied in this work may act to preserve ice, glaciers, permafrost and seasonal snow areas, and perhaps aid natural ice formation processes. If this method could be deployed on a large enough scale, it could conceivably bring about a reduction in the Ice-Albedo Feedback Effect, possibly slowing one of the key effects and factors in climate change. Test site at man-made lake in Minnesota 2013

  9. Predicting agricultural impacts of large-scale drought: 2012 and the case for better modeling

    USDA-ARS?s Scientific Manuscript database

    We present an example of a simulation-based forecast for the 2012 U.S. maize growing season produced as part of a high-resolution, multi-scale, predictive mechanistic modeling study designed for decision support, risk management, and counterfactual analysis. The simulations undertaken for this analy...

  10. Large- and small-scale environmental factors drive distributions of cool-adapted plants in karstic microrefugia

    PubMed Central

    Vojtkó, András; Farkas, Tünde; Szabó, Anna; Havadtői, Krisztina; Vojtkó, Anna E.; Tölgyesi, Csaba; Cseh, Viktória; Erdős, László; Maák, István Elek; Keppel, Gunnar

    2017-01-01

    Background and aims Dolines are small- to large-sized bowl-shaped depressions of karst surfaces. They may constitute important microrefugia, as thermal inversion often maintains cooler conditions within them. This study aimed to identify the effects of large- (macroclimate) and small-scale (slope aspect and vegetation type) environmental factors on cool-adapted plants in karst dolines of East-Central Europe. We also evaluated the potential of these dolines to be microrefugia that mitigate the effects of climate change on cool-adapted plants in both forest and grassland ecosystems. Methods We compared surveys of plant species composition that were made between 2007 and 2015 in 21 dolines distributed across four mountain ranges (sites) in Hungary and Romania. We examined the effects of environmental factors on the distribution and number of cool-adapted plants on three scales: (1) regional (all sites); (2) within sites and; (3) within dolines. Generalized linear models and non-parametric tests were used for the analyses. Key Results Macroclimate, vegetation type and aspect were all significant predictors of the diversity of cool-adapted plants. More cool-adapted plants were recorded in the coolest site, with only few found in the warmest site. At the warmest site, the distribution of cool-adapted plants was restricted to the deepest parts of dolines. Within sites of intermediate temperature and humidity, the effect of vegetation type and aspect on the diversity of cool-adapted plants was often significant, with more taxa being found in grasslands (versus forests) and on north-facing slopes (versus south-facing slopes). Conclusions There is large variation in the number and spatial distribution of cool-adapted plants in karst dolines, which is related to large- and small-scale environmental factors. Both macro- and microrefugia are therefore likely to play important roles in facilitating the persistence of cool-adapted plants under global warming. PMID:28025290

  11. Large- and small-scale environmental factors drive distributions of cool-adapted plants in karstic microrefugia.

    PubMed

    Bátori, Zoltán; Vojtkó, András; Farkas, Tünde; Szabó, Anna; Havadtői, Krisztina; Vojtkó, Anna E; Tölgyesi, Csaba; Cseh, Viktória; Erdős, László; Maák, István Elek; Keppel, Gunnar

    2017-01-01

    Dolines are small- to large-sized bowl-shaped depressions of karst surfaces. They may constitute important microrefugia, as thermal inversion often maintains cooler conditions within them. This study aimed to identify the effects of large- (macroclimate) and small-scale (slope aspect and vegetation type) environmental factors on cool-adapted plants in karst dolines of East-Central Europe. We also evaluated the potential of these dolines to be microrefugia that mitigate the effects of climate change on cool-adapted plants in both forest and grassland ecosystems. We compared surveys of plant species composition that were made between 2007 and 2015 in 21 dolines distributed across four mountain ranges (sites) in Hungary and Romania. We examined the effects of environmental factors on the distribution and number of cool-adapted plants on three scales: (1) regional (all sites); (2) within sites and; (3) within dolines. Generalized linear models and non-parametric tests were used for the analyses. Macroclimate, vegetation type and aspect were all significant predictors of the diversity of cool-adapted plants. More cool-adapted plants were recorded in the coolest site, with only few found in the warmest site. At the warmest site, the distribution of cool-adapted plants was restricted to the deepest parts of dolines. Within sites of intermediate temperature and humidity, the effect of vegetation type and aspect on the diversity of cool-adapted plants was often significant, with more taxa being found in grasslands (versus forests) and on north-facing slopes (versus south-facing slopes). There is large variation in the number and spatial distribution of cool-adapted plants in karst dolines, which is related to large- and small-scale environmental factors. Both macro- and microrefugia are therefore likely to play important roles in facilitating the persistence of cool-adapted plants under global warming. © The Author 2016. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Use of large-scale, multi-species surveys to monitor gyrfalcon and ptarmigan populations

    USGS Publications Warehouse

    Bart, Jonathan; Fuller, Mark; Smith, Paul; Dunn, Leah; Watson, Richard T.; Cade, Tom J.; Fuller, Mark; Hunt, Grainger; Potapov, Eugene

    2011-01-01

    We evaluated the ability of three large-scale, multi-species surveys in the Arctic to provide information on abundance and habitat relationships of Gyrfalcons (Falco rusticolus) and ptarmigan. The Program for Regional and International Shorebird Monitoring (PRISM) has surveyed birds widely across the arctic regions of Canada and Alaska since 2001. The Arctic Coastal Plain survey has collected abundance information on the North Slope of Alaska using fixed-wing aircraft since 1992. The Northwest Territories-Nunavut Bird Checklist has collected presenceabsence information from little-known locations in northern Canada since 1995. All three surveys provide extensive information on Willow Ptarmigan (Lagopus lagopus) and Rock Ptarmigan (L. muta). For example, they show that ptarmigan are most abundant in western Alaska, next most abundant in northern Alaska and northwest Canada, and least abundant in the Canadian Archipelago. PRISM surveys were less successful in detecting Gyrfalcons, and the Arctic Coastal Plain Survey is largely outside the Gyrfalcon?s breeding range. The Checklist Survey, however, reflects the expansive Gyrfalcon range in Canada. We suggest that collaboration by Gyrfalcon and ptarmigan biologists with the organizers of large scale surveys like the ones we investigated provides an opportunity for obtaining useful information on these species and their environment across large areas.

  13. Asynchronous adaptive time step in quantitative cellular automata modeling

    PubMed Central

    Zhu, Hao; Pang, Peter YH; Sun, Yan; Dhar, Pawan

    2004-01-01

    Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment. PMID:15222901

  14. Towards Personalized Cardiology: Multi-Scale Modeling of the Failing Heart

    PubMed Central

    Amr, Ali; Neumann, Dominik; Georgescu, Bogdan; Seegerer, Philipp; Kamen, Ali; Haas, Jan; Frese, Karen S.; Irawati, Maria; Wirsz, Emil; King, Vanessa; Buss, Sebastian; Mereles, Derliz; Zitron, Edgar; Keller, Andreas; Katus, Hugo A.; Comaniciu, Dorin; Meder, Benjamin

    2015-01-01

    Background Despite modern pharmacotherapy and advanced implantable cardiac devices, overall prognosis and quality of life of HF patients remain poor. This is in part due to insufficient patient stratification and lack of individualized therapy planning, resulting in less effective treatments and a significant number of non-responders. Methods and Results State-of-the-art clinical phenotyping was acquired, including magnetic resonance imaging (MRI) and biomarker assessment. An individualized, multi-scale model of heart function covering cardiac anatomy, electrophysiology, biomechanics and hemodynamics was estimated using a robust framework. The model was computed on n=46 HF patients, showing for the first time that advanced multi-scale models can be fitted consistently on large cohorts. Novel multi-scale parameters derived from the model of all cases were analyzed and compared against clinical parameters, cardiac imaging, lab tests and survival scores to evaluate the explicative power of the model and its potential for better patient stratification. Model validation was pursued by comparing clinical parameters that were not used in the fitting process against model parameters. Conclusion This paper illustrates how advanced multi-scale models can complement cardiovascular imaging and how they could be applied in patient care. Based on obtained results, it becomes conceivable that, after thorough validation, such heart failure models could be applied for patient management and therapy planning in the future, as we illustrate in one patient of our cohort who received CRT-D implantation. PMID:26230546

  15. Cost estimate for a proposed GDF Suez LNG testing program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanchat, Thomas K.; Brady, Patrick Dennis; Jernigan, Dann A.

    2014-02-01

    At the request of GDF Suez, a Rough Order of Magnitude (ROM) cost estimate was prepared for the design, construction, testing, and data analysis for an experimental series of large-scale (Liquefied Natural Gas) LNG spills on land and water that would result in the largest pool fires and vapor dispersion events ever conducted. Due to the expected cost of this large, multi-year program, the authors utilized Sandia's structured cost estimating methodology. This methodology insures that the efforts identified can be performed for the cost proposed at a plus or minus 30 percent confidence. The scale of the LNG spill, fire,more » and vapor dispersion tests proposed by GDF could produce hazard distances and testing safety issues that need to be fully explored. Based on our evaluations, Sandia can utilize much of our existing fire testing infrastructure for the large fire tests and some small dispersion tests (with some modifications) in Albuquerque, but we propose to develop a new dispersion testing site at our remote test area in Nevada because of the large hazard distances. While this might impact some testing logistics, the safety aspects warrant this approach. In addition, we have included a proposal to study cryogenic liquid spills on water and subsequent vaporization in the presence of waves. Sandia is working with DOE on applications that provide infrastructure pertinent to wave production. We present an approach to conduct repeatable wave/spill interaction testing that could utilize such infrastructure.« less

  16. Multi-scale properties of large eddy simulations: correlations between resolved-scale velocity-field increments and subgrid-scale quantities

    NASA Astrophysics Data System (ADS)

    Linkmann, Moritz; Buzzicotti, Michele; Biferale, Luca

    2018-06-01

    We provide analytical and numerical results concerning multi-scale correlations between the resolved velocity field and the subgrid-scale (SGS) stress-tensor in large eddy simulations (LES). Following previous studies for Navier-Stokes equations, we derive the exact hierarchy of LES equations governing the spatio-temporal evolution of velocity structure functions of any order. The aim is to assess the influence of the subgrid model on the inertial range intermittency. We provide a series of predictions, within the multifractal theory, for the scaling of correlation involving the SGS stress and we compare them against numerical results from high-resolution Smagorinsky LES and from a-priori filtered data generated from direct numerical simulations (DNS). We find that LES data generally agree very well with filtered DNS results and with the multifractal prediction for all leading terms in the balance equations. Discrepancies are measured for some of the sub-leading terms involving cross-correlation between resolved velocity increments and the SGS tensor or the SGS energy transfer, suggesting that there must be room to improve the SGS modelisation to further extend the inertial range properties for any fixed LES resolution.

  17. High Fidelity Modeling of Turbulent Mixing and Chemical Kinetics Interactions in a Post-Detonation Flow Field

    NASA Astrophysics Data System (ADS)

    Sinha, Neeraj; Zambon, Andrea; Ott, James; Demagistris, Michael

    2015-06-01

    Driven by the continuing rapid advances in high-performance computing, multi-dimensional high-fidelity modeling is an increasingly reliable predictive tool capable of providing valuable physical insight into complex post-detonation reacting flow fields. Utilizing a series of test cases featuring blast waves interacting with combustible dispersed clouds in a small-scale test setup under well-controlled conditions, the predictive capabilities of a state-of-the-art code are demonstrated and validated. Leveraging physics-based, first principle models and solving large system of equations on highly-resolved grids, the combined effects of finite-rate/multi-phase chemical processes (including thermal ignition), turbulent mixing and shock interactions are captured across the spectrum of relevant time-scales and length scales. Since many scales of motion are generated in a post-detonation environment, even if the initial ambient conditions are quiescent, turbulent mixing plays a major role in the fireball afterburning as well as in dispersion, mixing, ignition and burn-out of combustible clouds in its vicinity. Validating these capabilities at the small scale is critical to establish a reliable predictive tool applicable to more complex and large-scale geometries of practical interest.

  18. Understanding CO2 Plume Behavior and Basin-Scale Pressure Changes during Sequestration Projects through the use of Reservoir Fluid Modeling

    USGS Publications Warehouse

    Leetaru, H.E.; Frailey, S.M.; Damico, J.; Mehnert, E.; Birkholzer, J.; Zhou, Q.; Jordan, P.D.

    2009-01-01

    Large scale geologic sequestration tests are in the planning stages around the world. The liability and safety issues of the migration of CO2 away from the primary injection site and/or reservoir are of significant concerns for these sequestration tests. Reservoir models for simulating single or multi-phase fluid flow are used to understand the migration of CO2 in the subsurface. These models can also help evaluate concerns related to brine migration and basin-scale pressure increases that occur due to the injection of additional fluid volumes into the subsurface. The current paper presents different modeling examples addressing these issues, ranging from simple geometric models to more complex reservoir fluid models with single-site and basin-scale applications. Simple geometric models assuming a homogeneous geologic reservoir and piston-like displacement have been used for understanding pressure changes and fluid migration around each CO2 storage site. These geometric models are useful only as broad approximations because they do not account for the variation in porosity, permeability, asymmetry of the reservoir, and dip of the beds. In addition, these simple models are not capable of predicting the interference between different injection sites within the same reservoir. A more realistic model of CO2 plume behavior can be produced using reservoir fluid models. Reservoir simulation of natural gas storage reservoirs in the Illinois Basin Cambrian-age Mt. Simon Sandstone suggest that reservoir heterogeneity will be an important factor for evaluating storage capacity. The Mt. Simon Sandstone is a thick sandstone that underlies many significant coal fired power plants (emitting at least 1 million tonnes per year) in the midwestern United States including the states of Illinois, Indiana, Kentucky, Michigan, and Ohio. The initial commercial sequestration sites are expected to inject 1 to 2 million tonnes of CO2 per year. Depending on the geologic structure and permeability anisotropy, the CO2 injected into the Mt. Simon are expected to migrate less than 3 km. After 30 years of continuous injection followed by 100 years of shut-in, the plume from a 1 million tonnes a year injection rate is expected to migrate 1.6 km for a 0 degree dip reservoir and over 3 km for a 5 degree dip reservoir. The region where reservoir pressure increases in response to CO2 injection is typically much larger than the CO2 plume. It can thus be anticipated that there will be basin wide interactions between different CO2 injection sources if multiple, large volume sites are developed. This interaction will result in asymmetric plume migration that may be contrary to reservoir dip. A basin- scale simulation model is being developed to predict CO2 plume migration, brine displacement, and pressure buildup for a possible future sequestration scenario featuring multiple CO2 storage sites within the Illinois Basin Mt. Simon Sandstone. Interactions between different sites will be evaluated with respect to impacts on pressure and CO2 plume migration patterns. ?? 2009 Elsevier Ltd. All rights reserved.

  19. Marital Happiness and Sleep Disturbances in a Multi-Ethnic Sample of Middle-Aged Women

    PubMed Central

    Troxel, Wendy M.; Buysse, Daniel J.; Hall, Martica; Matthews, Karen A.

    2009-01-01

    Previous research suggests that divorced individuals, particularly women, have higher rates of sleep disturbances as compared to married individuals. Among the married, however, little is known about the association between relationship quality and sleep. The present study examined the association between marital happiness and self-reported sleep disturbances in a sample of midlife women drawn from the Study of Women’s Health Across the Nation (SWAN), a multi-site, multi-ethnic, community-based study (N=2,148). Marital happiness was measured using a single-item from the Dyadic Adjustment Scale and sleep disturbance was assessed using 4-items from the Women’s Health Initiative Insomnia Rating Scale (WHIIRS). After controlling for relevant covariates, maritally happy women reported fewer sleep disturbances, with the association evident among Caucasian women and to a lesser extent among African American women. PMID:19116797

  20. Development and Demonstration of an Aerial Imagery Assessment Method to Monitor Changes in Restored Stream Condition

    NASA Astrophysics Data System (ADS)

    Fong, L. S.; Ambrose, R. F.

    2017-12-01

    Remote sensing is an excellent way to assess the changing condition of streams and wetlands. Several studies have measured large-scale changes in riparian condition indicators, but few have remotely applied multi-metric assessments on a finer scale to measure changes, such as those caused by restoration, in the condition of small riparian areas. We developed an aerial imagery assessment method (AIAM) that combines landscape, hydrology, and vegetation observations into one index describing overall ecological condition of non-confined streams. Verification of AIAM demonstrated that sites in good condition (as assessed on-site by the California Rapid Assessment Method) received high AIAM scores. (AIAM was not verified with poor condition sites.) Spearman rank correlation tests comparing AIAM and the field-based California Rapid Assessment Method (CRAM) results revealed that some components of the two methods were highly correlated. The application of AIAM is illustrated with time-series restoration trajectories of three southern California stream restoration projects aged 15 to 21 years. The trajectories indicate that the projects improved in condition in years following their restoration, with vegetation showing the most dynamic change over time. AIAM restoration trajectories also overlapped to different degrees with CRAM chronosequence restoration performance curves that demonstrate the hypothetical development of high-performing projects. AIAM has high potential as a remote ecological assessment method and effective tool to determine restoration trajectories. Ultimately, this tool could be used to further improve stream and wetland restoration management.

  1. Airborne Lidar Measurements of Surface Topography and Structure in Arctic-Boreal Ecosystems

    NASA Astrophysics Data System (ADS)

    Hofton, M. A.; Blair, J. B.; Rabine, D.; Cornejo, H.; Story, S.

    2017-12-01

    In June-July 2017, NASA's Land, Vegetation and Ice Sensor (LVIS) Facility was deployed to sites in northern Canada and Alaska as part of NASA's Arctic-Boreal Vulnerability Experiment (ABoVE) 2017 airborne campaign. ABoVE is a large-scale, multi-year study of environmental change and its implications for social-ecological systems, and involves multiple airborne sensors flying both field-based and larger scale sampling sites. During the 4 week deployment of LVIS-F, a total of 15 flights were flown over diverse science targets based out of multiple airports in Canada and Alaska. LVIS-F is NASA's high-altitude airborne lidar sensor, collecting a nominal 2km wide swath of data from 10km altitude above the ground. Footprints are continguous both along and across track and for ABoVE operations, were 6m in diameter. Full waveform data are collected for every footprint and georeferenced to provide a true 3 dimensional view of overflown terrain. Along with precise positioning and pointing information, the LVIS laser range and waveform data are processed to provide high-quality measurements of surface structure including ground elevation, canopy height and canopy volume metrics. Information on data coverage and examples of level1b and level2 data products at science target sites will be shown along with initial results for data precision and accuracy. All AboVe LVIS data products will be available to investigators via a NASA DAAC.

  2. Explanatory Power of Multi-scale Physical Descriptors in Modeling Benthic Indices Across Nested Ecoregions of the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Holburn, E. R.; Bledsoe, B. P.; Poff, N. L.; Cuhaciyan, C. O.

    2005-05-01

    Using over 300 R/EMAP sites in OR and WA, we examine the relative explanatory power of watershed, valley, and reach scale descriptors in modeling variation in benthic macroinvertebrate indices. Innovative metrics describing flow regime, geomorphic processes, and hydrologic-distance weighted watershed and valley characteristics are used in multiple regression and regression tree modeling to predict EPT richness, % EPT, EPT/C, and % Plecoptera. A nested design using seven ecoregions is employed to evaluate the influence of geographic scale and environmental heterogeneity on the explanatory power of individual and combined scales. Regression tree models are constructed to explain variability while identifying threshold responses and interactions. Cross-validated models demonstrate differences in the explanatory power associated with single-scale and multi-scale models as environmental heterogeneity is varied. Models explaining the greatest variability in biological indices result from multi-scale combinations of physical descriptors. Results also indicate that substantial variation in benthic macroinvertebrate response can be explained with process-based watershed and valley scale metrics derived exclusively from common geospatial data. This study outlines a general framework for identifying key processes driving macroinvertebrate assemblages across a range of scales and establishing the geographic extent at which various levels of physical description best explain biological variability. Such information can guide process-based stratification to avoid spurious comparison of dissimilar stream types in bioassessments and ensure that key environmental gradients are adequately represented in sampling designs.

  3. Scalable Triadic Analysis of Large-Scale Graphs: Multi-Core vs. Multi-Processor vs. Multi-Threaded Shared Memory Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Marquez, Andres; Choudhury, Sutanay

    2012-09-01

    Triadic analysis encompasses a useful set of graph mining methods that is centered on the concept of a triad, which is a subgraph of three nodes and the configuration of directed edges across the nodes. Such methods are often applied in the social sciences as well as many other diverse fields. Triadic methods commonly operate on a triad census that counts the number of triads of every possible edge configuration in a graph. Like other graph algorithms, triadic census algorithms do not scale well when graphs reach tens of millions to billions of nodes. To enable the triadic analysis ofmore » large-scale graphs, we developed and optimized a triad census algorithm to efficiently execute on shared memory architectures. We will retrace the development and evolution of a parallel triad census algorithm. Over the course of several versions, we continually adapted the code’s data structures and program logic to expose more opportunities to exploit parallelism on shared memory that would translate into improved computational performance. We will recall the critical steps and modifications that occurred during code development and optimization. Furthermore, we will compare the performances of triad census algorithm versions on three specific systems: Cray XMT, HP Superdome, and AMD multi-core NUMA machine. These three systems have shared memory architectures but with markedly different hardware capabilities to manage parallelism.« less

  4. Identifying strategic sites for Green-Infrastructures (GI) to manage stormwater in a miscellaneous use urban African watershed

    NASA Astrophysics Data System (ADS)

    Selker, J. S.; Kahsai, S. K.

    2017-12-01

    Green Infrastructure (GI) or Low impact development (LID), is a land use planning and design approach with the objective of mitigating land development impacts to the environment, and is ever more looked to as a way to lessen runoff and pollutant loading to receiving water bodies. Broad-scale approaches for siting GI/LID have been developed for agricultural watersheds, but are rare for urban watersheds, largely due to greater land use complexity. And it is even more challenging when it comes to Urban Africa due to the combination of poor data quality, rapid and unplanned development, and civic institutions unable to reliably carry out regular maintenance. We present a spacio-temporal simulation-based approach to identify an optimal prioritization of sites for GI/LID based on DEM, land use and land cover. Optimization used is a multi-objective optimization tool along with an urban storm water management model (SWMM) to identify the most cost-effective combination of LID/GI. This was applied to an urban watershed in NW Kampala, Lubigi Catchment (notorious for being heavily flooded every year), with a miscellaneous use watershed in Uganda, as a case-study to demonstrate the approach.

  5. Lessons learned from cross-border medical response to the terrorist bombings in Tabba and Ras-el-Satan, Egypt, on 07 October 2004.

    PubMed

    Leiba, Adi; Blumenfeld, Amir; Hourvitz, Ariel; Weiss, Gali; Peres, Michal; Laor, Dani; Schwartz, Dagan; Arad, Jacob; Goldberg, Avishay; Levi, Yeheskel; Bar-Dayan, Yaron

    2005-01-01

    Large-scale, terrorist attacks can happen in peripheral areas, which are located close to a country's borders and far from its main medical facilities and involve multi-national casualties and responders. The objective of this study was to analyze the terrorist suicide bombings that occurred on 07 October 2004, near the Israeli-Egyptian border, as representative of such a complex scenario. Data from formal debriefings after the event were processed in order to learn about victim outcomes, resource utilization, critical events, and time course of the emergency response. A total of 185 injured survivors were repatriated: four were severely wounded, 13 were moderately injured, and 168 were mildly injured. Thirty-eight people died. A forward medical team landed at the border town's airport, which provided reinforcement in the field and in the local hospital. Israeli and Egyptian search and rescue teams collaborated at the destruction site. One-hundred sixty-eight injured patients arrived at the small border hospital that rapidly organized itself for the mass-casualty incident, operating as an evacuation "staging hospital". Twenty-three casualties secondarily were distributed to two major trauma centers in the south and the center of Israel, respectively, either by ambulance or by helicopter. Large-scale, terrorist attacks at a peripheral border zone can be handled by international collaboration, reinforcement of medical teams at the site itself and at the peripheral neighboring hospital, rapid rearrangement of an "evacuation hospital", and efficient transport to trauma centers by ambulances, helicopters, and other aircraft.

  6. Integrated methodology for assessing the HCH groundwater pollution at the multi-source contaminated mega-site Bitterfeld/Wolfen.

    PubMed

    Wycisk, Peter; Stollberg, Reiner; Neumann, Christian; Gossel, Wolfgang; Weiss, Holger; Weber, Roland

    2013-04-01

    A large-scale groundwater contamination characterises the Pleistocene groundwater system of the former industrial and abandoned mining region Bitterfeld/Wolfen, Eastern Germany. For more than a century, local chemical production and extensive lignite mining caused a complex contaminant release from local production areas and related dump sites. Today, organic pollutants (mainly organochlorines) are present in all compartments of the environment at high concentration levels. An integrated methodology for characterising the current situation of pollution as well as the future fate development of hazardous substances is highly required to decide on further management and remediation strategies. Data analyses have been performed on regional groundwater monitoring data from about 10 years, containing approximately 3,500 samples, and up to 180 individual organic parameters from almost 250 observation wells. Run-off measurements as well as water samples were taken biweekly from local creeks during a period of 18 months. A kriging interpolation procedure was applied on groundwater analytics to generate continuous distribution patterns of the nodal contaminant samples. High-resolution geological 3-D modelling serves as a database for a regional 3-D groundwater flow model. Simulation results support the future fate assessment of contaminants. A first conceptual model of the contamination has been developed to characterise the contamination in regional surface waters and groundwater. A reliable explanation of the variant hexachlorocyclohexane (HCH) occurrence within the two local aquifer systems has been derived from the regionalised distribution patterns. Simulation results from groundwater flow modelling provide a better understanding of the future pollutant migration paths and support the overall site characterisation. The presented case study indicates that an integrated assessment of large-scale groundwater contaminations often needs more data than only from local groundwater monitoring. The developed methodology is appropriate to assess POP-contaminated mega-sites including, e.g. HCH deposits. Although HCH isomers are relevant groundwater pollutants at this site, further organochlorine pollutants are present at considerably higher levels. The study demonstrates that an effective evaluation of the current situation of contamination as well as of the related future fate development requires detailed information of the entire observed system.

  7. Beyond RGB: Very high resolution urban remote sensing with multimodal deep networks

    NASA Astrophysics Data System (ADS)

    Audebert, Nicolas; Le Saux, Bertrand; Lefèvre, Sébastien

    2018-06-01

    In this work, we investigate various methods to deal with semantic labeling of very high resolution multi-modal remote sensing data. Especially, we study how deep fully convolutional networks can be adapted to deal with multi-modal and multi-scale remote sensing data for semantic labeling. Our contributions are threefold: (a) we present an efficient multi-scale approach to leverage both a large spatial context and the high resolution data, (b) we investigate early and late fusion of Lidar and multispectral data, (c) we validate our methods on two public datasets with state-of-the-art results. Our results indicate that late fusion make it possible to recover errors steaming from ambiguous data, while early fusion allows for better joint-feature learning but at the cost of higher sensitivity to missing data.

  8. Viscous decay of nonlinear oscillations of a spherical bubble at large Reynolds number

    NASA Astrophysics Data System (ADS)

    Smith, W. R.; Wang, Q. X.

    2017-08-01

    The long-time viscous decay of large-amplitude bubble oscillations is considered in an incompressible Newtonian fluid, based on the Rayleigh-Plesset equation. At large Reynolds numbers, this is a multi-scaled problem with a short time scale associated with inertial oscillation and a long time scale associated with viscous damping. A multi-scaled perturbation method is thus employed to solve the problem. The leading-order analytical solution of the bubble radius history is obtained to the Rayleigh-Plesset equation in a closed form including both viscous and surface tension effects. Some important formulae are derived including the following: the average energy loss rate of the bubble system during each cycle of oscillation, an explicit formula for the dependence of the oscillation frequency on the energy, and an implicit formula for the amplitude envelope of the bubble radius as a function of the energy. Our theory shows that the energy of the bubble system and the frequency of oscillation do not change on the inertial time scale at leading order, the energy loss rate on the long viscous time scale being inversely proportional to the Reynolds number. These asymptotic predictions remain valid during each cycle of oscillation whether or not compressibility effects are significant. A systematic parametric analysis is carried out using the above formula for the energy of the bubble system, frequency of oscillation, and minimum/maximum bubble radii in terms of the Reynolds number, the dimensionless initial pressure of the bubble gases, and the Weber number. Our results show that the frequency and the decay rate have substantial variations over the lifetime of a decaying oscillation. The results also reveal that large-amplitude bubble oscillations are very sensitive to small changes in the initial conditions through large changes in the phase shift.

  9. Using MCDA and GIS for hazardous waste landfill siting considering land scarcity for waste disposal.

    PubMed

    De Feo, Giovanni; De Gisi, Sabino

    2014-11-01

    The main aim of this study was to develop a procedure that minimizes the wasting of space for the siting of hazardous waste landfills as part of a solid waste management system. We wanted to tackle the shortage of land for waste disposal that is a serious and growing problem in most large urban regions. The procedure combines a multi-criteria decision analysis (MCDA) approach with a geographical information system (GIS). The GIS was utilised to obtain an initial screening in order to eliminate unsuitable areas, whereas the MCDA was developed to select the most suitable sites. The novelty of the proposed siting procedure is the introduction of a new screening phase before the macro-siting step aimed at producing a "land use map of potentially suitable areas" for the siting of solid waste facilities which simultaneously takes into consideration all plant types. The issue of obtaining sites evaluations of a specific facility was coupled with the issue of not wasting land appropriate to facilitate other types of waste management options. In the developed case study, the use of an innovative criteria weighting tool (the "Priority Scale") in combination with the Analytic Hierarchy Process was useful to easier define the priorities of the evaluation criteria in comparison with other classic methods such as the Paired Comparison Technique in combination with the Simple Additive Weighting method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Megatux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-25

    The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less

  11. Large-Scale Power Production Potential on U.S. Department of Energy Lands

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kandt, Alicen J.; Elgqvist, Emma M.; Gagne, Douglas A.

    This report summarizes the potential for independent power producers to generate large-scale power on U.S. Department of Energy (DOE) lands and export that power into a larger power market, rather than serving on-site DOE loads. The report focuses primarily on the analysis of renewable energy (RE) technologies that are commercially viable at utility scale, including photovoltaics (PV), concentrating solar power (CSP), wind, biomass, landfill gas (LFG), waste to energy (WTE), and geothermal technologies. The report also summarizes the availability of fossil fuel, uranium, or thorium resources at 55 DOE sites.

  12. Biodiversity conservation in Swedish forests: ways forward for a 30-year-old multi-scaled approach.

    PubMed

    Gustafsson, Lena; Perhans, Karin

    2010-12-01

    A multi-scaled model for biodiversity conservation in forests was introduced in Sweden 30 years ago, which makes it a pioneer example of an integrated ecosystem approach. Trees are set aside for biodiversity purposes at multiple scale levels varying from individual trees to areas of thousands of hectares, with landowner responsibility at the lowest level and with increasing state involvement at higher levels. Ecological theory supports the multi-scaled approach, and retention efforts at every harvest occasion stimulate landowners' interest in conservation. We argue that the model has large advantages but that in a future with intensified forestry and global warming, development based on more progressive thinking is necessary to maintain and increase biodiversity. Suggestions for the future include joint planning for several forest owners, consideration of cost-effectiveness, accepting opportunistic work models, adjusting retention levels to stand and landscape composition, introduction of temporary reserves, creation of "receiver habitats" for species escaping climate change, and protection of young forests.

  13. Ground observations and remote sensing data for integrated modelisation of water budget in the Merguellil catchment, Tunisia

    NASA Astrophysics Data System (ADS)

    Mougenot, Bernard

    2016-04-01

    The Mediterranean region is affected by water scarcity. Some countries as Tunisia reached the limit of 550 m3/year/capita due overexploitation of low water resources for irrigation, domestic uses and industry. A lot of programs aim to evaluate strategies to improve water consumption at regional level. In central Tunisia, on the Merguellil catchment, we develop integrated water resources modelisations based on social investigations, ground observations and remote sensing data. The main objective is to close the water budget at regional level and to estimate irrigation and water pumping to test scenarios with endusers. Our works benefit from French, bilateral and European projects (ANR, MISTRALS/SICMed, FP6, FP7…), GMES/GEOLAND-ESA) and also network projects as JECAM and AERONET, where the Merguellil site is a reference. This site has specific characteristics associating irrigated and rainfed crops mixing cereals, market gardening and orchards and will be proposed as a new environmental observing system connected to the OMERE, TENSIFT and OSR systems respectively in Tunisia, Morocco and France. We show here an original and large set of ground and remote sensing data mainly acquired from 2008 to present to be used for calibration/validation of water budget processes and integrated models for present and scenarios: - Ground data: meteorological stations, water budget at local scale: fluxes tower, soil fluxes, soil and surface temperature, soil moisture, drainage, flow, water level in lakes, aquifer, vegetation parameters on selected fieds/month (LAI, height, biomass, yield), land cover: 3 times/year, bare soil roughness, irrigation and pumping estimations, soil texture. - Remote sensing data: remote sensing products from multi-platform (MODIS, SPOT, LANDSAT, ASTER, PLEIADES, ASAR, COSMO-SkyMed, TerraSAR X…), multi-wavelength (solar, micro-wave and thermal) and multi-resolution (0.5 meters to 1 km). Ground observations are used (1) to calibrate soil-vegetation-atmosphere models at field scale on different compartment and irrigated and rainfed land during a limited time (seasons or set of dry and wet years), (2) to calibrate and validate particularly evapotranspiration derived from multi-wavelength satellite data at watershed level in relationships with the aquifer conditions: pumping and recharge rate. We will point out some examples.

  14. Multi-scale Material Parameter Identification Using LS-DYNA® and LS-OPT®

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stander, Nielen; Basudhar, Anirban; Basu, Ushnish

    2015-09-14

    Ever-tightening regulations on fuel economy, and the likely future regulation of carbon emissions, demand persistent innovation in vehicle design to reduce vehicle mass. Classical methods for computational mass reduction include sizing, shape and topology optimization. One of the few remaining options for weight reduction can be found in materials engineering and material design optimization. Apart from considering different types of materials, by adding material diversity and composite materials, an appealing option in automotive design is to engineer steel alloys for the purpose of reducing plate thickness while retaining sufficient strength and ductility required for durability and safety. A project tomore » develop computational material models for advanced high strength steel is currently being executed under the auspices of the United States Automotive Materials Partnership (USAMP) funded by the US Department of Energy. Under this program, new Third Generation Advanced High Strength Steel (i.e., 3GAHSS) are being designed, tested and integrated with the remaining design variables of a benchmark vehicle Finite Element model. The objectives of the project are to integrate atomistic, microstructural, forming and performance models to create an integrated computational materials engineering (ICME) toolkit for 3GAHSS. The mechanical properties of Advanced High Strength Steels (AHSS) are controlled by many factors, including phase composition and distribution in the overall microstructure, volume fraction, size and morphology of phase constituents as well as stability of the metastable retained austenite phase. The complex phase transformation and deformation mechanisms in these steels make the well-established traditional techniques obsolete, and a multi-scale microstructure-based modeling approach following the ICME [0]strategy was therefore chosen in this project. Multi-scale modeling as a major area of research and development is an outgrowth of the Comprehensive Test Ban Treaty of 1996 which banned surface testing of nuclear devices [1]. This had the effect that experimental work was reduced from large scale tests to multiscale experiments to provide material models with validation at different length scales. In the subsequent years industry realized that multi-scale modeling and simulation-based design were transferable to the design optimization of any structural system. Horstemeyer [1] lists a number of advantages of the use of multiscale modeling. Among these are: the reduction of product development time by alleviating costly trial-and-error iterations as well as the reduction of product costs through innovations in material, product and process designs. Multi-scale modeling can reduce the number of costly large scale experiments and can increase product quality by providing more accurate predictions. Research tends to be focussed on each particular length scale, which enhances accuracy in the long term. This paper serves as an introduction to the LS-OPT and LS-DYNA methodology for multi-scale modeling. It mainly focuses on an approach to integrate material identification using material models of different length scales. As an example, a multi-scale material identification strategy, consisting of a Crystal Plasticity (CP) material model and a homogenized State Variable (SV) model, is discussed and the parameter identification of the individual material models of different length scales is demonstrated. The paper concludes with thoughts on integrating the multi-scale methodology into the overall vehicle design.« less

  15. Outbreaks associated to large open air festivals, including music festivals, 1980 to 2012.

    PubMed

    Botelho-Nevers, E; Gautret, P

    2013-03-14

    In the minds of many, large scale open air festivals have become associated with spring and summer, attracting many people, and in the case of music festivals, thousands of music fans. These festivals share the usual health risks associated with large mass gatherings, including transmission of communicable diseases and risk of outbreaks. Large scale open air festivals have however specific characteristics, including outdoor settings, on-site housing and food supply and the generally young age of the participants. Outbreaks at large scale open air festivals have been caused by Cryptosporium parvum, Campylobacter spp., Escherichia coli, Salmonella enterica, Shigella sonnei, Staphylococcus aureus, hepatitis A virus, influenza virus, measles virus, mumps virus and norovirus. Faecal-oral and respiratory transmissions of pathogens result from non-compliance with hygiene rules, inadequate sanitation and insufficient vaccination coverage. Sexual transmission of infectious diseases may also occur and is likely to be underestimated and underreported. Enhanced surveillance during and after festivals is essential. Preventive measures such as immunisations of participants and advice on-site and via social networks should be considered to reduce outbreaks at these large scale open air festivals.

  16. Intelligent Facades for High Performance Green Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyson, Anna

    Progress Towards Net-Zero and Net-Positive-Energy Commercial Buildings and Urban Districts Through Intelligent Building Envelope Strategies Previous research and development of intelligent facades systems has been limited in their contribution towards national goals for achieving on-site net zero buildings, because this R&D has failed to couple the many qualitative requirements of building envelopes such as the provision of daylighting, access to exterior views, satisfying aesthetic and cultural characteristics, with the quantitative metrics of energy harvesting, storage and redistribution. To achieve energy self-sufficiency from on-site solar resources, building envelopes can and must address this gamut of concerns simultaneously. With this project, wemore » have undertaken a high-performance building integrated combined-heat and power concentrating photovoltaic system with high temperature thermal capture, storage and transport towards multiple applications (BICPV/T). The critical contribution we are offering with the Integrated Concentrating Solar Façade (ICSF) is conceived to improve daylighting quality for improved health of occupants and mitigate solar heat gain while maximally capturing and transferring onsite solar energy. The ICSF accomplishes this multi-functionality by intercepting only the direct-normal component of solar energy (which is responsible for elevated cooling loads) thereby transforming a previously problematic source of energy into a high quality resource that can be applied to building demands such as heating, cooling, dehumidification, domestic hot water, and possible further augmentation of electrical generation through organic Rankine cycles. With the ICSF technology, our team is addressing the global challenge in transitioning commercial and residential building stock towards on-site clean energy self-sufficiency, by fully integrating innovative environmental control systems strategies within an intelligent and responsively dynamic building envelope. The advantage of being able to use the entire solar spectrum for active and passive benefits, along with the potential savings of avoiding transmission losses through direct current (DC) transfer to all buildings systems directly from the site of solar conversion, gives the system a compounded economic viability within the commercial and institutional building markets. With a team that spans multiple stakeholders across disparate industries, from CPV to A&E partners that are responsible for the design and development of District and Regional Scale Urban Development, this project demonstrates that integrating utility-scale high efficiency CPV installations with urban and suburban environments is both viable and desirable within the marketplace. The historical schism between utility scale CPV and BIPV has been one of differing scale and cultures. There is no technical reason why utility-scale CPV cannot be located within urban embedded district scale sites of energy harvesting. New models for leasing large areas of district scale roofs and facades are emerging, such that the model for utility scale energy harvesting can be reconciled to commercial and public scale building sites and campuses. This consortium is designed to unite utility scale solar harvesting into building applications for smart grid development.« less

  17. Carbon diffusion in bulk hcp zirconium: A multi-scale approach

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Roques, J.; Domain, C.; Simoni, E.

    2016-05-01

    In the framework of the geological repository of the used fuel claddings of pressurized water reactor, carbon behavior in bulk zirconium is studied by periodic Density Functional Theory calculations. The C interstitial sites were investigated and it was found that there are two possible carbon interstitial sites: a distorted basal tetragonal site and an octahedral site. There are four types of possible atomic jumps between them. After calculating the migration energies, the attempt frequencies and the jump probabilities for each possible migration path, kinetic Monte Carlo (KMC) simulations were performed to simulate carbon diffusion at the macroscopic scale. The results show that carbon diffusion in pure Zr bulk is extremely limited at the storage temperature (50 °C). Since there are defects in Zr bulk, in a second step, the effect of atomic vacancy was studied and it was proved that vacancies cannot increase carbon diffusion.

  18. 3MRA: A MULTI-MEDIA HUMAN AND ECOLOGICAL MODELING SYSTEM FOR SITE-SPECIFIC TO NATIONAL SCALE REGULATORY APPLICATIONS

    EPA Science Inventory

    3MRA provides a technology that fully integrates the full dimensionality of human and ecological exposure and risk assessment, thus allowing regulatory decisions a more complete expression of potential adverse health effects related to the disposal and reuse of contaminated waste...

  19. DEVELOP MULTI-STRESSOR, OPEN ARCHITECTURE MODELING FRAMEWORK FOR ECOLOGICAL EXPOSURE FROM SITE TO WATERSHED SCALE

    EPA Science Inventory

    A number of multimedia modeling frameworks are currently being developed. The Multimedia Integrated Modeling System (MIMS) is one of these frameworks. A framework should be seen as more of a multimedia modeling infrastructure than a single software system. This infrastructure do...

  20. Nitrogen mineralization in riparian soils along a river continuum within a multi-landuse basin

    EPA Science Inventory

    Nitrogen dynamics in riparian systems are often addressed within one landuse type and are rarely studied on watershed scales involving multiple land uses. This study tested for both temporal trends and watershed-wide spatial patterns in N mineralization and identified site fact...

  1. Animal movement data: GPS telemetry, autocorrelation and the need for path-level analysis [chapter 7

    Treesearch

    Samuel A. Cushman

    2010-01-01

    In the previous chapter we presented the idea of a multi-layer, multi-scale, spatially referenced data-cube as the foundation for monitoring and for implementing flexible modeling of ecological pattern-process relationships in particulate, in context and to integrate these across large spatial extents at the grain of the strongest linkage between response and driving...

  2. A multi-scale approach of fluvial biogeomorphic dynamics using photogrammetry.

    PubMed

    Hortobágyi, Borbála; Corenblit, Dov; Vautier, Franck; Steiger, Johannes; Roussel, Erwan; Burkart, Andreas; Peiry, Jean-Luc

    2017-11-01

    Over the last twenty years, significant technical advances turned photogrammetry into a relevant tool for the integrated analysis of biogeomorphic cross-scale interactions within vegetated fluvial corridors, which will largely contribute to the development and improvement of self-sustainable river restoration efforts. Here, we propose a cost-effective, easily reproducible approach based on stereophotogrammetry and Structure from Motion (SfM) technique to study feedbacks between fluvial geomorphology and riparian vegetation at different nested spatiotemporal scales. We combined different photogrammetric methods and thus were able to investigate biogeomorphic feedbacks at all three spatial scales (i.e., corridor, alluvial bar and micro-site) and at three different temporal scales, i.e., present, recent past and long term evolution on a diversified riparian landscape mosaic. We evaluate the performance and the limits of photogrammetric methods by targeting a set of fundamental parameters necessary to study biogeomorphic feedbacks at each of the three nested spatial scales and, when possible, propose appropriate solutions. The RMSE varies between 0.01 and 2 m depending on spatial scale and photogrammetric methods. Despite some remaining difficulties to properly apply them with current technologies under all circumstances in fluvial biogeomorphic studies, e.g. the detection of vegetation density or landform topography under a dense vegetation canopy, we suggest that photogrammetry is a promising instrument for the quantification of biogeomorphic feedbacks at nested spatial scales within river systems and for developing appropriate river management tools and strategies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Large-scale drivers of malaria and priority areas for prevention and control in the Brazilian Amazon region using a novel multi-pathogen geospatial model.

    PubMed

    Valle, Denis; Lima, Joanna M Tucker

    2014-11-20

    Most of the malaria burden in the Americas is concentrated in the Brazilian Amazon but a detailed spatial characterization of malaria risk has yet to be undertaken. Utilizing 2004-2008 malaria incidence data collected from six Brazilian Amazon states, large-scale spatial patterns of malaria risk were characterized with a novel Bayesian multi-pathogen geospatial model. Data included 2.4 million malaria cases spread across 3.6 million sq km. Remotely sensed variables (deforestation rate, forest cover, rainfall, dry season length, and proximity to large water bodies), socio-economic variables (rural population size, income, and literacy rate, mortality rate for children age under five, and migration patterns), and GIS variables (proximity to roads, hydro-electric dams and gold mining operations) were incorporated as covariates. Borrowing information across pathogens allowed for better spatial predictions of malaria caused by Plasmodium falciparum, as evidenced by a ten-fold cross-validation. Malaria incidence for both Plasmodium vivax and P. falciparum tended to be higher in areas with greater forest cover. Proximity to gold mining operations was another important risk factor, corroborated by a positive association between migration rates and malaria incidence. Finally, areas with a longer dry season and areas with higher average rural income tended to have higher malaria risk. Risk maps reveal striking spatial heterogeneity in malaria risk across the region, yet these mean disease risk surface maps can be misleading if uncertainty is ignored. By combining mean spatial predictions with their associated uncertainty, several sites were consistently classified as hotspots, suggesting their importance as priority areas for malaria prevention and control. This article provides several contributions. From a methodological perspective, the benefits of jointly modelling multiple pathogens for spatial predictions were illustrated. In addition, maps of mean disease risk were contrasted with that of statistically significant disease clusters, highlighting the critical importance of uncertainty in determining disease hotspots. From an epidemiological perspective, forest cover and proximity to gold mining operations were important large-scale drivers of disease risk in the region. Finally, the hotspot in Western Acre was identified as the area that should receive highest priority from the Brazilian national malaria prevention and control programme.

  4. Comparison Analysis among Large Amount of SNS Sites

    NASA Astrophysics Data System (ADS)

    Toriumi, Fujio; Yamamoto, Hitoshi; Suwa, Hirohiko; Okada, Isamu; Izumi, Kiyoshi; Hashimoto, Yasuhiro

    In recent years, application of Social Networking Services (SNS) and Blogs are growing as new communication tools on the Internet. Several large-scale SNS sites are prospering; meanwhile, many sites with relatively small scale are offering services. Such small-scale SNSs realize small-group isolated type of communication while neither mixi nor MySpace can do that. However, the studies on SNS are almost about particular large-scale SNSs and cannot analyze whether their results apply for general features or for special characteristics on the SNSs. From the point of view of comparison analysis on SNS, comparison with just several types of those cannot reach a statistically significant level. We analyze many SNS sites with the aim of classifying them by using some approaches. Our paper classifies 50,000 sites for small-scale SNSs and gives their features from the points of network structure, patterns of communication, and growth rate of SNS. The result of analysis for network structure shows that many SNS sites have small-world attribute with short path lengths and high coefficients of their cluster. Distribution of degrees of the SNS sites is close to power law. This result indicates the small-scale SNS sites raise the percentage of users with many friends than mixi. According to the analysis of their coefficients of assortativity, those SNS sites have negative values of assortativity, and that means users with high degree tend to connect users with small degree. Next, we analyze the patterns of user communication. A friend network of SNS is explicit while users' communication behaviors are defined as an implicit network. What kind of relationships do these networks have? To address this question, we obtain some characteristics of users' communication structure and activation patterns of users on the SNS sites. By using new indexes, friend aggregation rate and friend coverage rate, we show that SNS sites with high value of friend coverage rate activate diary postings and their comments. Besides, they become activated when hub users with high degree do not behave actively on the sites with high value of friend aggregation rate and high value of friend coverage rate. On the other hand, activation emerges when hub users behave actively on the sites with low value of friend aggregation rate and high value of friend coverage rate. Finally, we observe SNS sites which are increasing the number of users considerably, from the viewpoint of network structure, and extract characteristics of high growth SNS sites. As a result of discrimination on the basis of the decision tree analysis, we can recognize the high growth SNS sites with a high degree of accuracy. Besides, this approach suggests mixi and the other small-scale SNS sites have different character trait.

  5. Investigating In-Situ Mass Transfer Processes in a Groundwater U Plume Influenced by Groundwater-River Hydrologic and Geochemical Coupling (Invited)

    NASA Astrophysics Data System (ADS)

    Zachara, J. M.

    2009-12-01

    The Hanford Integrated Field Research Challenge (IFRC) site is a DOE/BER-supported experimental and monitoring facility focused on multi-scale mass transfer processes (hanfordifc@pnl.gov). It is located within the footprint of a historic uranium (U) waste disposal pond that overlies a contaminated vadose zone and a 1 km+ groundwater U plume. The plume is under a regulatory clean-up mandate. The site is in hydraulic connectivity with the Columbia River that is located approximately 300 m distant. Dramatic seasonal variations in Columbia River stage cause 2m+ variations in water table and associated changes in groundwater flow directions and composition that are believed to recharge contaminant U to the plume through lower vadose zone pumping. The 60 m triangular shaped facility contains 37 monitoring wells equipped with down-hole electrical resistance tomography electrode and thermistor arrays, pressure transducers for continual water level monitoring, and specific conductance electrodes. Well spacings allow cross-hole geophysical interrogation and dynamic plume monitoring. Various geophysical and hydrologic field characterizations were performed during and after well installation, and retrieved sediments are being subjected to a hierarchal laboratory characterization process to support geostatistical models of hydrologic properties, U(VI) distribution and speciation, and equilibrium and kinetic reaction parameters for robust but tractable field-scale reactive transport calculations. Three large scale (10,000 gal+), non-reactive tracer experiments have been performed to evaluate groundwater flowpaths and velocities, facies scale mass transfer, and subsurface heterogeneity effects under different hydrologic conditions (e.g., flow vectors toward or away from the river). A passive monitoring experiment was completed during spring and summer of 2009 that documents spatially variable U(VI) release and plume recharge from the contaminated lower vadose zone during oscillating rising and falling water table events. A large scale injection experiment to evaluate in situ U(VI) desorption kinetics controlled by mass transfer is planned for the fall of 2009. The presentation will summarize key results from these different activities, and discuss their implications to improved plume forecasting and development of an effective groundwater remedy.

  6. Homogenization of Large-Scale Movement Models in Ecology

    USGS Publications Warehouse

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  7. Numerical Upscaling of Solute Transport in Fractured Porous Media Based on Flow Aligned Blocks

    NASA Astrophysics Data System (ADS)

    Leube, P.; Nowak, W.; Sanchez-Vila, X.

    2013-12-01

    High-contrast or fractured-porous media (FPM) pose one of the largest unresolved challenges for simulating large hydrogeological systems. The high contrast in advective transport between fast conduits and low-permeability rock matrix, including complex mass transfer processes, leads to the typical complex characteristics of early bulk arrivals and long tailings. Adequate direct representation of FPM requires enormous numerical resolutions. For large scales, e.g. the catchment scale, and when allowing for uncertainty in the fracture network architecture or in matrix properties, computational costs quickly reach an intractable level. In such cases, multi-scale simulation techniques have become useful tools. They allow decreasing the complexity of models by aggregating and transferring their parameters to coarser scales and so drastically reduce the computational costs. However, these advantages come at a loss of detail and accuracy. In this work, we develop and test a new multi-scale or upscaled modeling approach based on block upscaling. The novelty is that individual blocks are defined by and aligned with the local flow coordinates. We choose a multi-rate mass transfer (MRMT) model to represent the remaining sub-block non-Fickian behavior within these blocks on the coarse scale. To make the scale transition simple and to save computational costs, we capture sub-block features by temporal moments (TM) of block-wise particle arrival times to be matched with the MRMT model. By predicting spatial mass distributions of injected tracers in a synthetic test scenario, our coarse-scale solution matches reasonably well with the corresponding fine-scale reference solution. For predicting higher TM-orders (such as arrival time and effective dispersion), the prediction accuracy steadily decreases. This is compensated to some extent by the MRMT model. If the MRMT model becomes too complex, it loses its effect. We also found that prediction accuracy is sensitive to the choice of the effective dispersion coefficients and on the block resolution. A key advantage of the flow-aligned blocks is that the small-scale velocity field is reproduced quite accurately on the block-scale through their flow alignment. Thus, the block-scale transverse dispersivities remain in the similar magnitude as local ones, and they do not have to represent macroscopic uncertainty. Also, the flow-aligned blocks minimize numerical dispersion when solving the large-scale transport problem.

  8. Frustration-guided motion planning reveals conformational transitions in proteins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budday, Dominik; Fonseca, Rasmus; Leyendecker, Sigrid

    Proteins exist as conformational ensembles, exchanging between substates to perform their function. Advances in experimental techniques yield unprecedented access to structural snapshots of their conformational landscape. However, computationally modeling how proteins use collective motions to transition between substates is challenging owing to a rugged landscape and large energy barriers. Here in this paper, we present a new, robotics-inspired motion planning procedure called dCCRRT that navigates the rugged landscape between substates by introducing dynamic, interatomic constraints to modulate frustration. The constraints balance non-native contacts and flexibility, and instantaneously redirect the motion towards sterically favorable conformations. On a test set of eightmore » proteins determined in two conformations separated by, on average, 7.5Å root mean square deviation (RMSD), our pathways reduced the Cα atom RMSD to the goal conformation by 78%, outperforming peer methods. Additionally, we then applied dCC-RRT to examine how collective, small-scale motions of four side-chains in the active site of cyclophilin A propagate through the protein. dCC-RRT uncovered a spatially contiguous network of residues linked by steric interactions and collective motion connecting the active site to a recently proposed, non-canonical capsid binding site 25Å away, rationalizing NMR and multi-temperature crystallography experiments. In all, dCC-RRT can reveal detailed, all-atom molecular mechanisms for small and large amplitude motions.Source code and binaries are freely available at https://github.com/ExcitedStates/KGS/.« less

  9. Frustration-guided motion planning reveals conformational transitions in proteins.

    PubMed

    Budday, Dominik; Fonseca, Rasmus; Leyendecker, Sigrid; van den Bedem, Henry

    2017-10-01

    Proteins exist as conformational ensembles, exchanging between substates to perform their function. Advances in experimental techniques yield unprecedented access to structural snapshots of their conformational landscape. However, computationally modeling how proteins use collective motions to transition between substates is challenging owing to a rugged landscape and large energy barriers. Here, we present a new, robotics-inspired motion planning procedure called dCC-RRT that navigates the rugged landscape between substates by introducing dynamic, interatomic constraints to modulate frustration. The constraints balance non-native contacts and flexibility, and instantaneously redirect the motion towards sterically favorable conformations. On a test set of eight proteins determined in two conformations separated by, on average, 7.5 Å root mean square deviation (RMSD), our pathways reduced the Cα atom RMSD to the goal conformation by 78%, outperforming peer methods. We then applied dCC-RRT to examine how collective, small-scale motions of four side-chains in the active site of cyclophilin A propagate through the protein. dCC-RRT uncovered a spatially contiguous network of residues linked by steric interactions and collective motion connecting the active site to a recently proposed, non-canonical capsid binding site 25 Å away, rationalizing NMR and multi-temperature crystallography experiments. In all, dCC-RRT can reveal detailed, all-atom molecular mechanisms for small and large amplitude motions. Source code and binaries are freely available at https://github.com/ExcitedStates/KGS/. © 2017 Wiley Periodicals, Inc.

  10. Frustration-guided motion planning reveals conformational transitions in proteins

    DOE PAGES

    Budday, Dominik; Fonseca, Rasmus; Leyendecker, Sigrid; ...

    2017-07-12

    Proteins exist as conformational ensembles, exchanging between substates to perform their function. Advances in experimental techniques yield unprecedented access to structural snapshots of their conformational landscape. However, computationally modeling how proteins use collective motions to transition between substates is challenging owing to a rugged landscape and large energy barriers. Here in this paper, we present a new, robotics-inspired motion planning procedure called dCCRRT that navigates the rugged landscape between substates by introducing dynamic, interatomic constraints to modulate frustration. The constraints balance non-native contacts and flexibility, and instantaneously redirect the motion towards sterically favorable conformations. On a test set of eightmore » proteins determined in two conformations separated by, on average, 7.5Å root mean square deviation (RMSD), our pathways reduced the Cα atom RMSD to the goal conformation by 78%, outperforming peer methods. Additionally, we then applied dCC-RRT to examine how collective, small-scale motions of four side-chains in the active site of cyclophilin A propagate through the protein. dCC-RRT uncovered a spatially contiguous network of residues linked by steric interactions and collective motion connecting the active site to a recently proposed, non-canonical capsid binding site 25Å away, rationalizing NMR and multi-temperature crystallography experiments. In all, dCC-RRT can reveal detailed, all-atom molecular mechanisms for small and large amplitude motions.Source code and binaries are freely available at https://github.com/ExcitedStates/KGS/.« less

  11. Progression marker of Parkinson's disease: a 4-year multi-site imaging study.

    PubMed

    Burciu, Roxana G; Ofori, Edward; Archer, Derek B; Wu, Samuel S; Pasternak, Ofer; McFarland, Nikolaus R; Okun, Michael S; Vaillancourt, David E

    2017-08-01

    Progression markers of Parkinson's disease are crucial for successful therapeutic development. Recently, a diffusion magnetic resonance imaging analysis technique using a bitensor model was introduced allowing the estimation of the fractional volume of free water within a voxel, which is expected to increase in neurodegenerative disorders such as Parkinson's disease. Prior work demonstrated that free water in the posterior substantia nigra was elevated in Parkinson's disease compared to controls across single- and multi-site cohorts, and increased over 1 year in Parkinson's disease but not in controls at a single site. Here, the goal was to validate free water in the posterior substantia nigra as a progression marker in Parkinson's disease, and describe the pattern of progression of free water in patients with a 4-year follow-up tested in a multicentre international longitudinal study of de novo Parkinson's disease (http://www.ppmi-info.org/). The analyses examined: (i) 1-year changes in free water in 103 de novo patients with Parkinson's disease and 49 controls; (ii) 2- and 4-year changes in free water in a subset of 46 patients with Parkinson's disease imaged at baseline, 12, 24, and 48 months; (iii) whether 1- and 2-year changes in free water predict 4-year changes in the Hoehn and Yahr scale; and (iv) the relationship between 4-year changes in free water and striatal binding ratio in a subgroup of Parkinson's disease who had undergone both diffusion and dopamine transporter imaging. Results demonstrated that: (i) free water level in the posterior substantia nigra increased over 1 year in de novo Parkinson's disease but not in controls; (ii) free water kept increasing over 4 years in Parkinson's disease; (iii) sex and baseline free water predicted 4-year changes in free water; (iv) free water increases over 1 and 2 years were related to worsening on the Hoehn and Yahr scale over 4 years; and (v) the 4-year increase in free water was associated with the 4-year decrease in striatal binding ratio in the putamen. Importantly, all longitudinal results were consistent across sites. In summary, this study demonstrates an increase over 1 year in free water in the posterior substantia nigra in a large cohort of de novo patients with Parkinson's disease from a multi-site cohort study and no change in healthy controls, and further demonstrates an increase of free water in Parkinson's disease over the course of 4 years. A key finding was that results are consistent across sites and the 1-year and 2-year increase in free water in the posterior substantia nigra predicts subsequent long-term progression on the Hoehn and Yahr staging system. Collectively, these findings demonstrate that free water in the posterior substantia nigra is a valid, progression imaging marker of Parkinson's disease, which may be used in clinical trials of disease-modifying therapies. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain.

  12. Follow YOUR Heart: development of an evidence-based campaign empowering older women with HIV to participate in a large-scale cardiovascular disease prevention trial.

    PubMed

    Zanni, Markella V; Fitch, Kathleen; Rivard, Corinne; Sanchez, Laura; Douglas, Pamela S; Grinspoon, Steven; Smeaton, Laura; Currier, Judith S; Looby, Sara E

    2017-03-01

    Women's under-representation in HIV and cardiovascular disease (CVD) research suggests a need for novel strategies to ensure robust representation of women in HIV-associated CVD research. To elicit perspectives on CVD research participation among a community-sample of women with or at risk for HIV, and to apply acquired insights toward the development of an evidence-based campaign empowering older women with HIV to participate in a large-scale CVD prevention trial. In a community-based setting, we surveyed 40 women with or at risk for HIV about factors which might facilitate or impede engagement in CVD research. We applied insights derived from these surveys into the development of the Follow YOUR Heart campaign, educating women about HIV-associated CVD and empowering them to learn more about a multi-site HIV-associated CVD prevention trial: REPRIEVE. Endorsed best methods for learning about a CVD research study included peer-to-peer communication (54%), provider communication (46%) and video-based communication (39%). Top endorsed non-monetary reasons for participating in research related to gaining information (63%) and helping others (47%). Top endorsed reasons for not participating related to lack of knowledge about studies (29%) and lack of request to participate (29%). Based on survey results, the REPRIEVE Follow YOUR Heart campaign was developed. Interwoven campaign components (print materials, video, web presence) offer provider-based information/knowledge, peer-to-peer communication, and empowerment to learn more. Campaign components reflect women's self-identified motivations for research participation - education and altruism. Investigation of factors influencing women's participation in HIV-associated CVD research may be usefully applied to develop evidence-based strategies for enhancing women's enrollment in disease-specific large-scale trials. If proven efficacious, such strategies may enhance conduct of large-scale research studies across disciplines.

  13. Multi scales based sparse matrix spectral clustering image segmentation

    NASA Astrophysics Data System (ADS)

    Liu, Zhongmin; Chen, Zhicai; Li, Zhanming; Hu, Wenjin

    2018-04-01

    In image segmentation, spectral clustering algorithms have to adopt the appropriate scaling parameter to calculate the similarity matrix between the pixels, which may have a great impact on the clustering result. Moreover, when the number of data instance is large, computational complexity and memory use of the algorithm will greatly increase. To solve these two problems, we proposed a new spectral clustering image segmentation algorithm based on multi scales and sparse matrix. We devised a new feature extraction method at first, then extracted the features of image on different scales, at last, using the feature information to construct sparse similarity matrix which can improve the operation efficiency. Compared with traditional spectral clustering algorithm, image segmentation experimental results show our algorithm have better degree of accuracy and robustness.

  14. Geospatial optimization of siting large-scale solar projects

    USGS Publications Warehouse

    Macknick, Jordan; Quinby, Ted; Caulfield, Emmet; Gerritsen, Margot; Diffendorfer, James E.; Haines, Seth S.

    2014-01-01

    guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  15. A Hybrid, Large-Scale Wireless Sensor Network for Real-Time Acquisition and Tracking

    DTIC Science & Technology

    2007-06-01

    multicolor, Quantum Well Infrared Photodetector ( QWIP ), step-stare, large-format Focal Plane Array (FPA) is proposed and evaluated through performance...Photodetector ( QWIP ), step-stare, large-format Focal Plane Array (FPA) is proposed and evaluated through performance analysis. The thesis proposes...7 1. Multi-color IR Sensors - Operational Advantages ...........................8 2. Quantum-Well IR Photodetector ( QWIP

  16. Spatial and Temporal Dynamics of Pacific Oyster Hemolymph Microbiota across Multiple Scales

    PubMed Central

    Lokmer, Ana; Goedknegt, M. Anouk; Thieltges, David W.; Fiorentino, Dario; Kuenzel, Sven; Baines, John F.; Wegner, K. Mathias

    2016-01-01

    Unveiling the factors and processes that shape the dynamics of host associated microbial communities (microbiota) under natural conditions is an important part of understanding and predicting an organism's response to a changing environment. The microbiota is shaped by host (i.e., genetic) factors as well as by the biotic and abiotic environment. Studying natural variation of microbial community composition in multiple host genetic backgrounds across spatial as well as temporal scales represents a means to untangle this complex interplay. Here, we combined a spatially-stratified with a longitudinal sampling scheme within differentiated host genetic backgrounds by reciprocally transplanting Pacific oysters between two sites in the Wadden Sea (Sylt and Texel). To further differentiate contingent site from host genetic effects, we repeatedly sampled the same individuals over a summer season to examine structure, diversity and dynamics of individual hemolymph microbiota following experimental removal of resident microbiota by antibiotic treatment. While a large proportion of microbiome variation could be attributed to immediate environmental conditions, we observed persistent effects of antibiotic treatment and translocation suggesting that hemolymph microbial community dynamics is subject to within-microbiome interactions and host population specific factors. In addition, the analysis of spatial variation revealed that the within-site microenvironmental heterogeneity resulted in high small-scale variability, as opposed to large-scale (between-site) stability. Similarly, considerable within-individual temporal variability was in contrast with the overall temporal stability at the site level. Overall, our longitudinal, spatially-stratified sampling design revealed that variation in hemolymph microbiota is strongly influenced by site and immediate environmental conditions, whereas internal microbiome dynamics and oyster-related factors add to their long-term stability. The combination of small and large scale resolution of spatial and temporal observations therefore represents a crucial but underused tool to study host-associated microbiome dynamics. PMID:27630625

  17. Accelerating large-scale simulation of seismic wave propagation by multi-GPUs and three-dimensional domain decomposition

    NASA Astrophysics Data System (ADS)

    Okamoto, Taro; Takenaka, Hiroshi; Nakamura, Takeshi; Aoki, Takayuki

    2010-12-01

    We adopted the GPU (graphics processing unit) to accelerate the large-scale finite-difference simulation of seismic wave propagation. The simulation can benefit from the high-memory bandwidth of GPU because it is a "memory intensive" problem. In a single-GPU case we achieved a performance of about 56 GFlops, which was about 45-fold faster than that achieved by a single core of the host central processing unit (CPU). We confirmed that the optimized use of fast shared memory and registers were essential for performance. In the multi-GPU case with three-dimensional domain decomposition, the non-contiguous memory alignment in the ghost zones was found to impose quite long time in data transfer between GPU and the host node. This problem was solved by using contiguous memory buffers for ghost zones. We achieved a performance of about 2.2 TFlops by using 120 GPUs and 330 GB of total memory: nearly (or more than) 2200 cores of host CPUs would be required to achieve the same performance. The weak scaling was nearly proportional to the number of GPUs. We therefore conclude that GPU computing for large-scale simulation of seismic wave propagation is a promising approach as a faster simulation is possible with reduced computational resources compared to CPUs.

  18. Large-scale Eucalyptus energy farms and power cogeneration

    Treesearch

    Robert C. Noroña

    1983-01-01

    A thorough evaluation of all factors possibly affecting a large-scale planting of eucalyptus is foremost in determining the cost effectiveness of the planned operation. Seven basic areas of concern must be analyzed:1. Species Selection 2. Site Preparation 3. Planting 4. Weed Control 5....

  19. A System for Information Management in BioMedical Studies—SIMBioMS

    PubMed Central

    Krestyaninova, Maria; Zarins, Andris; Viksna, Juris; Kurbatova, Natalja; Rucevskis, Peteris; Neogi, Sudeshna Guha; Gostev, Mike; Perheentupa, Teemu; Knuuttila, Juha; Barrett, Amy; Lappalainen, Ilkka; Rung, Johan; Podnieks, Karlis; Sarkans, Ugis; McCarthy, Mark I; Brazma, Alvis

    2009-01-01

    Summary: SIMBioMS is a web-based open source software system for managing data and information in biomedical studies. It provides a solution for the collection, storage, management and retrieval of information about research subjects and biomedical samples, as well as experimental data obtained using a range of high-throughput technologies, including gene expression, genotyping, proteomics and metabonomics. The system can easily be customized and has proven to be successful in several large-scale multi-site collaborative projects. It is compatible with emerging functional genomics data standards and provides data import and export in accepted standard formats. Protocols for transferring data to durable archives at the European Bioinformatics Institute have been implemented. Availability: The source code, documentation and initialization scripts are available at http://simbioms.org. Contact: support@simbioms.org; mariak@ebi.ac.uk PMID:19633095

  20. The research of medical equipment on-line detection system based on Android smartphone

    NASA Astrophysics Data System (ADS)

    Jiang, Junjie; Dong, Xinyu; Zhang, Hongjie; Liu, Mengjun

    2017-06-01

    With the unceasing enhancement of medical level, the expanding scale of medical institutions, medical equipment as an important tool for disease diagnosis, treatment and prevention, used in all levels of medical institutions. The quality and accuracy of the Medical equipment play a key role in the doctor's diagnosis and treatment effect, medical metrology as the important technical foundation is to ensure that the equipment, technology, material components are accurate and the application is safe and reliable. Medical equipment have the feature of variety, large quantity, long using cycle, expensive and multi-site, which bring great difficulty in maintenance, equipment management and verification. Therefore, how to get the medical measurement integrate deeply into the advanced internet technology, information technology and the new measuring method, for real-time monitoring of medical equipment, tracking, positioning, and query is particularly important.

  1. Understanding Hydraulic Fracturing: A Multi-Scale Problem

    DOE PAGES

    Hyman, Jeffrey De'Haven; Gimenez Martinez, Joaquin; Viswanathan, Hari S.; ...

    2016-09-05

    Despite the impact that hydraulic fracturing has had on the energy sector, the physical mechanisms that control its efficiency and environmental impacts remain poorly understood in part because the length scales involved range from nano-meters to kilo-meters. We characterize flow and transport in shale formations across and between these scales using integrated computational, theoretical, and experimental efforts. At the field scale, we use discrete fracture network modeling to simulate production at a well site whose fracture network is based on a site characterization of a shale formation. At the core scale, we use triaxial fracture experiments and a finite-element discrete-elementmore » fracture propagation model with a coupled fluid solver to study dynamic crack propagation in low permeability shale. We use lattice Boltzmann pore-scale simulations and microfluidic experiments in both synthetic and real micromodels to study pore-scale flow phenomenon such as multiphase flow and mixing. A mechanistic description and integration of these multiple scales is required for accurate predictions of production and the eventual optimization of hydrocarbon extraction from unconventional reservoirs.« less

  2. REACH-ER: a tool to evaluate river basin remediation measures for contaminants at the catchment scale

    NASA Astrophysics Data System (ADS)

    van Griensven, Ann; Haest, Pieter Jan; Broekx, Steven; Seuntjens, Piet; Campling, Paul; Ducos, Geraldine; Blaha, Ludek; Slobodnik, Jaroslav

    2010-05-01

    The European Union (EU) adopted the Water Framework Directive (WFD) in 2000 ensuring that all aquatic ecosystems meet ‘good status' by 2015. However, it is a major challenge for river basin managers to meet this requirement in river basins with a high population density as well as intensive agricultural and industrial activities. The EU financed AQUAREHAB project (FP7) specifically examines the ecological and economic impact of innovative rehabilitation technologies for multi-pressured degraded water bodies. For this purpose, a generic collaborative management tool ‘REACH-ER' is being developed that can be used by stakeholders, citizens and water managers to evaluate the ecological and economical effects of different remedial actions on waterbodies. The tool is built using databases from large scale models simulating the hydrological dynamics of the river basing and sub-basins, the costs of the measures and the effectiveness of the measures in terms of ecological impact. Knowledge rules are used to describe the relationships between these data in order to compute the flux concentrations or to compute the effectiveness of measures. The management tool specifically addresses nitrate pollution and pollution by organic micropollutants. Detailed models are also used to predict the effectiveness of site remedial technologies using readily available global data. Rules describing ecological impacts are derived from ecotoxicological data for (mixtures of) specific contaminants (msPAF) and ecological indices relating effects to the presence of certain contaminants. Rules describing the cost-effectiveness of measures are derived from linear programming models identifying the least-cost combination of abatement measures to satisfy multi-pollutant reduction targets and from multi-criteria analysis.

  3. Investigating the Role of Large-Scale Domain Dynamics in Protein-Protein Interactions.

    PubMed

    Delaforge, Elise; Milles, Sigrid; Huang, Jie-Rong; Bouvier, Denis; Jensen, Malene Ringkjøbing; Sattler, Michael; Hart, Darren J; Blackledge, Martin

    2016-01-01

    Intrinsically disordered linkers provide multi-domain proteins with degrees of conformational freedom that are often essential for function. These highly dynamic assemblies represent a significant fraction of all proteomes, and deciphering the physical basis of their interactions represents a considerable challenge. Here we describe the difficulties associated with mapping the large-scale domain dynamics and describe two recent examples where solution state methods, in particular NMR spectroscopy, are used to investigate conformational exchange on very different timescales.

  4. Investigating the Role of Large-Scale Domain Dynamics in Protein-Protein Interactions

    PubMed Central

    Delaforge, Elise; Milles, Sigrid; Huang, Jie-rong; Bouvier, Denis; Jensen, Malene Ringkjøbing; Sattler, Michael; Hart, Darren J.; Blackledge, Martin

    2016-01-01

    Intrinsically disordered linkers provide multi-domain proteins with degrees of conformational freedom that are often essential for function. These highly dynamic assemblies represent a significant fraction of all proteomes, and deciphering the physical basis of their interactions represents a considerable challenge. Here we describe the difficulties associated with mapping the large-scale domain dynamics and describe two recent examples where solution state methods, in particular NMR spectroscopy, are used to investigate conformational exchange on very different timescales. PMID:27679800

  5. Three-dimensional multi-scale model of deformable platelets adhesion to vessel wall in blood flow

    PubMed Central

    Wu, Ziheng; Xu, Zhiliang; Kim, Oleg; Alber, Mark

    2014-01-01

    When a blood vessel ruptures or gets inflamed, the human body responds by rapidly forming a clot to restrict the loss of blood. Platelets aggregation at the injury site of the blood vessel occurring via platelet–platelet adhesion, tethering and rolling on the injured endothelium is a critical initial step in blood clot formation. A novel three-dimensional multi-scale model is introduced and used in this paper to simulate receptor-mediated adhesion of deformable platelets at the site of vascular injury under different shear rates of blood flow. The novelty of the model is based on a new approach of coupling submodels at three biological scales crucial for the early clot formation: novel hybrid cell membrane submodel to represent physiological elastic properties of a platelet, stochastic receptor–ligand binding submodel to describe cell adhesion kinetics and lattice Boltzmann submodel for simulating blood flow. The model implementation on the GPU cluster significantly improved simulation performance. Predictive model simulations revealed that platelet deformation, interactions between platelets in the vicinity of the vessel wall as well as the number of functional GPIbα platelet receptors played significant roles in platelet adhesion to the injury site. Variation of the number of functional GPIbα platelet receptors as well as changes of platelet stiffness can represent effects of specific drugs reducing or enhancing platelet activity. Therefore, predictive simulations can improve the search for new drug targets and help to make treatment of thrombosis patient-specific. PMID:24982253

  6. Honeycomb: Visual Analysis of Large Scale Social Networks

    NASA Astrophysics Data System (ADS)

    van Ham, Frank; Schulz, Hans-Jörg; Dimicco, Joan M.

    The rise in the use of social network sites allows us to collect large amounts of user reported data on social structures and analysis of this data could provide useful insights for many of the social sciences. This analysis is typically the domain of Social Network Analysis, and visualization of these structures often proves invaluable in understanding them. However, currently available visual analysis tools are not very well suited to handle the massive scale of this network data, and often resolve to displaying small ego networks or heavily abstracted networks. In this paper, we present Honeycomb, a visualization tool that is able to deal with much larger scale data (with millions of connections), which we illustrate by using a large scale corporate social networking site as an example. Additionally, we introduce a new probability based network metric to guide users to potentially interesting or anomalous patterns and discuss lessons learned during design and implementation.

  7. Large-scale Estimates of Leaf Area Index from Active Remote Sensing Laser Altimetry

    NASA Astrophysics Data System (ADS)

    Hopkinson, C.; Mahoney, C.

    2016-12-01

    Leaf area index (LAI) is a key parameter that describes the spatial distribution of foliage within forest canopies which in turn control numerous relationships between the ground, canopy, and atmosphere. The retrieval of LAI has demonstrated success by in-situ (digital) hemispherical photography (DHP) and airborne laser scanning (ALS) data; however, field and ALS acquisitions are often spatially limited (100's km2) and costly. Large-scale (>1000's km2) retrievals have been demonstrated by optical sensors, however, accuracies remain uncertain due to the sensor's inability to penetrate the canopy. The spaceborne Geoscience Laser Altimeter System (GLAS) provides a possible solution in retrieving large-scale derivations whilst simultaneously penetrating the canopy. LAI retrieved by multiple DHP from 6 Australian sites, representing a cross-section of Australian ecosystems, were employed to model ALS LAI, which in turn were used to infer LAI from GLAS data at 5 other sites. An optimally filtered GLAS dataset was then employed in conjunction with a host of supplementary data to build a Random Forest (RF) model to infer predictions (and uncertainties) of LAI at a 250 m resolution across the forested regions of Australia. Predictions were validated against ALS-based LAI from 20 sites (R2=0.64, RMSE=1.1 m2m-2); MODIS-based LAI were also assessed against these sites (R2=0.30, RMSE=1.78 m2m-2) to demonstrate the strength of GLAS-based predictions. The large-scale nature of current predictions was also leveraged to demonstrate large-scale relationships of LAI with other environmental characteristics, such as: canopy height, elevation, and slope. The need for such wide-scale quantification of LAI is key in the assessment and modification of forest management strategies across Australia. Such work also assists Australia's Terrestrial Ecosystem Research Network, in fulfilling their government issued mandates.

  8. Multi-scale controls on spatial variability in river biogeochemical cycling

    NASA Astrophysics Data System (ADS)

    Blaen, Phillip; Kurz, Marie; Knapp, Julia; Mendoza-Lera, Clara; Lee-Cullin, Joe; Klaar, Megan; Drummond, Jennifer; Jaeger, Anna; Zarnetske, Jay; Lewandowski, Joerg; Marti, Eugenia; Ward, Adam; Fleckenstein, Jan; Datry, Thibault; Larned, Scott; Krause, Stefan

    2016-04-01

    Excessive nutrient concentrations are common in surface waters and groundwaters in agricultural catchments worldwide. Increasing geomorphological heterogeneity in river channels may help to attenuate nutrient pollution by facilitating water exchange fluxes with the hyporheic zone; a site of intense microbial activity where biogeochemical cycling rates can be high. However, the controls on spatial variability in biogeochemical cycling, particularly at scales relevant for river managers, are largely unknown. Here, we aimed to assess: 1) how differences in river geomorphological heterogeneity control solute transport and rates of biogeochemical cycling at sub-reach scales (102 m); and 2) the relative magnitude of these differences versus those relating to reach scale substrate variability (103 m). We used the reactive tracer resazurin (Raz), a weakly fluorescent dye that transforms to highly fluorescent resorufin (Rru) under mildly reducing conditions, as a proxy to assess rates of biogeochemical cycling in a lowland river in southern England. Solute tracer tests were conducted in two reaches with contrasting substrates: one sand-dominated and the other gravel-dominated. Each reach was divided into sub-reaches that varied in geomorphic complexity (e.g. by the presence of pool-riffle sequences or the abundance of large woody debris). Slug injections of Raz and the conservative tracer fluorescein were conducted in each reach during baseflow conditions (Q ≈ 80 L/s) and breakthrough curves monitored using in-situ fluorometers. Preliminary results indicate overall Raz:Rru transformation rates in the gravel-dominated reach were more than 50% higher than those in the sand-dominated reach. However, high sub-reach variability in Raz:Rru transformation rates and conservative solute transport parameters suggests small scale targeted management interventions to alter geomorphic heterogeneity may be effective in creating hotspots of river biogeochemical cycling and nutrient load attenuation.

  9. Detecting recurrent gene mutation in interaction network context using multi-scale graph diffusion.

    PubMed

    Babaei, Sepideh; Hulsman, Marc; Reinders, Marcel; de Ridder, Jeroen

    2013-01-23

    Delineating the molecular drivers of cancer, i.e. determining cancer genes and the pathways which they deregulate, is an important challenge in cancer research. In this study, we aim to identify pathways of frequently mutated genes by exploiting their network neighborhood encoded in the protein-protein interaction network. To this end, we introduce a multi-scale diffusion kernel and apply it to a large collection of murine retroviral insertional mutagenesis data. The diffusion strength plays the role of scale parameter, determining the size of the network neighborhood that is taken into account. As a result, in addition to detecting genes with frequent mutations in their genomic vicinity, we find genes that harbor frequent mutations in their interaction network context. We identify densely connected components of known and putatively novel cancer genes and demonstrate that they are strongly enriched for cancer related pathways across the diffusion scales. Moreover, the mutations in the clusters exhibit a significant pattern of mutual exclusion, supporting the conjecture that such genes are functionally linked. Using multi-scale diffusion kernel, various infrequently mutated genes are found to harbor significant numbers of mutations in their interaction network neighborhood. Many of them are well-known cancer genes. The results demonstrate the importance of defining recurrent mutations while taking into account the interaction network context. Importantly, the putative cancer genes and networks detected in this study are found to be significant at different diffusion scales, confirming the necessity of a multi-scale analysis.

  10. Constructing Optimal Coarse-Grained Sites of Huge Biomolecules by Fluctuation Maximization.

    PubMed

    Li, Min; Zhang, John Zenghui; Xia, Fei

    2016-04-12

    Coarse-grained (CG) models are valuable tools for the study of functions of large biomolecules on large length and time scales. The definition of CG representations for huge biomolecules is always a formidable challenge. In this work, we propose a new method called fluctuation maximization coarse-graining (FM-CG) to construct the CG sites of biomolecules. The defined residual in FM-CG converges to a maximal value as the number of CG sites increases, allowing an optimal CG model to be rigorously defined on the basis of the maximum. More importantly, we developed a robust algorithm called stepwise local iterative optimization (SLIO) to accelerate the process of coarse-graining large biomolecules. By means of the efficient SLIO algorithm, the computational cost of coarse-graining large biomolecules is reduced to within the time scale of seconds, which is far lower than that of conventional simulated annealing. The coarse-graining of two huge systems, chaperonin GroEL and lengsin, indicates that our new methods can coarse-grain huge biomolecular systems with up to 10,000 residues within the time scale of minutes. The further parametrization of CG sites derived from FM-CG allows us to construct the corresponding CG models for studies of the functions of huge biomolecular systems.

  11. Robust scalable stabilisability conditions for large-scale heterogeneous multi-agent systems with uncertain nonlinear interactions: towards a distributed computing architecture

    NASA Astrophysics Data System (ADS)

    Manfredi, Sabato

    2016-06-01

    Large-scale dynamic systems are becoming highly pervasive in their occurrence with applications ranging from system biology, environment monitoring, sensor networks, and power systems. They are characterised by high dimensionality, complexity, and uncertainty in the node dynamic/interactions that require more and more computational demanding methods for their analysis and control design, as well as the network size and node system/interaction complexity increase. Therefore, it is a challenging problem to find scalable computational method for distributed control design of large-scale networks. In this paper, we investigate the robust distributed stabilisation problem of large-scale nonlinear multi-agent systems (briefly MASs) composed of non-identical (heterogeneous) linear dynamical systems coupled by uncertain nonlinear time-varying interconnections. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, new conditions are given for the distributed control design of large-scale MASs that can be easily solved by the toolbox of MATLAB. The stabilisability of each node dynamic is a sufficient assumption to design a global stabilising distributed control. The proposed approach improves some of the existing LMI-based results on MAS by both overcoming their computational limits and extending the applicative scenario to large-scale nonlinear heterogeneous MASs. Additionally, the proposed LMI conditions are further reduced in terms of computational requirement in the case of weakly heterogeneous MASs, which is a common scenario in real application where the network nodes and links are affected by parameter uncertainties. One of the main advantages of the proposed approach is to allow to move from a centralised towards a distributed computing architecture so that the expensive computation workload spent to solve LMIs may be shared among processors located at the networked nodes, thus increasing the scalability of the approach than the network size. Finally, a numerical example shows the applicability of the proposed method and its advantage in terms of computational complexity when compared with the existing approaches.

  12. Scalable Methods for Uncertainty Quantification, Data Assimilation and Target Accuracy Assessment for Multi-Physics Advanced Simulation of Light Water Reactors

    NASA Astrophysics Data System (ADS)

    Khuwaileh, Bassam

    High fidelity simulation of nuclear reactors entails large scale applications characterized with high dimensionality and tremendous complexity where various physics models are integrated in the form of coupled models (e.g. neutronic with thermal-hydraulic feedback). Each of the coupled modules represents a high fidelity formulation of the first principles governing the physics of interest. Therefore, new developments in high fidelity multi-physics simulation and the corresponding sensitivity/uncertainty quantification analysis are paramount to the development and competitiveness of reactors achieved through enhanced understanding of the design and safety margins. Accordingly, this dissertation introduces efficient and scalable algorithms for performing efficient Uncertainty Quantification (UQ), Data Assimilation (DA) and Target Accuracy Assessment (TAA) for large scale, multi-physics reactor design and safety problems. This dissertation builds upon previous efforts for adaptive core simulation and reduced order modeling algorithms and extends these efforts towards coupled multi-physics models with feedback. The core idea is to recast the reactor physics analysis in terms of reduced order models. This can be achieved via identifying the important/influential degrees of freedom (DoF) via the subspace analysis, such that the required analysis can be recast by considering the important DoF only. In this dissertation, efficient algorithms for lower dimensional subspace construction have been developed for single physics and multi-physics applications with feedback. Then the reduced subspace is used to solve realistic, large scale forward (UQ) and inverse problems (DA and TAA). Once the elite set of DoF is determined, the uncertainty/sensitivity/target accuracy assessment and data assimilation analysis can be performed accurately and efficiently for large scale, high dimensional multi-physics nuclear engineering applications. Hence, in this work a Karhunen-Loeve (KL) based algorithm previously developed to quantify the uncertainty for single physics models is extended for large scale multi-physics coupled problems with feedback effect. Moreover, a non-linear surrogate based UQ approach is developed, used and compared to performance of the KL approach and brute force Monte Carlo (MC) approach. On the other hand, an efficient Data Assimilation (DA) algorithm is developed to assess information about model's parameters: nuclear data cross-sections and thermal-hydraulics parameters. Two improvements are introduced in order to perform DA on the high dimensional problems. First, a goal-oriented surrogate model can be used to replace the original models in the depletion sequence (MPACT -- COBRA-TF - ORIGEN). Second, approximating the complex and high dimensional solution space with a lower dimensional subspace makes the sampling process necessary for DA possible for high dimensional problems. Moreover, safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. Accordingly, an inverse problem can be defined and solved to assess the contributions from sources of uncertainty; and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this dissertation a subspace-based gradient-free and nonlinear algorithm for inverse uncertainty quantification namely the Target Accuracy Assessment (TAA) has been developed and tested. The ideas proposed in this dissertation were first validated using lattice physics applications simulated using SCALE6.1 package (Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) lattice models). Ultimately, the algorithms proposed her were applied to perform UQ and DA for assembly level (CASL progression problem number 6) and core wide problems representing Watts Bar Nuclear 1 (WBN1) for cycle 1 of depletion (CASL Progression Problem Number 9) modeled via simulated using VERA-CS which consists of several multi-physics coupled models. The analysis and algorithms developed in this dissertation were encoded and implemented in a newly developed tool kit algorithms for Reduced Order Modeling based Uncertainty/Sensitivity Estimator (ROMUSE).

  13. A scalable multi-photon coincidence detector based on superconducting nanowires.

    PubMed

    Zhu, Di; Zhao, Qing-Yuan; Choi, Hyeongrak; Lu, Tsung-Ju; Dane, Andrew E; Englund, Dirk; Berggren, Karl K

    2018-06-04

    Coincidence detection of single photons is crucial in numerous quantum technologies and usually requires multiple time-resolved single-photon detectors. However, the electronic readout becomes a major challenge when the measurement basis scales to large numbers of spatial modes. Here, we address this problem by introducing a two-terminal coincidence detector that enables scalable readout of an array of detector segments based on superconducting nanowire microstrip transmission line. Exploiting timing logic, we demonstrate a sixteen-element detector that resolves all 136 possible single-photon and two-photon coincidence events. We further explore the pulse shapes of the detector output and resolve up to four-photon events in a four-element device, giving the detector photon-number-resolving capability. This new detector architecture and operating scheme will be particularly useful for multi-photon coincidence detection in large-scale photonic integrated circuits.

  14. Seismic Rate Changes Associated with Seasonal, Annual, and Decadal Changes in the Cryosphere

    NASA Technical Reports Server (NTRS)

    Sauber-Rosenberg, Jeanne

    2012-01-01

    Near the Bering Glacier Global Fiducial site in southern Alaska large cryospheric fluctuations occur in a region of upper crustal faulting and folding associated with collision and accretion of the Yakutat terrane. In this study we report constraints on seasonal, annual and decadal cryospheric changes estimated over the last decade from field, aircraft and satellite measurements, and we evaluate the influence of cryospheric changes on the background seismic rate. Multi-year images from the Bering Glacier global fiducial site are available since mid-2003 to constrain changes in extent of the Bering Glacier and to discern feature changes in the glacial surface. Starting around the same time, satellite gravimetric measurements from the Gravity Recovery and Climate experiment (GRACE) commenced. Large spatial-scale mass change calculated from the GRACE 1deg x 1deg mascon solution of Luthcke et al. [2012] indicate a general trend of annual ice mass loss for southern Alaska but with large, variable seasonal mass fluctuations. Since 2007, the station position of a continuous GPS site near Cape Yakataga (Alaska EarthScope PBO site, AB35) has been available as well. In addition to changes in the geodetic position due to tectonic motion, this GPS station shows large seasonal excursions in the detrended vertical and horizontal position components consistent with snow loading in the fall and winter and melt onset/mass decrease in the spring/summer. To better understand the timing of processes responsible for the onset of cryospheric mass loss documented in the GRACE data, we examined changes in the snow cover extent and the onset of melt in the spring. We calculated the surface displacements of the solid Earth and theoretical earthquake failure criteria associated with these annual and seasonal ice and snow changes using layered elastic half-space. Additionally, we compared the seismic rate (M>1.8) from a reference background time period against other time periods with variable ice or tectonic change characteristics to test the significance of seismic rate changes. Our earlier results suggest statistically significant changes in the background seismic rate associated with large seasonal mass changes. INDEX

  15. Relations between rainfall–runoff-induced erosion and aeolian deposition at archaeological sites in a semi-arid dam-controlled river corridor

    USGS Publications Warehouse

    Collins, Brian D.; Bedford, David; Corbett, Skye C.; Fairley, Helen C.; Cronkite-Ratcliff, Collin

    2016-01-01

    Process dynamics in fluvial-based dryland environments are highly complex with fluvial, aeolian, and alluvial processes all contributing to landscape change. When anthropogenic activities such as dam-building affect fluvial processes, the complexity in local response can be further increased by flood- and sediment-limiting flows. Understanding these complexities is key to predicting landscape behavior in drylands and has important scientific and management implications, including for studies related to paleoclimatology, landscape ecology evolution, and archaeological site context and preservation. Here we use multi-temporal LiDAR surveys, local weather data, and geomorphological observations to identify trends in site change throughout the 446-km-long semi-arid Colorado River corridor in Grand Canyon, Arizona, USA, where archaeological site degradation related to the effects of upstream dam operation is a concern. Using several site case studies, we show the range of landscape responses that might be expected from concomitant occurrence of dam-controlled fluvial sand bar deposition, aeolian sand transport, and rainfall-induced erosion. Empirical rainfall-erosion threshold analyses coupled with a numerical rainfall–runoff–soil erosion model indicate that infiltration-excess overland flow and gullying govern large-scale (centimeter- to decimeter-scale) landscape changes, but that aeolian deposition can in some cases mitigate gully erosion. Whereas threshold analyses identify the normalized rainfall intensity (defined as the ratio of rainfall intensity to hydraulic conductivity) as the primary factor governing hydrologic-driven erosion, assessment of false positives and false negatives in the dataset highlight topographic slope as the next most important parameter governing site response. Analysis of 4+ years of high resolution (four-minute) weather data and 75+ years of low resolution (daily) climate records indicates that dryland erosion is dependent on short-term, storm-driven rainfall intensity rather than cumulative rainfall, and that erosion can occur outside of wet seasons and even wet years. These results can apply to other similar semi-arid landscapes where process complexity may not be fully understood.

  16. Controllable 3D architectures of aligned carbon nanotube arrays by multi-step processes

    NASA Astrophysics Data System (ADS)

    Huang, Shaoming

    2003-06-01

    An effective way to fabricate large area three-dimensional (3D) aligned CNTs pattern based on pyrolysis of iron(II) phthalocyanine (FePc) by two-step processes is reported. The controllable generation of different lengths and selective growth of the aligned CNT arrays on metal-patterned (e.g., Ag and Au) substrate are the bases for generating such 3D aligned CNTs architectures. By controlling experimental conditions 3D aligned CNT arrays with different lengths/densities and morphologies/structures as well as multi-layered architectures can be fabricated in large scale by multi-step pyrolysis of FePc. These 3D architectures could have interesting properties and be applied for developing novel nanotube-based devices.

  17. Multi-GPU implementation of a VMAT treatment plan optimization algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Zhen, E-mail: Zhen.Tian@UTSouthwestern.edu, E-mail: Xun.Jia@UTSouthwestern.edu, E-mail: Steve.Jiang@UTSouthwestern.edu; Folkerts, Michael; Tan, Jun

    Purpose: Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU’s relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors’ group, on a multi-GPU platform tomore » solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. Methods: The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors’ method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H and N) cancer case is then used to validate the authors’ method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H and N patient cases and three prostate cases are used to demonstrate the advantages of the authors’ method. Results: The authors’ multi-GPU implementation can finish the optimization process within ∼1 min for the H and N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23–46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. Conclusions: The results demonstrate that the multi-GPU implementation of the authors’ column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors’ study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.« less

  18. Snow depth spatial structure from hillslope to basin scale

    NASA Astrophysics Data System (ADS)

    Deems, J. S.

    2017-12-01

    Knowledge of spatial patterns of snow accumulation is required for understanding the hydrology, climatology, and ecology of mountain regions. Spatial structure in snow accumulation patterns changes with the scale of observation, a feature that has been characterized using fractal dimensions calculated from lidar-derived snow depth maps: fractal scaling structure at short length scales, with a `scale break' transition to more stochastic patterns at longer separation distances. Previous work has shown that this fractal structure of snow depth distributions differs between sites with different vegetation and terrain characteristics. Forested areas showed a transition to a nearly random spatial distribution at a much shorter lag distance than do unforested sites, enabling a statistical characterization. Alpine areas, however, showed strong spatial structure for a much wider scale range, and were the source of the dominant spatial pattern observable over a wider area. These spatial structure characteristics suggest that the choice of measurement or model resolution (satellite sensor, DEM, field survey point spacing, etc.) will strongly affect the estimates of snow volume or mass, as well as the magnitude of spatial variability. These prior efforts used data sets that were high resolution ( 1 m laser point spacing) but of limited extent ( 1 km2), constraining detection of scale features such as fractal dimension or scale breaks to areas of relatively similar characteristics and to lag distances of under 500 m. New datasets available from the NASA JPL Airborne Snow Observatory (ASO) provide similar resolution but over large areas, enabling assessment of snow spatial structure across an entire watershed, or in similar vegetation or physiography but in different parts of the basin. Additionally, the multi-year ASO time series allows an investigation into the temporal stability of these scale characteristics, within a single snow season and between seasons of strongly varying accumulation totals and patterns. This presentation will explore initial results from this study, using data from the Tuolumne River Basin in California, USA. Fractal scaling characteristics derived from ASO lidar snow depth measurements are examined at the basin scale, as well as in varying topographic and forest cover environments.

  19. Repurposing of open data through large scale hydrological modelling - hypeweb.smhi.se

    NASA Astrophysics Data System (ADS)

    Strömbäck, Lena; Andersson, Jafet; Donnelly, Chantal; Gustafsson, David; Isberg, Kristina; Pechlivanidis, Ilias; Strömqvist, Johan; Arheimer, Berit

    2015-04-01

    Hydrological modelling demands large amounts of spatial data, such as soil properties, land use, topography, lakes and reservoirs, ice and snow coverage, water management (e.g. irrigation patterns and regulations), meteorological data and observed water discharge in rivers. By using such data, the hydrological model will in turn provide new data that can be used for new purposes (i.e. re-purposing). This presentation will give an example of how readily available open data from public portals have been re-purposed by using the Hydrological Predictions for the Environment (HYPE) model in a number of large-scale model applications covering numerous subbasins and rivers. HYPE is a dynamic, semi-distributed, process-based, and integrated catchment model. The model output is launched as new Open Data at the web site www.hypeweb.smhi.se to be used for (i) Climate change impact assessments on water resources and dynamics; (ii) The European Water Framework Directive (WFD) for characterization and development of measure programs to improve the ecological status of water bodies; (iii) Design variables for infrastructure constructions; (iv) Spatial water-resource mapping; (v) Operational forecasts (1-10 days and seasonal) on floods and droughts; (vi) Input to oceanographic models for operational forecasts and marine status assessments; (vii) Research. The following regional domains have been modelled so far with different resolutions (number of subbasins within brackets): Sweden (37 000), Europe (35 000), Arctic basin (30 000), La Plata River (6 000), Niger River (800), Middle-East North-Africa (31 000), and the Indian subcontinent (6 000). The Hype web site provides several interactive web applications for exploring results from the models. The user can explore an overview of various water variables for historical and future conditions. Moreover the user can explore and download historical time series of discharge for each basin and explore the performance of the model towards observed river flow. The presentation will describe the Open Data sources used, show the functionality of the web site and discuss model performance and experience from this world-wide hydrological modelling of multi-basins using open data.

  20. Integrating scales of seagrass monitoring to meet conservation needs

    USGS Publications Warehouse

    Neckles, Hilary A.; Kopp, Blaine S.; Peterson, Bradley J.; Pooler, Penelope S.

    2012-01-01

    We evaluated a hierarchical framework for seagrass monitoring in two estuaries in the northeastern USA: Little Pleasant Bay, Massachusetts, and Great South Bay/Moriches Bay, New York. This approach includes three tiers of monitoring that are integrated across spatial scales and sampling intensities. We identified monitoring attributes for determining attainment of conservation objectives to protect seagrass ecosystems from estuarine nutrient enrichment. Existing mapping programs provided large-scale information on seagrass distribution and bed sizes (tier 1 monitoring). We supplemented this with bay-wide, quadrat-based assessments of seagrass percent cover and canopy height at permanent sampling stations following a spatially distributed random design (tier 2 monitoring). Resampling simulations showed that four observations per station were sufficient to minimize bias in estimating mean percent cover on a bay-wide scale, and sample sizes of 55 stations in a 624-ha system and 198 stations in a 9,220-ha system were sufficient to detect absolute temporal increases in seagrass abundance from 25% to 49% cover and from 4% to 12% cover, respectively. We made high-resolution measurements of seagrass condition (percent cover, canopy height, total and reproductive shoot density, biomass, and seagrass depth limit) at a representative index site in each system (tier 3 monitoring). Tier 3 data helped explain system-wide changes. Our results suggest tiered monitoring as an efficient and feasible way to detect and predict changes in seagrass systems relative to multi-scale conservation objectives.

  1. Scaling Fiber Lasers to Large Mode Area: An Investigation of Passive Mode-Locking Using a Multi-Mode Fiber

    PubMed Central

    Ding, Edwin; Lefrancois, Simon; Kutz, Jose Nathan; Wise, Frank W.

    2011-01-01

    The mode-locking of dissipative soliton fiber lasers using large mode area fiber supporting multiple transverse modes is studied experimentally and theoretically. The averaged mode-locking dynamics in a multi-mode fiber are studied using a distributed model. The co-propagation of multiple transverse modes is governed by a system of coupled Ginzburg–Landau equations. Simulations show that stable and robust mode-locked pulses can be produced. However, the mode-locking can be destabilized by excessive higher-order mode content. Experiments using large core step-index fiber, photonic crystal fiber, and chirally-coupled core fiber show that mode-locking can be significantly disturbed in the presence of higher-order modes, resulting in lower maximum single-pulse energies. In practice, spatial mode content must be carefully controlled to achieve full pulse energy scaling. This paper demonstrates that mode-locking performance is very sensitive to the presence of multiple waveguide modes when compared to systems such as amplifiers and continuous-wave lasers. PMID:21731106

  2. Scaling Fiber Lasers to Large Mode Area: An Investigation of Passive Mode-Locking Using a Multi-Mode Fiber.

    PubMed

    Ding, Edwin; Lefrancois, Simon; Kutz, Jose Nathan; Wise, Frank W

    2011-01-01

    The mode-locking of dissipative soliton fiber lasers using large mode area fiber supporting multiple transverse modes is studied experimentally and theoretically. The averaged mode-locking dynamics in a multi-mode fiber are studied using a distributed model. The co-propagation of multiple transverse modes is governed by a system of coupled Ginzburg-Landau equations. Simulations show that stable and robust mode-locked pulses can be produced. However, the mode-locking can be destabilized by excessive higher-order mode content. Experiments using large core step-index fiber, photonic crystal fiber, and chirally-coupled core fiber show that mode-locking can be significantly disturbed in the presence of higher-order modes, resulting in lower maximum single-pulse energies. In practice, spatial mode content must be carefully controlled to achieve full pulse energy scaling. This paper demonstrates that mode-locking performance is very sensitive to the presence of multiple waveguide modes when compared to systems such as amplifiers and continuous-wave lasers.

  3. OMERO and Bio-Formats 5: flexible access to large bioimaging datasets at scale

    NASA Astrophysics Data System (ADS)

    Moore, Josh; Linkert, Melissa; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Li, Simon; Lindner, Dominik; Moore, William J.; Patterson, Andrew J.; Pindelski, Blazej; Ramalingam, Balaji; Rozbicki, Emil; Tarkowska, Aleksandra; Walczysko, Petr; Allan, Chris; Burel, Jean-Marie; Swedlow, Jason

    2015-03-01

    The Open Microscopy Environment (OME) has built and released Bio-Formats, a Java-based proprietary file format conversion tool and OMERO, an enterprise data management platform under open source licenses. In this report, we describe new versions of Bio-Formats and OMERO that are specifically designed to support large, multi-gigabyte or terabyte scale datasets that are routinely collected across most domains of biological and biomedical research. Bio- Formats reads image data directly from native proprietary formats, bypassing the need for conversion into a standard format. It implements the concept of a file set, a container that defines the contents of multi-dimensional data comprised of many files. OMERO uses Bio-Formats to read files natively, and provides a flexible access mechanism that supports several different storage and access strategies. These new capabilities of OMERO and Bio-Formats make them especially useful for use in imaging applications like digital pathology, high content screening and light sheet microscopy that create routinely large datasets that must be managed and analyzed.

  4. Development of the US3D Code for Advanced Compressible and Reacting Flow Simulations

    NASA Technical Reports Server (NTRS)

    Candler, Graham V.; Johnson, Heath B.; Nompelis, Ioannis; Subbareddy, Pramod K.; Drayna, Travis W.; Gidzak, Vladimyr; Barnhardt, Michael D.

    2015-01-01

    Aerothermodynamics and hypersonic flows involve complex multi-disciplinary physics, including finite-rate gas-phase kinetics, finite-rate internal energy relaxation, gas-surface interactions with finite-rate oxidation and sublimation, transition to turbulence, large-scale unsteadiness, shock-boundary layer interactions, fluid-structure interactions, and thermal protection system ablation and thermal response. Many of the flows have a large range of length and time scales, requiring large computational grids, implicit time integration, and large solution run times. The University of Minnesota NASA US3D code was designed for the simulation of these complex, highly-coupled flows. It has many of the features of the well-established DPLR code, but uses unstructured grids and has many advanced numerical capabilities and physical models for multi-physics problems. The main capabilities of the code are described, the physical modeling approaches are discussed, the different types of numerical flux functions and time integration approaches are outlined, and the parallelization strategy is overviewed. Comparisons between US3D and the NASA DPLR code are presented, and several advanced simulations are presented to illustrate some of novel features of the code.

  5. Multi-site calibration, validation, and sensitivity analysis of the MIKE SHE Model for a large watershed in northern China

    Treesearch

    S. Wang; Z. Zhang; G. Sun; P. Strauss; J. Guo; Y. Tang; A. Yao

    2012-01-01

    Model calibration is essential for hydrologic modeling of large watersheds in a heterogeneous mountain environment. Little guidance is available for model calibration protocols for distributed models that aim at capturing the spatial variability of hydrologic processes. This study used the physically-based distributed hydrologic model, MIKE SHE, to contrast a lumped...

  6. Chasing Perfection: Should We Reduce Model Uncertainty in Carbon Cycle-Climate Feedbacks

    NASA Astrophysics Data System (ADS)

    Bonan, G. B.; Lombardozzi, D.; Wieder, W. R.; Lindsay, K. T.; Thomas, R. Q.

    2015-12-01

    Earth system model simulations of the terrestrial carbon (C) cycle show large multi-model spread in the carbon-concentration and carbon-climate feedback parameters. Large differences among models are also seen in their simulation of global vegetation and soil C stocks and other aspects of the C cycle, prompting concern about model uncertainty and our ability to faithfully represent fundamental aspects of the terrestrial C cycle in Earth system models. Benchmarking analyses that compare model simulations with common datasets have been proposed as a means to assess model fidelity with observations, and various model-data fusion techniques have been used to reduce model biases. While such efforts will reduce multi-model spread, they may not help reduce uncertainty (and increase confidence) in projections of the C cycle over the twenty-first century. Many ecological and biogeochemical processes represented in Earth system models are poorly understood at both the site scale and across large regions, where biotic and edaphic heterogeneity are important. Our experience with the Community Land Model (CLM) suggests that large uncertainty in the terrestrial C cycle and its feedback with climate change is an inherent property of biological systems. The challenge of representing life in Earth system models, with the rich diversity of lifeforms and complexity of biological systems, may necessitate a multitude of modeling approaches to capture the range of possible outcomes. Such models should encompass a range of plausible model structures. We distinguish between model parameter uncertainty and model structural uncertainty. Focusing on improved parameter estimates may, in fact, limit progress in assessing model structural uncertainty associated with realistically representing biological processes. Moreover, higher confidence may be achieved through better process representation, but this does not necessarily reduce uncertainty.

  7. Multi-scale nest-site selection by black-backed woodpeckers in outbreaks of mountain pine beetles

    Treesearch

    Thomas W. Bonnot; Joshua J. Millspaugh; Mark A. Rumble

    2009-01-01

    Areas of mountain pine beetle (Dendroctonus ponderosae Hopkins) outbreaks in the Black Hills can provide habitat for black-backed woodpeckers (Picoides arcticus), a U.S. Forest Service, Region 2 Sensitive Species. These outbreaks are managed through removal of trees infested with mountain pine beetles to control mountain pine...

  8. MEETING IN TUCSON: 3MRA: A MULTI-MEDIA HUMAN AND ECOLOGICAL MODELING SYSTEM FOR SITE-SPECIFIC TO NATIONAL SCALE REGULATORY APPLICATIONS

    EPA Science Inventory

    3MRA provides a technology that fully integrates the full dimensionality of human and ecological exposure and risk assessment, thus allowing regulatory decisions a more complete expression of potential adverse health effects related to the disposal and reuse of contaminated waste...

  9. Pattern-based, multi-scale segmentation and regionalization of EOSD land cover

    NASA Astrophysics Data System (ADS)

    Niesterowicz, Jacek; Stepinski, Tomasz F.

    2017-10-01

    The Earth Observation for Sustainable Development of Forests (EOSD) map is a 25 m resolution thematic map of Canadian forests. Because of its large spatial extent and relatively high resolution the EOSD is difficult to analyze using standard GIS methods. In this paper we propose multi-scale segmentation and regionalization of EOSD as new methods for analyzing EOSD on large spatial scales. Segments, which we refer to as forest land units (FLUs), are delineated as tracts of forest characterized by cohesive patterns of EOSD categories; we delineated from 727 to 91,885 FLUs within the spatial extent of EOSD depending on the selected scale of a pattern. Pattern of EOSD's categories within each FLU is described by 1037 landscape metrics. A shapefile containing boundaries of all FLUs together with an attribute table listing landscape metrics make up an SQL-searchable spatial database providing detailed information on composition and pattern of land cover types in Canadian forest. Shapefile format and extensive attribute table pertaining to the entire legend of EOSD are designed to facilitate broad range of investigations in which assessment of composition and pattern of forest over large areas is needed. We calculated four such databases using different spatial scales of pattern. We illustrate the use of FLU database for producing forest regionalization maps of two Canadian provinces, Quebec and Ontario. Such maps capture the broad scale variability of forest at the spatial scale of the entire province. We also demonstrate how FLU database can be used to map variability of landscape metrics, and thus the character of landscape, over the entire Canada.

  10. Superconductor bearings, flywheels and transportation

    NASA Astrophysics Data System (ADS)

    Werfel, F. N.; Floegel-Delor, U.; Rothfeld, R.; Riedel, T.; Goebel, B.; Wippich, D.; Schirrmeister, P.

    2012-01-01

    This paper describes the present status of high temperature superconductors (HTS) and of bulk superconducting magnet devices, their use in bearings, in flywheel energy storage systems (FESS) and linear transport magnetic levitation (Maglev) systems. We report and review the concepts of multi-seeded REBCO bulk superconductor fabrication. The multi-grain bulks increase the averaged trapped magnetic flux density up to 40% compared to single-grain assembly in large-scale applications. HTS magnetic bearings with permanent magnet (PM) excitation were studied and scaled up to maximum forces of 10 kN axially and 4.5 kN radially. We examine the technology of the high-gradient magnetic bearing concept and verify it experimentally. A large HTS bearing is tested for stabilizing a 600 kg rotor of a 5 kWh/250 kW flywheel system. The flywheel rotor tests show the requirement for additional damping. Our compact flywheel system is compared with similar HTS-FESS projects. A small-scale compact YBCO bearing with in situ Stirling cryocooler is constructed and investigated for mobile applications. Next we show a successfully developed modular linear Maglev system for magnetic train operation. Each module levitates 0.25t at 10 mm distance during one-day operation without refilling LN2. More than 30 vacuum cryostats containing multi-seeded YBCO blocks are fabricated and are tested now in Germany, China and Brazil.

  11. Developing and Testing a Robust, Multi-Scale Framework for the Recovery of Longleaf Pine Understory Communities

    DTIC Science & Technology

    2015-05-01

    Model averaging for species richness on post-agricultural sites (1000 m2) with a landscape radius of 150 m. Table 3.4.8. Model selection for species ... richness on post-agricultural sites (1000 m2) with a landscape radius of 150 m. Table 3.4.9. Model averaging for proportion of reference species on...Direct, indirect, and total standardized effects on species richness . Table 4.1.1. Species and number of seeds added to the experimental plots at

  12. Integrating multi-criteria evaluation techniques with geographic information systems for landfill site selection: a case study using ordered weighted average.

    PubMed

    Gorsevski, Pece V; Donevska, Katerina R; Mitrovski, Cvetko D; Frizado, Joseph P

    2012-02-01

    This paper presents a GIS-based multi-criteria decision analysis approach for evaluating the suitability for landfill site selection in the Polog Region, Macedonia. The multi-criteria decision framework considers environmental and economic factors which are standardized by fuzzy membership functions and combined by integration of analytical hierarchy process (AHP) and ordered weighted average (OWA) techniques. The AHP is used for the elicitation of attribute weights while the OWA operator function is used to generate a wide range of decision alternatives for addressing uncertainty associated with interaction between multiple criteria. The usefulness of the approach is illustrated by different OWA scenarios that report landfill suitability on a scale between 0 and 1. The OWA scenarios are intended to quantify the level of risk taking (i.e., optimistic, pessimistic, and neutral) and to facilitate a better understanding of patterns that emerge from decision alternatives involved in the decision making process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Spatial Temporal Mathematics at Scale: An Innovative and Fully Developed Paradigm to Boost Math Achievement among All Learners

    ERIC Educational Resources Information Center

    Rutherford, Teomara; Kibrick, Melissa; Burchinal, Margaret; Richland, Lindsey; Conley, AnneMarie; Osborne, Keara; Schneider, Stephanie; Duran, Lauren; Coulson, Andrew; Antenore, Fran; Daniels, Abby; Martinez, Michael E.

    2010-01-01

    This paper describes the background, methodology, preliminary findings, and anticipated future directions of a large-scale multi-year randomized field experiment addressing the efficacy of ST Math [Spatial-Temporal Math], a fully-developed math curriculum that uses interactive animated software. ST Math's unique approach minimizes the use of…

  14. A multi-scale assessment of population connectivity in African lions (Panthera leo) in response to landscape change

    Treesearch

    Samuel A. Cushman; Nicholas B. Elliot; David W. Macdonald; Andrew J. Loveridge

    2015-01-01

    Habitat loss and fragmentation are among the major drivers of population declines and extinction, particularly in large carnivores. Connectivity models provide practical tools for assessing fragmentation effects and developing mitigation or conservation responses. To be useful to conservation practitioners, connectivity models need to incorporate multiple scales and...

  15. Prediction accuracies for growth and wood attributes of interior spruce in space using genotyping-by-sequencing.

    PubMed

    Gamal El-Dien, Omnia; Ratcliffe, Blaise; Klápště, Jaroslav; Chen, Charles; Porth, Ilga; El-Kassaby, Yousry A

    2015-05-09

    Genomic selection (GS) in forestry can substantially reduce the length of breeding cycle and increase gain per unit time through early selection and greater selection intensity, particularly for traits of low heritability and late expression. Affordable next-generation sequencing technologies made it possible to genotype large numbers of trees at a reasonable cost. Genotyping-by-sequencing was used to genotype 1,126 Interior spruce trees representing 25 open-pollinated families planted over three sites in British Columbia, Canada. Four imputation algorithms were compared (mean value (MI), singular value decomposition (SVD), expectation maximization (EM), and a newly derived, family-based k-nearest neighbor (kNN-Fam)). Trees were phenotyped for several yield and wood attributes. Single- and multi-site GS prediction models were developed using the Ridge Regression Best Linear Unbiased Predictor (RR-BLUP) and the Generalized Ridge Regression (GRR) to test different assumption about trait architecture. Finally, using PCA, multi-trait GS prediction models were developed. The EM and kNN-Fam imputation methods were superior for 30 and 60% missing data, respectively. The RR-BLUP GS prediction model produced better accuracies than the GRR indicating that the genetic architecture for these traits is complex. GS prediction accuracies for multi-site were high and better than those of single-sites while multi-site predictability produced the lowest accuracies reflecting type-b genetic correlations and deemed unreliable. The incorporation of genomic information in quantitative genetics analyses produced more realistic heritability estimates as half-sib pedigree tended to inflate the additive genetic variance and subsequently both heritability and gain estimates. Principle component scores as representatives of multi-trait GS prediction models produced surprising results where negatively correlated traits could be concurrently selected for using PCA2 and PCA3. The application of GS to open-pollinated family testing, the simplest form of tree improvement evaluation methods, was proven to be effective. Prediction accuracies obtained for all traits greatly support the integration of GS in tree breeding. While the within-site GS prediction accuracies were high, the results clearly indicate that single-site GS models ability to predict other sites are unreliable supporting the utilization of multi-site approach. Principle component scores provided an opportunity for the concurrent selection of traits with different phenotypic optima.

  16. Multi-Product Microalgae Biorefineries: From Concept Towards Reality.

    PubMed

    't Lam, G P; Vermuë, M H; Eppink, M H M; Wijffels, R H; van den Berg, C

    2018-02-01

    Although microalgae are a promising biobased feedstock, industrial scale production is still far off. To enhance the economic viability of large-scale microalgae processes, all biomass components need to be valorized, requiring a multi-product biorefinery. However, this concept is still too expensive. Typically, downstream processing of industrial biotechnological bulk products accounts for 20-40% of the total production costs, while for a microalgae multi-product biorefinery the costs are substantially higher (50-60%). These costs are high due to the lack of appropriate and mild technologies to access the different product fractions such as proteins, carbohydrates, and lipids. To reduce the costs, simplified processes need to be developed for the main unit operations including harvesting, cell disruption, extraction, and possibly fractionation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Small-scale multi-axial hybrid simulation of a shear-critical reinforced concrete frame

    NASA Astrophysics Data System (ADS)

    Sadeghian, Vahid; Kwon, Oh-Sung; Vecchio, Frank

    2017-10-01

    This study presents a numerical multi-scale simulation framework which is extended to accommodate hybrid simulation (numerical-experimental integration). The framework is enhanced with a standardized data exchange format and connected to a generalized controller interface program which facilitates communication with various types of laboratory equipment and testing configurations. A small-scale experimental program was conducted using a six degree-of-freedom hydraulic testing equipment to verify the proposed framework and provide additional data for small-scale testing of shearcritical reinforced concrete structures. The specimens were tested in a multi-axial hybrid simulation manner under a reversed cyclic loading condition simulating earthquake forces. The physical models were 1/3.23-scale representations of a beam and two columns. A mixed-type modelling technique was employed to analyze the remainder of the structures. The hybrid simulation results were compared against those obtained from a large-scale test and finite element analyses. The study found that if precautions are taken in preparing model materials and if the shear-related mechanisms are accurately considered in the numerical model, small-scale hybrid simulations can adequately simulate the behaviour of shear-critical structures. Although the findings of the study are promising, to draw general conclusions additional test data are required.

  18. Nicholas Metropolis Award for Outstanding Doctoral Thesis Work in Computational Physics Talk: Understanding Nano-scale Electronic Systems via Large-scale Computation

    NASA Astrophysics Data System (ADS)

    Cao, Chao

    2009-03-01

    Nano-scale physical phenomena and processes, especially those in electronics, have drawn great attention in the past decade. Experiments have shown that electronic and transport properties of functionalized carbon nanotubes are sensitive to adsorption of gas molecules such as H2, NO2, and NH3. Similar measurements have also been performed to study adsorption of proteins on other semiconductor nano-wires. These experiments suggest that nano-scale systems can be useful for making future chemical and biological sensors. Aiming to understand the physical mechanisms underlying and governing property changes at nano-scale, we start off by investigating, via first-principles method, the electronic structure of Pd-CNT before and after hydrogen adsorption, and continue with coherent electronic transport using non-equilibrium Green’s function techniques combined with density functional theory. Once our results are fully analyzed they can be used to interpret and understand experimental data, with a few difficult issues to be addressed. Finally, we discuss a newly developed multi-scale computing architecture, OPAL, that coordinates simultaneous execution of multiple codes. Inspired by the capabilities of this computing framework, we present a scenario of future modeling and simulation of multi-scale, multi-physical processes.

  19. Monitoring and validating spatio-temporal continuously daily evapotranspiration and its components at river basin scale

    NASA Astrophysics Data System (ADS)

    Song, L.; Liu, S.; Kustas, W. P.; Nieto, H.

    2017-12-01

    Operational estimation of spatio-temporal continuously daily evapotranspiration (ET), and the components evaporation (E) and transpiration (T), at watershed scale is very useful for developing a sustainable water resource strategy in semi-arid and arid areas. In this study, multi-year all-weather daily ET, E and T were estimated using MODIS-based (Dual Temperature Difference) DTD model under different land covers in Heihe watershed, China. The remotely sensed ET was validated using ground measurements from large aperture scintillometer systems, with a source area of several kilometers, under grassland, cropland and riparian shrub-forest. The results showed that the remotely sensed ET produced mean absolute percent deviation (MAPD) errors of about 30% during the growing season for all-weather conditions, but the model performed better under clear sky conditions. However, uncertainty in interpolated MODIS land surface temperature input data under cloudy conditions to the DTD model, and the representativeness of LAS measurements for the heterogeneous land surfaces contribute to the discrepancies between the modeled and ground measured surface heat fluxes, especially for the more humid grassland and heterogeneous shrub-forest sites.

  20. Evaluation of Cartosat-1 Multi-Scale Digital Surface Modelling Over France

    PubMed Central

    Gianinetto, Marco

    2009-01-01

    On 5 May 2005, the Indian Space Research Organization launched Cartosat-1, the eleventh satellite of its constellation, dedicated to the stereo viewing of the Earth's surface for terrain modeling and large-scale mapping, from the Satish Dhawan Space Centre (India). In early 2006, the Indian Space Research Organization started the Cartosat-1 Scientific Assessment Programme, jointly established with the International Society for Photogrammetry and Remote Sensing. Within this framework, this study evaluated the capabilities of digital surface modeling from Cartosat-1 stereo data for the French test sites of Mausanne les Alpilles and Salon de Provence. The investigation pointed out that for hilly territories it is possible to produce high-resolution digital surface models with a root mean square error less than 7.1 m and a linear error at 90% confidence level less than 9.5 m. The accuracy of the generated digital surface models also fulfilled the requirements of the French Reference 3D®, so Cartosat-1 data may be used to produce or update such kinds of products. PMID:22412311

  1. Psychometric Evaluation of the Fear of Positive Evaluation Scale in Patients With Social Anxiety Disorder

    PubMed Central

    Weeks, Justin W.; Heimberg, Richard G.; Rodebaugh, Thomas L.; Goldin, Philippe R.; Gross, James J.

    2014-01-01

    The Fear of Positive Evaluation Scale (FPES; Weeks, Heimberg, & Rodebaugh, 2008) was designed to assess fear of positive evaluation, a proposed cognitive component of social anxiety. Although previous findings on the psychometric properties of the FPES have been highly encouraging, only one previous study has examined the psychometric profile of the FPES in a sample of patients with social anxiety disorder (Fergus et al., 2009). The primary purpose of the present study was to conduct a large multi-site examination of the psychometric profile of the FPES among patients with a principal diagnosis of social anxiety disorder (n = 226; generalized subtype: 97.8%). Responses of non-anxious control participants (n = 42) were also examined. The factorial validity, internal consistency, test-retest reliability, construct validity, and treatment sensitivity of the FPES were strongly supported by our findings. Furthermore, an FPES cutoff score was identified for distinguishing levels of fear of positive evaluation characteristic of patients with social anxiety disorder from those characteristic of the control group. Results provide additional support for the psychometric properties of the FPES in clinical samples. PMID:21966932

  2. SChloro: directing Viridiplantae proteins to six chloroplastic sub-compartments.

    PubMed

    Savojardo, Castrense; Martelli, Pier Luigi; Fariselli, Piero; Casadio, Rita

    2017-02-01

    Chloroplasts are organelles found in plants and involved in several important cell processes. Similarly to other compartments in the cell, chloroplasts have an internal structure comprising several sub-compartments, where different proteins are targeted to perform their functions. Given the relation between protein function and localization, the availability of effective computational tools to predict protein sub-organelle localizations is crucial for large-scale functional studies. In this paper we present SChloro, a novel machine-learning approach to predict protein sub-chloroplastic localization, based on targeting signal detection and membrane protein information. The proposed approach performs multi-label predictions discriminating six chloroplastic sub-compartments that include inner membrane, outer membrane, stroma, thylakoid lumen, plastoglobule and thylakoid membrane. In comparative benchmarks, the proposed method outperforms current state-of-the-art methods in both single- and multi-compartment predictions, with an overall multi-label accuracy of 74%. The results demonstrate the relevance of the approach that is eligible as a good candidate for integration into more general large-scale annotation pipelines of protein subcellular localization. The method is available as web server at http://schloro.biocomp.unibo.it gigi@biocomp.unibo.it.

  3. SNAVA-A real-time multi-FPGA multi-model spiking neural network simulation architecture.

    PubMed

    Sripad, Athul; Sanchez, Giovanny; Zapata, Mireya; Pirrone, Vito; Dorta, Taho; Cambria, Salvatore; Marti, Albert; Krishnamourthy, Karthikeyan; Madrenas, Jordi

    2018-01-01

    Spiking Neural Networks (SNN) for Versatile Applications (SNAVA) simulation platform is a scalable and programmable parallel architecture that supports real-time, large-scale, multi-model SNN computation. This parallel architecture is implemented in modern Field-Programmable Gate Arrays (FPGAs) devices to provide high performance execution and flexibility to support large-scale SNN models. Flexibility is defined in terms of programmability, which allows easy synapse and neuron implementation. This has been achieved by using a special-purpose Processing Elements (PEs) for computing SNNs, and analyzing and customizing the instruction set according to the processing needs to achieve maximum performance with minimum resources. The parallel architecture is interfaced with customized Graphical User Interfaces (GUIs) to configure the SNN's connectivity, to compile the neuron-synapse model and to monitor SNN's activity. Our contribution intends to provide a tool that allows to prototype SNNs faster than on CPU/GPU architectures but significantly cheaper than fabricating a customized neuromorphic chip. This could be potentially valuable to the computational neuroscience and neuromorphic engineering communities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Active phase locking of thirty fiber channels using multilevel phase dithering method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhimeng; Luo, Yongquan, E-mail: yongquan-l@sina.com; Liu, Cangli

    2016-03-15

    An active phase locking of a large-scale fiber array with thirty channels has been demonstrated experimentally. In the experiment, the first group of thirty phase controllers is used to compensate the phase noises between the elements and the second group of thirty phase modulators is used to impose additional phase disturbances to mimic the phase noises in the high power fiber amplifiers. A multi-level phase dithering algorithm using dual-level rectangular-wave phase modulation and time division multiplexing can achieve the same phase control as single/multi-frequency dithering technique, but without coherent demodulation circuit. The phase locking efficiency of 30 fiber channels ismore » achieved about 98.68%, 97.82%, and 96.50% with no additional phase distortion, modulated phase distortion I (±1 rad), and phase distortion II (±2 rad), corresponding to the phase error of λ/54, λ/43, and λ/34 rms. The contrast of the coherent combined beam profile is about 89%. Experimental results reveal that the multi-level phase dithering technique has great potential in scaling to a large number of laser beams.« less

  5. Problems and Solutions in Evaluating Child Outcomes of Large-Scale Educational Programs.

    ERIC Educational Resources Information Center

    Abrams, Allan S.; And Others

    1979-01-01

    Evaluation of large-scale programs is problematical because of inherent bias in assignment of treatment and control groups, resulting in serious regression artifacts even with the use of analysis of covariance designs. Nonuniformity of program implementation across sites and classrooms is also a problem. (Author/GSK)

  6. Fine-scale spatial distribution of the common lugworm Arenicola marina, and effects of intertidal clam fishing

    NASA Astrophysics Data System (ADS)

    Boldina, Inna; Beninger, Peter G.

    2014-04-01

    Despite its ubiquity and its role as an ecosystem engineer on temperate intertidal mudflats, little is known of the spatial ecology of the lugworm Arenicola marina. We estimated lugworm densities and analyzed the spatial distribution of A. marina on a French Atlantic mudflat subjected to long-term clam digging activities, and compared these to a nearby pristine reference mudflat, using a combination of geostatistical techniques: point-pattern analysis, autocorrelation, and wavelet analysis. Lugworm densities were an order of magnitude greater at the reference site. Although A. marina showed an aggregative spatial distribution at both sites, the characteristics and intensity of aggregation differed markedly between sites. The reference site showed an inhibition process (regular distribution) at distances <7.5 cm, whereas the impacted site showed a random distribution at this scale. At distances from 15 cm to several tens of meters, the spatial distribution of A. marina was clearly aggregated at both sites; however, the autocorrelation strength was much weaker at the impacted site. In addition, the non-impacted site presented multi-scale spatial distribution, which was not evident at the impacted site. The differences observed between the spatial distributions of the fishing-impacted vs. the non-impacted site reflect similar findings for other components of these two mudflat ecosystems, suggesting common community-level responses to prolonged mechanical perturbation: a decrease in naturally-occurring aggregation. This change may have consequences for basic biological characteristics such as reproduction, recruitment, growth, and feeding.

  7. A Composite Network Approach for Assessing Multi-Species Connectivity: An Application to Road Defragmentation Prioritisation

    PubMed Central

    Saura, Santiago; Rondinini, Carlo

    2016-01-01

    One of the biggest challenges in large-scale conservation is quantifying connectivity at broad geographic scales and for a large set of species. Because connectivity analyses can be computationally intensive, and the planning process quite complex when multiple taxa are involved, assessing connectivity at large spatial extents for many species turns to be often intractable. Such limitation results in that conducted assessments are often partial by focusing on a few key species only, or are generic by considering a range of dispersal distances and a fixed set of areas to connect that are not directly linked to the actual spatial distribution or mobility of particular species. By using a graph theory framework, here we propose an approach to reduce computational effort and effectively consider large assemblages of species in obtaining multi-species connectivity priorities. We demonstrate the potential of the approach by identifying defragmentation priorities in the Italian road network focusing on medium and large terrestrial mammals. We show that by combining probabilistic species graphs prior to conducting the network analysis (i) it is possible to analyse connectivity once for all species simultaneously, obtaining conservation or restoration priorities that apply for the entire species assemblage; and that (ii) those priorities are well aligned with the ones that would be obtained by aggregating the results of separate connectivity analysis for each of the individual species. This approach offers great opportunities to extend connectivity assessments to large assemblages of species and broad geographic scales. PMID:27768718

  8. Application of the multi-scale finite element method to wave propagation problems in damaged structures

    NASA Astrophysics Data System (ADS)

    Casadei, F.; Ruzzene, M.

    2011-04-01

    This work illustrates the possibility to extend the field of application of the Multi-Scale Finite Element Method (MsFEM) to structural mechanics problems that involve localized geometrical discontinuities like cracks or notches. The main idea is to construct finite elements with an arbitrary number of edge nodes that describe the actual geometry of the damage with shape functions that are defined as local solutions of the differential operator of the specific problem according to the MsFEM approach. The small scale information are then brought to the large scale model through the coupling of the global system matrices that are assembled using classical finite element procedures. The efficiency of the method is demonstrated through selected numerical examples that constitute classical problems of great interest to the structural health monitoring community.

  9. Stochastic partial differential fluid equations as a diffusive limit of deterministic Lagrangian multi-time dynamics.

    PubMed

    Cotter, C J; Gottwald, G A; Holm, D D

    2017-09-01

    In Holm (Holm 2015 Proc. R. Soc. A 471 , 20140963. (doi:10.1098/rspa.2014.0963)), stochastic fluid equations were derived by employing a variational principle with an assumed stochastic Lagrangian particle dynamics. Here we show that the same stochastic Lagrangian dynamics naturally arises in a multi-scale decomposition of the deterministic Lagrangian flow map into a slow large-scale mean and a rapidly fluctuating small-scale map. We employ homogenization theory to derive effective slow stochastic particle dynamics for the resolved mean part, thereby obtaining stochastic fluid partial equations in the Eulerian formulation. To justify the application of rigorous homogenization theory, we assume mildly chaotic fast small-scale dynamics, as well as a centring condition. The latter requires that the mean of the fluctuating deviations is small, when pulled back to the mean flow.

  10. Glacial-Interglacial, Orbital and Millennial-Scale Climate Variability for the Last Glacial Cycle at Shackleton Site U1385 based on Dinoflagellate Cysts

    NASA Astrophysics Data System (ADS)

    Datema, M.

    2015-12-01

    The Shackleton Site (IODP Expedition 339 Site U1385), located off the West-Portuguese Margin, preserves a continuous high-fidelity record of millennial-scale climate variability for the last several glacial cycles (~1.4 Myr) that can be correlated precisely to patterns observed in polar ice cores. In addition, rapid delivery of terrestrial material to the deep-sea environment allows the correlation of these marine records to European terrestrial climate records. This unique marine-ice-terrestrial linkage makes the Shackleton Site the ideal reference section for studying Quaternary abrupt climate change. The main objective of studying Site U1385 is to establish a marine reference section of Pleistocene climate change. We generated (sub)millennial-scale (~600 year interval) dinoflagellate cyst (dinocyst) assemblage records from Shackleton Site U1385 (IODP Expedition 339) to reconstruct sea surface temperature (SST) and productivity/upwelling over the last 152 kyrs. In addition, our approach allows for detailed land-sea correlations, because we also counted assemblages of pollen and spores from higher plants. Dinocyst SST and upwelling proxies, as well as warm/cold pollen proxies from Site U1385 show glacial-interglacial, orbital and stadial-interstadial climate variability and correlate very well to Uk'37, planktic foraminifer δ18O and Ca/Ti proxies of previously drilled Shackleton Sites and Greenland Ice Core δ18O. The palynological proxies capture (almost) all Dansgaard-Oeschger events of the last glacial cycle, also before ~70 ka, where millennial-scale variability is overprinted by precession. We compare the performance and results of the palynology of Site U1385 to proxies of previously drilled Shackleton Sites and conclude that palynology strengthens the potential of this site to form a multi-proxy reference section for millennial scale climate variability across the Pleistocene-Holocene. Finally, we will present a long-term paleoceanographic perspective down to ~150 ka.

  11. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    NASA Astrophysics Data System (ADS)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  12. A GIS-based multi-source and multi-box modeling approach (GMSMB) for air pollution assessment--a North American case study.

    PubMed

    Wang, Bao-Zhen; Chen, Zhi

    2013-01-01

    This article presents a GIS-based multi-source and multi-box modeling approach (GMSMB) to predict the spatial concentration distributions of airborne pollutant on local and regional scales. In this method, an extended multi-box model combined with a multi-source and multi-grid Gaussian model are developed within the GIS framework to examine the contributions from both point- and area-source emissions. By using GIS, a large amount of data including emission sources, air quality monitoring, meteorological data, and spatial location information required for air quality modeling are brought into an integrated modeling environment. It helps more details of spatial variation in source distribution and meteorological condition to be quantitatively analyzed. The developed modeling approach has been examined to predict the spatial concentration distribution of four air pollutants (CO, NO(2), SO(2) and PM(2.5)) for the State of California. The modeling results are compared with the monitoring data. Good agreement is acquired which demonstrated that the developed modeling approach could deliver an effective air pollution assessment on both regional and local scales to support air pollution control and management planning.

  13. Truly sedentary? The multi-range tactic as a response to resource heterogeneity and unpredictability in a large herbivore.

    PubMed

    Couriot, Ophélie; Hewison, A J Mark; Saïd, Sonia; Cagnacci, Francesca; Chamaillé-Jammes, Simon; Linnell, John D C; Mysterud, Atle; Peters, Wibke; Urbano, Ferdinando; Heurich, Marco; Kjellander, Petter; Nicoloso, Sandro; Berger, Anne; Sustr, Pavel; Kroeschel, Max; Soennichsen, Leif; Sandfort, Robin; Gehr, Benedikt; Morellet, Nicolas

    2018-05-01

    Much research on large herbivore movement has focused on the annual scale to distinguish between resident and migratory tactics, commonly assuming that individuals are sedentary at the within-season scale. However, apparently sedentary animals may occupy a number of sub-seasonal functional home ranges (sfHR), particularly when the environment is spatially heterogeneous and/or temporally unpredictable. The roe deer (Capreolus capreolus) experiences sharply contrasting environmental conditions due to its widespread distribution, but appears markedly sedentary over much of its range. Using GPS monitoring from 15 populations across Europe, we evaluated the propensity of this large herbivore to be truly sedentary at the seasonal scale in relation to variation in environmental conditions. We studied movement using net square displacement to identify the possible use of sfHR. We expected that roe deer should be less sedentary within seasons in heterogeneous and unpredictable environments, while migratory individuals should be seasonally more sedentary than residents. Our analyses revealed that, across the 15 populations, all individuals adopted a multi-range tactic, occupying between two and nine sfHR during a given season. In addition, we showed that (i) the number of sfHR was only marginally influenced by variation in resource distribution, but decreased with increasing sfHR size; and (ii) the distance between sfHR increased with increasing heterogeneity and predictability in resource distribution, as well as with increasing sfHR size. We suggest that the multi-range tactic is likely widespread among large herbivores, allowing animals to track spatio-temporal variation in resource distribution and, thereby, to cope with changes in their local environment.

  14. Upscaling Ameriflux observations to assess drought impacts on gross primary productivity across the Southwest

    NASA Astrophysics Data System (ADS)

    Barnes, M.; Moore, D. J.; Scott, R. L.; MacBean, N.; Ponce-Campos, G. E.; Breshears, D. D.

    2017-12-01

    Both satellite observations and eddy covariance estimates provide crucial information about the Earth's carbon, water and energy cycles. Continuous measurements from flux towers facilitate exploration of the exchange of carbon dioxide, water and energy between the land surface and the atmosphere at fine temporal and spatial scales, while satellite observations can fill in the large spatial gaps of in-situ measurements and provide long-term temporal continuity. The Southwest (Southwest United States and Northwest Mexico) and other semi-arid regions represent a key uncertainty in interannual variability in carbon uptake. Comparisons of existing global upscaled gross primary production (GPP) products with flux tower data at sites across the Southwest show widespread mischaracterization of seasonality in vegetation carbon uptake, resulting in large (up to 200%) errors in annual carbon uptake estimates. Here, remotely sensed and distributed meteorological inputs are used to upscale GPP estimates from 25 Ameriflux towers across the Southwest to the regional scale using a machine learning approach. Our random forest model incorporates two novel features that improve the spatial and temporal variability in GPP. First, we incorporate a multi-scalar drought index at multiple timescales to account for differential seasonality between ecosystem types. Second, our machine learning algorithm was trained on twenty five ecologically diverse sites to optimize both the monthly variability in and the seasonal cycle of GPP. The product and its components will be used to examine drought impacts on terrestrial carbon cycling across the Southwest including the effects of drought seasonality and on carbon uptake. Our spatially and temporally continuous upscaled GPP product drawing from both ground and satellite data over the Southwest region helps us understand linkages between the carbon and water cycles in semi-arid ecosystems and informs predictions of vegetation response to future climate conditions.

  15. Evaluation of various LandFlux evapotranspiration algorithms using the LandFlux-EVAL synthesis benchmark products and observational data

    NASA Astrophysics Data System (ADS)

    Michel, Dominik; Hirschi, Martin; Jimenez, Carlos; McCabe, Mathew; Miralles, Diego; Wood, Eric; Seneviratne, Sonia

    2014-05-01

    Research on climate variations and the development of predictive capabilities largely rely on globally available reference data series of the different components of the energy and water cycles. Several efforts aimed at producing large-scale and long-term reference data sets of these components, e.g. based on in situ observations and remote sensing, in order to allow for diagnostic analyses of the drivers of temporal variations in the climate system. Evapotranspiration (ET) is an essential component of the energy and water cycle, which can not be monitored directly on a global scale by remote sensing techniques. In recent years, several global multi-year ET data sets have been derived from remote sensing-based estimates, observation-driven land surface model simulations or atmospheric reanalyses. The LandFlux-EVAL initiative presented an ensemble-evaluation of these data sets over the time periods 1989-1995 and 1989-2005 (Mueller et al. 2013). Currently, a multi-decadal global reference heat flux data set for ET at the land surface is being developed within the LandFlux initiative of the Global Energy and Water Cycle Experiment (GEWEX). This LandFlux v0 ET data set comprises four ET algorithms forced with a common radiation and surface meteorology. In order to estimate the agreement of this LandFlux v0 ET data with existing data sets, it is compared to the recently available LandFlux-EVAL synthesis benchmark product. Additional evaluation of the LandFlux v0 ET data set is based on a comparison to in situ observations of a weighing lysimeter from the hydrological research site Rietholzbach in Switzerland. These analyses serve as a test bed for similar evaluation procedures that are envisaged for ESA's WACMOS-ET initiative (http://wacmoset.estellus.eu). Reference: Mueller, B., Hirschi, M., Jimenez, C., Ciais, P., Dirmeyer, P. A., Dolman, A. J., Fisher, J. B., Jung, M., Ludwig, F., Maignan, F., Miralles, D. G., McCabe, M. F., Reichstein, M., Sheffield, J., Wang, K., Wood, E. F., Zhang, Y., and Seneviratne, S. I. (2013). Benchmark products for land evapotranspiration: LandFlux-EVAL multi-data set synthesis. Hydrology and Earth System Sciences, 17(10): 3707-3720.

  16. Dynamical optical imaging monocytes/macrophages migration and activation in contact hypersensitivity (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Zhang, Zhihong

    2017-02-01

    Inflammatory monocytes/macrophages (Mon/Mφ) play an important role in cutaneous allergic inflammation. However, their migration and activation in dermatitis and how they accelerate the inflammatory reaction are largely unknown. Optical molecular imaging is the most promising tool for investigating the function and motility of immune cells in vivo. We have developed a multi-scale optical imaging approach to evaluate the spatio-temporal dynamic behavior and properties of immune cells from the whole field of organs to the cellular level at the inflammatory site in delayed type hypersensitivity reaction. Here, we developed some multi-color labeling mouse models based on the endogenous labeling with fluorescent proteins and the exogenous labeling with fluorescent dyes. We investigated the cell movement, cell interaction and function of immunocytes (e.g. Mon/Mφ, DC, T cells and neutrophils) in the skin allergy inflammation (e.g., contact hypersensitivity) by using intravital microscopy. The long-term imaging data showed that after inflammatory Mon/Mφ transendothelial migration in dermis, they migrating in interstitial space of dermis. Depletion of blood monocyte with clodronate liposome extremely reduced the inflammatory reaction. Our finding provided further insight into inflammatory Mon/Mφ mediating the inflammatory cascade through functional migration in allergic contact dermatitis.

  17. SUMMARY OF SOLIDIFICATION/STABILIZATION SITE DEMONSTRATIONS AT UNCONTROLLED HAZARDOUS WASTE SITES

    EPA Science Inventory

    Four large-scale solidification/stabilization demonstrations have occurred under EPA's SITE program. In general, physical testing results have been acceptable. Reduction in metal leachability, as determined by the TCLP test, has been observed. Reduction in organic leachability ha...

  18. Evaluation of Advanced Reactive Surface Area Estimates for Improved Prediction of Mineral Reaction Rates in Porous Media

    NASA Astrophysics Data System (ADS)

    Beckingham, L. E.; Mitnick, E. H.; Zhang, S.; Voltolini, M.; Yang, L.; Steefel, C. I.; Swift, A.; Cole, D. R.; Sheets, J.; Kneafsey, T. J.; Landrot, G.; Anovitz, L. M.; Mito, S.; Xue, Z.; Ajo Franklin, J. B.; DePaolo, D.

    2015-12-01

    CO2 sequestration in deep sedimentary formations is a promising means of reducing atmospheric CO2 emissions but the rate and extent of mineral trapping remains difficult to predict. Reactive transport models provide predictions of mineral trapping based on laboratory mineral reaction rates, which have been shown to have large discrepancies with field rates. This, in part, may be due to poor quantification of mineral reactive surface area in natural porous media. Common estimates of mineral reactive surface area are ad hoc and typically based on grain size, adjusted several orders of magnitude to account for surface roughness and reactivity. This results in orders of magnitude discrepancies in estimated surface areas that directly translate into orders of magnitude discrepancies in model predictions. Additionally, natural systems can be highly heterogeneous and contain abundant nano- and micro-porosity, which can limit connected porosity and access to mineral surfaces. In this study, mineral-specific accessible surface areas are computed for a sample from the reservoir formation at the Nagaoka pilot CO2 injection site (Japan). Accessible mineral surface areas are determined from a multi-scale image analysis including X-ray microCT, SEM QEMSCAN, XRD, SANS, and SEM-FIB. Powder and flow-through column laboratory experiments are performed and the evolution of solutes in the aqueous phase is tracked. Continuum-scale reactive transport models are used to evaluate the impact of reactive surface area on predictions of experimental reaction rates. Evaluated reactive surface areas include geometric and specific surface areas (eg. BET) in addition to their reactive-site weighted counterparts. The most accurate predictions of observed powder mineral dissolution rates were obtained through use of grain-size specific surface areas computed from a BET-based correlation. Effectively, this surface area reflects the grain-fluid contact area, or accessible surface area, in the powder dissolution experiment. In the model of the flow-through column experiment, the accessible mineral surface area, computed from the multi-scale image analysis, is evaluated in addition to the traditional surface area estimates.

  19. A global framework to model spatial ecosystems exposure to home and personal care chemicals in Asia.

    PubMed

    Wannaz, Cedric; Franco, Antonio; Kilgallon, John; Hodges, Juliet; Jolliet, Olivier

    2018-05-01

    This paper analyzes spatially ecosystem exposure to home and personal care (HPC) chemicals, accounting for market data and environmental processes in hydrological water networks, including multi-media fate and transport. We present a global modeling framework built on ScenAT (spatial scenarios of emission), SimpleTreat (sludge treatment plants), and Pangea (spatial multi-scale multimedia fate and transport of chemicals), that we apply across Asia to four chemicals selected to cover a variety of applications, volumes of production and emission, and physico-chemical and environmental fate properties: the anionic surfactant linear alkylbenzene sulphonate (LAS), the antimicrobial triclosan (TCS), the personal care preservative methyl paraben (MeP), and the emollient decamethylcyclopentasiloxane (D5). We present maps of predicted environmental concentrations (PECs) and compare them with monitored values. LAS emission levels and PECs are two to three orders of magnitude greater than for other substances, yet the literature about monitored levels of LAS in Asia is very limited. We observe a good agreement for TCS in freshwater (Pearson r=0.82, for 253 monitored values covering 12 streams), a moderate agreement in general, and a significant model underestimation for MeP in sediments. While most differences could be explained by uncertainty in both chemical/hydrological parameters (DT50 water , DT50 sediments , K oc , f oc , TSS) and monitoring sites (e.g. spatial/temporal design), the underestimation of MeP concentrations in sediments may involve potential natural sources. We illustrate the relevance of local evaluations for short-lived substances in fresh water (LAS, MeP), and their inadequacy for substances with longer half-lives (TCS, D5). This framework constitutes a milestone towards higher tier exposure modeling approaches for identifying areas of higher chemical concentration, and linking large-scale fate modeling with (sub) catchment-scale ecological scenarios; a major limitation in model accuracy comes from the discrepancy between streams routed on a gridded, 0.5°×0.5° global hydrological network and actual locations of streams and monitoring sites. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Sea ice algae chlorophyll a concentrations derived from under-ice spectral radiation profiling platforms

    NASA Astrophysics Data System (ADS)

    Lange, Benjamin A.; Katlein, Christian; Nicolaus, Marcel; Peeken, Ilka; Flores, Hauke

    2016-12-01

    Multiscale sea ice algae observations are fundamentally important for projecting changes to sea ice ecosystems, as the physical environment continues to change. In this study, we developed upon previously established methodologies for deriving sea ice-algal chlorophyll a concentrations (chl a) from spectral radiation measurements, and applied these to larger-scale spectral surveys. We conducted four different under-ice spectral measurements: irradiance, radiance, transmittance, and transflectance, and applied three statistical approaches: Empirical Orthogonal Functions (EOF), Normalized Difference Indices (NDI), and multi-NDI. We developed models based on ice core chl a and coincident spectral irradiance/transmittance (N = 49) and radiance/transflectance (N = 50) measurements conducted during two cruises to the central Arctic Ocean in 2011 and 2012. These reference models were ranked based on two criteria: mean robustness R2 and true prediction error estimates. For estimating the biomass of a large-scale data set, the EOF approach performed better than the NDI, due to its ability to account for the high variability of environmental properties experienced over large areas. Based on robustness and true prediction error, the three most reliable models, EOF-transmittance, EOF-transflectance, and NDI-transmittance, were applied to two remotely operated vehicle (ROV) and two Surface and Under-Ice Trawl (SUIT) spectral radiation surveys. In these larger-scale chl a estimates, EOF-transmittance showed the best fit to ice core chl a. Application of our most reliable model, EOF-transmittance, to an 85 m horizontal ROV transect revealed large differences compared to published biomass estimates from the same site with important implications for projections of Arctic-wide ice-algal biomass and primary production.

  1. Students' Attitudes towards Edmodo, a Social Learning Network: A Scale Development Study

    ERIC Educational Resources Information Center

    Yunkul, Eyup; Cankaya, Serkan

    2017-01-01

    Social Learning Networks (SLNs) are the developed forms of Social Network Sites (SNSs) adapted to educational environments, and they are used by quite a large population throughout the world. In addition, in related literature, there is no scale for the measurement of students' attitudes towards such sites. The purpose of this study was to develop…

  2. 75 FR 58395 - Agency Information Collection Activities: Proposed Collection: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ... Technology Planning Grants, Electronic Health Record Implementation Health Center Controlled Networks, Health... Records Implementation for Health Center Controlled Networks and Large Multi Site Health Centers. In order... DEPARTMENT OF HEALTH AND HUMAN SERVICES Health Resources and Services Administration Agency...

  3. Using 3D Geologic Models to Synthesize Large and Disparate Datasets for Site Characterization and Verification Purposes

    NASA Astrophysics Data System (ADS)

    Hillesheim, M. B.; Rautman, C. A.; Johnson, P. B.; Powers, D. W.

    2008-12-01

    As we are all aware, increases in computing power and efficiency have allowed for the development of many modeling codes capable of processing large and sometimes disparate datasets (e.g., geological, hydrological, geochemical, etc). Because people sometimes have difficulty visualizing in three dimensions (3D) or understanding how multiple figures of various geologic features relate as a whole, 3D geologic models can be excellent tools to illustrate key concepts and findings, especially to lay persons, such as stakeholders, customers, and other concerned parties. In this presentation, we will show examples of 3D geologic modeling efforts using data collected during site characterization and verification work at the Waste Isolation Pilot Plant (WIPP). The WIPP is a U.S. Department of Energy (DOE) facility located in southeastern New Mexico, designed for the safe disposal of transuranic wastes resulting from U.S. defense programs. The 3D geologic modeling efforts focused on refining our understanding of the WIPP site by integrating a variety of geologic data. Examples include: overlaying isopach surfaces of unit thickness and overburden thickness, a map of geologic facies changes, and a transmissivity field onto a 3D structural map of a geologic unit of interest. In addition, we also present a 4D hydrogeologic model of the effects of a large-scale pumping test on water levels. All these efforts have provided additional insights into the controls on transmissivity and flow in the WIPP vicinity. Ultimately, by combining these various types of data we have increased our understanding of the WIPP site's hydrogeologic system, which is a key aspect of continued certification. Sandia is a multi program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04- 94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S Department of Energy.

  4. Multi-contamination (heavy metals, polychlorinated biphenyls and polycyclic aromatic hydrocarbons) of littoral sediments and the associated ecological risk assessment in a large lake in France (Lake Bourget).

    PubMed

    Lécrivain, Nathalie; Aurenche, Vincent; Cottin, Nathalie; Frossard, Victor; Clément, Bernard

    2018-04-01

    The lake littoral sediment is exposed to a large array of contaminants that can exhibit significant spatial variability and challenge our ability to assess contamination at lake scale. In this study, littoral sediment contamination was characterized among ten different sites in a large peri-alpine lake (Lake Bourget) regarding three groups of contaminants: 6 heavy metals, 15 polycyclic aromatic hydrocarbons and 7 polychlorinated biphenyls. The contamination profiles significantly varied among sites and differed from those previously reported for the deepest zone of the lake. An integrative approach including chemical and biological analyses was conducted to relate site contamination to ecological risk. The chemical approach consisted in mean PEC quotient calculation (average of the ratios of the contaminants concentration to their corresponding Probable Effect Concentration values) and revealed a low and heterogeneous toxicity of the contaminant mixture along the littoral. Biological analysis including both laboratory (microcosm assays) and in situ (Acetylcholine Esterase (AChE) and Glutathione S-Transferase (GST) activity measurements) experiments highlighted significant differences among sites both in the field and in laboratory assays suggesting a spatial variation of the biota response to contamination. Linear regressions were performed between mean PEC quotients and biological results to assess whether littoral ecological risk was explained by the contamination profiles. The results highly depended on the study benthic or pelagic compartment. Regarding autochthonous Corbicula fluminea, no significant relationship between mean PEC quotients and biomarker activity was found while a significant increase in AChE was observed on autochthonous chironomids, suggesting different stress among benthic organisms. Both AChE and GST in caged pelagic Daphnia magna showed a significant positive relationship with mean PEC quotients. This study underlines the importance of accounting for spatial variations in lake littoral sediment contamination and the need for performing an integrative approach coupling chemical, field and laboratory analyses to assess the ecological risk. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Stochastic injection-strategy optimization for the preliminary assessment of candidate geological storage sites

    NASA Astrophysics Data System (ADS)

    Cody, Brent M.; Baù, Domenico; González-Nicolás, Ana

    2015-09-01

    Geological carbon sequestration (GCS) has been identified as having the potential to reduce increasing atmospheric concentrations of carbon dioxide (CO2). However, a global impact will only be achieved if GCS is cost-effectively and safely implemented on a massive scale. This work presents a computationally efficient methodology for identifying optimal injection strategies at candidate GCS sites having uncertainty associated with caprock permeability, effective compressibility, and aquifer permeability. A multi-objective evolutionary optimization algorithm is used to heuristically determine non-dominated solutions between the following two competing objectives: (1) maximize mass of CO2 sequestered and (2) minimize project cost. A semi-analytical algorithm is used to estimate CO2 leakage mass rather than a numerical model, enabling the study of GCS sites having vastly different domain characteristics. The stochastic optimization framework presented herein is applied to a feasibility study of GCS in a brine aquifer in the Michigan Basin (MB), USA. Eight optimization test cases are performed to investigate the impact of decision-maker (DM) preferences on Pareto-optimal objective-function values and carbon-injection strategies. This analysis shows that the feasibility of GCS at the MB test site is highly dependent upon the DM's risk-adversity preference and degree of uncertainty associated with caprock integrity. Finally, large gains in computational efficiency achieved using parallel processing and archiving are discussed.

  6. The Human Blood Metabolome-Transcriptome Interface

    PubMed Central

    Schramm, Katharina; Adamski, Jerzy; Gieger, Christian; Herder, Christian; Carstensen, Maren; Peters, Annette; Rathmann, Wolfgang; Roden, Michael; Strauch, Konstantin; Suhre, Karsten; Kastenmüller, Gabi; Prokisch, Holger; Theis, Fabian J.

    2015-01-01

    Biological systems consist of multiple organizational levels all densely interacting with each other to ensure function and flexibility of the system. Simultaneous analysis of cross-sectional multi-omics data from large population studies is a powerful tool to comprehensively characterize the underlying molecular mechanisms on a physiological scale. In this study, we systematically analyzed the relationship between fasting serum metabolomics and whole blood transcriptomics data from 712 individuals of the German KORA F4 cohort. Correlation-based analysis identified 1,109 significant associations between 522 transcripts and 114 metabolites summarized in an integrated network, the ‘human blood metabolome-transcriptome interface’ (BMTI). Bidirectional causality analysis using Mendelian randomization did not yield any statistically significant causal associations between transcripts and metabolites. A knowledge-based interpretation and integration with a genome-scale human metabolic reconstruction revealed systematic signatures of signaling, transport and metabolic processes, i.e. metabolic reactions mainly belonging to lipid, energy and amino acid metabolism. Moreover, the construction of a network based on functional categories illustrated the cross-talk between the biological layers at a pathway level. Using a transcription factor binding site enrichment analysis, this pathway cross-talk was further confirmed at a regulatory level. Finally, we demonstrated how the constructed networks can be used to gain novel insights into molecular mechanisms associated to intermediate clinical traits. Overall, our results demonstrate the utility of a multi-omics integrative approach to understand the molecular mechanisms underlying both normal physiology and disease. PMID:26086077

  7. Habitat structure mediates biodiversity effects on ecosystem properties

    PubMed Central

    Godbold, J. A.; Bulling, M. T.; Solan, M.

    2011-01-01

    Much of what we know about the role of biodiversity in mediating ecosystem processes and function stems from manipulative experiments, which have largely been performed in isolated, homogeneous environments that do not incorporate habitat structure or allow natural community dynamics to develop. Here, we use a range of habitat configurations in a model marine benthic system to investigate the effects of species composition, resource heterogeneity and patch connectivity on ecosystem properties at both the patch (bioturbation intensity) and multi-patch (nutrient concentration) scale. We show that allowing fauna to move and preferentially select patches alters local species composition and density distributions, which has negative effects on ecosystem processes (bioturbation intensity) at the patch scale, but overall positive effects on ecosystem functioning (nutrient concentration) at the multi-patch scale. Our findings provide important evidence that community dynamics alter in response to localized resource heterogeneity and that these small-scale variations in habitat structure influence species contributions to ecosystem properties at larger scales. We conclude that habitat complexity forms an important buffer against disturbance and that contemporary estimates of the level of biodiversity required for maintaining future multi-functional systems may need to be revised. PMID:21227969

  8. Habitat structure mediates biodiversity effects on ecosystem properties.

    PubMed

    Godbold, J A; Bulling, M T; Solan, M

    2011-08-22

    Much of what we know about the role of biodiversity in mediating ecosystem processes and function stems from manipulative experiments, which have largely been performed in isolated, homogeneous environments that do not incorporate habitat structure or allow natural community dynamics to develop. Here, we use a range of habitat configurations in a model marine benthic system to investigate the effects of species composition, resource heterogeneity and patch connectivity on ecosystem properties at both the patch (bioturbation intensity) and multi-patch (nutrient concentration) scale. We show that allowing fauna to move and preferentially select patches alters local species composition and density distributions, which has negative effects on ecosystem processes (bioturbation intensity) at the patch scale, but overall positive effects on ecosystem functioning (nutrient concentration) at the multi-patch scale. Our findings provide important evidence that community dynamics alter in response to localized resource heterogeneity and that these small-scale variations in habitat structure influence species contributions to ecosystem properties at larger scales. We conclude that habitat complexity forms an important buffer against disturbance and that contemporary estimates of the level of biodiversity required for maintaining future multi-functional systems may need to be revised.

  9. Detection of Neuron Membranes in Electron Microscopy Images Using Multi-scale Context and Radon-Like Features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seyedhosseini, Mojtaba; Kumar, Ritwik; Jurrus, Elizabeth R.

    2011-10-01

    Automated neural circuit reconstruction through electron microscopy (EM) images is a challenging problem. In this paper, we present a novel method that exploits multi-scale contextual information together with Radon-like features (RLF) to learn a series of discriminative models. The main idea is to build a framework which is capable of extracting information about cell membranes from a large contextual area of an EM image in a computationally efficient way. Toward this goal, we extract RLF that can be computed efficiently from the input image and generate a scale-space representation of the context images that are obtained at the output ofmore » each discriminative model in the series. Compared to a single-scale model, the use of a multi-scale representation of the context image gives the subsequent classifiers access to a larger contextual area in an effective way. Our strategy is general and independent of the classifier and has the potential to be used in any context based framework. We demonstrate that our method outperforms the state-of-the-art algorithms in detection of neuron membranes in EM images.« less

  10. Multi-season climate synchronized historical fires in dry forests (1650-1900), northern Rockies, U.S.A.

    PubMed

    Heyerdahl, Emily K; Morgan, Penelope; Riser, James P

    2008-03-01

    Our objective was to infer the climate drivers of regionally synchronous fire years in dry forests of the U.S. northern Rockies in Idaho and western Montana. During our analysis period (1650-1900), we reconstructed fires from 9245 fire scars on 576 trees (mostly ponderosa pine, Pinus ponderosa P. & C. Lawson) at 21 sites and compared them to existing tree-ring reconstructions of climate (temperature and the Palmer Drought Severity Index [PDSI]) and large-scale climate patterns that affect modern spring climate in this region (El Niño Southern Oscillation [ENSO] and the Pacific Decadal Oscillation [PDO]). We identified 32 regional-fire years as those with five or more sites with fire. Fires were remarkably widespread during such years, including one year (1748) in which fires were recorded at 10 sites across what are today seven national forests plus one site on state land. During regional-fire years, spring-summers were significantly warm and summers were significantly warm-dry whereas the opposite conditions prevailed during the 99 years when no fires were recorded at any of our sites (no-fire years). Climate in prior years was not significantly associated with regional- or no-fire years. Years when fire was recorded at only a few of our sites occurred under a broad range of climate conditions, highlighting the fact that the regional climate drivers of fire are most evident when fires are synchronized across a large area. No-fire years tended to occur during La Niña years, which tend to have anomalously deep snowpacks in this region. However, ENSO was not a significant driver of regional-fire years, consistent with the greater influence of La Niña than El Niño conditions on the spring climate of this region. PDO was not a significant driver of past fire, despite being a strong driver of modern spring climate and modern regional-fire years in the northern Rockies.

  11. The effects of workplace health promotion on absenteeism and employment costs in a large industrial population.

    PubMed Central

    Bertera, R L

    1990-01-01

    We evaluated the impact of a comprehensive workplace health promotion program on absences among full-time employees in a large, multi-location, diversified industrial company. A pretest-posttest control group design was used to study 41 intervention sites and 19 control sites with 29,315 and 14,573 hourly employees, respectively. Blue-collar employees at intervention sites experienced an 14.0 percent decline in disability days over two years versus a 5.8 percent decline at control sites. This resulted in a net difference of 11,726 fewer disability days over two years at program sites compared with non-program sites. Savings due to lower disability costs at intervention sites offset program costs in the first year, and provided a return of $2.05 for every dollar invested in the program by the end of the second year. These results suggest that comprehensive workplace health promotion programs can reduce disability days among blue collar employees and provide a good return on investment. PMID:2382748

  12. How well do the GCMs/RCMs capture the multi-scale temporal variability of precipitation in the Southwestern United States?

    NASA Astrophysics Data System (ADS)

    Jiang, Peng; Gautam, Mahesh R.; Zhu, Jianting; Yu, Zhongbo

    2013-02-01

    SummaryMulti-scale temporal variability of precipitation has an established relationship with floods and droughts. In this paper, we present the diagnostics on the ability of 16 General Circulation Models (GCMs) from Bias Corrected and Downscaled (BCSD) World Climate Research Program's (WCRP's) Coupled Model Inter-comparison Project Phase 3 (CMIP3) projections and 10 Regional Climate Models (RCMs) that participated in the North American Regional Climate Change Assessment Program (NARCCAP) to represent multi-scale temporal variability determined from the observed station data. Four regions (Los Angeles, Las Vegas, Tucson, and Cimarron) in the Southwest United States are selected as they represent four different precipitation regions classified by clustering method. We investigate how storm properties and seasonal, inter-annual, and decadal precipitation variabilities differed between GCMs/RCMs and observed records in these regions. We find that current GCMs/RCMs tend to simulate longer storm duration and lower storm intensity compared to those from observed records. Most GCMs/RCMs fail to produce the high-intensity summer storms caused by local convective heat transport associated with the summer monsoon. Both inter-annual and decadal bands are present in the GCM/RCM-simulated precipitation time series; however, these do not line up to the patterns of large-scale ocean oscillations such as El Nino/La Nina Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO). Our results show that the studied GCMs/RCMs can capture long-term monthly mean as the examined data is bias-corrected and downscaled, but fail to simulate the multi-scale precipitation variability including flood generating extreme events, which suggests their inadequacy for studies on floods and droughts that are strongly associated with multi-scale temporal precipitation variability.

  13. Climate insensitivity of treeline in the Canadian Rocky Mountains

    NASA Astrophysics Data System (ADS)

    Johnson, E. A.; Macias Fauria, M.

    2011-12-01

    Successful modelling efforts demonstrate that tree presence over a ~ 200 km2 alpine/subalpine area in the Front Ranges of the Canadian Rocky Mountains results from a multi-scale spatiotemporal process competition involving not only growing season temperatures but also topographical shelter, water availability, and substrate stability and availability. The study area was selected to represent the diversity of substrates and geomorphologic processes found in the Canadian Rockies, and ranges in elevation from 1400 to > 2800 meters above sea level. Tree presence was mapped at 10m resolution using a combination of remote sensing imagery (taken in 2008) and intensive ground truthing, and modelled with an ensemble of state-of-the-art environmental envelope models. Explanatory variables chosen represented not only temperature and moisture availability (computed over 1971-2000 climate normals), but also substrate diversity, slope angle and type, geomorphologic features, modelled regolith depth, and concavity/convexity of the terrain. Such variables were meant to serve as proxies for known convergent and divergent processes that occur on steep landscapes and that have profound influence on tree establishment and survival. Model performance was very high and revealed substrate and geomorphology to be the most important explanatory variables for tree presence in the area. Available high-resolution imagery for 1954 enabled the mapping of tree presence over most of the study area and the identification of changes in the distribution of trees over the last nearly six decades. Overall, the only major observed changes were related to post-fire stand recovery, and areas with treeline advance were insignificant at the landscape scale. Tree suitable sites were projected onto high resolution grids of late 21st century climatic conditions predicted by regional climate models driven by atmosphere-ocean general circulation models. Emissions scenario was A2 (as defined in the Special Report on Emissions Scenarios used by the Intergovernmental Panel on Climate Change), at the higher end of emissions scenarios, and thus at the higher end of forecasted temperature increases. Projected changes in tree site availability were minimal at the landscape scale, as the presence of trees in the uppermost part of these forests largely depends on the existence of suitable sites largely linked to topography. Such places are the result of geomorphologic processes acting on a framework set by the structural geology of the region, and thus the appearance of new sites suitable for tree growth does not depend on short (i.e. yearly to decadal) time scales but on longer ones (i.e. centuries to millennia). This work has the strength of studying treeline over a whole area, thus avoiding potential biases in the regional representativity of local study sites, and warns against careless upscaling of site-based studies. Moreover, we suggest that the term 'treeline' is weak at a high-resolution landscape scale in our study area (i.e. young glaciated terrain) because the distribution of trees over the landscape is spatially irregular and most of the processes enabling or preventing tree presence occur over its whole elevational range.

  14. Commissioning MMS

    NASA Technical Reports Server (NTRS)

    Wood, Paul; Gramling, Cheryl; Stone, John; Smith, Patrick; Reiter, Jenifer

    2016-01-01

    This paper discusses commissioning of NASAs Magnetospheric MultiScale (MMS) Mission. The mission includes four identical spacecraft with a large, complex set of instrumentation. The planning for and execution of commissioning for this mission is described. The paper concludes by discussing lessons learned.

  15. Connecting Smartphone and Wearable Fitness Tracker Data with a Nationally Used Electronic Health Record System for Diabetes Education to Facilitate Behavioral Goal Monitoring in Diabetes Care: Protocol for a Pragmatic Multi-Site Randomized Trial.

    PubMed

    Wang, Jing; Coleman, Deidra Carroll; Kanter, Justin; Ummer, Brad; Siminerio, Linda

    2018-04-02

    Mobile and wearable technology have been shown to be effective in improving diabetes self-management; however, integrating data from these technologies into clinical diabetes care to facilitate behavioral goal monitoring has not been explored. The objective of this paper is to report on a study protocol for a pragmatic multi-site trial along with the intervention components, including the detailed connected health interface. This interface was developed to integrate patient self-monitoring data collected from a wearable fitness tracker and its companion smartphone app to an electronic health record system for diabetes self-management education and support (DSMES) to facilitate behavioral goal monitoring. A 3-month multi-site pragmatic clinical trial was conducted with eligible patients with diabetes mellitus from DSMES programs. The Chronicle Diabetes system is currently freely available to diabetes educators through American Diabetes Association-recognized DSMES programs to set patient nutrition and physical activity goals. To integrate the goal-setting and self-monitoring intervention into the DSMES process, a connected interface in the Chronicle Diabetes system was developed. With the connected interface, patient self-monitoring information collected from smartphones and wearable fitness trackers can facilitate educators' monitoring of patients' adherence to their goals. Feasibility outcomes of the 3-month trial included hemoglobin A 1c levels, weight, and the usability of the connected system. An interface designed to connect data from a wearable fitness tracker with a companion smartphone app for nutrition and physical activity self-monitoring into a diabetes education electronic health record system was successfully developed to enable diabetes educators to facilitate goal setting and monitoring. A total of 60 eligible patients with type 2 diabetes mellitus were randomized into either group 1) standard diabetes education or 2) standard education enhanced with the connected system. Data collection for the 3-month pragmatic trial is completed. Data analysis is in progress. If results of the pragmatic multi-site clinical trial show preliminary efficacy and usability of the connected system, a large-scale implementation trial will be conducted. ClinicalTrials.gov NCT02664233; https://clinicaltrials.gov/ct2/show/NCT02664233 (Archived by WebCite at http://www.webcitation.org/6yDEwXHo5). ©Jing Wang, Deidra Carroll Coleman, Justin Kanter, Brad Ummer, Linda Siminerio. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 02.04.2018.

  16. Generation of large scale GHZ states with the interactions of photons and quantum-dot spins

    NASA Astrophysics Data System (ADS)

    Miao, Chun; Fang, Shu-Dong; Dong, Ping; Yang, Ming; Cao, Zhuo-Liang

    2018-03-01

    We present a deterministic scheme for generating large scale GHZ states in a cavity-quantum dot system. A singly charged quantum dot is embedded in a double-sided optical microcavity with partially reflective top and bottom mirrors. The GHZ-type Bell spin state can be created and two n-spin GHZ states can be perfectly fused to a 2n-spin GHZ state with the help of n ancilla single-photon pulses. The implementation of the current scheme only depends on the photon detection and its need not to operate multi-qubit gates and multi-qubit measurements. Discussions about the effect of the cavity loss, side leakage and exciton cavity coupling strength for the fidelity of generated states show that the fidelity can remain high enough by controlling system parameters. So the current scheme is simple and feasible in experiment.

  17. Advances in multi-scale modeling of solidification and casting processes

    NASA Astrophysics Data System (ADS)

    Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang

    2011-04-01

    The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.

  18. Fault-tolerant Control of a Cyber-physical System

    NASA Astrophysics Data System (ADS)

    Roxana, Rusu-Both; Eva-Henrietta, Dulf

    2017-10-01

    Cyber-physical systems represent a new emerging field in automatic control. The fault system is a key component, because modern, large scale processes must meet high standards of performance, reliability and safety. Fault propagation in large scale chemical processes can lead to loss of production, energy, raw materials and even environmental hazard. The present paper develops a multi-agent fault-tolerant control architecture using robust fractional order controllers for a (13C) cryogenic separation column cascade. The JADE (Java Agent DEvelopment Framework) platform was used to implement the multi-agent fault tolerant control system while the operational model of the process was implemented in Matlab/SIMULINK environment. MACSimJX (Multiagent Control Using Simulink with Jade Extension) toolbox was used to link the control system and the process model. In order to verify the performance and to prove the feasibility of the proposed control architecture several fault simulation scenarios were performed.

  19. Portable chamber measurements of evapotranspiration at the Amargosa Desert Research Site near Beatty, Nye County, Nevada, 2003-06

    USGS Publications Warehouse

    Garcia, C. Amanda; Johnson, Michael J.; Andraski, Brian J.; Halford, Keith J.; Mayers, C. Justin

    2008-01-01

    Portable chamber measurements of evapotranspiration (ET) were made at the U.S. Geological Survey's Amargosa Desert Research Site in southern Nevada to help quantify component- and landscape-scale contributions to ET in an arid environment. Evapotranspiration data were collected approximately every 3 months from 2003 to 2006. Chamber measurements of ET were partitioned into bare-soil evaporation and mixed-species transpiration components. The component-scale ET fluxes from native shrubs typically surpassed those from bare soil by as much as a factor of four. Component-scale ET fluxes were extrapolated to landscape-scale ET using a one-layer, multi-component canopy model. Landscape-scale ET fluxes predominantly were controlled by bare-soil evaporation. Bare soil covered 94 percent of the landscape on average and contributed about 70 percent of the landscape-scale vapor flux. Creosote bush, an evergreen shrub, accounted for about 90 percent of transpiration on average due to its dominance across the landscape (80 percent of the 6 percent shrub cover) and evergreen character.

  20. Large scale modulation of high frequency acoustic waves in periodic porous media.

    PubMed

    Boutin, Claude; Rallu, Antoine; Hans, Stephane

    2012-12-01

    This paper deals with the description of the modulation at large scale of high frequency acoustic waves in gas saturated periodic porous media. High frequencies mean local dynamics at the pore scale and therefore absence of scale separation in the usual sense of homogenization. However, although the pressure is spatially varying in the pores (according to periodic eigenmodes), the mode amplitude can present a large scale modulation, thereby introducing another type of scale separation to which the asymptotic multi-scale procedure applies. The approach is first presented on a periodic network of inter-connected Helmholtz resonators. The equations governing the modulations carried by periodic eigenmodes, at frequencies close to their eigenfrequency, are derived. The number of cells on which the carrying periodic mode is defined is therefore a parameter of the modeling. In a second part, the asymptotic approach is developed for periodic porous media saturated by a perfect gas. Using the "multicells" periodic condition, one obtains the family of equations governing the amplitude modulation at large scale of high frequency waves. The significant difference between modulations of simple and multiple mode are evidenced and discussed. The features of the modulation (anisotropy, width of frequency band) are also analyzed.

Top