Sample records for amazon elastic compute

  1. Integrating Marine Observatories into a System-of-Systems: Messaging in the US Ocean Observatories Initiative

    DTIC Science & Technology

    2010-06-01

    Woods Hole, MA 02543, USA 3 Raytheon Intelligence and Information Systems, Aurora , CO 80011, USA 4 Scripps Institution of Oceanography, La Jolla...Amazon.com, Amazon Web Services for the Amazon Elastic Compute Cloud ( Amazon EC2). http://aws.amazon.com/ec2/. [4] M. Arrott, B. Demchak, V. Ermagan, C

  2. Processing Shotgun Proteomics Data on the Amazon Cloud with the Trans-Proteomic Pipeline*

    PubMed Central

    Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W.; Moritz, Robert L.

    2015-01-01

    Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. PMID:25418363

  3. Processing shotgun proteomics data on the Amazon cloud with the trans-proteomic pipeline.

    PubMed

    Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W; Moritz, Robert L

    2015-02-01

    Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  4. Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus

    NASA Astrophysics Data System (ADS)

    Baun, Christian; Kunze, Marcel

    Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.

  5. HPC: Rent or Buy

    ERIC Educational Resources Information Center

    Fredette, Michelle

    2012-01-01

    "Rent or buy?" is a question people ask about everything from housing to textbooks. It is also a question universities must consider when it comes to high-performance computing (HPC). With the advent of Amazon's Elastic Compute Cloud (EC2), Microsoft Windows HPC Server, Rackspace's OpenStack, and other cloud-based services, researchers now have…

  6. ATLAS@AWS

    NASA Astrophysics Data System (ADS)

    Gehrcke, Jan-Philip; Kluth, Stefan; Stonjek, Stefan

    2010-04-01

    We show how the ATLAS offline software is ported on the Amazon Elastic Compute Cloud (EC2). We prepare an Amazon Machine Image (AMI) on the basis of the standard ATLAS platform Scientific Linux 4 (SL4). Then an instance of the SLC4 AMI is started on EC2 and we install and validate a recent release of the ATLAS offline software distribution kit. The installed software is archived as an image on the Amazon Simple Storage Service (S3) and can be quickly retrieved and connected to new SL4 AMI instances using the Amazon Elastic Block Store (EBS). ATLAS jobs can then configure against the release kit using the ATLAS configuration management tool (cmt) in the standard way. The output of jobs is exported to S3 before the SL4 AMI is terminated. Job status information is transferred to the Amazon SimpleDB service. The whole process of launching instances of our AMI, starting, monitoring and stopping jobs and retrieving job output from S3 is controlled from a client machine using python scripts implementing the Amazon EC2/S3 API via the boto library working together with small scripts embedded in the SL4 AMI. We report our experience with setting up and operating the system using standard ATLAS job transforms.

  7. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-12-01

    proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup

  8. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-12-01

    simultaneous cloud nodes. 1. INTRODUCTION The proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as...Amazon Web Services and Google Compute Engine means more cloud tenants are hosting sensitive, private, and business critical data and applications in the...thousands of IaaS resources as they are elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features

  9. Cloud computing for comparative genomics

    PubMed Central

    2010-01-01

    Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems. PMID:20482786

  10. Cloud computing for comparative genomics.

    PubMed

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  11. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    NASA Astrophysics Data System (ADS)

    Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.

    2011-12-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  12. Menu-driven cloud computing and resource sharing for R and Bioconductor.

    PubMed

    Bolouri, Hamid; Dulepet, Rajiv; Angerman, Michael

    2011-08-15

    We report CRdata.org, a cloud-based, free, open-source web server for running analyses and sharing data and R scripts with others. In addition to using the free, public service, CRdata users can launch their own private Amazon Elastic Computing Cloud (EC2) nodes and store private data and scripts on Amazon's Simple Storage Service (S3) with user-controlled access rights. All CRdata services are provided via point-and-click menus. CRdata is open-source and free under the permissive MIT License (opensource.org/licenses/mit-license.php). The source code is in Ruby (ruby-lang.org/en/) and available at: github.com/seerdata/crdata. hbolouri@fhcrc.org.

  13. Uncover the Cloud for Geospatial Sciences and Applications to Adopt Cloud Computing

    NASA Astrophysics Data System (ADS)

    Yang, C.; Huang, Q.; Xia, J.; Liu, K.; Li, J.; Xu, C.; Sun, M.; Bambacus, M.; Xu, Y.; Fay, D.

    2012-12-01

    Cloud computing is emerging as the future infrastructure for providing computing resources to support and enable scientific research, engineering development, and application construction, as well as work force education. On the other hand, there is a lot of doubt about the readiness of cloud computing to support a variety of scientific research, development and educations. This research is a project funded by NASA SMD to investigate through holistic studies how ready is the cloud computing to support geosciences. Four applications with different computing characteristics including data, computing, concurrent, and spatiotemporal intensities are taken to test the readiness of cloud computing to support geosciences. Three popular and representative cloud platforms including Amazon EC2, Microsoft Azure, and NASA Nebula as well as a traditional cluster are utilized in the study. Results illustrates that cloud is ready to some degree but more research needs to be done to fully implemented the cloud benefit as advertised by many vendors and defined by NIST. Specifically, 1) most cloud platform could help stand up new computing instances, a new computer, in a few minutes as envisioned, therefore, is ready to support most computing needs in an on demand fashion; 2) the load balance and elasticity, a defining characteristic, is ready in some cloud platforms, such as Amazon EC2, to support bigger jobs, e.g., needs response in minutes, while some are not ready to support the elasticity and load balance well. All cloud platform needs further research and development to support real time application at subminute level; 3) the user interface and functionality of cloud platforms vary a lot and some of them are very professional and well supported/documented, such as Amazon EC2, some of them needs significant improvement for the general public to adopt cloud computing without professional training or knowledge about computing infrastructure; 4) the security is a big concern in cloud computing platform, with the sharing spirit of cloud computing, it is very hard to ensure higher level security, except a private cloud is built for a specific organization without public access, public cloud platform does not support FISMA medium level yet and may never be able to support FISMA high level; 5) HPC jobs needs of cloud computing is not well supported and only Amazon EC2 supports this well. The research is being taken by NASA and other agencies to consider cloud computing adoption. We hope the publication of the research would also benefit the public to adopt cloud computing.

  14. HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation

    DOE PAGES

    Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian; ...

    2017-09-29

    Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less

  15. Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud.

    PubMed

    Cianfrocco, Michael A; Leschziner, Andres E

    2015-05-08

    The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available 'off-the-shelf' computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16-480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM.

  16. HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian

    Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less

  17. Menu-driven cloud computing and resource sharing for R and Bioconductor

    PubMed Central

    Bolouri, Hamid; Angerman, Michael

    2011-01-01

    Summary: We report CRdata.org, a cloud-based, free, open-source web server for running analyses and sharing data and R scripts with others. In addition to using the free, public service, CRdata users can launch their own private Amazon Elastic Computing Cloud (EC2) nodes and store private data and scripts on Amazon's Simple Storage Service (S3) with user-controlled access rights. All CRdata services are provided via point-and-click menus. Availability and Implementation: CRdata is open-source and free under the permissive MIT License (opensource.org/licenses/mit-license.php). The source code is in Ruby (ruby-lang.org/en/) and available at: github.com/seerdata/crdata. Contact: hbolouri@fhcrc.org PMID:21685055

  18. Secure and Resilient Cloud Computing for the Department of Defense

    DTIC Science & Technology

    2015-11-16

    platform as a service (PaaS), and software as a service ( SaaS )—that target system administrators, developers, and end-users respectively (see Table 2...interfaces (API) and services Medium Amazon Elastic MapReduce, MathWorks Cloud, Red Hat OpenShift SaaS Full-fledged applications Low Google gMail

  19. Using Python to generate AHPS-based precipitation simulations over CONUS using Amazon distributed computing

    NASA Astrophysics Data System (ADS)

    Machalek, P.; Kim, S. M.; Berry, R. D.; Liang, A.; Small, T.; Brevdo, E.; Kuznetsova, A.

    2012-12-01

    We describe how the Climate Corporation uses Python and Clojure, a language impleneted on top of Java, to generate climatological forecasts for precipitation based on the Advanced Hydrologic Prediction Service (AHPS) radar based daily precipitation measurements. A 2-year-long forecasts is generated on each of the ~650,000 CONUS land based 4-km AHPS grids by constructing 10,000 ensembles sampled from a 30-year reconstructed AHPS history for each grid. The spatial and temporal correlations between neighboring AHPS grids and the sampling of the analogues are handled by Python. The parallelization for all the 650,000 CONUS stations is further achieved by utilizing the MAP-REDUCE framework (http://code.google.com/edu/parallel/mapreduce-tutorial.html). Each full scale computational run requires hundreds of nodes with up to 8 processors each on the Amazon Elastic MapReduce (http://aws.amazon.com/elasticmapreduce/) distributed computing service resulting in 3 terabyte datasets. We further describe how we have productionalized a monthly run of the simulations process at full scale of the 4km AHPS grids and how the resultant terabyte sized datasets are handled.

  20. Automating NEURON Simulation Deployment in Cloud Resources.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.

  1. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    PubMed

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  2. Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud

    PubMed Central

    Cianfrocco, Michael A; Leschziner, Andres E

    2015-01-01

    The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available ‘off-the-shelf’ computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16–480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM. DOI: http://dx.doi.org/10.7554/eLife.06664.001 PMID:25955969

  3. Globus | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Globus software services provide secure cancer research data transfer, synchronization, and sharing in distributed environments at large scale. These services can be integrated into applications and research data gateways, leveraging Globus identity management, single sign-on, search, and authorization capabilities. Globus Genomics integrates Globus with the Galaxy genomics workflow engine and Amazon Web Services to enable cancer genomics analysis that can elastically scale compute resources with demand.

  4. Experiences Building Globus Genomics: A Next-Generation Sequencing Analysis Service using Galaxy, Globus, and Amazon Web Services

    PubMed Central

    Madduri, Ravi K.; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J.; Foster, Ian T.

    2014-01-01

    We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads. PMID:25342933

  5. Experiences Building Globus Genomics: A Next-Generation Sequencing Analysis Service using Galaxy, Globus, and Amazon Web Services.

    PubMed

    Madduri, Ravi K; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J; Foster, Ian T

    2014-09-10

    We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads.

  6. Cloud Computing: An Overview

    NASA Astrophysics Data System (ADS)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  7. cloudPEST - A python module for cloud-computing deployment of PEST, a program for parameter estimation

    USGS Publications Warehouse

    Fienen, Michael N.; Kunicki, Thomas C.; Kester, Daniel E.

    2011-01-01

    This report documents cloudPEST-a Python module with functions to facilitate deployment of the model-independent parameter estimation code PEST on a cloud-computing environment. cloudPEST makes use of low-level, freely available command-line tools that interface with the Amazon Elastic Compute Cloud (EC2(TradeMark)) that are unlikely to change dramatically. This report describes the preliminary setup for both Python and EC2 tools and subsequently describes the functions themselves. The code and guidelines have been tested primarily on the Windows(Registered) operating system but are extensible to Linux(Registered).

  8. Exploiting parallel R in the cloud with SPRINT.

    PubMed

    Piotrowski, M; McGilvary, G A; Sloan, T M; Mewissen, M; Lloyd, A D; Forster, T; Mitchell, L; Ghazal, P; Hill, J

    2013-01-01

    Advances in DNA Microarray devices and next-generation massively parallel DNA sequencing platforms have led to an exponential growth in data availability but the arising opportunities require adequate computing resources. High Performance Computing (HPC) in the Cloud offers an affordable way of meeting this need. Bioconductor, a popular tool for high-throughput genomic data analysis, is distributed as add-on modules for the R statistical programming language but R has no native capabilities for exploiting multi-processor architectures. SPRINT is an R package that enables easy access to HPC for genomics researchers. This paper investigates: setting up and running SPRINT-enabled genomic analyses on Amazon's Elastic Compute Cloud (EC2), the advantages of submitting applications to EC2 from different parts of the world and, if resource underutilization can improve application performance. The SPRINT parallel implementations of correlation, permutation testing, partitioning around medoids and the multi-purpose papply have been benchmarked on data sets of various size on Amazon EC2. Jobs have been submitted from both the UK and Thailand to investigate monetary differences. It is possible to obtain good, scalable performance but the level of improvement is dependent upon the nature of the algorithm. Resource underutilization can further improve the time to result. End-user's location impacts on costs due to factors such as local taxation. Although not designed to satisfy HPC requirements, Amazon EC2 and cloud computing in general provides an interesting alternative and provides new possibilities for smaller organisations with limited funds.

  9. Signal and image processing algorithm performance in a virtual and elastic computing environment

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly W.; Robertson, James

    2013-05-01

    The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.

  10. Evaluating the Efficacy of the Cloud for Cluster Computation

    NASA Technical Reports Server (NTRS)

    Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom

    2012-01-01

    Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.

  11. Genotyping in the cloud with Crossbow.

    PubMed

    Gurtowski, James; Schatz, Michael C; Langmead, Ben

    2012-09-01

    Crossbow is a scalable, portable, and automatic cloud computing tool for identifying SNPs from high-coverage, short-read resequencing data. It is built on Apache Hadoop, an implementation of the MapReduce software framework. Hadoop allows Crossbow to distribute read alignment and SNP calling subtasks over a cluster of commodity computers. Two robust tools, Bowtie and SOAPsnp, implement the fundamental alignment and variant calling operations respectively, and have demonstrated capabilities within Crossbow of analyzing approximately one billion short reads per hour on a commodity Hadoop cluster with 320 cores. Through protocol examples, this unit will demonstrate the use of Crossbow for identifying variations in three different operating modes: on a Hadoop cluster, on a single computer, and on the Amazon Elastic MapReduce cloud computing service.

  12. Cloud-Coffee: implementation of a parallel consistency-based multiple alignment algorithm in the T-Coffee package and its benchmarking on the Amazon Elastic-Cloud.

    PubMed

    Di Tommaso, Paolo; Orobitg, Miquel; Guirado, Fernando; Cores, Fernado; Espinosa, Toni; Notredame, Cedric

    2010-08-01

    We present the first parallel implementation of the T-Coffee consistency-based multiple aligner. We benchmark it on the Amazon Elastic Cloud (EC2) and show that the parallelization procedure is reasonably effective. We also conclude that for a web server with moderate usage (10K hits/month) the cloud provides a cost-effective alternative to in-house deployment. T-Coffee is a freeware open source package available from http://www.tcoffee.org/homepage.html

  13. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  14. Benchmarking Undedicated Cloud Computing Providers for Analysis of Genomic Datasets

    PubMed Central

    Yazar, Seyhan; Gooden, George E. C.; Mackey, David A.; Hewitt, Alex W.

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5–78.2) for E.coli and 53.5% (95% CI: 34.4–72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5–303.1) and 173.9% (95% CI: 134.6–213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  15. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  16. Hybrid Pluggable Processing Pipeline (HyP3): A cloud-based infrastructure for generic processing of SAR data

    NASA Astrophysics Data System (ADS)

    Hogenson, K.; Arko, S. A.; Buechler, B.; Hogenson, R.; Herrmann, J.; Geiger, A.

    2016-12-01

    A problem often faced by Earth science researchers is how to scale algorithms that were developed against few datasets and take them to regional or global scales. One significant hurdle can be the processing and storage resources available for such a task, not to mention the administration of those resources. As a processing environment, the cloud offers nearly unlimited potential for compute and storage, with limited administration required. The goal of the Hybrid Pluggable Processing Pipeline (HyP3) project was to demonstrate the utility of the Amazon cloud to process large amounts of data quickly and cost effectively, while remaining generic enough to incorporate new algorithms with limited administration time or expense. Principally built by three undergraduate students at the ASF DAAC, the HyP3 system relies on core Amazon services such as Lambda, the Simple Notification Service (SNS), Relational Database Service (RDS), Elastic Compute Cloud (EC2), Simple Storage Service (S3), and Elastic Beanstalk. The HyP3 user interface was written using elastic beanstalk, and the system uses SNS and Lamdba to handle creating, instantiating, executing, and terminating EC2 instances automatically. Data are sent to S3 for delivery to customers and removed using standard data lifecycle management rules. In HyP3 all data processing is ephemeral; there are no persistent processes taking compute and storage resources or generating added cost. When complete, HyP3 will leverage the automatic scaling up and down of EC2 compute power to respond to event-driven demand surges correlated with natural disaster or reprocessing efforts. Massive simultaneous processing within EC2 will be able match the demand spike in ways conventional physical computing power never could, and then tail off incurring no costs when not needed. This presentation will focus on the development techniques and technologies that were used in developing the HyP3 system. Data and process flow will be shown, highlighting the benefits of the cloud for each step. Finally, the steps for integrating a new processing algorithm will be demonstrated. This is the true power of HyP3; allowing people to upload their own algorithms and execute them at archive level scales.

  17. The HEPiX Virtualisation Working Group: Towards a Grid of Clouds

    NASA Astrophysics Data System (ADS)

    Cass, Tony

    2012-12-01

    The use of virtual machine images, as for example with Cloud services such as Amazon's Elastic Compute Cloud, is attractive for users as they have a guaranteed execution environment, something that cannot today be provided across sites participating in computing grids such as the Worldwide LHC Computing Grid. However, Grid sites often operate within computer security frameworks which preclude the use of remotely generated images. The HEPiX Virtualisation Working Group was setup with the objective to enable use of remotely generated virtual machine images at Grid sites and, to this end, has introduced the idea of trusted virtual machine images which are guaranteed to be secure and configurable by sites such that security policy commitments can be met. This paper describes the requirements and details of these trusted virtual machine images and presents a model for their use to facilitate the integration of Grid- and Cloud-based computing environments for High Energy Physics.

  18. Leveraging Cloud Technology to Provide a Responsive, Reliable and Scalable Backend for the Virtual Ice Sheet Laboratory Using the Ice Sheet System Model and Amazon's Elastic Compute Cloud

    NASA Astrophysics Data System (ADS)

    Perez, G. L.; Larour, E. Y.; Halkides, D. J.; Cheng, D. L. C.

    2015-12-01

    The Virtual Ice Sheet Laboratory(VISL) is a Cryosphere outreach effort byscientists at the Jet Propulsion Laboratory(JPL) in Pasadena, CA, Earth and SpaceResearch(ESR) in Seattle, WA, and the University of California at Irvine (UCI), with the goal of providing interactive lessons for K-12 and college level students,while conforming to STEM guidelines. At the core of VISL is the Ice Sheet System Model(ISSM), an open-source project developed jointlyat JPL and UCI whose main purpose is to model the evolution of the polar ice caps in Greenland and Antarctica. By using ISSM, VISL students have access tostate-of-the-art modeling software that is being used to conduct scientificresearch by users all over the world. However, providing this functionality isby no means simple. The modeling of ice sheets in response to sea and atmospheric temperatures, among many other possible parameters, requiressignificant computational resources. Furthermore, this service needs to beresponsive and capable of handling burst requests produced by classrooms ofstudents. Cloud computing providers represent a burgeoning industry. With majorinvestments by tech giants like Amazon, Google and Microsoft, it has never beeneasier or more affordable to deploy computational elements on-demand. This isexactly what VISL needs and ISSM is capable of. Moreover, this is a promisingalternative to investing in expensive and rapidly devaluing hardware.

  19. Resources and costs for microbial sequence analysis evaluated using virtual machines and cloud computing.

    PubMed

    Angiuoli, Samuel V; White, James R; Matalka, Malcolm; White, Owen; Fricke, W Florian

    2011-01-01

    The widespread popularity of genomic applications is threatened by the "bioinformatics bottleneck" resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single-genome and metagenomics WGS projects can achieve cost-efficient bioinformatics support using CloVR in combination with Amazon EC2 as an alternative to local computing centers.

  20. Resources and Costs for Microbial Sequence Analysis Evaluated Using Virtual Machines and Cloud Computing

    PubMed Central

    Angiuoli, Samuel V.; White, James R.; Matalka, Malcolm; White, Owen; Fricke, W. Florian

    2011-01-01

    Background The widespread popularity of genomic applications is threatened by the “bioinformatics bottleneck” resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. Results We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Conclusions Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single-genome and metagenomics WGS projects can achieve cost-efficient bioinformatics support using CloVR in combination with Amazon EC2 as an alternative to local computing centers. PMID:22028928

  1. Cloud-based Predictive Modeling System and its Application to Asthma Readmission Prediction

    PubMed Central

    Chen, Robert; Su, Hang; Khalilia, Mohammed; Lin, Sizhe; Peng, Yue; Davis, Tod; Hirsh, Daniel A; Searles, Elizabeth; Tejedor-Sojo, Javier; Thompson, Michael; Sun, Jimeng

    2015-01-01

    The predictive modeling process is time consuming and requires clinical researchers to handle complex electronic health record (EHR) data in restricted computational environments. To address this problem, we implemented a cloud-based predictive modeling system via a hybrid setup combining a secure private server with the Amazon Web Services (AWS) Elastic MapReduce platform. EHR data is preprocessed on a private server and the resulting de-identified event sequences are hosted on AWS. Based on user-specified modeling configurations, an on-demand web service launches a cluster of Elastic Compute 2 (EC2) instances on AWS to perform feature selection and classification algorithms in a distributed fashion. Afterwards, the secure private server aggregates results and displays them via interactive visualization. We tested the system on a pediatric asthma readmission task on a de-identified EHR dataset of 2,967 patients. We conduct a larger scale experiment on the CMS Linkable 2008–2010 Medicare Data Entrepreneurs’ Synthetic Public Use File dataset of 2 million patients, which achieves over 25-fold speedup compared to sequential execution. PMID:26958172

  2. Hybrid Pluggable Processing Pipeline (HyP3): Programmatic Access to Cloud-Based Processing of SAR Data

    NASA Astrophysics Data System (ADS)

    Weeden, R.; Horn, W. B.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    A problem often faced by Earth science researchers is the question of how to scale algorithms that were developed against few datasets and take them to regional or global scales. This problem only gets worse as we look to a future with larger and larger datasets becoming available. One significant hurdle can be having the processing and storage resources available for such a task, not to mention the administration of those resources. As a processing environment, the cloud offers nearly unlimited potential for compute and storage, with limited administration required. The goal of the Hybrid Pluggable Processing Pipeline (HyP3) project was to demonstrate the utility of the Amazon cloud to process large amounts of data quickly and cost effectively. Principally built by three undergraduate students at the ASF DAAC, the HyP3 system relies on core Amazon cloud services such as Lambda, Relational Database Service (RDS), Elastic Compute Cloud (EC2), Simple Storage Service (S3), and Elastic Beanstalk. HyP3 provides an Application Programming Interface (API) through which users can programmatically interface with the HyP3 system; allowing them to monitor and control processing jobs running in HyP3, and retrieve the generated HyP3 products when completed. This presentation will focus on the development techniques and enabling technologies that were used in developing the HyP3 system. Data and process flow, from new subscription through to order completion will be shown, highlighting the benefits of the cloud for each step. Because the HyP3 system can be accessed directly from a user's Python scripts, powerful applications leveraging SAR products can be put together fairly easily. This is the true power of HyP3; allowing people to programmatically leverage the power of the cloud.

  3. Exploiting Parallel R in the Cloud with SPRINT

    PubMed Central

    Piotrowski, M.; McGilvary, G.A.; Sloan, T. M.; Mewissen, M.; Lloyd, A.D.; Forster, T.; Mitchell, L.; Ghazal, P.; Hill, J.

    2012-01-01

    Background Advances in DNA Microarray devices and next-generation massively parallel DNA sequencing platforms have led to an exponential growth in data availability but the arising opportunities require adequate computing resources. High Performance Computing (HPC) in the Cloud offers an affordable way of meeting this need. Objectives Bioconductor, a popular tool for high-throughput genomic data analysis, is distributed as add-on modules for the R statistical programming language but R has no native capabilities for exploiting multi-processor architectures. SPRINT is an R package that enables easy access to HPC for genomics researchers. This paper investigates: setting up and running SPRINT-enabled genomic analyses on Amazon’s Elastic Compute Cloud (EC2), the advantages of submitting applications to EC2 from different parts of the world and, if resource underutilization can improve application performance. Methods The SPRINT parallel implementations of correlation, permutation testing, partitioning around medoids and the multi-purpose papply have been benchmarked on data sets of various size on Amazon EC2. Jobs have been submitted from both the UK and Thailand to investigate monetary differences. Results It is possible to obtain good, scalable performance but the level of improvement is dependent upon the nature of algorithm. Resource underutilization can further improve the time to result. End-user’s location impacts on costs due to factors such as local taxation. Conclusions: Although not designed to satisfy HPC requirements, Amazon EC2 and cloud computing in general provides an interesting alternative and provides new possibilities for smaller organisations with limited funds. PMID:23223611

  4. Automating NEURON Simulation Deployment in Cloud Resources

    PubMed Central

    Santamaria, Fidel

    2016-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341

  5. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    DOE PAGES

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; ...

    2015-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-end NGS analysis requirements. The Globus Genomicsmore » system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.« less

  6. A Cloud-Based Simulation Architecture for Pandemic Influenza Simulation

    PubMed Central

    Eriksson, Henrik; Raciti, Massimiliano; Basile, Maurizio; Cunsolo, Alessandro; Fröberg, Anders; Leifler, Ola; Ekberg, Joakim; Timpka, Toomas

    2011-01-01

    High-fidelity simulations of pandemic outbreaks are resource consuming. Cluster-based solutions have been suggested for executing such complex computations. We present a cloud-based simulation architecture that utilizes computing resources both locally available and dynamically rented online. The approach uses the Condor framework for job distribution and management of the Amazon Elastic Computing Cloud (EC2) as well as local resources. The architecture has a web-based user interface that allows users to monitor and control simulation execution. In a benchmark test, the best cost-adjusted performance was recorded for the EC2 H-CPU Medium instance, while a field trial showed that the job configuration had significant influence on the execution time and that the network capacity of the master node could become a bottleneck. We conclude that it is possible to develop a scalable simulation environment that uses cloud-based solutions, while providing an easy-to-use graphical user interface. PMID:22195089

  7. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    PubMed Central

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha

    2014-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205

  8. GATE Monte Carlo simulation of dose distribution using MapReduce in a cloud computing environment.

    PubMed

    Liu, Yangchuan; Tang, Yuguo; Gao, Xin

    2017-12-01

    The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop's fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE.

  9. An Architecture for Cross-Cloud System Management

    NASA Astrophysics Data System (ADS)

    Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad

    The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.

  10. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services

    PubMed Central

    Castaño-Díez, Daniel

    2017-01-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID:28580909

  11. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services.

    PubMed

    Castaño-Díez, Daniel

    2017-06-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.

  12. MR-Tandem: parallel X!Tandem using Hadoop MapReduce on Amazon Web Services.

    PubMed

    Pratt, Brian; Howbert, J Jeffry; Tasman, Natalie I; Nilsson, Erik J

    2012-01-01

    MR-Tandem adapts the popular X!Tandem peptide search engine to work with Hadoop MapReduce for reliable parallel execution of large searches. MR-Tandem runs on any Hadoop cluster but offers special support for Amazon Web Services for creating inexpensive on-demand Hadoop clusters, enabling search volumes that might not otherwise be feasible with the compute resources a researcher has at hand. MR-Tandem is designed to drop in wherever X!Tandem is already in use and requires no modification to existing X!Tandem parameter files, and only minimal modification to X!Tandem-based workflows. MR-Tandem is implemented as a lightly modified X!Tandem C++ executable and a Python script that drives Hadoop clusters including Amazon Web Services (AWS) Elastic Map Reduce (EMR), using the modified X!Tandem program as a Hadoop Streaming mapper and reducer. The modified X!Tandem C++ source code is Artistic licensed, supports pluggable scoring, and is available as part of the Sashimi project at http://sashimi.svn.sourceforge.net/viewvc/sashimi/trunk/trans_proteomic_pipeline/extern/xtandem/. The MR-Tandem Python script is Apache licensed and available as part of the Insilicos Cloud Army project at http://ica.svn.sourceforge.net/viewvc/ica/trunk/mr-tandem/. Full documentation and a windows installer that configures MR-Tandem, Python and all necessary packages are available at this same URL. brian.pratt@insilicos.com

  13. Scaling predictive modeling in drug development with cloud computing.

    PubMed

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  14. BioPig: a Hadoop-based analytic toolkit for large-scale sequence data.

    PubMed

    Nordberg, Henrik; Bhatia, Karan; Wang, Kai; Wang, Zhong

    2013-12-01

    The recent revolution in sequencing technologies has led to an exponential growth of sequence data. As a result, most of the current bioinformatics tools become obsolete as they fail to scale with data. To tackle this 'data deluge', here we introduce the BioPig sequence analysis toolkit as one of the solutions that scale to data and computation. We built BioPig on the Apache's Hadoop MapReduce system and the Pig data flow language. Compared with traditional serial and MPI-based algorithms, BioPig has three major advantages: first, BioPig's programmability greatly reduces development time for parallel bioinformatics applications; second, testing BioPig with up to 500 Gb sequences demonstrates that it scales automatically with size of data; and finally, BioPig can be ported without modification on many Hadoop infrastructures, as tested with Magellan system at National Energy Research Scientific Computing Center and the Amazon Elastic Compute Cloud. In summary, BioPig represents a novel program framework with the potential to greatly accelerate data-intensive bioinformatics analysis.

  15. Distributed MRI reconstruction using Gadgetron-based cloud computing.

    PubMed

    Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S

    2015-03-01

    To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.

  16. Cloudbus Toolkit for Market-Oriented Cloud Computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  17. MR-Tandem: parallel X!Tandem using Hadoop MapReduce on Amazon Web Services

    PubMed Central

    Pratt, Brian; Howbert, J. Jeffry; Tasman, Natalie I.; Nilsson, Erik J.

    2012-01-01

    Summary: MR-Tandem adapts the popular X!Tandem peptide search engine to work with Hadoop MapReduce for reliable parallel execution of large searches. MR-Tandem runs on any Hadoop cluster but offers special support for Amazon Web Services for creating inexpensive on-demand Hadoop clusters, enabling search volumes that might not otherwise be feasible with the compute resources a researcher has at hand. MR-Tandem is designed to drop in wherever X!Tandem is already in use and requires no modification to existing X!Tandem parameter files, and only minimal modification to X!Tandem-based workflows. Availability and implementation: MR-Tandem is implemented as a lightly modified X!Tandem C++ executable and a Python script that drives Hadoop clusters including Amazon Web Services (AWS) Elastic Map Reduce (EMR), using the modified X!Tandem program as a Hadoop Streaming mapper and reducer. The modified X!Tandem C++ source code is Artistic licensed, supports pluggable scoring, and is available as part of the Sashimi project at http://sashimi.svn.sourceforge.net/viewvc/sashimi/trunk/trans_proteomic_pipeline/extern/xtandem/. The MR-Tandem Python script is Apache licensed and available as part of the Insilicos Cloud Army project at http://ica.svn.sourceforge.net/viewvc/ica/trunk/mr-tandem/. Full documentation and a windows installer that configures MR-Tandem, Python and all necessary packages are available at this same URL. Contact: brian.pratt@insilicos.com PMID:22072385

  18. A New User Interface for On-Demand Customizable Data Products for Sensors in a SensorWeb

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Sullivan, Don

    2011-01-01

    A SensorWeb is a set of sensors, which can consist of ground, airborne and space-based sensors interoperating in an automated or autonomous collaborative manner. The NASA SensorWeb toolbox, developed at NASA/GSFC in collaboration with NASA/JPL, NASA/Ames and other partners, is a set of software and standards that (1) enables users to create virtual private networks of sensors over open networks; (2) provides the capability to orchestrate their actions; (3) provides the capability to customize the output data products and (4) enables automated delivery of the data products to the users desktop. A recent addition to the SensorWeb Toolbox is a new user interface, together with web services co-resident with the sensors, to enable rapid creation, loading and execution of new algorithms for processing sensor data. The web service along with the user interface follows the Open Geospatial Consortium (OGC) standard called Web Coverage Processing Service (WCPS). This presentation will detail the prototype that was built and how the WCPS was tested against a HyspIRI flight testbed and an elastic computation cloud on the ground with EO-1 data. HyspIRI is a future NASA decadal mission. The elastic computation cloud stores EO-1 data and runs software similar to Amazon online shopping.

  19. The HEPCloud Facility: elastic computing for High Energy Physics - The NOvA Use Case

    NASA Astrophysics Data System (ADS)

    Fuess, S.; Garzoglio, G.; Holzman, B.; Kennedy, R.; Norman, A.; Timm, S.; Tiradani, A.

    2017-10-01

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 38 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper describes the Fermilab HEPCloud Facility and the challenges overcome for the CMS and NOvA communities.

  20. Accelerating statistical image reconstruction algorithms for fan-beam x-ray CT using cloud computing

    NASA Astrophysics Data System (ADS)

    Srivastava, Somesh; Rao, A. Ravishankar; Sheinin, Vadim

    2011-03-01

    Statistical image reconstruction algorithms potentially offer many advantages to x-ray computed tomography (CT), e.g. lower radiation dose. But, their adoption in practical CT scanners requires extra computation power, which is traditionally provided by incorporating additional computing hardware (e.g. CPU-clusters, GPUs, FPGAs etc.) into a scanner. An alternative solution is to access the required computation power over the internet from a cloud computing service, which is orders-of-magnitude more cost-effective. This is because users only pay a small pay-as-you-go fee for the computation resources used (i.e. CPU time, storage etc.), and completely avoid purchase, maintenance and upgrade costs. In this paper, we investigate the benefits and shortcomings of using cloud computing for statistical image reconstruction. We parallelized the most time-consuming parts of our application, the forward and back projectors, using MapReduce, the standard parallelization library on clouds. From preliminary investigations, we found that a large speedup is possible at a very low cost. But, communication overheads inside MapReduce can limit the maximum speedup, and a better MapReduce implementation might become necessary in the future. All the experiments for this paper, including development and testing, were completed on the Amazon Elastic Compute Cloud (EC2) for less than $20.

  1. Rail-dbGaP: analyzing dbGaP-protected data in the cloud with Amazon Elastic MapReduce.

    PubMed

    Nellore, Abhinav; Wilks, Christopher; Hansen, Kasper D; Leek, Jeffrey T; Langmead, Ben

    2016-08-15

    Public archives contain thousands of trillions of bases of valuable sequencing data. More than 40% of the Sequence Read Archive is human data protected by provisions such as dbGaP. To analyse dbGaP-protected data, researchers must typically work with IT administrators and signing officials to ensure all levels of security are implemented at their institution. This is a major obstacle, impeding reproducibility and reducing the utility of archived data. We present a protocol and software tool for analyzing protected data in a commercial cloud. The protocol, Rail-dbGaP, is applicable to any tool running on Amazon Web Services Elastic MapReduce. The tool, Rail-RNA v0.2, is a spliced aligner for RNA-seq data, which we demonstrate by running on 9662 samples from the dbGaP-protected GTEx consortium dataset. The Rail-dbGaP protocol makes explicit for the first time the steps an investigator must take to develop Elastic MapReduce pipelines that analyse dbGaP-protected data in a manner compliant with NIH guidelines. Rail-RNA automates implementation of the protocol, making it easy for typical biomedical investigators to study protected RNA-seq data, regardless of their local IT resources or expertise. Rail-RNA is available from http://rail.bio Technical details on the Rail-dbGaP protocol as well as an implementation walkthrough are available at https://github.com/nellore/rail-dbgap Detailed instructions on running Rail-RNA on dbGaP-protected data using Amazon Web Services are available at http://docs.rail.bio/dbgap/ : anellore@gmail.com or langmea@cs.jhu.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  2. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.

    PubMed

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan

    2013-06-27

    Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available for third-party implementation and use, and can be downloaded from http://s3.amazonaws.com/jnj_rainbow/index.html.

  3. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing

    PubMed Central

    2013-01-01

    Background Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Results Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Conclusions Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available for third-party implementation and use, and can be downloaded from http://s3.amazonaws.com/jnj_rainbow/index.html. PMID:23802613

  4. High Performance Computing (HPC) Innovation Service Portal Pilots Cloud Computing (HPC-ISP Pilot Cloud Computing)

    DTIC Science & Technology

    2011-08-01

    5 Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis...classification of streaming data. Example input images (top left). All digit prototypes (cluster centers) found, with size proportional to frequency (top...Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis 1 http

  5. Inferring Large-Scale Terrestrial Water Storage Through GRACE and GPS Data Fusion in Cloud Computing Environments

    NASA Astrophysics Data System (ADS)

    Rude, C. M.; Li, J. D.; Gowanlock, M.; Herring, T.; Pankratius, V.

    2016-12-01

    Surface subsidence due to depletion of groundwater can lead to permanent compaction of aquifers and damaged infrastructure. However, studies of such effects on a large scale are challenging and compute intensive because they involve fusing a variety of data sets beyond direct measurements from groundwater wells, such as gravity change measurements from the Gravity Recovery and Climate Experiment (GRACE) or surface displacements measured by GPS receivers. Our work therefore leverages Amazon cloud computing to enable these types of analyses spanning the entire continental US. Changes in groundwater storage are inferred from surface displacements measured by GPS receivers stationed throughout the country. Receivers located on bedrock are anti-correlated with changes in water levels from elastic deformation due to loading, while stations on aquifers correlate with groundwater changes due to poroelastic expansion and compaction. Correlating linearly detrended equivalent water thickness measurements from GRACE with linearly detrended and Kalman filtered vertical displacements of GPS stations located throughout the United States helps compensate for the spatial and temporal limitations of GRACE. Our results show that the majority of GPS stations are negatively correlated with GRACE in a statistically relevant way, as most GPS stations are located on bedrock in order to provide stable reference locations and measure geophysical processes such as tectonic deformations. Additionally, stations located on the Central Valley California aquifer show statistically significant positive correlations. Through the identification of positive and negative correlations, deformation phenomena can be classified as loading or poroelastic expansion due to changes in groundwater. This method facilitates further studies of terrestrial water storage on a global scale. This work is supported by NASA AIST-NNX15AG84G (PI: V. Pankratius) and Amazon.

  6. A Cost-Benefit Study of Doing Astrophysics On The Cloud: Production of Image Mosaics

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Good, J. C. Deelman, E.; Singh, G. Livny, M.

    2009-09-01

    Utility grids such as the Amazon EC2 and Amazon S3 clouds offer computational and storage resources that can be used on-demand for a fee by compute- and data-intensive applications. The cost of running an application on such a cloud depends on the compute, storage and communication resources it will provision and consume. Different execution plans of the same application may result in significantly different costs. We studied via simulation the cost performance trade-offs of different execution and resource provisioning plans by creating, under the Amazon cloud fee structure, mosaics with the Montage image mosaic engine, a widely used data- and compute-intensive application. Specifically, we studied the cost of building mosaics of 2MASS data that have sizes of 1, 2 and 4 square degrees, and a 2MASS all-sky mosaic. These are examples of mosaics commonly generated by astronomers. We also study these trade-offs in the context of the storage and communication fees of Amazon S3 when used for long-term application data archiving. Our results show that by provisioning the right amount of storage and compute resources cost can be significantly reduced with no significant impact on application performance.

  7. Cost Optimal Elastic Auto-Scaling in Cloud Infrastructure

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, S.; Sidhanta, S.; Ganguly, S.; Nemani, R. R.

    2014-12-01

    Today, elastic scaling is critical part of leveraging cloud. Elastic scaling refers to adding resources only when it is needed and deleting resources when not in use. Elastic scaling ensures compute/server resources are not over provisioned. Today, Amazon and Windows Azure are the only two platform provider that allow auto-scaling of cloud resources where servers are automatically added and deleted. However, these solution falls short of following key features: A) Requires explicit policy definition such server load and therefore lacks any predictive intelligence to make optimal decision; B) Does not decide on the right size of resource and thereby does not result in cost optimal resource pool. In a typical cloud deployment model, we consider two types of application scenario: A. Batch processing jobs → Hadoop/Big Data case B. Transactional applications → Any application that process continuous transactions (Requests/response) In reference of classical queuing model, we are trying to model a scenario where servers have a price and capacity (size) and system can add delete servers to maintain a certain queue length. Classical queueing models applies to scenario where number of servers are constant. So we cannot apply stationary system analysis in this case. We investigate the following questions 1. Can we define Job queue and use the metric to define such a queue to predict the resource requirement in a quasi-stationary way? Can we map that into an optimal sizing problem? 2. Do we need to get into a level of load (CPU/Data) on server level to characterize the size requirement? How do we learn that based on Job type?

  8. Cloud Computing Technologies Facilitate Earth Research

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Under a Space Act Agreement, NASA partnered with Seattle-based Amazon Web Services to make the agency's climate and Earth science satellite data publicly available on the company's servers. Users can access the data for free, but they can also pay to use Amazon's computing services to analyze and visualize information using the same software available to NASA researchers.

  9. Sentinel-1 Archive and Processing in the Cloud using the Hybrid Pluggable Processing Pipeline (HyP3) at the ASF DAAC

    NASA Astrophysics Data System (ADS)

    Arko, S. A.; Hogenson, R.; Geiger, A.; Herrmann, J.; Buechler, B.; Hogenson, K.

    2016-12-01

    In the coming years there will be an unprecedented amount of SAR data available on a free and open basis to research and operational users around the globe. The Alaska Satellite Facility (ASF) DAAC hosts, through an international agreement, data from the Sentinel-1 spacecraft and will be hosting data from the upcoming NASA ISRO SAR (NISAR) mission. To more effectively manage and exploit these vast datasets, ASF DAAC has begun moving portions of the archive to the cloud and utilizing cloud services to provide higher-level processing on the data. The Hybrid Pluggable Processing Pipeline (HyP3) project is designed to support higher-level data processing in the cloud and extend the capabilities of researchers to larger scales. Built upon a set of core Amazon cloud services, the HyP3 system allows users to request data processing using a number of canned algorithms or their own algorithms once they have been uploaded to the cloud. The HyP3 system automatically accesses the ASF cloud-based archive through the DAAC RESTful application programming interface and processes the data on Amazon's elastic compute cluster (EC2). Final products are distributed through Amazon's simple storage service (S3) and are available for user download. This presentation will provide an overview of ASF DAAC's activities moving the Sentinel-1 archive into the cloud and developing the integrated HyP3 system, covering both the benefits and difficulties of working in the cloud. Additionally, we will focus on the utilization of HyP3 for higher-level processing of SAR data. Two example algorithms, for sea-ice tracking and change detection, will be discussed as well as the mechanism for integrating new algorithms into the pipeline for community use.

  10. Low Cost, Scalable Proteomics Data Analysis Using Amazon's Cloud Computing Services and Open Source Search Algorithms

    PubMed Central

    Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.

    2009-01-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578

  11. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    PubMed

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  12. The HEPCloud Facility: elastic computing for High Energy Physics – The NOvA Use Case

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuess, S.; Garzoglio, G.; Holzman, B.

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a commonmore » interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 25 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper describes the Fermilab HEPCloud Facility and the challenges overcome for the CMS and NOvA communities.« less

  13. Assessing the Amazon Cloud Suitability for CLARREO's Computational Needs

    NASA Technical Reports Server (NTRS)

    Goldin, Daniel; Vakhnin, Andrei A.; Currey, Jon C.

    2015-01-01

    In this document we compare the performance of the Amazon Web Services (AWS), also known as Amazon Cloud, with the CLARREO (Climate Absolute Radiance and Refractivity Observatory) cluster and assess its suitability for computational needs of the CLARREO mission. A benchmark executable to process one month and one year of PARASOL (Polarization and Anistropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar) data was used. With the optimal AWS configuration, adequate data-processing times, comparable to the CLARREO cluster, were found. The assessment of alternatives to the CLARREO cluster continues and several options, such as a NASA-based cluster, are being considered.

  14. Reducing and Analyzing the PHAT Survey with the Cloud

    NASA Astrophysics Data System (ADS)

    Williams, Benjamin F.; Olsen, Knut; Khan, Rubab; Pirone, Daniel; Rosema, Keith

    2018-05-01

    We discuss the technical challenges we faced and the techniques we used to overcome them when reducing the Panchromatic Hubble Andromeda Treasury (PHAT) photometric data set on the Amazon Elastic Compute Cloud (EC2). We first describe the architecture of our photometry pipeline, which we found particularly efficient for reducing the data in multiple ways for different purposes. We then describe the features of EC2 that make this architecture both efficient to use and challenging to implement. We describe the techniques we adopted to process our data, and suggest ways these techniques may be improved for those interested in trying such reductions in the future. Finally, we summarize the output photometry data products, which are now hosted publicly in two places in two formats. They are in simple fits tables in the high-level science products on MAST, and on a queryable database available through the NOAO Data Lab.

  15. Performance Management of High Performance Computing for Medical Image Processing in Amazon Web Services.

    PubMed

    Bao, Shunxing; Damon, Stephen M; Landman, Bennett A; Gokhale, Aniruddha

    2016-02-27

    Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.

  16. Performance management of high performance computing for medical image processing in Amazon Web Services

    NASA Astrophysics Data System (ADS)

    Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha

    2016-03-01

    Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical- Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for- use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.

  17. Performance Management of High Performance Computing for Medical Image Processing in Amazon Web Services

    PubMed Central

    Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha

    2016-01-01

    Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline. PMID:27127335

  18. Concentration of Access to Information and Communication Technologies in the Municipalities of the Brazilian Legal Amazon.

    PubMed

    de Brito, Silvana Rossy; da Silva, Aleksandra do Socorro; Cruz, Adejard Gaia; Monteiro, Maurílio de Abreu; Vijaykumar, Nandamudi Lankalapalli; da Silva, Marcelino Silva; Costa, João Crisóstomo Weyl Albuquerque; Francês, Carlos Renato Lisboa

    2016-01-01

    This study fills demand for data on access and use of information and communication technologies (ICT) in the Brazilian legal Amazon, a region of localities with identical economic, political, and social problems. We use the 2010 Brazilian Demographic Census to compile data on urban and rural households (i) with computers and Internet access, (ii) with mobile phones, and (iii) with fixed phones. To compare the concentration of access to ICT in the municipalities of the Brazilian Amazon with other regions of Brazil, we use a concentration index to quantify the concentration of households in the following classes: with computers and Internet access, with mobile phones, with fixed phones, and no access. These data are analyzed along with municipal indicators on income, education, electricity, and population size. The results show that for urban households, the average concentration in the municipalities of the Amazon for computers and Internet access and for fixed phones is lower than in other regions of the country; meanwhile, that for no access and mobile phones is higher than in any other region. For rural households, the average concentration in the municipalities of the Amazon for computers and Internet access, mobile phones, and fixed phones is lower than in any other region of the country; meanwhile, that for no access is higher than in any other region. In addition, the study shows that education and income are determinants of inequality in accessing ICT in Brazilian municipalities and that the existence of electricity in rural households is directly associated with the ownership of ICT resources.

  19. Concentration of Access to Information and Communication Technologies in the Municipalities of the Brazilian Legal Amazon

    PubMed Central

    de Brito, Silvana Rossy; da Silva, Aleksandra do Socorro; Cruz, Adejard Gaia; Monteiro, Maurílio de Abreu; Vijaykumar, Nandamudi Lankalapalli; da Silva, Marcelino Silva; Costa, João Crisóstomo Weyl Albuquerque; Francês, Carlos Renato Lisboa

    2016-01-01

    This study fills demand for data on access and use of information and communication technologies (ICT) in the Brazilian legal Amazon, a region of localities with identical economic, political, and social problems. We use the 2010 Brazilian Demographic Census to compile data on urban and rural households (i) with computers and Internet access, (ii) with mobile phones, and (iii) with fixed phones. To compare the concentration of access to ICT in the municipalities of the Brazilian Amazon with other regions of Brazil, we use a concentration index to quantify the concentration of households in the following classes: with computers and Internet access, with mobile phones, with fixed phones, and no access. These data are analyzed along with municipal indicators on income, education, electricity, and population size. The results show that for urban households, the average concentration in the municipalities of the Amazon for computers and Internet access and for fixed phones is lower than in other regions of the country; meanwhile, that for no access and mobile phones is higher than in any other region. For rural households, the average concentration in the municipalities of the Amazon for computers and Internet access, mobile phones, and fixed phones is lower than in any other region of the country; meanwhile, that for no access is higher than in any other region. In addition, the study shows that education and income are determinants of inequality in accessing ICT in Brazilian municipalities and that the existence of electricity in rural households is directly associated with the ownership of ICT resources. PMID:27035577

  20. Leveraging the Cloud for Robust and Efficient Lunar Image Processing

    NASA Technical Reports Server (NTRS)

    Chang, George; Malhotra, Shan; Wolgast, Paul

    2011-01-01

    The Lunar Mapping and Modeling Project (LMMP) is tasked to aggregate lunar data, from the Apollo era to the latest instruments on the LRO spacecraft, into a central repository accessible by scientists and the general public. A critical function of this task is to provide users with the best solution for browsing the vast amounts of imagery available. The image files LMMP manages range from a few gigabytes to hundreds of gigabytes in size with new data arriving every day. Despite this ever-increasing amount of data, LMMP must make the data readily available in a timely manner for users to view and analyze. This is accomplished by tiling large images into smaller images using Hadoop, a distributed computing software platform implementation of the MapReduce framework, running on a small cluster of machines locally. Additionally, the software is implemented to use Amazon's Elastic Compute Cloud (EC2) facility. We also developed a hybrid solution to serve images to users by leveraging cloud storage using Amazon's Simple Storage Service (S3) for public data while keeping private information on our own data servers. By using Cloud Computing, we improve upon our local solution by reducing the need to manage our own hardware and computing infrastructure, thereby reducing costs. Further, by using a hybrid of local and cloud storage, we are able to provide data to our users more efficiently and securely. 12 This paper examines the use of a distributed approach with Hadoop to tile images, an approach that provides significant improvements in image processing time, from hours to minutes. This paper describes the constraints imposed on the solution and the resulting techniques developed for the hybrid solution of a customized Hadoop infrastructure over local and cloud resources in managing this ever-growing data set. It examines the performance trade-offs of using the more plentiful resources of the cloud, such as those provided by S3, against the bandwidth limitations such use encounters with remote resources. As part of this discussion this paper will outline some of the technologies employed, the reasons for their selection, the resulting performance metrics and the direction the project is headed based upon the demonstrated capabilities thus far.

  1. SCIMITAR: Scalable Stream-Processing for Sensor Information Brokering

    DTIC Science & Technology

    2013-11-01

    IaaS) cloud frameworks including Amazon Web Services and Eucalyptus . For load testing, we used The Grinder [9], a Java load testing framework that...internal Eucalyptus cluster which we could not scale as large as the Amazon environment due to a lack of computation resources. We recreated our

  2. Challenges and opportunities in understanding microbial communities with metagenome assembly (accompanied by IPython Notebook tutorial)

    DOE PAGES

    Howe, Adina; Chain, Patrick S. G.

    2015-07-09

    Metagenomic investigations hold great promise for informing the genetics, physiology, and ecology of environmental microorganisms. Current challenges for metagenomic analysis are related to our ability to connect the dots between sequencing reads, their population of origin, and their encoding functions. Assembly-based methods reduce dataset size by extending overlapping reads into larger contiguous sequences (contigs), providing contextual information for genetic sequences that does not rely on existing references. These methods, however, tend to be computationally intensive and are again challenged by sequencing errors as well as by genomic repeats. While numerous tools have been developed based on these methodological concepts, theymore » present confounding choices and training requirements to metagenomic investigators. To help with accessibility to assembly tools, this review also includes an IPython Notebook metagenomic assembly tutorial. This tutorial has instructions for execution any operating system using Amazon Elastic Cloud Compute and guides users through downloading, assembly, and mapping reads to contigs of a mock microbiome metagenome. Despite its challenges, metagenomic analysis has already revealed novel insights into many environments on Earth. As software, training, and data continue to emerge, metagenomic data access and its discoveries will to grow.« less

  3. Challenges and opportunities in understanding microbial communities with metagenome assembly (accompanied by IPython Notebook tutorial)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howe, Adina; Chain, Patrick S. G.

    Metagenomic investigations hold great promise for informing the genetics, physiology, and ecology of environmental microorganisms. Current challenges for metagenomic analysis are related to our ability to connect the dots between sequencing reads, their population of origin, and their encoding functions. Assembly-based methods reduce dataset size by extending overlapping reads into larger contiguous sequences (contigs), providing contextual information for genetic sequences that does not rely on existing references. These methods, however, tend to be computationally intensive and are again challenged by sequencing errors as well as by genomic repeats. While numerous tools have been developed based on these methodological concepts, theymore » present confounding choices and training requirements to metagenomic investigators. To help with accessibility to assembly tools, this review also includes an IPython Notebook metagenomic assembly tutorial. This tutorial has instructions for execution any operating system using Amazon Elastic Cloud Compute and guides users through downloading, assembly, and mapping reads to contigs of a mock microbiome metagenome. Despite its challenges, metagenomic analysis has already revealed novel insights into many environments on Earth. As software, training, and data continue to emerge, metagenomic data access and its discoveries will to grow.« less

  4. The Virtual Climate Data Server (vCDS): An iRODS-Based Data Management Software Appliance Supporting Climate Data Services and Virtualization-as-a-Service in the NASA Center for Climate Simulation

    NASA Technical Reports Server (NTRS)

    Schnase, John L.; Tamkin, Glenn S.; Ripley, W. David III; Stong, Savannah; Gill, Roger; Duffy, Daniel Q.

    2012-01-01

    Scientific data services are becoming an important part of the NASA Center for Climate Simulation's mission. Our technological response to this expanding role is built around the concept of a Virtual Climate Data Server (vCDS), repetitive provisioning, image-based deployment and distribution, and virtualization-as-a-service. The vCDS is an iRODS-based data server specialized to the needs of a particular data-centric application. We use RPM scripts to build vCDS images in our local computing environment, our local Virtual Machine Environment, NASA s Nebula Cloud Services, and Amazon's Elastic Compute Cloud. Once provisioned into one or more of these virtualized resource classes, vCDSs can use iRODS s federation capabilities to create an integrated ecosystem of managed collections that is scalable and adaptable to changing resource requirements. This approach enables platform- or software-asa- service deployment of vCDS and allows the NCCS to offer virtualization-as-a-service: a capacity to respond in an agile way to new customer requests for data services.

  5. Towards a Multi-Mission, Airborne Science Data System Environment

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Hardman, S.; Law, E.; Freeborn, D.; Kay-Im, E.; Lau, G.; Oswald, J.

    2011-12-01

    NASA earth science instruments are increasingly relying on airborne missions. However, traditionally, there has been limited common infrastructure support available to principal investigators in the area of science data systems. As a result, each investigator has been required to develop their own computing infrastructures for the science data system. Typically there is little software reuse and many projects lack sufficient resources to provide a robust infrastructure to capture, process, distribute and archive the observations acquired from airborne flights. At NASA's Jet Propulsion Laboratory (JPL), we have been developing a multi-mission data system infrastructure for airborne instruments called the Airborne Cloud Computing Environment (ACCE). ACCE encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation. This includes improving data system interoperability across each instrument. A principal characteristic is being able to provide an agile infrastructure that is architected to allow for a variety of configurations of the infrastructure from locally installed compute and storage services to provisioning those services via the "cloud" from cloud computer vendors such as Amazon.com. Investigators often have different needs that require a flexible configuration. The data system infrastructure is built on the Apache's Object Oriented Data Technology (OODT) suite of components which has been used for a number of spaceborne missions and provides a rich set of open source software components and services for constructing science processing and data management systems. In 2010, a partnership was formed between the ACCE team and the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to support the data processing and data management needs. A principal goal is to provide support for the Fourier Transform Spectrometer (FTS) instrument which will produce over 700,000 soundings over the life of their three-year mission. The cost to purchase and operate a cluster-based system in order to generate Level 2 Full Physics products from this data was prohibitive. Through an evaluation of cloud computing solutions, Amazon's Elastic Compute Cloud (EC2) was selected for the CARVE deployment. As the ACCE infrastructure is developed and extended to form an infrastructure for airborne missions, the experience of working with CARVE has provided a number of lessons learned and has proven to be important in reinforcing the unique aspects of airborne missions and the importance of the ACCE infrastructure in developing a cost effective, flexible multi-mission capability that leverages emerging capabilities in cloud computing, workflow management, and distributed computing.

  6. Parallel implementation of D-Phylo algorithm for maximum likelihood clusters.

    PubMed

    Malik, Shamita; Sharma, Dolly; Khatri, Sunil Kumar

    2017-03-01

    This study explains a newly developed parallel algorithm for phylogenetic analysis of DNA sequences. The newly designed D-Phylo is a more advanced algorithm for phylogenetic analysis using maximum likelihood approach. The D-Phylo while misusing the seeking capacity of k -means keeps away from its real constraint of getting stuck at privately conserved motifs. The authors have tested the behaviour of D-Phylo on Amazon Linux Amazon Machine Image(Hardware Virtual Machine)i2.4xlarge, six central processing unit, 122 GiB memory, 8  ×  800 Solid-state drive Elastic Block Store volume, high network performance up to 15 processors for several real-life datasets. Distributing the clusters evenly on all the processors provides us the capacity to accomplish a near direct speed if there should arise an occurrence of huge number of processors.

  7. GATE Monte Carlo simulation in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Rowedder, Blake Austin

    The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

  8. A Visual Database System for Image Analysis on Parallel Computers and its Application to the EOS Amazon Project

    NASA Technical Reports Server (NTRS)

    Shapiro, Linda G.; Tanimoto, Steven L.; Ahrens, James P.

    1996-01-01

    The goal of this task was to create a design and prototype implementation of a database environment that is particular suited for handling the image, vision and scientific data associated with the NASA's EOC Amazon project. The focus was on a data model and query facilities that are designed to execute efficiently on parallel computers. A key feature of the environment is an interface which allows a scientist to specify high-level directives about how query execution should occur.

  9. NGScloud: RNA-seq analysis of non-model species using cloud computing.

    PubMed

    Mora-Márquez, Fernando; Vázquez-Poletti, José Luis; López de Heredia, Unai

    2018-05-03

    RNA-seq analysis usually requires large computing infrastructures. NGScloud is a bioinformatic system developed to analyze RNA-seq data using the cloud computing services of Amazon that permit the access to ad hoc computing infrastructure scaled according to the complexity of the experiment, so its costs and times can be optimized. The application provides a user-friendly front-end to operate Amazon's hardware resources, and to control a workflow of RNA-seq analysis oriented to non-model species, incorporating the cluster concept, which allows parallel runs of common RNA-seq analysis programs in several virtual machines for faster analysis. NGScloud is freely available at https://github.com/GGFHF/NGScloud/. A manual detailing installation and how-to-use instructions is available with the distribution. unai.lopezdeheredia@upm.es.

  10. Running climate model on a commercial cloud computing environment: A case study using Community Earth System Model (CESM) on Amazon AWS

    NASA Astrophysics Data System (ADS)

    Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock

    2017-01-01

    The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.

  11. Redundancy and Replication Help Make Your Systems Stress-Free

    ERIC Educational Resources Information Center

    Mitchell, Erik

    2011-01-01

    In mid-April, Amazon EC2 services had a small problem. Apparently, a large swath of its cloud computing environment had such substantial trouble that a number of customers had server issues. A number of high-profile sites, including Reddit, Evite, and Foursquare, went down when Amazon experienced issues in their US East 1a region (Justinb 2011).…

  12. Large-Scale Image Analytics Using Deep Learning

    NASA Astrophysics Data System (ADS)

    Ganguly, S.; Nemani, R. R.; Basu, S.; Mukhopadhyay, S.; Michaelis, A.; Votava, P.

    2014-12-01

    High resolution land cover classification maps are needed to increase the accuracy of current Land ecosystem and climate model outputs. Limited studies are in place that demonstrates the state-of-the-art in deriving very high resolution (VHR) land cover products. In addition, most methods heavily rely on commercial softwares that are difficult to scale given the region of study (e.g. continents to globe). Complexities in present approaches relate to (a) scalability of the algorithm, (b) large image data processing (compute and memory intensive), (c) computational cost, (d) massively parallel architecture, and (e) machine learning automation. In addition, VHR satellite datasets are of the order of terabytes and features extracted from these datasets are of the order of petabytes. In our present study, we have acquired the National Agricultural Imaging Program (NAIP) dataset for the Continental United States at a spatial resolution of 1-m. This data comes as image tiles (a total of quarter million image scenes with ~60 million pixels) and has a total size of ~100 terabytes for a single acquisition. Features extracted from the entire dataset would amount to ~8-10 petabytes. In our proposed approach, we have implemented a novel semi-automated machine learning algorithm rooted on the principles of "deep learning" to delineate the percentage of tree cover. In order to perform image analytics in such a granular system, it is mandatory to devise an intelligent archiving and query system for image retrieval, file structuring, metadata processing and filtering of all available image scenes. Using the Open NASA Earth Exchange (NEX) initiative, which is a partnership with Amazon Web Services (AWS), we have developed an end-to-end architecture for designing the database and the deep belief network (following the distbelief computing model) to solve a grand challenge of scaling this process across quarter million NAIP tiles that cover the entire Continental United States. The AWS core components that we use to solve this problem are DynamoDB along with S3 for database query and storage, ElastiCache shared memory architecture for image segmentation, Elastic Map Reduce (EMR) for image feature extraction, and the memory optimized Elastic Cloud Compute (EC2) for the learning algorithm.

  13. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, J

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of computemore » node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.« less

  14. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  15. SUPERFAMILY 1.75 including a domain-centric gene ontology method.

    PubMed

    de Lima Morais, David A; Fang, Hai; Rackham, Owen J L; Wilson, Derek; Pethica, Ralph; Chothia, Cyrus; Gough, Julian

    2011-01-01

    The SUPERFAMILY resource provides protein domain assignments at the structural classification of protein (SCOP) superfamily level for over 1400 completely sequenced genomes, over 120 metagenomes and other gene collections such as UniProt. All models and assignments are available to browse and download at http://supfam.org. A new hidden Markov model library based on SCOP 1.75 has been created and a previously ignored class of SCOP, coiled coils, is now included. Our scoring component now uses HMMER3, which is in orders of magnitude faster and produces superior results. A cloud-based pipeline was implemented and is publicly available at Amazon web services elastic computer cloud. The SUPERFAMILY reference tree of life has been improved allowing the user to highlight a chosen superfamily, family or domain architecture on the tree of life. The most significant advance in SUPERFAMILY is that now it contains a domain-based gene ontology (GO) at the superfamily and family levels. A new methodology was developed to ensure a high quality GO annotation. The new methodology is general purpose and has been used to produce domain-based phenotypic ontologies in addition to GO.

  16. Large-Scale, Multi-Sensor Atmospheric Data Fusion Using Hybrid Cloud Computing

    NASA Astrophysics Data System (ADS)

    Wilson, Brian; Manipon, Gerald; Hua, Hook; Fetzer, Eric

    2014-05-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration analyses of important climate variables presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over 10 years of data. To efficiently assemble such datasets, we are utilizing Elastic Computing in the Cloud and parallel map-reduce-based algorithms. However, these problems are Data Intensive computing so the data transfer times and storage costs (for caching) are key issues. SciReduce is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in a hybrid Cloud (private eucalyptus & public Amazon). Unlike Hadoop, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Multi-year datasets are automatically "sharded" by time and space across a cluster of nodes so that years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the cached input and intermediate datasets. We are using SciReduce to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a NASA MEASURES grant. We will present the architecture of SciReduce, describe the achieved "clock time" speedups in fusing datasets on our own nodes and in the Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. We will also present a concept and prototype for staging NASA's A-Train Atmospheric datasets (Levels 2 & 3) in the Amazon Cloud so that any number of compute jobs can be executed "near" the multi-sensor data. Given such a system, multi-sensor climate studies over 10-20 years of data could be perform

  17. Performance of parallel computation using CUDA for solving the one-dimensional elasticity equations

    NASA Astrophysics Data System (ADS)

    Darmawan, J. B. B.; Mungkasi, S.

    2017-01-01

    In this paper, we investigate the performance of parallel computation in solving the one-dimensional elasticity equations. Elasticity equations are usually implemented in engineering science. Solving these equations fast and efficiently is desired. Therefore, we propose the use of parallel computation. Our parallel computation uses CUDA of the NVIDIA. Our research results show that parallel computation using CUDA has a great advantage and is powerful when the computation is of large scale.

  18. Cloud computing for comparative genomics with windows azure platform.

    PubMed

    Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.

  19. Cloud Computing for Comparative Genomics with Windows Azure Platform

    PubMed Central

    Kim, Insik; Jung, Jae-Yoon; DeLuca, Todd F.; Nelson, Tristan H.; Wall, Dennis P.

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services. PMID:23032609

  20. A Tale Of 160 Scientists, Three Applications, a Workshop and a Cloud

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Brinkworth, C.; Gelino, D.; Wittman, D. K.; Deelman, E.; Juve, G.; Rynge, M.; Kinney, J.

    2013-10-01

    The NASA Exoplanet Science Institute (NExScI) hosts the annual Sagan Workshops, thematic meetings aimed at introducing researchers to the latest tools and methodologies in exoplanet research. The theme of the Summer 2012 workshop, held from July 23 to July 27 at Caltech, was to explore the use of exoplanet light curves to study planetary system architectures and atmospheres. A major part of the workshop was to use hands-on sessions to instruct attendees in the use of three open source tools for the analysis of light curves, especially from the Kepler mission. Each hands-on session involved the 160 attendees using their laptops to follow step-by-step tutorials given by experts. One of the applications, PyKE, is a suite of Python tools designed to reduce and analyze Kepler light curves; these tools can be invoked from the Unix command line or a GUI in PyRAF. The Transit Analysis Package (TAP) uses Markov Chain Monte Carlo (MCMC) techniques to fit light curves under the Interactive Data Language (IDL) environment, and Transit Timing Variations (TTV) uses IDL tools and Java-based GUIs to confirm and detect exoplanets from timing variations in light curve fitting. Rather than attempt to run these diverse applications on the inevitable wide range of environments on attendees laptops, they were run instead on the Amazon Elastic Cloud 2 (EC2). The cloud offers features ideal for this type of short term need: computing and storage services are made available on demand for as long as needed, and a processing environment can be customized and replicated as needed. The cloud environment included an NFS file server virtual machine (VM), 20 client VMs for use by attendees, and a VM to enable ftp downloads of the attendees' results. The file server was configured with a 1 TB Elastic Block Storage (EBS) volume (network-attached storage mounted as a device) containing the application software and attendees home directories. The clients were configured to mount the applications and home directories from the server via NFS. All VMs were built with CentOS version 5.8. Attendees connected their laptops to one of the client VMs using the Virtual Network Computing (VNC) protocol, which enabled them to interact with a remote desktop GUI during the hands-on sessions. We will describe the mechanisms for handling security, failovers, and licensing of commercial software. In particular, IDL licenses were managed through a server at Caltech, connected to the IDL instances running on Amazon EC2 via a Secure Shell (ssh) tunnel. The system operated flawlessly during the workshop.

  1. SU-E-T-628: A Cloud Computing Based Multi-Objective Optimization Method for Inverse Treatment Planning.

    PubMed

    Na, Y; Suh, T; Xing, L

    2012-06-01

    Multi-objective (MO) plan optimization entails generation of an enormous number of IMRT or VMAT plans constituting the Pareto surface, which presents a computationally challenging task. The purpose of this work is to overcome the hurdle by developing an efficient MO method using emerging cloud computing platform. As a backbone of cloud computing for optimizing inverse treatment planning, Amazon Elastic Compute Cloud with a master node (17.1 GB memory, 2 virtual cores, 420 GB instance storage, 64-bit platform) is used. The master node is able to scale seamlessly a number of working group instances, called workers, based on the user-defined setting account for MO functions in clinical setting. Each worker solved the objective function with an efficient sparse decomposition method. The workers are automatically terminated if there are finished tasks. The optimized plans are archived to the master node to generate the Pareto solution set. Three clinical cases have been planned using the developed MO IMRT and VMAT planning tools to demonstrate the advantages of the proposed method. The target dose coverage and critical structure sparing of plans are comparable obtained using the cloud computing platform are identical to that obtained using desktop PC (Intel Xeon® CPU 2.33GHz, 8GB memory). It is found that the MO planning speeds up the processing of obtaining the Pareto set substantially for both types of plans. The speedup scales approximately linearly with the number of nodes used for computing. With the use of N nodes, the computational time is reduced by the fitting model, 0.2+2.3/N, with r̂2>0.99, on average of the cases making real-time MO planning possible. A cloud computing infrastructure is developed for MO optimization. The algorithm substantially improves the speed of inverse plan optimization. The platform is valuable for both MO planning and future off- or on-line adaptive re-planning. © 2012 American Association of Physicists in Medicine.

  2. Ab initio calculations of the lattice parameter and elastic stiffness coefficients of bcc Fe with solutes

    DOE PAGES

    Fellinger, Michael R.; Hector, Louis G.; Trinkle, Dallas R.

    2016-10-28

    Here, we present an efficient methodology for computing solute-induced changes in lattice parameters and elastic stiffness coefficients Cij of single crystals using density functional theory. We also introduce a solute strain misfit tensor that quantifies how solutes change lattice parameters due to the stress they induce in the host crystal. Solutes modify the elastic stiffness coefficients through volumetric changes and by altering chemical bonds. We compute each of these contributions to the elastic stiffness coefficients separately, and verify that their sum agrees with changes in the elastic stiffness coefficients computed directly using fully optimized supercells containing solutes. Computing the twomore » elastic stiffness contributions separately is more computationally efficient and provides more information on solute effects than the direct calculations. We compute the solute dependence of polycrystalline averaged shear and Young's moduli from the solute dependence of the single-crystal Cij. We then apply this methodology to substitutional Al, B, Cu, Mn, Si solutes and octahedral interstitial C and N solutes in bcc Fe. Comparison with experimental data indicates that our approach accurately predicts solute-induced changes in the lattice parameter and elastic coefficients. The computed data can be used to quantify solute-induced changes in mechanical properties such as strength and ductility, and can be incorporated into mesoscale models to improve their predictive capabilities.« less

  3. Three-Dimensional Computer-Assisted Two-Layer Elastic Models of the Face.

    PubMed

    Ueda, Koichi; Shigemura, Yuka; Otsuki, Yuki; Fuse, Asuka; Mitsuno, Daisuke

    2017-11-01

    To make three-dimensional computer-assisted elastic models for the face, we decided on five requirements: (1) an elastic texture like skin and subcutaneous tissue; (2) the ability to take pen marking for incisions; (3) the ability to be cut with a surgical knife; (4) the ability to keep stitches in place for a long time; and (5) a layered structure. After testing many elastic solvents, we have made realistic three-dimensional computer-assisted two-layer elastic models of the face and cleft lip from the computed tomographic and magnetic resonance imaging stereolithographic data. The surface layer is made of polyurethane and the inner layer is silicone. Using this elastic model, we taught residents and young doctors how to make several typical local flaps and to perform cheiloplasty. They could experience realistic simulated surgery and understand three-dimensional movement of the flaps.

  4. Atlas2 Cloud: a framework for personal genome analysis in the cloud

    PubMed Central

    2012-01-01

    Background Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. Results We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. Conclusions We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms. PMID:23134663

  5. Atlas2 Cloud: a framework for personal genome analysis in the cloud.

    PubMed

    Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli

    2012-01-01

    Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.

  6. Reduced precipitation over large water bodies in the Brazilian Amazon shown from TRMM data

    NASA Astrophysics Data System (ADS)

    Paiva, Rodrigo Cauduro Dias; Buarque, Diogo Costa; Clarke, Robin T.; Collischonn, Walter; Allasia, Daniel Gustavo

    2011-02-01

    Tropical Rainfall Measurement Mission (TRMM) data show lower rainfall over large water bodies in the Brazilian Amazon. Mean annual rainfall (P), number of wet days (rainfall > 2 mm) (W) and annual rainfall accumulated over 3-hour time intervals (P3hr) were computed from TRMM 3B42 data for 1998-2009. Reduced rainfall was marked over the Rio Solimões/Amazon, along most Amazon tributaries and over the Balbina reservoir. In a smaller test area, a heuristic argument showed that P and W were reduced by 5% and 6.5% respectively. Allowing for TRMM 3B42 spatial resolution, the reduction may be locally greater. Analyses of diurnal rainfall patterns showed that rainfall is lowest over large rivers during the afternoon, when most rainfall is convective, but at night and early morning the opposite occurs, with increased rainfall over rivers, although this pattern is less marked. Rainfall patterns reported from studies of smaller Amazonian regions therefore exist more widely.

  7. NASA Enterprise Managed Cloud Computing (EMCC): Delivering an Initial Operating Capability (IOC) for NASA use of Commercial Infrastructure-as-a-Service (IaaS)

    NASA Technical Reports Server (NTRS)

    O'Brien, Raymond

    2017-01-01

    In 2016, Ames supported the NASA CIO in delivering an initial operating capability for Agency use of commercial cloud computing. This presentation provides an overview of the project, the services approach followed, and the major components of the capability that was delivered. The presentation is being given at the request of Amazon Web Services to a contingent representing the Brazilian Federal Government and Defense Organization that is interested in the use of Amazon Web Services (AWS). NASA is currently a customer of AWS and delivered the Initial Operating Capability using AWS as its first commercial cloud provider. The IOC, however, designed to also support other cloud providers in the future.

  8. Tree rings and rainfall in the equatorial Amazon

    NASA Astrophysics Data System (ADS)

    Granato-Souza, Daniela; Stahle, David W.; Barbosa, Ana Carolina; Feng, Song; Torbenson, Max C. A.; de Assis Pereira, Gabriel; Schöngart, Jochen; Barbosa, Joao Paulo; Griffin, Daniel

    2018-05-01

    The Amazon basin is a global center of hydroclimatic variability and biodiversity, but there are only eight instrumental rainfall stations with continuous records longer than 80 years in the entire basin, an area nearly the size of the coterminous US. The first long moisture-sensitive tree-ring chronology has been developed in the eastern equatorial Amazon of Brazil based on dendrochronological analysis of Cedrela cross sections cut during sustainable logging operations near the Rio Paru. The Rio Paru chronology dates from 1786 to 2016 and is significantly correlated with instrumental precipitation observations from 1939 to 2016. The strength and spatial scale of the precipitation signal vary during the instrumental period, but the Rio Paru chronology has been used to develop a preliminary reconstruction of February to November rainfall totals from 1786 to 2016. The reconstruction is related to SSTs in the Atlantic and especially the tropical Pacific, similar to the stronger pattern of association computed for the instrumental rainfall data from the eastern Amazon. The tree-ring data estimate extended drought and wet episodes in the mid- to late-nineteenth century, providing a valuable, long-term perspective on the moisture changes expected to emerge over the Amazon in the coming century due to deforestation and anthropogenic climate change.

  9. Tree-Ring Reconstruction of Wet Season Rainfall Totals in the Amazon

    NASA Astrophysics Data System (ADS)

    Stahle, D. W.; Lopez, L.; Granato-Souza, D.; Barbosa, A. C. M. C.; Torbenson, M.; Villalba, R.; Pereira, G. D. A.; Feng, S.; Schongart, J.; Cook, E. R.

    2017-12-01

    The Amazon Basin is a globally important center of deep atmospheric convection, energy balance, and biodiversity, but only a handful of weather stations in this vast Basin have recorded rainfall measurements for at least 50 years. The available rainfall and river level observations suggest that the hydrologic cycle in the Amazon may have become amplified in the last 40-years, with more extreme rainfall and streamflow seasonality, deeper droughts, and more severe flooding. These changes in the largest hydrological system on earth may be early evidence of the expected consequences of anthropogenic climate change and deforestation in the coming century. Placing these observed and simulated changes in the context of natural climate variability during the late Holocene is a significant challenge for high-resolution paleoclimatology. We have developed exactly dated and well-replicated annual tree-ring chronologies from two native Amazonian tree species (Cedrela sp and Centrolobium microchaete). These moisture sensitive chronologies have been used to compute two reconstructions of wet season rainfall totals, one in the southern Amazon based on Centrolobium and another in the eastern equatorial Amazon using Cedrela. Both reconstructions are over 200-years long and extend the available instrumental observations in each region by over 150-years. These reconstructions are well correlated with the same regional and large-scale climate dynamics that govern the inter-annual variability of the instrumental wet season rainfall totals. Increased multi-decadal variability is reconstructed after 1950 with the Centrolobium chronologies in the southern Amazon. The Cedrela reconstruction from the eastern Amazon exhibits changes in the spatial pattern of correlation with regional rainfall stations and the large-scale sea surface temperature field after 1990 that may be consistent with recent changes in the mean position of the Inter-Tropical Convergence Zone in March over the western Atlantic and South American sector.

  10. Efficiently approximating the Pareto frontier: Hydropower dam placement in the Amazon basin

    USGS Publications Warehouse

    Wu, Xiaojian; Gomes-Selman, Jonathan; Shi, Qinru; Xue, Yexiang; Garcia-Villacorta, Roosevelt; Anderson, Elizabeth; Sethi, Suresh; Steinschneider, Scott; Flecker, Alexander; Gomes, Carla P.

    2018-01-01

    Real–world problems are often not fully characterized by a single optimal solution, as they frequently involve multiple competing objectives; it is therefore important to identify the so-called Pareto frontier, which captures solution trade-offs. We propose a fully polynomial-time approximation scheme based on Dynamic Programming (DP) for computing a polynomially succinct curve that approximates the Pareto frontier to within an arbitrarily small > 0 on treestructured networks. Given a set of objectives, our approximation scheme runs in time polynomial in the size of the instance and 1/. We also propose a Mixed Integer Programming (MIP) scheme to approximate the Pareto frontier. The DP and MIP Pareto frontier approaches have complementary strengths and are surprisingly effective. We provide empirical results showing that our methods outperform other approaches in efficiency and accuracy. Our work is motivated by a problem in computational sustainability concerning the proliferation of hydropower dams throughout the Amazon basin. Our goal is to support decision-makers in evaluating impacted ecosystem services on the full scale of the Amazon basin. Our work is general and can be applied to approximate the Pareto frontier of a variety of multiobjective problems on tree-structured networks.

  11. Application of microarray analysis on computer cluster and cloud platforms.

    PubMed

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  12. Biomass burning losses of carbon estimated from ecosystem modeling and satellite data analysis for the Brazilian Amazon region

    NASA Astrophysics Data System (ADS)

    Potter, Christopher; Brooks Genovese, Vanessa; Klooster, Steven; Bobo, Matthew; Torregrosa, Alicia

    To produce a new daily record of gross carbon emissions from biomass burning events and post-burning decomposition fluxes in the states of the Brazilian Legal Amazon (Instituto Brasileiro de Geografia e Estatistica (IBGE), 1991. Anuario Estatistico do Brasil, Vol. 51. Rio de Janeiro, Brazil pp. 1-1024). We have used vegetation greenness estimates from satellite images as inputs to a terrestrial ecosystem production model. This carbon allocation model generates new estimates of regional aboveground vegetation biomass at 8-km resolution. The modeled biomass product is then combined for the first time with fire pixel counts from the advanced very high-resolution radiometer (AVHRR) to overlay regional burning activities in the Amazon. Results from our analysis indicate that carbon emission estimates from annual region-wide sources of deforestation and biomass burning in the early 1990s are apparently three to five times higher than reported in previous studies for the Brazilian Legal Amazon (Houghton et al., 2000. Nature 403, 301-304; Fearnside, 1997. Climatic Change 35, 321-360), i.e., studies which implied that the Legal Amazon region tends toward a net-zero annual source of terrestrial carbon. In contrast, our analysis implies that the total source fluxes over the entire Legal Amazon region range from 0.2 to 1.2 Pg C yr -1, depending strongly on annual rainfall patterns. The reasons for our higher burning emission estimates are (1) use of combustion fractions typically measured during Amazon forest burning events for computing carbon losses, (2) more detailed geographic distribution of vegetation biomass and daily fire activity for the region, and (3) inclusion of fire effects in extensive areas of the Legal Amazon covered by open woodland, secondary forests, savanna, and pasture vegetation. The total area of rainforest estimated annually to be deforested did not differ substantially among the previous analyses cited and our own.

  13. Adventures in Private Cloud: Balancing Cost and Capability at the CloudSat Data Processing Center

    NASA Astrophysics Data System (ADS)

    Partain, P.; Finley, S.; Fluke, J.; Haynes, J. M.; Cronk, H. Q.; Miller, S. D.

    2016-12-01

    Since the beginning of the CloudSat Mission in 2006, The CloudSat Data Processing Center (DPC) at the Cooperative Institute for Research in the Atmosphere (CIRA) has been ingesting data from the satellite and other A-Train sensors, producing data products, and distributing them to researchers around the world. The computing infrastructure was specifically designed to fulfill the requirements as specified at the beginning of what nominally was a two-year mission. The environment consisted of servers dedicated to specific processing tasks in a rigid workflow to generate the required products. To the benefit of science and with credit to the mission engineers, CloudSat has lasted well beyond its planned lifetime and is still collecting data ten years later. Over that period requirements of the data processing system have greatly expanded and opportunities for providing value-added services have presented themselves. But while demands on the system have increased, the initial design allowed for very little expansion in terms of scalability and flexibility. The design did change to include virtual machine processing nodes and distributed workflows but infrastructure management was still a time consuming task when system modification was required to run new tests or implement new processes. To address the scalability, flexibility, and manageability of the system Cloud computing methods and technologies are now being employed. The use of a public cloud like Amazon Elastic Compute Cloud or Google Compute Engine was considered but, among other issues, data transfer and storage cost becomes a problem especially when demand fluctuates as a result of reprocessing and the introduction of new products and services. Instead, the existing system was converted to an on premises private Cloud using the OpenStack computing platform and Ceph software defined storage to reap the benefits of the Cloud computing paradigm. This work details the decisions that were made, the benefits that have been realized, the difficulties that were encountered and issues that still exist.

  14. Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic

    PubMed Central

    Sanduja, S; Jewell, P; Aron, E; Pharai, N

    2015-01-01

    Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic. PMID:26451333

  15. Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.

    PubMed

    Sanduja, S; Jewell, P; Aron, E; Pharai, N

    2015-09-01

    Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.

  16. Libraries in the Cloud: Making a Case for Google and Amazon

    ERIC Educational Resources Information Center

    Buck, Stephanie

    2009-01-01

    As news outlets create headlines such as "A Cloud & A Prayer," "The Cloud Is the Computer," and "Leveraging Clouds to Make You More Efficient," many readers have been left with cloud confusion. Many definitions exist for cloud computing, and a uniform definition is hard to find. In its most basic form, cloud…

  17. Computer program ETC improves computation of elastic transfer matrices of Legendre polynomials P/0/ and P/1/

    NASA Technical Reports Server (NTRS)

    Gibson, G.; Miller, M.

    1967-01-01

    Computer program ETC improves computation of elastic transfer matrices of Legendre polynomials P/0/ and P/1/. Rather than carrying out a double integration numerically, one of the integrations is accomplished analytically and the numerical integration need only be carried out over one variable.

  18. Cloud Computing Security Issue: Survey

    NASA Astrophysics Data System (ADS)

    Kamal, Shailza; Kaur, Rajpreet

    2011-12-01

    Cloud computing is the growing field in IT industry since 2007 proposed by IBM. Another company like Google, Amazon, and Microsoft provides further products to cloud computing. The cloud computing is the internet based computing that shared recourses, information on demand. It provides the services like SaaS, IaaS and PaaS. The services and recourses are shared by virtualization that run multiple operation applications on cloud computing. This discussion gives the survey on the challenges on security issues during cloud computing and describes some standards and protocols that presents how security can be managed.

  19. Operating Dedicated Data Centers - Is It Cost-Effective?

    NASA Astrophysics Data System (ADS)

    Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.

  20. Computer program for investigating effects of nonlinear suspension-system elastic properties on parachute inflation loads and motions

    NASA Technical Reports Server (NTRS)

    Poole, L. R.

    1972-01-01

    A computer program is presented by which the effects of nonlinear suspension-system elastic characteristics on parachute inflation loads and motions can be investigated. A mathematical elastic model of suspension-system geometry is coupled to the planar equations of motion of a general vehicle and canopy. Canopy geometry and aerodynamic drag characteristics and suspension-system elastic properties are tabular inputs. The equations of motion are numerically integrated by use of an equivalent fifth-order Runge-Kutta technique.

  1. Computer Simulation of the Elastic Properties of Titanium Alloys for Medical Applications

    NASA Astrophysics Data System (ADS)

    Estevez, Elsa Paz; Burganova, R. M.; Lysogorskii, Yu. V.

    2016-09-01

    Results of a computer simulation of the elastic properties of α+β- and β-titanium alloys, used for medical purposes, within the framework of the molecular-dynamics method are presented. It is shown that β-titanium alloys are best suited for the use as bone implants because of their small moduli of elasticity. The advisability of the use of the molecular-dynamics method for the study of the elastic properties of titanium alloys, serving as bone implants, is demonstrated.

  2. Adopting Cloud Computing in the Pakistan Navy

    DTIC Science & Technology

    2015-06-01

    administrative aspect is required to operate optimally, provide synchronized delivery of cloud services, and integrate multi-provider cloud environment...AND ABBREVIATIONS ANSI American National Standards Institute AWS Amazon web services CIA Confidentiality Integrity Availability CIO Chief...also adopted cloud computing as an integral component of military operations conducted either locally or remotely. With the use of 2 cloud services

  3. Elastic-plastic finite-element analyses of thermally cycled single-edge wedge specimens

    NASA Technical Reports Server (NTRS)

    Kaufman, A.

    1982-01-01

    Elastic-plastic stress-strain analyses were performed for single-edge wedge alloys subjected to thermal cycling in fluidized beds. Three cases (NASA TAZ-8A alloy under one cycling condition and 316 stainless steel alloy under two cycling conditions) were analyzed by using the MARC nonlinear, finite-element computer program. Elastic solutions from MARC showed good agreement with previously reported solutions that used the NASTRAN and ISO3DQ computer programs. The NASA TAZ-8A case exhibited no plastic strains, and the elastic and elastic-plastic analyses gave identical results. Elastic-plastic analyses of the 316 stainless steel alloy showed plastic strain reversal with a shift of the mean stresses in the compressive direction. The maximum equivalent total strain ranges for these cases were 13 to 22 percent greater than that calculated from elastic analyses.

  4. Blade Displacement Predictions for the Full-Scale UH-60A Airloads Rotor

    NASA Technical Reports Server (NTRS)

    Bledron, Robert T.; Lee-Rausch, Elizabeth M.

    2014-01-01

    An unsteady Reynolds-Averaged Navier-Stokes solver for unstructured grids is loosely coupled to a rotorcraft comprehensive code and used to simulate two different test conditions from a wind-tunnel test of a full-scale UH-60A rotor. Performance data and sectional airloads from the simulation are compared with corresponding tunnel data to assess the level of fidelity of the aerodynamic aspects of the simulation. The focus then turns to a comparison of the blade displacements, both rigid (blade root) and elastic. Comparisons of computed root motions are made with data from three independent measurement systems. Finally, comparisons are made between computed elastic bending and elastic twist, and the corresponding measurements obtained from a photogrammetry system. Overall the correlation between computed and measured displacements was good, especially for the root pitch and lag motions and the elastic bending deformation. The correlation of root lead-lag motion and elastic twist deformation was less favorable.

  5. A computer program to trace seismic ray distribution in complex two-dimensional geological models

    USGS Publications Warehouse

    Yacoub, Nazieh K.; Scott, James H.

    1970-01-01

    A computer program has been developed to trace seismic rays and their amplitudes and energies through complex two-dimensional geological models, for which boundaries between elastic units are defined by a series of digitized X-, Y-coordinate values. Input data for the program includes problem identification, control parameters, model coordinates and elastic parameter for the elastic units. The program evaluates the partitioning of ray amplitude and energy at elastic boundaries, computes the total travel time, total travel distance and other parameters for rays arising at the earth's surface. Instructions are given for punching program control cards and data cards, and for arranging input card decks. An example of printer output for a simple problem is presented. The program is written in FORTRAN IV language. The listing of the program is shown in the Appendix, with an example output from a CDC-6600 computer.

  6. Atomistic calculations of interface elastic properties in noncoherent metallic bilayers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mi Changwen; Jun, Sukky; Kouris, Demitris A.

    2008-02-15

    The paper describes theoretical and computational studies associated with the interface elastic properties of noncoherent metallic bicrystals. Analytical forms of interface energy, interface stresses, and interface elastic constants are derived in terms of interatomic potential functions. Embedded-atom method potentials are then incorporated into the model to compute these excess thermodynamics variables, using energy minimization in a parallel computing environment. The proposed model is validated by calculating surface thermodynamic variables and comparing them with preexisting data. Next, the interface elastic properties of several fcc-fcc bicrystals are computed. The excess energies and stresses of interfaces are smaller than those on free surfacesmore » of the same crystal orientations. In addition, no negative values of interface stresses are observed. Current results can be applied to various heterogeneous materials where interfaces assume a prominent role in the systems' mechanical behavior.« less

  7. Passive and active ventricular elastances of the left ventricle

    PubMed Central

    Zhong, Liang; Ghista, Dhanjoo N; Ng, Eddie YK; Lim, Soo T

    2005-01-01

    Background Description of the heart as a pump has been dominated by models based on elastance and compliance. Here, we are presenting a somewhat new concept of time-varying passive and active elastance. The mathematical basis of time-varying elastance of the ventricle is presented. We have defined elastance in terms of the relationship between ventricular pressure and volume, as: dP = EdV + VdE, where E includes passive (Ep) and active (Ea) elastance. By incorporating this concept in left ventricular (LV) models to simulate filling and systolic phases, we have obtained the time-varying expression for Ea and the LV-volume dependent expression for Ep. Methods and Results Using the patient's catheterization-ventriculogram data, the values of passive and active elastance are computed. Ea is expressed as: ; Epis represented as: . Ea is deemed to represent a measure of LV contractility. Hence, Peak dP/dt and ejection fraction (EF) are computed from the monitored data and used as the traditional measures of LV contractility. When our computed peak active elastance (Ea,max) is compared against these traditional indices by linear regression, a high degree of correlation is obtained. As regards Ep, it constitutes a volume-dependent stiffness property of the LV, and is deemed to represent resistance-to-filling. Conclusions Passive and active ventricular elastance formulae can be evaluated from a single-beat P-V data by means of a simple-to-apply LV model. The active elastance (Ea) can be used to characterize the ventricle's contractile state, while passive elastance (Ep) can represent a measure of resistance-to-filling. PMID:15707494

  8. Galaxy CloudMan: delivering cloud compute clusters.

    PubMed

    Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James

    2010-12-21

    Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.

  9. BlueSky Cloud - rapid infrastructure capacity using Amazon's Cloud for wildfire emergency response

    NASA Astrophysics Data System (ADS)

    Haderman, M.; Larkin, N. K.; Beach, M.; Cavallaro, A. M.; Stilley, J. C.; DeWinter, J. L.; Craig, K. J.; Raffuse, S. M.

    2013-12-01

    During peak fire season in the United States, many large wildfires often burn simultaneously across the country. Smoke from these fires can produce air quality emergencies. It is vital that incident commanders, air quality agencies, and public health officials have smoke impact information at their fingertips for evaluating where fires and smoke are and where the smoke will go next. To address the need for this kind of information, the U.S. Forest Service AirFire Team created the BlueSky Framework, a modeling system that predicts concentrations of particle pollution from wildfires. During emergency response, decision makers use BlueSky predictions to make public outreach and evacuation decisions. The models used in BlueSky predictions are computationally intensive, and the peak fire season requires significantly more computer resources than off-peak times. Purchasing enough hardware to run the number of BlueSky Framework runs that are needed during fire season is expensive and leaves idle servers running the majority of the year. The AirFire Team and STI developed BlueSky Cloud to take advantage of Amazon's virtual servers hosted in the cloud. With BlueSky Cloud, as demand increases and decreases, servers can be easily spun up and spun down at a minimal cost. Moving standard BlueSky Framework runs into the Amazon Cloud made it possible for the AirFire Team to rapidly increase the number of BlueSky Framework instances that could be run simultaneously without the costs associated with purchasing and managing servers. In this presentation, we provide an overview of the features of BlueSky Cloud, describe how the system uses Amazon Cloud, and discuss the costs and benefits of moving from privately hosted servers to a cloud-based infrastructure.

  10. Elastic-plastic finite-element analyses of thermally cycled double-edge wedge specimens

    NASA Technical Reports Server (NTRS)

    Kaufman, A.; Hunt, L. E.

    1982-01-01

    Elastic-plastic stress-strain analyses were performed for double-edge wedge specimens subjected to thermal cycling in fluidized beds at 316 and 1088 C. Four cases involving different nickel-base alloys (IN 100, Mar M-200, NASA TAZ-8A, and Rene 80) were analyzed by using the MARC nonlinear, finite element computer program. Elastic solutions from MARC showed good agreement with previously reported solutions obtained by using the NASTRAN and ISO3DQ computer programs. Equivalent total strain ranges at the critical locations calculated by elastic analyses agreed within 3 percent with those calculated from elastic-plastic analyses. The elastic analyses always resulted in compressive mean stresses at the critical locations. However, elastic-plastic analyses showed tensile mean stresses for two of the four alloys and an increase in the compressive mean stress for the highest plastic strain case.

  11. Bayesian inferences suggest that Amazon Yunga Natives diverged from Andeans less than 5000 ybp: implications for South American prehistory.

    PubMed

    Scliar, Marilia O; Gouveia, Mateus H; Benazzo, Andrea; Ghirotto, Silvia; Fagundes, Nelson J R; Leal, Thiago P; Magalhães, Wagner C S; Pereira, Latife; Rodrigues, Maira R; Soares-Souza, Giordano B; Cabrera, Lilia; Berg, Douglas E; Gilman, Robert H; Bertorelle, Giorgio; Tarazona-Santos, Eduardo

    2014-09-30

    Archaeology reports millenary cultural contacts between Peruvian Coast-Andes and the Amazon Yunga, a rainforest transitional region between Andes and Lower Amazonia. To clarify the relationships between cultural and biological evolution of these populations, in particular between Amazon Yungas and Andeans, we used DNA-sequence data, a model-based Bayesian approach and several statistical validations to infer a set of demographic parameters. We found that the genetic diversity of the Shimaa (an Amazon Yunga population) is a subset of that of Quechuas from Central-Andes. Using the Isolation-with-Migration population genetics model, we inferred that the Shimaa ancestors were a small subgroup that split less than 5300 years ago (after the development of complex societies) from an ancestral Andean population. After the split, the most plausible scenario compatible with our results is that the ancestors of Shimaas moved toward the Peruvian Amazon Yunga and incorporated the culture and language of some of their neighbors, but not a substantial amount of their genes. We validated our results using Approximate Bayesian Computations, posterior predictive tests and the analysis of pseudo-observed datasets. We presented a case study in which model-based Bayesian approaches, combined with necessary statistical validations, shed light into the prehistoric demographic relationship between Andeans and a population from the Amazon Yunga. Our results offer a testable model for the peopling of this large transitional environmental region between the Andes and the Lower Amazonia. However, studies on larger samples and involving more populations of these regions are necessary to confirm if the predominant Andean biological origin of the Shimaas is the rule, and not the exception.

  12. Towards Cloud-based Asynchronous Elasticity for Iterative HPC Applications

    NASA Astrophysics Data System (ADS)

    da Rosa Righi, Rodrigo; Facco Rodrigues, Vinicius; André da Costa, Cristiano; Kreutz, Diego; Heiss, Hans-Ulrich

    2015-10-01

    Elasticity is one of the key features of cloud computing. It allows applications to dynamically scale computing and storage resources, avoiding over- and under-provisioning. In high performance computing (HPC), initiatives are normally modeled to handle bag-of-tasks or key-value applications through a load balancer and a loosely-coupled set of virtual machine (VM) instances. In the joint-field of Message Passing Interface (MPI) and tightly-coupled HPC applications, we observe the need of rewriting source codes, previous knowledge of the application and/or stop-reconfigure-and-go approaches to address cloud elasticity. Besides, there are problems related to how profit this new feature in the HPC scope, since in MPI 2.0 applications the programmers need to handle communicators by themselves, and a sudden consolidation of a VM, together with a process, can compromise the entire execution. To address these issues, we propose a PaaS-based elasticity model, named AutoElastic. It acts as a middleware that allows iterative HPC applications to take advantage of dynamic resource provisioning of cloud infrastructures without any major modification. AutoElastic provides a new concept denoted here as asynchronous elasticity, i.e., it provides a framework to allow applications to either increase or decrease their computing resources without blocking the current execution. The feasibility of AutoElastic is demonstrated through a prototype that runs a CPU-bound numerical integration application on top of the OpenNebula middleware. The results showed the saving of about 3 min at each scaling out operations, emphasizing the contribution of the new concept on contexts where seconds are precious.

  13. Highly parameterized model calibration with cloud computing: an example of regional flow model calibration in northeast Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Hayley, Kevin; Schumacher, J.; MacMillan, G. J.; Boutin, L. C.

    2014-05-01

    Expanding groundwater datasets collected by automated sensors, and improved groundwater databases, have caused a rapid increase in calibration data available for groundwater modeling projects. Improved methods of subsurface characterization have increased the need for model complexity to represent geological and hydrogeological interpretations. The larger calibration datasets and the need for meaningful predictive uncertainty analysis have both increased the degree of parameterization necessary during model calibration. Due to these competing demands, modern groundwater modeling efforts require a massive degree of parallelization in order to remain computationally tractable. A methodology for the calibration of highly parameterized, computationally expensive models using the Amazon EC2 cloud computing service is presented. The calibration of a regional-scale model of groundwater flow in Alberta, Canada, is provided as an example. The model covers a 30,865-km2 domain and includes 28 hydrostratigraphic units. Aquifer properties were calibrated to more than 1,500 static hydraulic head measurements and 10 years of measurements during industrial groundwater use. Three regionally extensive aquifers were parameterized (with spatially variable hydraulic conductivity fields), as was the aerial recharge boundary condition, leading to 450 adjustable parameters in total. The PEST-based model calibration was parallelized on up to 250 computing nodes located on Amazon's EC2 servers.

  14. Effects of Convective Transport on the Budget of Amazonian Aerosol under Background Conditions

    NASA Astrophysics Data System (ADS)

    Wang, J.; Krejci, R.; Giangrande, S. E.; Kuang, C.; Barbosa, H. M.; Brito, J.; Carbone, S.; Chi, X.; Comstock, J. M.; Ditas, F.; Lavric, J. V.; Manninen, H. E.; Mei, F.; Moran, D.; Pöhlker, C.; Pöhlker, M. L.; Saturno, J.; Schmid, B.; Souza, R. A. F. D.; Springston, S. R.; Tomlinson, J. M.; Toto, T.; Walter, D.; Wimmer, D.; Smith, J. N.; Machado, L.; Artaxo, P.; Andreae, M. O.; Martin, S. T.

    2016-12-01

    Aerosol particles can strongly influence the radiative properties of clouds, and they represent one of the largest uncertainties in computer simulations of climate change. The large uncertainty is in large part due to a poor understanding of processes under natural conditions, which serves as the baseline to measure change against. Understanding the processes under natural conditions is critical for a reliable assessment and quantification of ongoing and future climate change. The Amazon rainforest is one of the few continental regions where aerosol particles and their precursors can be studied under near-natural conditions. Here we examine the aerosol number and CCN budget under background conditions in the Amazon basin using data collected during the Observations and Modeling of the Green Ocean Amazon (GoAmazon 2014/5) campaign, which took place from January 2014 to December 2015 near Manaus, Brazil. The aerosol size spectrum was observed at the Amazon Tall Tower Observatory (ATTO), 150 km upwind of Manaus, and its variation with convection and precipitation during the wet season is presented. Air masses arriving at the ATTO during the wet season are typically brought by the northeasterly trade winds and travel across at least 1000 km of undeveloped tropical rainforest, therefore are generally clean. Also shown are vertical profiles of aerosol observed onboard the DOE Gulfstream-1 research aircraft. The impact of convective transport on the budget of boundary layer aerosol and CCN under the background conditions is discussed.

  15. Characterizing Temporal and Spatial Changes in Land Surface Temperature across the Amazon Basin using Thermal and Infrared Satellite Data

    NASA Astrophysics Data System (ADS)

    Cak, A. D.

    2017-12-01

    The Amazon Basin has faced innumerable pressures in recent years, including logging, mining and resource extraction, agricultural expansion, road building, and urbanization. These changes have drastically altered the landscape, transforming a predominantly forested environment into a mosaic of different types of land cover. The resulting fragmentation has caused dramatic and negative impacts on its structure and function, including on biodiversity and the transfer of water and energy to and from soil, vegetation, and the atmosphere (e.g., evapotranspiration). Because evapotranspiration from forested areas, which is affected by factors including temperature and water availability, plays a significant role in water dynamics in the Amazon Basin, measuring land surface temperature (LST) across the region can provide a dynamic assessment of hydrological, vegetation, and land use and land cover changes. It can also help to identify widespread urban development, which often has a higher LST signal relative to surrounding vegetation. Here, we discuss results from work to measure and identify drivers of change in LST across the entire Amazon Basin through analysis of past and current thermal and infrared satellite imagery. We leverage cloud computing resources in new ways to allow for more efficient analysis of imagery over the Amazon Basin across multiple years and multiple sensors. We also assess potential drivers of change in LST using spatial and multivariate statistical analyses with additional data sources of land cover, urban development, and demographics.

  16. Multiscale approach including microfibril scale to assess elastic constants of cortical bone based on neural network computation and homogenization method.

    PubMed

    Barkaoui, Abdelwahed; Chamekh, Abdessalem; Merzouki, Tarek; Hambli, Ridha; Mkaddem, Ali

    2014-03-01

    The complexity and heterogeneity of bone tissue require a multiscale modeling to understand its mechanical behavior and its remodeling mechanisms. In this paper, a novel multiscale hierarchical approach including microfibril scale based on hybrid neural network (NN) computation and homogenization equations was developed to link nanoscopic and macroscopic scales to estimate the elastic properties of human cortical bone. The multiscale model is divided into three main phases: (i) in step 0, the elastic constants of collagen-water and mineral-water composites are calculated by averaging the upper and lower Hill bounds; (ii) in step 1, the elastic properties of the collagen microfibril are computed using a trained NN simulation. Finite element calculation is performed at nanoscopic levels to provide a database to train an in-house NN program; and (iii) in steps 2-10 from fibril to continuum cortical bone tissue, homogenization equations are used to perform the computation at the higher scales. The NN outputs (elastic properties of the microfibril) are used as inputs for the homogenization computation to determine the properties of mineralized collagen fibril. The mechanical and geometrical properties of bone constituents (mineral, collagen, and cross-links) as well as the porosity were taken in consideration. This paper aims to predict analytically the effective elastic constants of cortical bone by modeling its elastic response at these different scales, ranging from the nanostructural to mesostructural levels. Our findings of the lowest scale's output were well integrated with the other higher levels and serve as inputs for the next higher scale modeling. Good agreement was obtained between our predicted results and literature data. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Importance of elastic finite-size effects: Neutral defects in ionic compounds

    DOE PAGES

    Burr, P. A.; Cooper, M. W. D.

    2017-09-15

    Small system sizes are a well known source of error in DFT calculations, yet computational constraints frequently dictate the use of small supercells, often as small as 96 atoms in oxides and compound semiconductors. In ionic compounds, electrostatic finite size effects have been well characterised, but self-interaction of charge neutral defects is often discounted or assumed to follow an asymptotic behaviour and thus easily corrected with linear elastic theory. Here we show that elastic effect are also important in the description of defects in ionic compounds and can lead to qualitatively incorrect conclusions if inadequatly small supercells are used; moreover,more » the spurious self-interaction does not follow the behaviour predicted by linear elastic theory. Considering the exemplar cases of metal oxides with fluorite structure, we show that numerous previous studies, employing 96-atom supercells, misidentify the ground state structure of (charge neutral) Schottky defects. We show that the error is eliminated by employing larger cells (324, 768 and 1500 atoms), and careful analysis determines that elastic effects, not electrostatic, are responsible. The spurious self-interaction was also observed in non-oxide ionic compounds and irrespective of the computational method used, thereby resolving long standing discrepancies between DFT and force-field methods, previously attributed to the level of theory. The surprising magnitude of the elastic effects are a cautionary tale for defect calculations in ionic materials, particularly when employing computationally expensive methods (e.g. hybrid functionals) or when modelling large defect clusters. We propose two computationally practicable methods to test the magnitude of the elastic self-interaction in any ionic system. In commonly studies oxides, where electrostatic effects would be expected to be dominant, it is the elastic effects that dictate the need for larger supercells | greater than 96 atoms.« less

  18. Importance of elastic finite-size effects: Neutral defects in ionic compounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, P. A.; Cooper, M. W. D.

    Small system sizes are a well known source of error in DFT calculations, yet computational constraints frequently dictate the use of small supercells, often as small as 96 atoms in oxides and compound semiconductors. In ionic compounds, electrostatic finite size effects have been well characterised, but self-interaction of charge neutral defects is often discounted or assumed to follow an asymptotic behaviour and thus easily corrected with linear elastic theory. Here we show that elastic effect are also important in the description of defects in ionic compounds and can lead to qualitatively incorrect conclusions if inadequatly small supercells are used; moreover,more » the spurious self-interaction does not follow the behaviour predicted by linear elastic theory. Considering the exemplar cases of metal oxides with fluorite structure, we show that numerous previous studies, employing 96-atom supercells, misidentify the ground state structure of (charge neutral) Schottky defects. We show that the error is eliminated by employing larger cells (324, 768 and 1500 atoms), and careful analysis determines that elastic effects, not electrostatic, are responsible. The spurious self-interaction was also observed in non-oxide ionic compounds and irrespective of the computational method used, thereby resolving long standing discrepancies between DFT and force-field methods, previously attributed to the level of theory. The surprising magnitude of the elastic effects are a cautionary tale for defect calculations in ionic materials, particularly when employing computationally expensive methods (e.g. hybrid functionals) or when modelling large defect clusters. We propose two computationally practicable methods to test the magnitude of the elastic self-interaction in any ionic system. In commonly studies oxides, where electrostatic effects would be expected to be dominant, it is the elastic effects that dictate the need for larger supercells | greater than 96 atoms.« less

  19. Importance of elastic finite-size effects: Neutral defects in ionic compounds

    NASA Astrophysics Data System (ADS)

    Burr, P. A.; Cooper, M. W. D.

    2017-09-01

    Small system sizes are a well-known source of error in density functional theory (DFT) calculations, yet computational constraints frequently dictate the use of small supercells, often as small as 96 atoms in oxides and compound semiconductors. In ionic compounds, electrostatic finite-size effects have been well characterized, but self-interaction of charge-neutral defects is often discounted or assumed to follow an asymptotic behavior and thus easily corrected with linear elastic theory. Here we show that elastic effects are also important in the description of defects in ionic compounds and can lead to qualitatively incorrect conclusions if inadequately small supercells are used; moreover, the spurious self-interaction does not follow the behavior predicted by linear elastic theory. Considering the exemplar cases of metal oxides with fluorite structure, we show that numerous previous studies, employing 96-atom supercells, misidentify the ground-state structure of (charge-neutral) Schottky defects. We show that the error is eliminated by employing larger cells (324, 768, and 1500 atoms), and careful analysis determines that elastic, not electrostatic, effects are responsible. The spurious self-interaction was also observed in nonoxide ionic compounds irrespective of the computational method used, thereby resolving long-standing discrepancies between DFT and force-field methods, previously attributed to the level of theory. The surprising magnitude of the elastic effects is a cautionary tale for defect calculations in ionic materials, particularly when employing computationally expensive methods (e.g., hybrid functionals) or when modeling large defect clusters. We propose two computationally practicable methods to test the magnitude of the elastic self-interaction in any ionic system. In commonly studied oxides, where electrostatic effects would be expected to be dominant, it is the elastic effects that dictate the need for larger supercells: greater than 96 atoms.

  20. A virtual surgical training system that simulates cutting of soft tissue using a modified pre-computed elastic model.

    PubMed

    Toe, Kyaw Kyar; Huang, Weimin; Yang, Tao; Duan, Yuping; Zhou, Jiayin; Su, Yi; Teo, Soo-Kng; Kumar, Selvaraj Senthil; Lim, Calvin Chi-Wan; Chui, Chee Kong; Chang, Stephen

    2015-08-01

    This work presents a surgical training system that incorporates cutting operation of soft tissue simulated based on a modified pre-computed linear elastic model in the Simulation Open Framework Architecture (SOFA) environment. A precomputed linear elastic model used for the simulation of soft tissue deformation involves computing the compliance matrix a priori based on the topological information of the mesh. While this process may require a few minutes to several hours, based on the number of vertices in the mesh, it needs only to be computed once and allows real-time computation of the subsequent soft tissue deformation. However, as the compliance matrix is based on the initial topology of the mesh, it does not allow any topological changes during simulation, such as cutting or tearing of the mesh. This work proposes a way to modify the pre-computed data by correcting the topological connectivity in the compliance matrix, without re-computing the compliance matrix which is computationally expensive.

  1. Deploying Crowd-Sourced Formal Verification Systems in a DoD Network

    DTIC Science & Technology

    2013-09-01

    INTENTIONALLY LEFT BLANK 1 I. INTRODUCTION A. INTRODUCTION In 2014 cyber attacks on critical infrastructure are expected to increase...CSFV systems on the Internet‒‒possibly using cloud infrastructure (Dean, 2013). By using Amazon Compute Cloud (EC2) systems, DARPA will use ordinary...through standard access methods. Those clients could be mobile phones, laptops, netbooks, tablet computers or personal digital assistants (PDAs) (Smoot

  2. Comparative Study of Nondestructive Pavement Testing, MacDill Air Force Base, Florida

    DTIC Science & Technology

    1987-07-01

    a deflection-basin fitting program that prodicts moduli of the pavement layers and subgrade. A layered-elastic program AIRPOD is used in a fatigue...Layered-elastic Limiting stress in PCC; of pavement layers ( AIRPOD ) limiting strain in AC from deflection basin (ELSYM-5) (BASFIT) AFESC Elastic moduli of...need not be made. 121. Pavement evaluation computations were next accomplished using a series of computer programs referred to as ELSYM-5 and AIRPOD

  3. Computing the Dynamic Response of a Stratified Elastic Half Space Using Diffuse Field Theory

    NASA Astrophysics Data System (ADS)

    Sanchez-Sesma, F. J.; Perton, M.; Molina Villegas, J. C.

    2015-12-01

    The analytical solution for the dynamic response of an elastic half-space for a normal point load at the free surface is due to Lamb (1904). For a tangential force, we have Chaós (1960) formulae. For an arbitrary load at any depth within a stratified elastic half space, the resulting elastic field can be given in the same fashion, by using an integral representation in the radial wavenumber domain. Typically, computations use discrete wave number (DWN) formalism and Fourier analysis allows for solution in space and time domain. Experimentally, these elastic Greeńs functions might be retrieved from ambient vibrations correlations when assuming a diffuse field. In fact, the field could not be totally diffuse and only parts of the Green's functions, associated to surface or body waves, are retrieved. In this communication, we explore the computation of Green functions for a layered media on top of a half-space using a set of equipartitioned elastic plane waves. Our formalism includes body and surface waves (Rayleigh and Love waves). These latter waves correspond to the classical representations in terms of normal modes in the asymptotic case of large separation distance between source and receiver. This approach allows computing Green's functions faster than DWN and separating the surface and body wave contributions in order to better represent Green's function experimentally retrieved.

  4. Pharmacokinetics of nalbuphine hydrochloride after intravenous and intramuscular administration to Hispaniolan Amazon parrots (Amazona ventralis).

    PubMed

    Keller, Dominique L; Sanchez-Migallon Guzman, David; Klauer, Julia M; KuKanich, Butch; Barker, Steven A; Rodríguez-Ramos Fernández, Julia; Paul-Murphy, Joanne R

    2011-06-01

    To assess the pharmacokinetics of nalbuphine HCl after IV and IM administration to Hispaniolan Amazon parrots (Amazona ventralis). 8 healthy adult Hispaniolan Amazon parrots of unknown sex. Nalbuphine HCl (12.5 mg/kg) was administered IV and IM to all birds in a complete randomized crossover study design; there was a washout period of 21 days between subsequent administrations. Plasma samples were obtained from blood collected at predetermined time points for measurement of nalbuphine concentration by use of liquid chromatography-tandem mass spectrometry. Pharmacokinetic parameters were estimated by use of computer software. Nalbuphine was rapidly eliminated with a terminal half-life of 0.33 hours and clearance of 69.95 mL/min/kg after IV administration and a half-life of 0.35 hours after IM administration. Volume of distribution was 2.01 L/kg after IV administration. The fraction of the dose absorbed was high (1.03) after IM administration. No adverse effects were detected in the parrots during the study. In Hispaniolan Amazon parrots, nalbuphine appeared to have good bioavailability after IM administration and was rapidly cleared after IV and IM administration. Safety and analgesic efficacy of various nalbuphine treatment regimens in this species require further investigation to determine the potential for clinical palliation of signs of pain in psittacine species.

  5. Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Fisher, W.; Yoksas, T.

    2014-12-01

    Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high student expectations. These changes are upending traditional approaches to accessing and using data and software. It is clear that Unidata's products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our initial efforts to deploy a subset of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.

  6. PhyloBot: A Web Portal for Automated Phylogenetics, Ancestral Sequence Reconstruction, and Exploration of Mutational Trajectories.

    PubMed

    Hanson-Smith, Victor; Johnson, Alexander

    2016-07-01

    The method of phylogenetic ancestral sequence reconstruction is a powerful approach for studying evolutionary relationships among protein sequence, structure, and function. In particular, this approach allows investigators to (1) reconstruct and "resurrect" (that is, synthesize in vivo or in vitro) extinct proteins to study how they differ from modern proteins, (2) identify key amino acid changes that, over evolutionary timescales, have altered the function of the protein, and (3) order historical events in the evolution of protein function. Widespread use of this approach has been slow among molecular biologists, in part because the methods require significant computational expertise. Here we present PhyloBot, a web-based software tool that makes ancestral sequence reconstruction easy. Designed for non-experts, it integrates all the necessary software into a single user interface. Additionally, PhyloBot provides interactive tools to explore evolutionary trajectories between ancestors, enabling the rapid generation of hypotheses that can be tested using genetic or biochemical approaches. Early versions of this software were used in previous studies to discover genetic mechanisms underlying the functions of diverse protein families, including V-ATPase ion pumps, DNA-binding transcription regulators, and serine/threonine protein kinases. PhyloBot runs in a web browser, and is available at the following URL: http://www.phylobot.com. The software is implemented in Python using the Django web framework, and runs on elastic cloud computing resources from Amazon Web Services. Users can create and submit jobs on our free server (at the URL listed above), or use our open-source code to launch their own PhyloBot server.

  7. PhyloBot: A Web Portal for Automated Phylogenetics, Ancestral Sequence Reconstruction, and Exploration of Mutational Trajectories

    PubMed Central

    Hanson-Smith, Victor; Johnson, Alexander

    2016-01-01

    The method of phylogenetic ancestral sequence reconstruction is a powerful approach for studying evolutionary relationships among protein sequence, structure, and function. In particular, this approach allows investigators to (1) reconstruct and “resurrect” (that is, synthesize in vivo or in vitro) extinct proteins to study how they differ from modern proteins, (2) identify key amino acid changes that, over evolutionary timescales, have altered the function of the protein, and (3) order historical events in the evolution of protein function. Widespread use of this approach has been slow among molecular biologists, in part because the methods require significant computational expertise. Here we present PhyloBot, a web-based software tool that makes ancestral sequence reconstruction easy. Designed for non-experts, it integrates all the necessary software into a single user interface. Additionally, PhyloBot provides interactive tools to explore evolutionary trajectories between ancestors, enabling the rapid generation of hypotheses that can be tested using genetic or biochemical approaches. Early versions of this software were used in previous studies to discover genetic mechanisms underlying the functions of diverse protein families, including V-ATPase ion pumps, DNA-binding transcription regulators, and serine/threonine protein kinases. PhyloBot runs in a web browser, and is available at the following URL: http://www.phylobot.com. The software is implemented in Python using the Django web framework, and runs on elastic cloud computing resources from Amazon Web Services. Users can create and submit jobs on our free server (at the URL listed above), or use our open-source code to launch their own PhyloBot server. PMID:27472806

  8. Producing an Infrared Multiwavelength Galactic Plane Atlas Using Montage, Pegasus, and Amazon Web Services

    NASA Astrophysics Data System (ADS)

    Rynge, M.; Juve, G.; Kinney, J.; Good, J.; Berriman, B.; Merrihew, A.; Deelman, E.

    2014-05-01

    In this paper, we describe how to leverage cloud resources to generate large-scale mosaics of the galactic plane in multiple wavelengths. Our goal is to generate a 16-wavelength infrared Atlas of the Galactic Plane at a common spatial sampling of 1 arcsec, processed so that they appear to have been measured with a single instrument. This will be achieved by using the Montage image mosaic engine process observations from the 2MASS, GLIMPSE, MIPSGAL, MSX and WISE datasets, over a wavelength range of 1 μm to 24 μm, and by using the Pegasus Workflow Management System for managing the workload. When complete, the Atlas will be made available to the community as a data product. We are generating images that cover ±180° in Galactic longitude and ±20° in Galactic latitude, to the extent permitted by the spatial coverage of each dataset. Each image will be 5°x5° in size (including an overlap of 1° with neighboring tiles), resulting in an atlas of 1,001 images. The final size will be about 50 TBs. This paper will focus on the computational challenges, solutions, and lessons learned in producing the Atlas. To manage the computation we are using the Pegasus Workflow Management System, a mature, highly fault-tolerant system now in release 4.2.2 that has found wide applicability across many science disciplines. A scientific workflow describes the dependencies between the tasks and in most cases the workflow is described as a directed acyclic graph, where the nodes are tasks and the edges denote the task dependencies. A defining property for a scientific workflow is that it manages data flow between tasks. Applied to the galactic plane project, each 5 by 5 mosaic is a Pegasus workflow. Pegasus is used to fetch the source images, execute the image mosaicking steps of Montage, and store the final outputs in a storage system. As these workflows are very I/O intensive, care has to be taken when choosing what infrastructure to execute the workflow on. In our setup, we choose to use dynamically provisioned compute clusters running on the Amazon Elastic Compute Cloud (EC2). All our instances are using the same base image, which is configured to come up as a master node by default. The master node is a central instance from where the workflow can be managed. Additional worker instances are provisioned and configured to accept work assignments from the master node. The system allows for adding/removing workers in an ad hoc fashion, and could be run in large configurations. To-date we have performed 245,000 CPU hours of computing and generated 7,029 images and totaling 30 TB. With the current set up our runtime would be 340,000 CPU hours for the whole project. Using spot m2.4xlarge instances, the cost would be approximately $5,950. Using faster AWS instances, such as cc2.8xlarge could potentially decrease the total CPU hours and further reduce the compute costs. The paper will explore these tradeoffs.

  9. Toward a web-based real-time radiation treatment planning system in a cloud computing environment.

    PubMed

    Na, Yong Hum; Suh, Tae-Suk; Kapp, Daniel S; Xing, Lei

    2013-09-21

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an 'on-demand' basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture's constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm(2)) from the Varian TrueBeam(TM) STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.

  10. Toward a web-based real-time radiation treatment planning system in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Hum Na, Yong; Suh, Tae-Suk; Kapp, Daniel S.; Xing, Lei

    2013-09-01

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an ‘on-demand’ basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture’s constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm2) from the Varian TrueBeamTM STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.

  11. Elasticity of crystalline molecular explosives

    DOE PAGES

    Hooks, Daniel E.; Ramos, Kyle J.; Bolme, C. A.; ...

    2015-04-14

    Crystalline molecular explosives are key components of engineered explosive formulations. In precision applications a high degree of consistency and predictability is desired under a range of conditions to a variety of stimuli. Prediction of behaviors from mechanical response and failure to detonation initiation and detonation performance of the material is linked to accurate knowledge of the material structure and first stage of deformation: elasticity. The elastic response of pentaerythritol tetranitrate (PETN), cyclotrimethylene trinitramine (RDX), and cyclotetramethylene tetranitramine (HMX), including aspects of material and measurement variability, and computational methods are described in detail. Experimental determinations of elastic tensors are compared, andmore » an evaluation of sources of error is presented. Furthermore, computed elastic constants are also compared for these materials and for triaminotrinitrobenzene (TATB), for which there are no measurements.« less

  12. Fast Computation of High Energy Elastic Collision Scattering Angle for Electric Propulsion Plume Simulation (Conference Paper with Briefing Charts)

    DTIC Science & Technology

    2016-07-10

    Elastic Collision Scattering Angle for Electric Propulsion Plume Simulation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...atom needs to be sampled; however, it is confirmed that initial target atom velocity does not play significant role in typical electric propulsion ...by ANSI Std. 239.18 Fast Computation of High Energy Elastic Collision Scattering Angle for Electric Propulsion Plume Simulation∗ Samuel J. Araki1

  13. AELAS: Automatic ELAStic property derivations via high-throughput first-principles computation

    NASA Astrophysics Data System (ADS)

    Zhang, S. H.; Zhang, R. F.

    2017-11-01

    The elastic properties are fundamental and important for crystalline materials as they relate to other mechanical properties, various thermodynamic qualities as well as some critical physical properties. However, a complete set of experimentally determined elastic properties is only available for a small subset of known materials, and an automatic scheme for the derivations of elastic properties that is adapted to high-throughput computation is much demanding. In this paper, we present the AELAS code, an automated program for calculating second-order elastic constants of both two-dimensional and three-dimensional single crystal materials with any symmetry, which is designed mainly for high-throughput first-principles computation. Other derivations of general elastic properties such as Young's, bulk and shear moduli as well as Poisson's ratio of polycrystal materials, Pugh ratio, Cauchy pressure, elastic anisotropy and elastic stability criterion, are also implemented in this code. The implementation of the code has been critically validated by a lot of evaluations and tests on a broad class of materials including two-dimensional and three-dimensional materials, providing its efficiency and capability for high-throughput screening of specific materials with targeted mechanical properties. Program Files doi:http://dx.doi.org/10.17632/f8fwg4j9tw.1 Licensing provisions: BSD 3-Clause Programming language: Fortran Nature of problem: To automate the calculations of second-order elastic constants and the derivations of other elastic properties for two-dimensional and three-dimensional materials with any symmetry via high-throughput first-principles computation. Solution method: The space-group number is firstly determined by the SPGLIB code [1] and the structure is then redefined to unit cell with IEEE-format [2]. Secondly, based on the determined space group number, a set of distortion modes is automatically specified and the distorted structure files are generated. Afterwards, the total energy for each distorted structure is calculated by the first-principles codes, e.g. VASP [3]. Finally, the second-order elastic constants are determined from the quadratic coefficients of the polynomial fitting of the energies vs strain relationships and other elastic properties are accordingly derived. References [1] http://atztogo.github.io/spglib/. [2] A. Meitzler, H.F. Tiersten, A.W. Warner, D. Berlincourt, G.A. Couqin, F.S. Welsh III, IEEE standard on piezoelectricity, Society, 1988. [3] G. Kresse, J. Furthmüller, Phys. Rev. B 54 (1996) 11169.

  14. Cloud-based Web Services for Near-Real-Time Web access to NPP Satellite Imagery and other Data

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Valente, E. G.

    2010-12-01

    We are building a scalable, cloud computing-based infrastructure for Web access to near-real-time data products synthesized from the U.S. National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP) and other geospatial and meteorological data. Given recent and ongoing changes in the the NPP and NPOESS programs (now Joint Polar Satellite System), the need for timely delivery of NPP data is urgent. We propose an alternative to a traditional, centralized ground segment, using distributed Direct Broadcast facilities linked to industry-standard Web services by a streamlined processing chain running in a scalable cloud computing environment. Our processing chain, currently implemented on Amazon.com's Elastic Compute Cloud (EC2), retrieves raw data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) and synthesizes data products such as Sea-Surface Temperature, Vegetation Indices, etc. The cloud computing approach lets us grow and shrink computing resources to meet large and rapid fluctuations (twice daily) in both end-user demand and data availability from polar-orbiting sensors. Early prototypes have delivered various data products to end-users with latencies between 6 and 32 minutes. We have begun to replicate machine instances in the cloud, so as to reduce latency and maintain near-real time data access regardless of increased data input rates or user demand -- all at quite moderate monthly costs. Our service-based approach (in which users invoke software processes on a Web-accessible server) facilitates access into datasets of arbitrary size and resolution, and allows users to request and receive tailored and composite (e.g., false-color multiband) products on demand. To facilitate broad impact and adoption of our technology, we have emphasized open, industry-standard software interfaces and open source software. Through our work, we envision the widespread establishment of similar, derived, or interoperable systems for processing and serving near-real-time data from NPP and other sensors. A scalable architecture based on cloud computing ensures cost-effective, real-time processing and delivery of NPP and other data. Access via standard Web services maximizes its interoperability and usefulness.

  15. OceanXtremes: Scalable Anomaly Detection in Oceanographic Time-Series

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Armstrong, E. M.; Chin, T. M.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jacob, J. C.; Quach, N.

    2016-12-01

    The oceanographic community must meet the challenge to rapidly identify features and anomalies in complex and voluminous observations to further science and improve decision support. Given this data-intensive reality, we are developing an anomaly detection system, called OceanXtremes, powered by an intelligent, elastic Cloud-based analytic service backend that enables execution of domain-specific, multi-scale anomaly and feature detection algorithms across the entire archive of 15 to 30-year ocean science datasets.Our parallel analytics engine is extending the NEXUS system and exploits multiple open-source technologies: Apache Cassandra as a distributed spatial "tile" cache, Apache Spark for in-memory parallel computation, and Apache Solr for spatial search and storing pre-computed tile statistics and other metadata. OceanXtremes provides these key capabilities: Parallel generation (Spark on a compute cluster) of 15 to 30-year Ocean Climatologies (e.g. sea surface temperature or SST) in hours or overnight, using simple pixel averages or customizable Gaussian-weighted "smoothing" over latitude, longitude, and time; Parallel pre-computation, tiling, and caching of anomaly fields (daily variables minus a chosen climatology) with pre-computed tile statistics; Parallel detection (over the time-series of tiles) of anomalies or phenomena by regional area-averages exceeding a specified threshold (e.g. high SST in El Nino or SST "blob" regions), or more complex, custom data mining algorithms; Shared discovery and exploration of ocean phenomena and anomalies (facet search using Solr), along with unexpected correlations between key measured variables; Scalable execution for all capabilities on a hybrid Cloud, using our on-premise OpenStack Cloud cluster or at Amazon. The key idea is that the parallel data-mining operations will be run "near" the ocean data archives (a local "network" hop) so that we can efficiently access the thousands of files making up a three decade time-series. The presentation will cover the architecture of OceanXtremes, parallelization of the climatology computation and anomaly detection algorithms using Spark, example results for SST and other time-series, and parallel performance metrics.

  16. On the anisotropic elastic properties of hydroxyapatite.

    NASA Technical Reports Server (NTRS)

    Katz, J. L.; Ukraincik, K.

    1971-01-01

    Experimental measurements of the isotropic elastic moduli on polycrystalline specimens of hydroxyapatite and fluorapatite are compared with elastic constants measured directly from single crystals of fluorapatite in order to derive a set of pseudo single crystal elastic constants for hydroxyapatite. The stiffness coefficients thus derived are given. The anisotropic and isotropic elastic properties are then computed and compared with similar properties derived from experimental observations of the anisotropic behavior of bone.

  17. An expert fitness diagnosis system based on elastic cloud computing.

    PubMed

    Tseng, Kevin C; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  18. Materials constitutive models for nonlinear analysis of thermally cycled structures

    NASA Technical Reports Server (NTRS)

    Kaufman, A.; Hunt, L. E.

    1982-01-01

    Effects of inelastic materials models on computed stress-strain solutions for thermally loaded structures were studied by performing nonlinear (elastoplastic creep) and elastic structural analyses on a prismatic, double edge wedge specimen of IN 100 alloy that was subjected to thermal cycling in fluidized beds. Four incremental plasticity creep models (isotropic, kinematic, combined isotropic kinematic, and combined plus transient creep) were exercised for the problem by using the MARC nonlinear, finite element computer program. Maximum total strain ranges computed from the elastic and nonlinear analyses agreed within 5 percent. Mean cyclic stresses, inelastic strain ranges, and inelastic work were significantly affected by the choice of inelastic constitutive model. The computing time per cycle for the nonlinear analyses was more than five times that required for the elastic analysis.

  19. Covariance Matrix of a Double-Differential Doppler-Broadened Elastic Scattering Cross Section

    NASA Astrophysics Data System (ADS)

    Arbanas, G.; Becker, B.; Dagan, R.; Dunn, M. E.; Larson, N. M.; Leal, L. C.; Williams, M. L.

    2012-05-01

    Legendre moments of a double-differential Doppler-broadened elastic neutron scattering cross section on 238U are computed near the 6.67 eV resonance at temperature T = 103 K up to angular order 14. A covariance matrix of these Legendre moments is computed as a functional of the covariance matrix of the elastic scattering cross section. A variance of double-differential Doppler-broadened elastic scattering cross section is computed from the covariance of Legendre moments. Notice: This manuscript has been authored by UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  20. MC-GenomeKey: a multicloud system for the detection and annotation of genomic variants.

    PubMed

    Elshazly, Hatem; Souilmi, Yassine; Tonellato, Peter J; Wall, Dennis P; Abouelhoda, Mohamed

    2017-01-20

    Next Generation Genome sequencing techniques became affordable for massive sequencing efforts devoted to clinical characterization of human diseases. However, the cost of providing cloud-based data analysis of the mounting datasets remains a concerning bottleneck for providing cost-effective clinical services. To address this computational problem, it is important to optimize the variant analysis workflow and the used analysis tools to reduce the overall computational processing time, and concomitantly reduce the processing cost. Furthermore, it is important to capitalize on the use of the recent development in the cloud computing market, which have witnessed more providers competing in terms of products and prices. In this paper, we present a new package called MC-GenomeKey (Multi-Cloud GenomeKey) that efficiently executes the variant analysis workflow for detecting and annotating mutations using cloud resources from different commercial cloud providers. Our package supports Amazon, Google, and Azure clouds, as well as, any other cloud platform based on OpenStack. Our package allows different scenarios of execution with different levels of sophistication, up to the one where a workflow can be executed using a cluster whose nodes come from different clouds. MC-GenomeKey also supports scenarios to exploit the spot instance model of Amazon in combination with the use of other cloud platforms to provide significant cost reduction. To the best of our knowledge, this is the first solution that optimizes the execution of the workflow using computational resources from different cloud providers. MC-GenomeKey provides an efficient multicloud based solution to detect and annotate mutations. The package can run in different commercial cloud platforms, which enables the user to seize the best offers. The package also provides a reliable means to make use of the low-cost spot instance model of Amazon, as it provides an efficient solution to the sudden termination of spot machines as a result of a sudden price increase. The package has a web-interface and it is available for free for academic use.

  1. Cloud CPFP: a shotgun proteomics data analysis pipeline using cloud and high performance computing.

    PubMed

    Trudgian, David C; Mirzaei, Hamid

    2012-12-07

    We have extended the functionality of the Central Proteomics Facilities Pipeline (CPFP) to allow use of remote cloud and high performance computing (HPC) resources for shotgun proteomics data processing. CPFP has been modified to include modular local and remote scheduling for data processing jobs. The pipeline can now be run on a single PC or server, a local cluster, a remote HPC cluster, and/or the Amazon Web Services (AWS) cloud. We provide public images that allow easy deployment of CPFP in its entirety in the AWS cloud. This significantly reduces the effort necessary to use the software, and allows proteomics laboratories to pay for compute time ad hoc, rather than obtaining and maintaining expensive local server clusters. Alternatively the Amazon cloud can be used to increase the throughput of a local installation of CPFP as necessary. We demonstrate that cloud CPFP allows users to process data at higher speed than local installations but with similar cost and lower staff requirements. In addition to the computational improvements, the web interface to CPFP is simplified, and other functionalities are enhanced. The software is under active development at two leading institutions and continues to be released under an open-source license at http://cpfp.sourceforge.net.

  2. Managing competing elastic Grid and Cloud scientific computing applications using OpenNebula

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Lusso, S.; Masera, M.; Vallero, S.

    2015-12-01

    Elastic cloud computing applications, i.e. applications that automatically scale according to computing needs, work on the ideal assumption of infinite resources. While large public cloud infrastructures may be a reasonable approximation of this condition, scientific computing centres like WLCG Grid sites usually work in a saturated regime, in which applications compete for scarce resources through queues, priorities and scheduling policies, and keeping a fraction of the computing cores idle to allow for headroom is usually not an option. In our particular environment one of the applications (a WLCG Tier-2 Grid site) is much larger than all the others and cannot autoscale easily. Nevertheless, other smaller applications can benefit of automatic elasticity; the implementation of this property in our infrastructure, based on the OpenNebula cloud stack, will be described and the very first operational experiences with a small number of strategies for timely allocation and release of resources will be discussed.

  3. Phase field benchmark problems for dendritic growth and linear elasticity

    DOE PAGES

    Jokisaari, Andrea M.; Voorhees, P. W.; Guyer, Jonathan E.; ...

    2018-03-26

    We present the second set of benchmark problems for phase field models that are being jointly developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST) along with input from other members in the phase field community. As the integrated computational materials engineering (ICME) approach to materials design has gained traction, there is an increasing need for quantitative phase field results. New algorithms and numerical implementations increase computational capabilities, necessitating standard problems to evaluate their impact on simulated microstructure evolution as well as their computational performance. We propose one benchmark problem formore » solidifiication and dendritic growth in a single-component system, and one problem for linear elasticity via the shape evolution of an elastically constrained precipitate. We demonstrate the utility and sensitivity of the benchmark problems by comparing the results of 1) dendritic growth simulations performed with different time integrators and 2) elastically constrained precipitate simulations with different precipitate sizes, initial conditions, and elastic moduli. As a result, these numerical benchmark problems will provide a consistent basis for evaluating different algorithms, both existing and those to be developed in the future, for accuracy and computational efficiency when applied to simulate physics often incorporated in phase field models.« less

  4. Phase field benchmark problems for dendritic growth and linear elasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jokisaari, Andrea M.; Voorhees, P. W.; Guyer, Jonathan E.

    We present the second set of benchmark problems for phase field models that are being jointly developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST) along with input from other members in the phase field community. As the integrated computational materials engineering (ICME) approach to materials design has gained traction, there is an increasing need for quantitative phase field results. New algorithms and numerical implementations increase computational capabilities, necessitating standard problems to evaluate their impact on simulated microstructure evolution as well as their computational performance. We propose one benchmark problem formore » solidifiication and dendritic growth in a single-component system, and one problem for linear elasticity via the shape evolution of an elastically constrained precipitate. We demonstrate the utility and sensitivity of the benchmark problems by comparing the results of 1) dendritic growth simulations performed with different time integrators and 2) elastically constrained precipitate simulations with different precipitate sizes, initial conditions, and elastic moduli. As a result, these numerical benchmark problems will provide a consistent basis for evaluating different algorithms, both existing and those to be developed in the future, for accuracy and computational efficiency when applied to simulate physics often incorporated in phase field models.« less

  5. Gravity anomalies and flexure of the lithosphere at the Middle Amazon Basin, Brazil

    NASA Astrophysics Data System (ADS)

    Nunn, Jeffrey A.; Aires, Jose R.

    1988-01-01

    The Middle Amazon Basin is a large Paleozoic sedimentary basin on the Amazonian craton in South America. It contains up to 7 km of mainly shallow water sediments. A chain of Bouguer gravity highs of approximately +40 to +90 mGals transects the basin roughly coincident with the axis of maximum thickness of sediment. The gravity highs are flanked on either side by gravity lows of approximately -40 mGals. The observed gravity anomalies can be explained by a steeply sided zone of high density in the lower crust varying in width from 100 to 200 km wide. Within this region, the continental crust has been intruded/replaced by more dense material to more than half its original thickness of 45-50 km. The much wider sedimentary basin results from regional compensation of the subsurface load and the subsequent load of accumulated sediments by flexure of the lithosphere. The observed geometry of the basin is consistent with an elastic lithosphere model with a mechanical thickness of 15-20 km. Although this value is lower than expected for a stable cratonic region of Early Proterozoic age, it is within the accepted range of effective elastic thicknesses for the earth. Rapid subsidence during the late Paleozoic may be evidence of a second tectonic event or lithospheric relaxation which could lower the effective mechanical thickness of the lithosphere. The high-density zone in the lower crust, as delineated by gravity and flexural modeling, has a complex sinuous geometry which is narrow and south of the axis of maximum sediment thickness on the east and west margins and wide and offset to the north in the center of the basin. The linear trough geometry of the basin itself is a result of smoothing by regional compensation of the load in the lower crust.

  6. Research on elastic resource management for multi-queue under cloud computing environment

    NASA Astrophysics Data System (ADS)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  7. Anisotropy and temperature dependence of structural, thermodynamic, and elastic properties of crystalline cellulose Iβ: a first-principles investigation

    Treesearch

    ShunLi Shang; Louis G. Hector Jr.; Paul Saxe; Zi-Kui Liu; Robert J. Moon; Pablo D. Zavattieri

    2014-01-01

    Anisotropy and temperature dependence of structural, thermodynamic and elastic properties of crystalline cellulose Iβ were computed with first-principles density functional theory (DFT) and a semi-empirical correction for van der Waals interactions. Specifically, we report the computed temperature variation (up to 500...

  8. Scattering of Airy elastic sheets by a cylindrical cavity in a solid.

    PubMed

    Mitri, F G

    2017-11-01

    The prediction of the elastic scattering by voids (and cracks) in materials is an important process in structural health monitoring, phononic crystals, metamaterials and non-destructive evaluation/imaging to name a few examples. Earlier analytical theories and numerical computations considered the elastic scattering by voids in plane waves of infinite extent. However, current research suggesting the use of (limited-diffracting, accelerating and self-healing) Airy acoustical-sheet beams for non-destructive evaluation or imaging applications in elastic solids requires the development of an improved analytical formalism to predict the scattering efficiency used as a priori information in quantitative material characterization. Based on the definition of the time-averaged scattered power flow density, an analytical expression for the scattering efficiency of a cylindrical empty cavity (i.e., void) encased in an elastic medium is derived for compressional and normally-polarized shear-wave Airy beams. The multipole expansion method using cylindrical wave functions is utilized. Numerical computations for the scattering energy efficiency factors for compressional and shear waves illustrate the analysis with particular emphasis on the Airy beam parameters and the non-dimensional frequency, for various elastic materials surrounding the cavity. The ratio of the compressional to the shear wave speed stimulates the generation of elastic resonances, which are manifested as a series of peaks in the scattering efficiency plots. The present analysis provides an improved method for the computations of the scattering energy efficiency factors using compressional and shear-wave Airy beams in elastic materials as opposed to plane waves of infinite extent. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Modeling spatial decisions with graph theory: logging roads and forest fragmentation in the Brazilian Amazon.

    PubMed

    Walker, Robert; Arima, Eugenio; Messina, Joe; Soares-Filho, Britaldo; Perz, Stephen; Vergara, Dante; Sales, Marcio; Pereira, Ritaumaria; Castro, Williams

    2013-01-01

    This article addresses the spatial decision-making of loggers and implications for forest fragmentation in the Amazon basin. It provides a behavioral explanation for fragmentation by modeling how loggers build road networks, typically abandoned upon removal of hardwoods. Logging road networks provide access to land, and the settlers who take advantage of them clear fields and pastures that accentuate their spatial signatures. In shaping agricultural activities, these networks organize emergent patterns of forest fragmentation, even though the loggers move elsewhere. The goal of the article is to explicate how loggers shape their road networks, in order to theoretically explain an important type of forest fragmentation found in the Amazon basin, particularly in Brazil. This is accomplished by adapting graph theory to represent the spatial decision-making of loggers, and by implementing computational algorithms that build graphs interpretable as logging road networks. The economic behavior of loggers is conceptualized as a profit maximization problem, and translated into spatial decision-making by establishing a formal correspondence between mathematical graphs and road networks. New computational approaches, adapted from operations research, are used to construct graphs and simulate spatial decision-making as a function of discount rates, land tenure, and topographic constraints. The algorithms employed bracket a range of behavioral settings appropriate for areas of terras de volutas, public lands that have not been set aside for environmental protection, indigenous peoples, or colonization. The simulation target sites are located in or near so-called Terra do Meio, once a major logging frontier in the lower Amazon Basin. Simulation networks are compared to empirical ones identified by remote sensing and then used to draw inferences about factors influencing the spatial behavior of loggers. Results overall suggest that Amazonia's logging road networks induce more fragmentation than necessary to access fixed quantities of wood. The paper concludes by considering implications of the approach and findings for Brazil's move to a system of concession logging.

  10. Modelling the Effects of Sea-level, Climate Change, Geology, and Tectonism on the Morphology of the Amazon River Valley and its Floodplain

    NASA Astrophysics Data System (ADS)

    Aalto, R. E.; Cremon, E.; Dunne, T.

    2017-12-01

    How continental-scale rivers respond to climate, geology, and sea level change is not well represented in morphodynamic models. Large rivers respond to influences less apparent in the form and deposits of smaller streams, as the huge scales require long time periods for changes in form and behavior. Tectonic deformation and excavation of resistant deposits can affect low gradient continental-scale rivers, thereby changing flow pathways, channel slope and sinuosity, along-stream patterns of sediment transport capacity, channel patterns, floodplain construction, and valley topography. Nowhere are such scales of morphodynamic response grander than the Amazon River, as described in papers by L.A.K. Mertes. Field-based understanding has improved over the intervening decades, but mechanistic models are needed to simulate and synthesize key morphodynamic components relevant to the construction of large river valleys, with a focus on the Amazon. The Landscape-Linked Environmental Model (LLEM) utilizes novel massively parallel computer architectures to simulate multiple-direction flow, sediment transport, deposition, and incision for exceptionally large (30-80 million nodes per compute unit) lowland dispersal systems. LLEM represents key fluvial processes such as bed and bar deposition, lateral and vertical erosion/incision, levee and floodplain construction, floodplain hydrology, `badlands dissection' of weak sedimentary deposits during falling sea level, tectonic and glacial-isostatic deformation, and provides a 3D record of created stratigraphy and underlying bedrock. We used LLEM to simulate the development of the main valley of the Amazon over the last million years, exploring the propagation of incision waves and system dissection during glacial lowstands, followed by rapid valley filling and extreme lateral mobility of channels during interglacials. We present metrics, videos, and 3D fly-throughs characterizing how system development responds to key assumptions, comparing highly detailed model outcomes against field-documented reality.

  11. A Concept of Operations for an Unclassified Common Operational Picture in Support of Maritime Domain Awareness

    DTIC Science & Technology

    2017-03-01

    Responsibility AWS Amazon Web Services C2 Command and Control C4ISR Command, Control, Communications, Computers and Intelligence, Surveillance...and Reconnaissance C5F Commander Fifth Fleet C6F Commander Sixth Fleet C7F Commander Seventh Fleet CAMTES Computer -Assisted Maritime...capabilities. C. SCOPE AND LIMITATIONS The scope of this study is considerable and encompasses numerous agencies and classification levels. Some

  12. Comparative study of internet cloud and cloudlet over wireless mesh networks for real-time applications

    NASA Astrophysics Data System (ADS)

    Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos

    2014-05-01

    Mobile cloud computing is receiving world-wide momentum for ubiquitous on-demand cloud services for mobile users provided by Amazon, Google etc. with low capital cost. However, Internet-centric clouds introduce wide area network (WAN) delays that are often intolerable for real-time applications such as video streaming. One promising approach to addressing this challenge is to deploy decentralized mini-cloud facility known as cloudlets to enable localized cloud services. When supported by local wireless connectivity, a wireless cloudlet is expected to offer low cost and high performance cloud services for the users. In this work, we implement a realistic framework that comprises both a popular Internet cloud (Amazon Cloud) and a real-world cloudlet (based on Ubuntu Enterprise Cloud (UEC)) for mobile cloud users in a wireless mesh network. We focus on real-time video streaming over the HTTP standard and implement a typical application. We further perform a comprehensive comparative analysis and empirical evaluation of the application's performance when it is delivered over the Internet cloud and the cloudlet respectively. The study quantifies the influence of the two different cloud networking architectures on supporting real-time video streaming. We also enable movement of the users in the wireless mesh network and investigate the effect of user's mobility on mobile cloud computing over the cloudlet and Amazon cloud respectively. Our experimental results demonstrate the advantages of the cloudlet paradigm over its Internet cloud counterpart in supporting the quality of service of real-time applications.

  13. Ultrasound imaging of the anterior section of the eye of five different snake species.

    PubMed

    Lauridsen, Henrik; Da Silva, Mari-Ann O; Hansen, Kasper; Jensen, Heidi M; Warming, Mads; Wang, Tobias; Pedersen, Michael

    2014-12-30

    Nineteen clinically normal snakes: six ball pythons (Python regius), six Burmese pythons (Python bivittatus), one Children's python (Antaresia childreni), four Amazon tree boas (Corallus hortulanus), and two Malagasy ground boas (Acrantophis madagascariensis) were subjected to ultrasound imaging with 21 MHz (ball python) and 50 MHz (ball python, Burmese python, Children's python, Amazon tree boa, Malagasy ground boa) transducers in order to measure the different structures of the anterior segment in clinically normal snake eyes with the aim to review baseline values for clinically important ophthalmic structures. The ultrasonographic measurements included horizontal spectacle diameter, spectacle thickness, depth of sub-spectacular space and corneal thickness. For comparative purposes, a formalin-fixed head of a Burmese python was subjected to micro computed tomography. In all snakes, the spectacle was thinner than the cornea. There was significant difference in spectacle diameter, and spectacle and corneal thickness between the Amazon tree boa and the Burmese and ball pythons. There was no difference in the depth of the sub-spectacular space. The results obtained in the Burmese python with the 50 MHz transducer were similar to the results obtained with micro computed tomography. Images acquired with the 21 MHz transducer included artifacts which may be misinterpreted as ocular structures. Our measurements of the structures in the anterior segment of the eye can serve as orientative values for snakes examined for ocular diseases. In addition, we demonstrated that using a high frequency transducer minimizes the risk of misinterpreting artifacts as ocular structures.

  14. River mixing in the Amazon as a driver of concentration-discharge relationships

    NASA Astrophysics Data System (ADS)

    Moquet, Jean-Sébastien; Bouchez, Julien; Carlo Espinoza, Jhan; Martinez, Jean-Michel; Guyot, Jean-Loup; Lagane, Christelle; Filizola, Naziano; Aniceto, Keila; Noriega, Luis; Hidalgo Sanchez, Liz; Pombosa, Rodrigo; Fraizy, Pascal; Santini, William; Timouk, Franck; Vauchel, Philippe

    2017-04-01

    Large hydrological systems such as continental-scale river basins aggregate water from compositionally different tributaries. Here we explore how such aggregation can affect solute concentration-discharge (C-Q) relationships and thus obscure the message carried by these relationships in terms of weathering properties of the Critical Zone. We compute 10 day-frequency time series of Q and major solute (Si, Ca2+, Mg2+, K+, Na+, Cl-, SO42-) C and fluxes (F) for 13 gauging stations of the SNO-HYBAM Monitoring Program (Geodynamical, hydrological and Biogeochemical control of erosion/weathering and material transport in the Amazon, Orinoco and Congo basins) located throughout the Amazon basin, the largest river basin in the world. Concentration-discharge relationships vary in a systematic manner, shifting for most solutes from a nearly "chemostatic" behavior (constant C) at the Andean mountain front to a more "dilutional" pattern (negative C-Q relationship) towards the system mouth. Associated to this shift in trend is a shift in shape: C-Q hysteresis becomes more prominent at the most downstream stations. A simple model of tributary mixing allows us to identify the important parameters controlling C-Q trends and shapes in the mixture, and we show that for the Amazon case, the model results are in qualitative agreement with the observations. Altogether, this study suggests that mixing of water and solutes between different flowpaths leads to altered C-Q relationships.

  15. Pharmacokinetics of long-acting nalbuphine decanoate after intramuscular administration to Hispaniolan Amazon parrots (Amazona ventralis).

    PubMed

    Sanchez-Migallon Guzman, David; KuKanich, Butch; Heath, Timothy D; Krugner-Higby, Lisa A; Barker, Steven A; Brown, Carolyn S; Paul-Murphy, Joanne R

    2013-02-01

    To evaluate the pharmacokinetics of nalbuphine decanoate after IM administration to Hispaniolan Amazon parrots (Amazona ventralis). 9 healthy adult Hispaniolan Amazon parrots of unknown sex. Nalbuphine decanoate (37.5 mg/kg) was administered IM to all birds. Plasma samples were obtained from blood collected before (time 0) and 0.25, 1, 2, 3, 6, 12, 24, 48, and 96 hours after drug administration. Plasma samples were used for measurement of nalbuphine concentrations via liquid chromatography-tandem mass spectrometry. Pharmacokinetic parameters were estimated with computer software. Plasma concentrations of nalbuphine increased rapidly after IM administration, with a mean concentration of 46.1 ng/mL at 0.25 hours after administration. Plasma concentrations of nalbuphine remained > 20 ng/mL for at least 24 hours in all birds. The maximum plasma concentration was 109.4 ng/mL at 2.15 hours. The mean terminal half-life was 20.4 hours. In Hispaniolan Amazon parrots, plasma concentrations of nalbuphine were prolonged after IM administration of nalbuphine decanoate, compared with previously reported results after administration of nalbuphine hydrochloride. Plasma concentrations that could be associated with antinociception were maintained for 24 hours after IM administration of 37.5 mg of nalbuphine decanoate/kg. Safety and analgesic efficacy of nalbuphine treatments in this species require further investigation to determine the potential for clinical use in pain management in psittacine species.

  16. Mixed-metal effects on ultra-incompressible metal diborides: Density functional computations

    NASA Astrophysics Data System (ADS)

    Lin, Fei; Wu, Kechen; He, Jiangang; Sa, Rongjian; Li, Qiaohong; Wei, Yongqin

    2010-07-01

    Mixed-metal borides are promising superhard materials (Kaner et al. (2005) [1]). In this Letter, density functional computations have been applied to the structural, electronic and elastic properties of mixed-metal diborides Re 0.5Ir 0.5B 2, Re 0.5Tc 0.5B 2, Os 0.5W 0.5B 2 and Os 0.5Ru 0.5B 2. The elastic moduli decrease from pure metal diboride ReB 2 to Re 0.5Ir 0.5B 2 and on the contrary increase from OsB 2 to Os 0.5W 0.5B 2 because boron-metal interactions are contaminated by the occupied anti-bonding orbitals. Alloying ReB 2 (OsB 2) with Tc (Ru) decreases the elastic moduli owing to the relativistic effects. Mixed-metal effects on elastic deformations focus on bonding strengths, which effectively tune the elastic properties of metal diborides.

  17. Cloud-Based Numerical Weather Prediction for Near Real-Time Forecasting and Disaster Response

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Case, Jonathan; Venners, Jason; Schroeder, Richard; Checchi, Milton; Zavodsky, Bradley; Limaye, Ashutosh; O'Brien, Raymond

    2015-01-01

    The use of cloud computing resources continues to grow within the public and private sector components of the weather enterprise as users become more familiar with cloud-computing concepts, and competition among service providers continues to reduce costs and other barriers to entry. Cloud resources can also provide capabilities similar to high-performance computing environments, supporting multi-node systems required for near real-time, regional weather predictions. Referred to as "Infrastructure as a Service", or IaaS, the use of cloud-based computing hardware in an on-demand payment system allows for rapid deployment of a modeling system in environments lacking access to a large, supercomputing infrastructure. Use of IaaS capabilities to support regional weather prediction may be of particular interest to developing countries that have not yet established large supercomputing resources, but would otherwise benefit from a regional weather forecasting capability. Recently, collaborators from NASA Marshall Space Flight Center and Ames Research Center have developed a scripted, on-demand capability for launching the NOAA/NWS Science and Training Resource Center (STRC) Environmental Modeling System (EMS), which includes pre-compiled binaries of the latest version of the Weather Research and Forecasting (WRF) model. The WRF-EMS provides scripting for downloading appropriate initial and boundary conditions from global models, along with higher-resolution vegetation, land surface, and sea surface temperature data sets provided by the NASA Short-term Prediction Research and Transition (SPoRT) Center. This presentation will provide an overview of the modeling system capabilities and benchmarks performed on the Amazon Elastic Compute Cloud (EC2) environment. In addition, the presentation will discuss future opportunities to deploy the system in support of weather prediction in developing countries supported by NASA's SERVIR Project, which provides capacity building activities in environmental monitoring and prediction across a growing number of regional hubs throughout the world. Capacity-building applications that extend numerical weather prediction to developing countries are intended to provide near real-time applications to benefit public health, safety, and economic interests, but may have a greater impact during disaster events by providing a source for local predictions of weather-related hazards, or impacts that local weather events may have during the recovery phase.

  18. The Readability of Principles of Macroeconomics Textbooks

    ERIC Educational Resources Information Center

    Tinkler, Sarah; Woods, James

    2013-01-01

    The authors evaluated principles of macroeconomics textbooks for readability using Coh-Metrix, a computational linguistics tool. Additionally, they conducted an experiment on Amazon's Mechanical Turk Web site in which participants ranked the readability of text samples. There was a wide range of scores on readability indexes both among…

  19. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package

    PubMed Central

    2012-01-01

    Background Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. Results In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Conclusions Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org. PMID:23281941

  20. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package.

    PubMed

    El-Kalioby, Mohamed; Abouelhoda, Mohamed; Krüger, Jan; Giegerich, Robert; Sczyrba, Alexander; Wall, Dennis P; Tonellato, Peter

    2012-01-01

    Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org.

  1. Elastic-plastic mixed-iterative finite element analysis: Implementation and performance assessment

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1993-01-01

    An elastic-plastic algorithm based on Von Mises and associative flow criteria is implemented in MHOST-a mixed iterative finite element analysis computer program developed by NASA Lewis Research Center. The performance of the resulting elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors of 4-node quadrilateral shell finite elements are tested for elastic-plastic performance. Generally, the membrane results are excellent, indicating the implementation of elastic-plastic mixed-iterative analysis is appropriate.

  2. Nonlinear Visco-Elastic Response of Composites via Micro-Mechanical Models

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Sridharan, Srinivasan

    2005-01-01

    Micro-mechanical models for a study of nonlinear visco-elastic response of composite laminae are developed and their performance compared. A single integral constitutive law proposed by Schapery and subsequently generalized to multi-axial states of stress is utilized in the study for the matrix material. This is used in conjunction with a computationally facile scheme in which hereditary strains are computed using a recursive relation suggested by Henriksen. Composite response is studied using two competing micro-models, viz. a simplified Square Cell Model (SSCM) and a Finite Element based self-consistent Cylindrical Model (FECM). The algorithm is developed assuming that the material response computations are carried out in a module attached to a general purpose finite element program used for composite structural analysis. It is shown that the SSCM as used in investigations of material nonlinearity can involve significant errors in the prediction of transverse Young's modulus and shear modulus. The errors in the elastic strains thus predicted are of the same order of magnitude as the creep strains accruing due to visco-elasticity. The FECM on the other hand does appear to perform better both in the prediction of elastic constants and the study of creep response.

  3. Advantages of formulating an evolution equation directly for elastic distortional deformation in finite deformation plasticity

    NASA Astrophysics Data System (ADS)

    Rubin, M. B.; Cardiff, P.

    2017-11-01

    Simo (Comput Methods Appl Mech Eng 66:199-219, 1988) proposed an evolution equation for elastic deformation together with a constitutive equation for inelastic deformation rate in plasticity. The numerical algorithm (Simo in Comput Methods Appl Mech Eng 68:1-31, 1988) for determining elastic distortional deformation was simple. However, the proposed inelastic deformation rate caused plastic compaction. The corrected formulation (Simo in Comput Methods Appl Mech Eng 99:61-112, 1992) preserves isochoric plasticity but the numerical integration algorithm is complicated and needs special methods for calculation of the exponential map of a tensor. Alternatively, an evolution equation for elastic distortional deformation can be proposed directly with a simplified constitutive equation for inelastic distortional deformation rate. This has the advantage that the physics of inelastic distortional deformation is separated from that of dilatation. The example of finite deformation J2 plasticity with linear isotropic hardening is used to demonstrate the simplicity of the numerical algorithm.

  4. Towards a precise test for malaria diagnosis in the Brazilian Amazon: comparison among field microscopy, a rapid diagnostic test, nested PCR, and a computational expert system based on artificial neural networks

    PubMed Central

    2010-01-01

    Background Accurate malaria diagnosis is mandatory for the treatment and management of severe cases. Moreover, individuals with asymptomatic malaria are not usually screened by health care facilities, which further complicates disease control efforts. The present study compared the performances of a malaria rapid diagnosis test (RDT), the thick blood smear method and nested PCR for the diagnosis of symptomatic malaria in the Brazilian Amazon. In addition, an innovative computational approach was tested for the diagnosis of asymptomatic malaria. Methods The study was divided in two parts. For the first part, passive case detection was performed in 311 individuals with malaria-related symptoms from a recently urbanized community in the Brazilian Amazon. A cross-sectional investigation compared the diagnostic performance of the RDT Optimal-IT, nested PCR and light microscopy. The second part of the study involved active case detection of asymptomatic malaria in 380 individuals from riverine communities in Rondônia, Brazil. The performances of microscopy, nested PCR and an expert computational system based on artificial neural networks (MalDANN) using epidemiological data were compared. Results Nested PCR was shown to be the gold standard for diagnosis of both symptomatic and asymptomatic malaria because it detected the major number of cases and presented the maximum specificity. Surprisingly, the RDT was superior to microscopy in the diagnosis of cases with low parasitaemia. Nevertheless, RDT could not discriminate the Plasmodium species in 12 cases of mixed infections (Plasmodium vivax + Plasmodium falciparum). Moreover, the microscopy presented low performance in the detection of asymptomatic cases (61.25% of correct diagnoses). The MalDANN system using epidemiological data was worse that the light microscopy (56% of correct diagnoses). However, when information regarding plasma levels of interleukin-10 and interferon-gamma were inputted, the MalDANN performance sensibly increased (80% correct diagnoses). Conclusions An RDT for malaria diagnosis may find a promising use in the Brazilian Amazon integrating a rational diagnostic approach. Despite the low performance of the MalDANN test using solely epidemiological data, an approach based on neural networks may be feasible in cases where simpler methods for discriminating individuals below and above threshold cytokine levels are available. PMID:20459613

  5. Towards a precise test for malaria diagnosis in the Brazilian Amazon: comparison among field microscopy, a rapid diagnostic test, nested PCR, and a computational expert system based on artificial neural networks.

    PubMed

    Andrade, Bruno B; Reis-Filho, Antonio; Barros, Austeclino M; Souza-Neto, Sebastião M; Nogueira, Lucas L; Fukutani, Kiyoshi F; Camargo, Erney P; Camargo, Luís M A; Barral, Aldina; Duarte, Angelo; Barral-Netto, Manoel

    2010-05-06

    Accurate malaria diagnosis is mandatory for the treatment and management of severe cases. Moreover, individuals with asymptomatic malaria are not usually screened by health care facilities, which further complicates disease control efforts. The present study compared the performances of a malaria rapid diagnosis test (RDT), the thick blood smear method and nested PCR for the diagnosis of symptomatic malaria in the Brazilian Amazon. In addition, an innovative computational approach was tested for the diagnosis of asymptomatic malaria. The study was divided in two parts. For the first part, passive case detection was performed in 311 individuals with malaria-related symptoms from a recently urbanized community in the Brazilian Amazon. A cross-sectional investigation compared the diagnostic performance of the RDT Optimal-IT, nested PCR and light microscopy. The second part of the study involved active case detection of asymptomatic malaria in 380 individuals from riverine communities in Rondônia, Brazil. The performances of microscopy, nested PCR and an expert computational system based on artificial neural networks (MalDANN) using epidemiological data were compared. Nested PCR was shown to be the gold standard for diagnosis of both symptomatic and asymptomatic malaria because it detected the major number of cases and presented the maximum specificity. Surprisingly, the RDT was superior to microscopy in the diagnosis of cases with low parasitaemia. Nevertheless, RDT could not discriminate the Plasmodium species in 12 cases of mixed infections (Plasmodium vivax + Plasmodium falciparum). Moreover, the microscopy presented low performance in the detection of asymptomatic cases (61.25% of correct diagnoses). The MalDANN system using epidemiological data was worse that the light microscopy (56% of correct diagnoses). However, when information regarding plasma levels of interleukin-10 and interferon-gamma were inputted, the MalDANN performance sensibly increased (80% correct diagnoses). An RDT for malaria diagnosis may find a promising use in the Brazilian Amazon integrating a rational diagnostic approach. Despite the low performance of the MalDANN test using solely epidemiological data, an approach based on neural networks may be feasible in cases where simpler methods for discriminating individuals below and above threshold cytokine levels are available.

  6. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    PubMed

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  7. Design and deployment of an elastic network test-bed in IHEP data center based on SDN

    NASA Astrophysics Data System (ADS)

    Zeng, Shan; Qi, Fazhi; Chen, Gang

    2017-10-01

    High energy physics experiments produce huge amounts of raw data, while because of the sharing characteristics of the network resources, there is no guarantee of the available bandwidth for each experiment which may cause link congestion problems. On the other side, with the development of cloud computing technologies, IHEP have established a cloud platform based on OpenStack which can ensure the flexibility of the computing and storage resources, and more and more computing applications have been deployed on virtual machines established by OpenStack. However, under the traditional network architecture, network capability can’t be required elastically, which becomes the bottleneck of restricting the flexible application of cloud computing. In order to solve the above problems, we propose an elastic cloud data center network architecture based on SDN, and we also design a high performance controller cluster based on OpenDaylight. In the end, we present our current test results.

  8. A FFT-based formulation for efficient mechanical fields computation in isotropic and anisotropic periodic discrete dislocation dynamics

    NASA Astrophysics Data System (ADS)

    Bertin, N.; Upadhyay, M. V.; Pradalier, C.; Capolungo, L.

    2015-09-01

    In this paper, we propose a novel full-field approach based on the fast Fourier transform (FFT) technique to compute mechanical fields in periodic discrete dislocation dynamics (DDD) simulations for anisotropic materials: the DDD-FFT approach. By coupling the FFT-based approach to the discrete continuous model, the present approach benefits from the high computational efficiency of the FFT algorithm, while allowing for a discrete representation of dislocation lines. It is demonstrated that the computational time associated with the new DDD-FFT approach is significantly lower than that of current DDD approaches when large number of dislocation segments are involved for isotropic and anisotropic elasticity, respectively. Furthermore, for fine Fourier grids, the treatment of anisotropic elasticity comes at a similar computational cost to that of isotropic simulation. Thus, the proposed approach paves the way towards achieving scale transition from DDD to mesoscale plasticity, especially due to the method’s ability to incorporate inhomogeneous elasticity.

  9. Iterative image reconstruction in elastic inhomogenous media with application to transcranial photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Poudel, Joemini; Matthews, Thomas P.; Mitsuhashi, Kenji; Garcia-Uribe, Alejandro; Wang, Lihong V.; Anastasio, Mark A.

    2017-03-01

    Photoacoustic computed tomography (PACT) is an emerging computed imaging modality that exploits optical contrast and ultrasonic detection principles to form images of the photoacoustically induced initial pressure distribution within tissue. The PACT reconstruction problem corresponds to a time-domain inverse source problem, where the initial pressure distribution is recovered from the measurements recorded on an aperture outside the support of the source. A major challenge in transcranial PACT brain imaging is to compensate for aberrations in the measured data due to the propagation of the photoacoustic wavefields through the skull. To properly account for these effects, a wave equation-based inversion method should be employed that can model the heterogeneous elastic properties of the medium. In this study, an iterative image reconstruction method for 3D transcranial PACT is developed based on the elastic wave equation. To accomplish this, a forward model based on a finite-difference time-domain discretization of the elastic wave equation is established. Subsequently, gradient-based methods are employed for computing penalized least squares estimates of the initial source distribution that produced the measured photoacoustic data. The developed reconstruction algorithm is validated and investigated through computer-simulation studies.

  10. Proof test of the computer program BUCKY for plasticity problems

    NASA Technical Reports Server (NTRS)

    Smith, James P.

    1994-01-01

    A theoretical equation describing the elastic-plastic deformation of a cantilever beam subject to a constant pressure is developed. The theoretical result is compared numerically to the computer program BUCKY for the case of an elastic-perfectly plastic specimen. It is shown that the theoretical and numerical results compare favorably in the plastic range. Comparisons are made to another research code to further validate the BUCKY results. This paper serves as a quality test for the computer program BUCKY developed at NASA Johnson Space Center.

  11. LSRN: A PARALLEL ITERATIVE SOLVER FOR STRONGLY OVER- OR UNDERDETERMINED SYSTEMS*

    PubMed Central

    Meng, Xiangrui; Saunders, Michael A.; Mahoney, Michael W.

    2014-01-01

    We describe a parallel iterative least squares solver named LSRN that is based on random normal projection. LSRN computes the min-length solution to minx∈ℝn ‖Ax − b‖2, where A ∈ ℝm × n with m ≫ n or m ≪ n, and where A may be rank-deficient. Tikhonov regularization may also be included. Since A is involved only in matrix-matrix and matrix-vector multiplications, it can be a dense or sparse matrix or a linear operator, and LSRN automatically speeds up when A is sparse or a fast linear operator. The preconditioning phase consists of a random normal projection, which is embarrassingly parallel, and a singular value decomposition of size ⌈γ min(m, n)⌉ × min(m, n), where γ is moderately larger than 1, e.g., γ = 2. We prove that the preconditioned system is well-conditioned, with a strong concentration result on the extreme singular values, and hence that the number of iterations is fully predictable when we apply LSQR or the Chebyshev semi-iterative method. As we demonstrate, the Chebyshev method is particularly efficient for solving large problems on clusters with high communication cost. Numerical results show that on a shared-memory machine, LSRN is very competitive with LAPACK’s DGELSD and a fast randomized least squares solver called Blendenpik on large dense problems, and it outperforms the least squares solver from SuiteSparseQR on sparse problems without sparsity patterns that can be exploited to reduce fill-in. Further experiments show that LSRN scales well on an Amazon Elastic Compute Cloud cluster. PMID:25419094

  12. BioVLAB-MMIA: a cloud environment for microRNA and mRNA integrated analysis (MMIA) on Amazon EC2.

    PubMed

    Lee, Hyungro; Yang, Youngik; Chae, Heejoon; Nam, Seungyoon; Choi, Donghoon; Tangchaisin, Patanachai; Herath, Chathura; Marru, Suresh; Nephew, Kenneth P; Kim, Sun

    2012-09-01

    MicroRNAs, by regulating the expression of hundreds of target genes, play critical roles in developmental biology and the etiology of numerous diseases, including cancer. As a vast amount of microRNA expression profile data are now publicly available, the integration of microRNA expression data sets with gene expression profiles is a key research problem in life science research. However, the ability to conduct genome-wide microRNA-mRNA (gene) integration currently requires sophisticated, high-end informatics tools, significant expertise in bioinformatics and computer science to carry out the complex integration analysis. In addition, increased computing infrastructure capabilities are essential in order to accommodate large data sets. In this study, we have extended the BioVLAB cloud workbench to develop an environment for the integrated analysis of microRNA and mRNA expression data, named BioVLAB-MMIA. The workbench facilitates computations on the Amazon EC2 and S3 resources orchestrated by the XBaya Workflow Suite. The advantages of BioVLAB-MMIA over the web-based MMIA system include: 1) readily expanded as new computational tools become available; 2) easily modifiable by re-configuring graphic icons in the workflow; 3) on-demand cloud computing resources can be used on an "as needed" basis; 4) distributed orchestration supports complex and long running workflows asynchronously. We believe that BioVLAB-MMIA will be an easy-to-use computing environment for researchers who plan to perform genome-wide microRNA-mRNA (gene) integrated analysis tasks.

  13. Substrate stress relaxation regulates cell spreading

    NASA Astrophysics Data System (ADS)

    Chaudhuri, Ovijit; Gu, Luo; Darnell, Max; Klumpers, Darinka; Bencherif, Sidi A.; Weaver, James C.; Huebsch, Nathaniel; Mooney, David J.

    2015-02-01

    Studies of cellular mechanotransduction have converged upon the idea that cells sense extracellular matrix (ECM) elasticity by gauging resistance to the traction forces they exert on the ECM. However, these studies typically utilize purely elastic materials as substrates, whereas physiological ECMs are viscoelastic, and exhibit stress relaxation, so that cellular traction forces exerted by cells remodel the ECM. Here we investigate the influence of ECM stress relaxation on cell behaviour through computational modelling and cellular experiments. Surprisingly, both our computational model and experiments find that spreading for cells cultured on soft substrates that exhibit stress relaxation is greater than cells spreading on elastic substrates of the same modulus, but similar to that of cells spreading on stiffer elastic substrates. These findings challenge the current view of how cells sense and respond to the ECM.

  14. Computing elastic anisotropy to discover gum-metal-like structural alloys

    NASA Astrophysics Data System (ADS)

    Winter, I. S.; de Jong, M.; Asta, M.; Chrzan, D. C.

    2017-08-01

    The computer aided discovery of structural alloys is a burgeoning but still challenging area of research. A primary challenge in the field is to identify computable screening parameters that embody key structural alloy properties. Here, an elastic anisotropy parameter that captures a material's susceptibility to solute solution strengthening is identified. The parameter has many applications in the discovery and optimization of structural materials. As a first example, the parameter is used to identify alloys that might display the super elasticity, super strength, and high ductility of the class of TiNb alloys known as gum metals. In addition, it is noted that the parameter can be used to screen candidate alloys for shape memory response, and potentially aid in the optimization of the mechanical properties of high-entropy alloys.

  15. Unsupervised SBAS-DInSAR Processing of Space-borne SAR data for Earth Surface Displacement Time Series Generation

    NASA Astrophysics Data System (ADS)

    Casu, F.; de Luca, C.; Lanari, R.; Manunta, M.; Zinno, I.

    2016-12-01

    During the last 25 years, the Differential Synthetic Aperture Radar Interferometry (DInSAR) has played an important role for understanding the Earth's surface deformation and its dynamics. In particular, the large collections of SAR data acquired by a number of space-borne missions (ERS, ENVISAT, ALOS, RADARSAT, TerraSAR-X, COSMO-SkyMed) have pushed toward the development of advanced DInSAR techniques for monitoring the temporal evolution of the ground displacements with an high spatial density. Moreover, the advent of the Copernicus Sentinel-1 (S1) constellation is providing a further increase in the SAR data flow available to the Earth science community, due to its characteristics of global coverage strategy and free and open access data policy. Therefore, managing and storing such a huge amount of data, processing it in an effcient way and maximizing the available archives exploitation are becoming high priority issues. In this work we present some recent advances in the DInSAR field for dealing with the effective exploitation of the present and future SAR data archives. In particular, an efficient parallel SBAS implementation (namely P-SBAS) that takes benefit from high performance computing is proposed. Then, the P-SBAS migration to the emerging Cloud Computing paradigm is shown, together with extensive tests carried out in the Amazon's Elastic Cloud Compute (EC2) infrastructure. Finally, the integration of the P-SBAS processing chain within the ESA Geohazards Exploitation Platform (GEP), for setting up operational on-demand and systematic web tools, open to every user, aimed at automatically processing stacks of SAR data for the generation of SBAS displacement time series, is also illustrated. A number of experimental results obtained by using the ERS, ENVISAT and S1 data in areas characterized by volcanic, seismic and anthropogenic phenomena will be shown. This work is partially supported by: the DPC-CNR agreement, the EPOS-IP project and the ESA GEP project.

  16. Effect of Zealotry in High-dimensional Opinion Dynamics Models

    DTIC Science & Technology

    2015-02-18

    the several platforms for portable computing to choose from, such as Apple iPad, Amazon Kindle, and Samsung Galaxy Note. A third example is the choice... diverse topologies, Chaos 17, 026111 (2007). [16] S. K. Maity, T. V. Manoj, and A. Mukherjee, Opinion formation in time-varying social networks: The case of

  17. A multi-platform evaluation of the randomized CX low-rank matrix factorization in Spark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gittens, Alex; Kottalam, Jey; Yang, Jiyan

    We investigate the performance and scalability of the randomized CX low-rank matrix factorization and demonstrate its applicability through the analysis of a 1TB mass spectrometry imaging (MSI) dataset, using Apache Spark on an Amazon EC2 cluster, a Cray XC40 system, and an experimental Cray cluster. We implemented this factorization both as a parallelized C implementation with hand-tuned optimizations and in Scala using the Apache Spark high-level cluster computing framework. We obtained consistent performance across the three platforms: using Spark we were able to process the 1TB size dataset in under 30 minutes with 960 cores on all systems, with themore » fastest times obtained on the experimental Cray cluster. In comparison, the C implementation was 21X faster on the Amazon EC2 system, due to careful cache optimizations, bandwidth-friendly access of matrices and vector computation using SIMD units. We report these results and their implications on the hardware and software issues arising in supporting data-centric workloads in parallel and distributed environments.« less

  18. Computational Overlap Coupling Between Micropolar Linear Elastic Continuum Finite Elements and Nonlinear Elastic Spherical Discrete Elements in One Dimension

    DTIC Science & Technology

    2013-01-01

    Cracking in asphalt pavement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Figure 2. 2D...metallic binder, figure 1(b)), particulate energetic materials (explosive crystalline grains with polymeric binder, figure 1(c)), asphalt pavement (stone...explosive HMX grains and at grain-matrix interfaces (2). (d) Cracking in asphalt pavement . 2 (i) it is limited by current computing power (even

  19. Effective dimensional reduction algorithm for eigenvalue problems for thin elastic structures: A paradigm in three dimensions

    PubMed Central

    Ovtchinnikov, Evgueni E.; Xanthis, Leonidas S.

    2000-01-01

    We present a methodology for the efficient numerical solution of eigenvalue problems of full three-dimensional elasticity for thin elastic structures, such as shells, plates and rods of arbitrary geometry, discretized by the finite element method. Such problems are solved by iterative methods, which, however, are known to suffer from slow convergence or even convergence failure, when the thickness is small. In this paper we show an effective way of resolving this difficulty by invoking a special preconditioning technique associated with the effective dimensional reduction algorithm (EDRA). As an example, we present an algorithm for computing the minimal eigenvalue of a thin elastic plate and we show both theoretically and numerically that it is robust with respect to both the thickness and discretization parameters, i.e. the convergence does not deteriorate with diminishing thickness or mesh refinement. This robustness is sine qua non for the efficient computation of large-scale eigenvalue problems for thin elastic structures. PMID:10655469

  20. Mobile healthcare information management utilizing Cloud Computing and Android OS.

    PubMed

    Doukas, Charalampos; Pliakas, Thomas; Maglogiannis, Ilias

    2010-01-01

    Cloud Computing provides functionality for managing information data in a distributed, ubiquitous and pervasive manner supporting several platforms, systems and applications. This work presents the implementation of a mobile system that enables electronic healthcare data storage, update and retrieval using Cloud Computing. The mobile application is developed using Google's Android operating system and provides management of patient health records and medical images (supporting DICOM format and JPEG2000 coding). The developed system has been evaluated using the Amazon's S3 cloud service. This article summarizes the implementation details and presents initial results of the system in practice.

  1. Biomass burning emissions of reactive gases estimated from satellite data analysis and ecosystem modeling for the Brazilian Amazon region

    NASA Astrophysics Data System (ADS)

    Potter, Christopher; Brooks-Genovese, Vanessa; Klooster, Steven; Torregrosa, Alicia

    2002-10-01

    To produce a new daily record of trace gas emissions from biomass burning events for the Brazilian Legal Amazon, we have combined satellite advanced very high resolution radiometer (AVHRR) data on fire counts together for the first time with vegetation greenness imagery as inputs to an ecosystem biomass model at 8 km spatial resolution. This analysis goes beyond previous estimates for reactive gas emissions from Amazon fires, owing to a more detailed geographic distribution estimate of vegetation biomass, coupled with daily fire activity for the region (original 1 km resolution), and inclusion of fire effects in extensive areas of the Legal Amazon (defined as the Brazilian states of Acre, Amapá, Amazonas, Maranhao, Mato Grosso, Pará, Rondônia, Roraima, and Tocantins) covered by open woodland, secondary forests, savanna, and pasture vegetation. Results from our emissions model indicate that annual emissions from Amazon deforestation and biomass burning in the early 1990s total to 102 Tg yr-1 carbon monoxide (CO) and 3.5 Tg yr-1 nitrogen oxides (NOx). Peak daily burning emissions, which occurred in early September 1992, were estimated at slightly more than 3 Tg d-1for CO and 0.1 Tg d-1for NOx flux to the atmosphere. Other burning source fluxes of gases with relatively high emission factors are reported, including methane (CH4), nonmethane hydrocarbons (NMHC), and sulfur dioxide (SO2), in addition to total particulate matter (TPM). We estimate the Brazilian Amazon region to be a source of between one fifth and one third for each of these global emission fluxes to the atmosphere. The regional distribution of burning emissions appears to be highest in the Brazilian states of Maranhao and Tocantins, mainly from burning outside of moist forest areas, and in Pará and Mato Grosso, where we identify important contributions from primary forest cutting and burning. These new daily emission estimates of reactive gases from biomass burning fluxes are designed to be used as detailed spatial and temporal inputs to computer models and data analysis of tropospheric chemistry over the tropical region.

  2. Fast computation of high energy elastic collision scattering angle for electric propulsion plume simulation

    NASA Astrophysics Data System (ADS)

    Araki, Samuel J.

    2016-11-01

    In the plumes of Hall thrusters and ion thrusters, high energy ions experience elastic collisions with slow neutral atoms. These collisions involve a process of momentum exchange, altering the initial velocity vectors of the collision pair. In addition to the momentum exchange process, ions and atoms can exchange electrons, resulting in slow charge-exchange ions and fast atoms. In these simulations, it is particularly important to accurately perform computations of ion-atom elastic collisions in determining the plume current profile and assessing the integration of spacecraft components. The existing models are currently capable of accurate calculation but are not fast enough such that the calculation can be a bottleneck of plume simulations. This study investigates methods to accelerate an ion-atom elastic collision calculation that includes both momentum- and charge-exchange processes. The scattering angles are pre-computed through a classical approach with ab initio spin-orbit free potential and are stored in a two-dimensional array as functions of impact parameter and energy. When performing a collision calculation for an ion-atom pair, the scattering angle is computed by a table lookup and multiple linear interpolations, given the relative energy and randomly determined impact parameter. In order to further accelerate the calculations, the number of collision calculations is reduced by properly defining two cut-off cross-sections for the elastic scattering. In the MCC method, the target atom needs to be sampled; however, it is confirmed that initial target atom velocity does not play a significant role in typical electric propulsion plume simulations such that the sampling process is unnecessary. With these implementations, the computational run-time to perform a collision calculation is reduced significantly compared to previous methods, while retaining the accuracy of the high fidelity models.

  3. Numerical study on influence of single control surface on aero elastic behavior of forward-swept wing

    NASA Astrophysics Data System (ADS)

    Wang, Ning; Su, Xinbing; Ma, Binlin; Zhang, Xiaofei

    2017-10-01

    In order to study the influence of elastic forward-swept wing (FSW) with single control surface, the computational fluid dynamics/computational structural dynamics (CFD/CSD) loose coupling static aero elastic numerical calculation method was adopted for numerical simulation. The effects of the elastic FSW with leading- or trailing-edge control surface on aero elastic characteristics were calculated and analysed under the condition of high subsonic speed. The result shows that, the deflection of every single control surface could change the aero elastic characteristics of elastic FSW greatly. Compared with the baseline model, when leading-edge control surface deflected up, under the condition of small angles of attack, the aerodynamic characteristics was poor, but the bending and torsional deformation decreased. Under the condition of moderate angles of attack, the aerodynamic characteristics was improved, but bending and torsional deformation increased; When leading-edge control surface deflected down, the aerodynamic characteristics was improved, the bending and torsional deformation decreased/increased under the condition of small/moderate angles of attack. Compared with the baseline model, when trailing-edge control surface deflected down, the aerodynamic characteristics was improved. The bending and torsional deformation increased under the condition of small angles of attack. The bending deformation increased under the condition of small angles of attack, but torsional deformation decreases under the condition of moderate angles of attack. So, for the elastic FSW, the deflection of trailing-edge control surface play a more important role on the improvement of aerodynamic and elastic deformation characteristics.

  4. A comparison between different finite elements for elastic and aero-elastic analyses.

    PubMed

    Mahran, Mohamed; ELsabbagh, Adel; Negm, Hani

    2017-11-01

    In the present paper, a comparison between five different shell finite elements, including the Linear Triangular Element, Linear Quadrilateral Element, Linear Quadrilateral Element based on deformation modes, 8-node Quadrilateral Element, and 9-Node Quadrilateral Element was presented. The shape functions and the element equations related to each element were presented through a detailed mathematical formulation. Additionally, the Jacobian matrix for the second order derivatives was simplified and used to derive each element's strain-displacement matrix in bending. The elements were compared using carefully selected elastic and aero-elastic bench mark problems, regarding the number of elements needed to reach convergence, the resulting accuracy, and the needed computation time. The best suitable element for elastic free vibration analysis was found to be the Linear Quadrilateral Element with deformation-based shape functions, whereas the most suitable element for stress analysis was the 8-Node Quadrilateral Element, and the most suitable element for aero-elastic analysis was the 9-Node Quadrilateral Element. Although the linear triangular element was the last choice for modal and stress analyses, it establishes more accurate results in aero-elastic analyses, however, with much longer computation time. Additionally, the nine-node quadrilateral element was found to be the best choice for laminated composite plates analysis.

  5. Computation of Temperature-Dependent Legendre Moments of a Double-Differential Elastic Cross Section

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arbanas, Goran; Dunn, Michael E; Larson, Nancy M

    2011-01-01

    A general expression for temperature-dependent Legendre moments of a double-differential elastic scattering cross section was derived by Ouisloumen and Sanchez [Nucl. Sci. Eng. 107, 189-200 (1991)]. Attempts to compute this expression are hindered by the three-fold nested integral, limiting their practical application to just the zeroth Legendre moment of an isotropic scattering. It is shown that the two innermost integrals could be evaluated analytically to all orders of Legendre moments, and for anisotropic scattering, by a recursive application of the integration by parts method. For this method to work, the anisotropic angular distribution in the center of mass is expressedmore » as an expansion in Legendre polynomials. The first several Legendre moments of elastic scattering of neutrons on U-238 are computed at T=1000 K at incoming energy 6.5 eV for isotropic scattering in the center of mass frame. Legendre moments of the anisotropic angular distribution given via Blatt-Biedenharn coefficients are computed at ~1 keV. The results are in agreement with those computed by the Monte Carlo method.« less

  6. Computer program for the load and trajectory analysis of two DOF bodies connected by an elastic tether: Users manual

    NASA Technical Reports Server (NTRS)

    Doyle, G. R., Jr.; Burbick, J. W.

    1973-01-01

    The derivation of the differential equations of motion of a 3 Degrees of Freedom body joined to a 3 Degrees of Freedom body by an elastic tether. The tether is represented by a spring and dashpot in parallel. A computer program which integrates the equations of motion is also described. Although the derivation of the equations of motions are for a general system, the computer program is written for defining loads in large boosters recovered by parachutes.

  7. Estimation of local stresses and elastic properties of a mortar sample by FFT computation of fields on a 3D image

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Escoda, J.; Departement Materiaux et Mecanique des Composants, Electricite de France, Moret-sur-Loing; Willot, F., E-mail: francois.willot@ensmp.f

    2011-05-15

    This study concerns the prediction of the elastic properties of a 3D mortar image, obtained by micro-tomography, using a combined image segmentation and numerical homogenization approach. The microstructure is obtained by segmentation of the 3D image into aggregates, voids and cement paste. Full-fields computations of the elastic response of mortar are undertaken using the Fast Fourier Transform method. Emphasis is made on highly-contrasted properties between aggregates and matrix, to anticipate needs for creep or damage computation. The representative volume element, i.e. the volume size necessary to compute the effective properties with a prescribed accuracy, is given. Overall, the volumes usedmore » in this work were sufficient to estimate the effective response of mortar with a precision of 5%, 6% and 10% for contrast ratios of 100, 1000 and 10,000, resp. Finally, a statistical and local characterization of the component of the stress field parallel to the applied loading is carried out.« less

  8. Branchial cysts in two Amazon parrots (Amazona species).

    PubMed

    Beaufrère, Hugues; Castillo-Alcala, Fernanda; Holmberg, David L; Boston, Sarah; Smith, Dale A; Taylor, W Michael

    2010-03-01

    A 37-year-old yellow-crowned Amazon parrot (Amazona ochrocephala) and a 20-year-old red-lored Amazon parrot (Amazona autumnalis) each presented with a large mass localized on the lateral neck. With the first bird, there was no evidence of signs of pain or discomfort, and the bird prehended and swallowed food normally. The second bird showed signs of mild upper-gastrointestinal discomfort. Results of an ultrasound examination and aspiration of the mass on each bird revealed a cystic structure. A computed tomography performed on the second bird revealed a large polycystic mass connected to the pharynx by a lateral tract. During surgical resection, both masses were found to originate from the subpharyngeal area. Based on topography and the histopathologic and immunohistochemical results, the masses were determined to be a second branchial cleft cyst for the first case and a second branchial pouch cyst for the second case. In addition, a carcinoma was present in situ within the epithelium of case 1, and the cyst in case 2 was secondarily infected. Branchial cysts are uncommonly diagnosed in veterinary and human medicine. These 2 cases are the first documented in parrots and appear similar to second branchial cysts reported in adult humans.

  9. Delineation of First-Order Elastic Property Closures for Hexagonal Metals Using Fast Fourier Transforms

    PubMed Central

    Landry, Nicholas W.; Knezevic, Marko

    2015-01-01

    Property closures are envelopes representing the complete set of theoretically feasible macroscopic property combinations for a given material system. In this paper, we present a computational procedure based on fast Fourier transforms (FFTs) for delineation of elastic property closures for hexagonal close packed (HCP) metals. The procedure consists of building a database of non-zero Fourier transforms for each component of the elastic stiffness tensor, calculating the Fourier transforms of orientation distribution functions (ODFs), and calculating the ODF-to-elastic property bounds in the Fourier space. In earlier studies, HCP closures were computed using the generalized spherical harmonics (GSH) representation and an assumption of orthotropic sample symmetry; here, the FFT approach allowed us to successfully calculate the closures for a range of HCP metals without invoking any sample symmetry assumption. The methodology presented here facilitates for the first time computation of property closures involving normal-shear coupling stiffness coefficients. We found that the representation of these property linkages using FFTs need more terms compared to GSH representations. However, the use of FFT representations reduces the computational time involved in producing the property closures due to the use of fast FFT algorithms. Moreover, FFT algorithms are readily available as opposed to GSH codes. PMID:28793566

  10. Biomedical cloud computing with Amazon Web Services.

    PubMed

    Fusaro, Vincent A; Patil, Prasad; Gafni, Erik; Wall, Dennis P; Tonellato, Peter J

    2011-08-01

    In this overview to biomedical computing in the cloud, we discussed two primary ways to use the cloud (a single instance or cluster), provided a detailed example using NGS mapping, and highlighted the associated costs. While many users new to the cloud may assume that entry is as straightforward as uploading an application and selecting an instance type and storage options, we illustrated that there is substantial up-front effort required before an application can make full use of the cloud's vast resources. Our intention was to provide a set of best practices and to illustrate how those apply to a typical application pipeline for biomedical informatics, but also general enough for extrapolation to other types of computational problems. Our mapping example was intended to illustrate how to develop a scalable project and not to compare and contrast alignment algorithms for read mapping and genome assembly. Indeed, with a newer aligner such as Bowtie, it is possible to map the entire African genome using one m2.2xlarge instance in 48 hours for a total cost of approximately $48 in computation time. In our example, we were not concerned with data transfer rates, which are heavily influenced by the amount of available bandwidth, connection latency, and network availability. When transferring large amounts of data to the cloud, bandwidth limitations can be a major bottleneck, and in some cases it is more efficient to simply mail a storage device containing the data to AWS (http://aws.amazon.com/importexport/). More information about cloud computing, detailed cost analysis, and security can be found in references.

  11. GT-WGS: an efficient and economic tool for large-scale WGS analyses based on the AWS cloud service.

    PubMed

    Wang, Yiqi; Li, Gen; Ma, Mark; He, Fazhong; Song, Zhuo; Zhang, Wei; Wu, Chengkun

    2018-01-19

    Whole-genome sequencing (WGS) plays an increasingly important role in clinical practice and public health. Due to the big data size, WGS data analysis is usually compute-intensive and IO-intensive. Currently it usually takes 30 to 40 h to finish a 50× WGS analysis task, which is far from the ideal speed required by the industry. Furthermore, the high-end infrastructure required by WGS computing is costly in terms of time and money. In this paper, we aim to improve the time efficiency of WGS analysis and minimize the cost by elastic cloud computing. We developed a distributed system, GT-WGS, for large-scale WGS analyses utilizing the Amazon Web Services (AWS). Our system won the first prize on the Wind and Cloud challenge held by Genomics and Cloud Technology Alliance conference (GCTA) committee. The system makes full use of the dynamic pricing mechanism of AWS. We evaluate the performance of GT-WGS with a 55× WGS dataset (400GB fastq) provided by the GCTA 2017 competition. In the best case, it only took 18.4 min to finish the analysis and the AWS cost of the whole process is only 16.5 US dollars. The accuracy of GT-WGS is 99.9% consistent with that of the Genome Analysis Toolkit (GATK) best practice. We also evaluated the performance of GT-WGS performance on a real-world dataset provided by the XiangYa hospital, which consists of 5× whole-genome dataset with 500 samples, and on average GT-WGS managed to finish one 5× WGS analysis task in 2.4 min at a cost of $3.6. WGS is already playing an important role in guiding therapeutic intervention. However, its application is limited by the time cost and computing cost. GT-WGS excelled as an efficient and affordable WGS analyses tool to address this problem. The demo video and supplementary materials of GT-WGS can be accessed at https://github.com/Genetalks/wgs_analysis_demo .

  12. Unidata Cyberinfrastructure in the Cloud

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Young, J. W.

    2016-12-01

    Data services, software, and user support are critical components of geosciences cyber-infrastructure to help researchers to advance science. With the maturity of and significant advances in cloud computing, it has recently emerged as an alternative new paradigm for developing and delivering a broad array of services over the Internet. Cloud computing is now mature enough in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Given the enormous potential of cloud-based services, Unidata has been moving to augment its software, services, data delivery mechanisms to align with the cloud-computing paradigm. To realize the above vision, Unidata has worked toward: * Providing access to many types of data from a cloud (e.g., via the THREDDS Data Server, RAMADDA and EDEX servers); * Deploying data-proximate tools to easily process, analyze, and visualize those data in a cloud environment cloud for consumption by any one, by any device, from anywhere, at any time; * Developing and providing a range of pre-configured and well-integrated tools and services that can be deployed by any university in their own private or public cloud settings. Specifically, Unidata has developed Docker for "containerized applications", making them easy to deploy. Docker helps to create "disposable" installs and eliminates many configuration challenges. Containerized applications include tools for data transport, access, analysis, and visualization: THREDDS Data Server, Integrated Data Viewer, GEMPAK, Local Data Manager, RAMADDA Data Server, and Python tools; * Leveraging Jupyter as a central platform and hub with its powerful set of interlinking tools to connect interactively data servers, Python scientific libraries, scripts, and workflows; * Exploring end-to-end modeling and prediction capabilities in the cloud; * Partnering with NOAA and public cloud vendors (e.g., Amazon and OCC) on the NOAA Big Data Project to harness their capabilities and resources for the benefit of the academic community.

  13. Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud

    NASA Astrophysics Data System (ADS)

    Ramamurthy, Mohan; Fisher, Ward; Yoksas, Tom

    2015-04-01

    Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high expectations from students who have grown up with smartphones and tablets. These changes are upending traditional approaches to accessing and using data and software. Unidata recognizes that its products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable in the form of downloadable Unidata-in-a-box virtual images, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our ongoing efforts to deploy a suite of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.

  14. A PACS archive architecture supported on cloud services.

    PubMed

    Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis

    2012-05-01

    Diagnostic imaging procedures have continuously increased over the last decade and this trend may continue in coming years, creating a great impact on storage and retrieval capabilities of current PACS. Moreover, many smaller centers do not have financial resources or requirements that justify the acquisition of a traditional infrastructure. Alternative solutions, such as cloud computing, may help address this emerging need. A tremendous amount of ubiquitous computational power, such as that provided by Google and Amazon, are used every day as a normal commodity. Taking advantage of this new paradigm, an architecture for a Cloud-based PACS archive that provides data privacy, integrity, and availability is proposed. The solution is independent from the cloud provider and the core modules were successfully instantiated in examples of two cloud computing providers. Operational metrics for several medical imaging modalities were tabulated and compared for Google Storage, Amazon S3, and LAN PACS. A PACS-as-a-Service archive that provides storage of medical studies using the Cloud was developed. The results show that the solution is robust and that it is possible to store, query, and retrieve all desired studies in a similar way as in a local PACS approach. Cloud computing is an emerging solution that promises high scalability of infrastructures, software, and applications, according to a "pay-as-you-go" business model. The presented architecture uses the cloud to setup medical data repositories and can have a significant impact on healthcare institutions by reducing IT infrastructures.

  15. The Start of a Tech Revolution

    ERIC Educational Resources Information Center

    Dyrli, Kurt O.

    2009-01-01

    We are at the start of a revolution in the use of computers, one that analysts predict will rival the development of the PC in its significance. Companies such as Google, HP, Amazon, Sun Microsystems, Sony, IBM, and Apple are orienting their entire business models toward this change, and software maker SAS has announced plans for a $70 million…

  16. Development of computer-aided design system of elastic sensitive elements of automatic metering devices

    NASA Astrophysics Data System (ADS)

    Kalinkina, M. E.; Kozlov, A. S.; Labkovskaia, R. I.; Pirozhnikova, O. I.; Tkalich, V. L.; Shmakov, N. A.

    2018-05-01

    The object of research is the element base of devices of control and automation systems, including in its composition annular elastic sensitive elements, methods of their modeling, calculation algorithms and software complexes for automation of their design processes. The article is devoted to the development of the computer-aided design system of elastic sensitive elements used in weight- and force-measuring automation devices. Based on the mathematical modeling of deformation processes in a solid, as well as the results of static and dynamic analysis, the calculation of elastic elements is given using the capabilities of modern software systems based on numerical simulation. In the course of the simulation, the model was a divided hexagonal grid of finite elements with a maximum size not exceeding 2.5 mm. The results of modal and dynamic analysis are presented in this article.

  17. Alignment theory of parallel-beam computed tomography image reconstruction for elastic-type objects using virtual focusing method.

    PubMed

    Jun, Kyungtaek; Kim, Dongwook

    2018-01-01

    X-ray computed tomography has been studied in various fields. Considerable effort has been focused on reconstructing the projection image set from a rigid-type specimen. However, reconstruction of images projected from an object showing elastic motion has received minimal attention. In this paper, a mathematical solution to reconstructing the projection image set obtained from an object with specific elastic motions-periodically, regularly, and elliptically expanded or contracted specimens-is proposed. To reconstruct the projection image set from expanded or contracted specimens, methods are presented for detection of the sample's motion modes, mathematical rescaling of pixel values, and conversion of the projection angle for a common layer.

  18. First-principles study of crystal structure, elastic stiffness constants, piezoelectric constants, and spontaneous polarization of orthorhombic Pna21-M2O3 (M = Al, Ga, In, Sc, Y)

    NASA Astrophysics Data System (ADS)

    Shimada, Kazuhiro

    2018-03-01

    We perform first-principles calculations to investigate the crystal structure, elastic and piezoelectric properties, and spontaneous polarization of orthorhombic M2O3 (M = Al, Ga, In, Sc, Y) with Pna21 space group based on density functional theory. The lattice parameters, full elastic stiffness constants, piezoelectric stress and strain constants, and spontaneous polarization are successfully predicted. Comparison with available experimental and computational results indicates the validity of our computational results. Detailed analysis of the results clarifies the difference in the bonding character and the origin of the strong piezoelectric response and large spontaneous polarization.

  19. Simplified computational methods for elastic and elastic-plastic fracture problems

    NASA Technical Reports Server (NTRS)

    Atluri, Satya N.

    1992-01-01

    An overview is given of some of the recent (1984-1991) developments in computational/analytical methods in the mechanics of fractures. Topics covered include analytical solutions for elliptical or circular cracks embedded in isotropic or transversely isotropic solids, with crack faces being subjected to arbitrary tractions; finite element or boundary element alternating methods for two or three dimensional crack problems; a 'direct stiffness' method for stiffened panels with flexible fasteners and with multiple cracks; multiple site damage near a row of fastener holes; an analysis of cracks with bonded repair patches; methods for the generation of weight functions for two and three dimensional crack problems; and domain-integral methods for elastic-plastic or inelastic crack mechanics.

  20. Elastic strain relaxation in interfacial dislocation patterns: I. A parametric energy-based framework

    NASA Astrophysics Data System (ADS)

    Vattré, A.

    2017-08-01

    A parametric energy-based framework is developed to describe the elastic strain relaxation of interface dislocations. By means of the Stroh sextic formalism with a Fourier series technique, the proposed approach couples the classical anisotropic elasticity theory with surface/interface stress and elasticity properties in heterogeneous interface-dominated materials. For any semicoherent interface of interest, the strain energy landscape is computed using the persistent elastic fields produced by infinitely periodic hexagonal-shaped dislocation configurations with planar three-fold nodes. A finite element based procedure combined with the conjugate gradient and nudged elastic band methods is applied to determine the minimum-energy paths for which the pre-computed energy landscapes yield to elastically favorable dislocation reactions. Several applications on the Au/Cu heterosystems are given. The simple and limiting case of a single set of infinitely periodic dislocations is introduced to determine exact closed-form expressions for stresses. The second limiting case of the pure (010) Au/Cu heterophase interfaces containing two crossing sets of straight dislocations investigates the effects due to the non-classical boundary conditions on the stress distributions, including separate and appropriate constitutive relations at semicoherent interfaces and free surfaces. Using the quantized Frank-Bilby equation, it is shown that the elastic strain landscape exhibits intrinsic dislocation configurations for which the junction formation is energetically unfavorable. On the other hand, the mismatched (111) Au/Cu system gives rise to the existence of a minimum-energy path where the fully strain-relaxed equilibrium and non-regular intrinsic hexagonal-shaped dislocation rearrangement is accompanied by a significant removal of the short-range elastic energy.

  1. Boverhof's App Earns Honorable Mention in Amazon's Web Services

    Science.gov Websites

    » Boverhof's App Earns Honorable Mention in Amazon's Web Services Competition News & Publications News Publications Facebook Google+ Twitter Boverhof's App Earns Honorable Mention in Amazon's Web Services by Amazon Web Services (AWS). Amazon officially announced the winners of its EC2 Spotathon on Monday

  2. Plasma osmolality reference values in African grey parrots (Psittacus erithacus erithacus), Hispaniolan Amazon parrots (Amazona ventralis), and red-fronted macaws (Ara rubrogenys).

    PubMed

    Beaufrère, Hugues; Acierno, Mark; Mitchell, Mark; Guzman, David Sanchez-Migallon; Bryant, Heather; Tully, Thomas N

    2011-06-01

    Birds are routinely presented to veterinarians for dehydration. Success with these cases ultimately depends on providing replacement fluids and re-establishing fluid homeostasis. Few studies have been done to determine reference ranges for plasma osmolality in birds. The goals of this study were to determine reference values for plasma osmolality in 3 species of parrots and to provide recommendations on fluid selection for replacement therapy in these species. Blood samples were collected from 21 adult Hispaniolan Amazon parrots (Amazona ventralis), 21 Congo African grey parrots (Psittacus erithacus erithacus), and 9 red-fronted macaws (Ara rubrogenys), and were placed into lithium heparin containers. Plasma osmolality was measured in duplicate with a freezing point depression osmometer. Summary statistics were computed from the average values. Reference ranges, calculated by using the robust method, were 288-324, 308-345, and 223-369 mOsm/kg in African grey parrots, Hispaniolan Amazon parrots, and red-fronted macaws, respectively. The mean +/- SD values were 306 +/- 7, 327 +/- 7, and 304 +/- 18 mOsm/kg in African grey parrots, Hispaniolan Amazon parrots, and red-fronted macaws, respectively. Comparisons with osmolality values in mammals and values previously reported for psittacine bird species suggest that plasma osmolality is slightly higher in parrots than in mammals, species-specific differences exist, and differences between reported values occur. Overall, fluids with an osmolarity close to 300-320 mOsm/L, such as Normosol-R, Plasmalyte-R, Plasmalyte-A, and NaCl 0.9%, can be recommended in parrots for fluid replacement therapy when isotonic fluids are required.

  3. Determination of stresses in gas-turbine disks subjected to plastic flow and creep

    NASA Technical Reports Server (NTRS)

    Millenson, M B; Manson, S S

    1948-01-01

    A finite-difference method previously presented for computing elastic stresses in rotating disks is extended to include the computation of the disk stresses when plastic flow and creep are considered. A finite-difference method is employed to eliminate numerical integration and to permit nontechnical personnel to make the calculations with a minimum of engineering supervision. Illustrative examples are included to facilitate explanation of the procedure by carrying out the computations on a typical gas-turbine disk through a complete running cycle. The results of the numerical examples presented indicate that plastic flow markedly alters the elastic-stress distribution.

  4. Amazon Forests Response to Droughts: A Perspective from the MAIAC Product

    NASA Technical Reports Server (NTRS)

    Bi, Jian; Myneni, Ranga; Lyapustin, Alexei; Wang, Yujie; Park, Taejin; Chi, Chen; Yan, Kai; Knyazikhin, Yuri

    2016-01-01

    Amazon forests experienced two severe droughts at the beginning of the 21st century: one in 2005 and the other in 2010. How Amazon forests responded to these droughts is critical for the future of the Earth's climate system. It is only possible to assess Amazon forests' response to the droughts in large areal extent through satellite remote sensing. Here, we used the Multi-Angle Implementation of Atmospheric Correction (MAIAC) Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation index (VI) data to assess Amazon forests' response to droughts, and compared the results with those from the standard (Collection 5 and Collection 6) MODIS VI data. Overall, the MAIAC data reveal more realistic Amazon forests inter-annual greenness dynamics than the standard MODIS data. Our results from the MAIAC data suggest that: (1) the droughts decreased the greenness (i.e., photosynthetic activity) of Amazon forests; (2) the Amazon wet season precipitation reduction induced by El Niño events could also lead to reduced photosynthetic activity of Amazon forests; and (3) in the subsequent year after the water stresses, the greenness of Amazon forests recovered from the preceding decreases. However, as previous research shows droughts cause Amazon forests to reduce investment in tissue maintenance and defense, it is not clear whether the photosynthesis of Amazon forests will continue to recover after future water stresses, because of the accumulated damages caused by the droughts.

  5. Confluence of the Amazon and Topajos Rivers, Brazil, South America

    NASA Image and Video Library

    1991-08-11

    This view shows the confluence of the Amazon and the Topajos Rivers at Santarem, Brazil (2.0S, 55.0W). The Am,azon flows from lower left to upper right of the photo. Below the river juncture of the Amazon and Tapajos, there is considerable deforestation activity along the Trans-Amazon Highway.

  6. Biopolymer Chain Elasticity: a novel concept and a least deformation energy principle predicts backbone and overall folding of DNA TTT hairpins in agreement with NMR distances

    PubMed Central

    Pakleza, Christophe; Cognet, Jean A. H.

    2003-01-01

    A new molecular modelling methodology is presented and shown to apply to all published solution structures of DNA hairpins with TTT in the loop. It is based on the theory of elasticity of thin rods and on the assumption that single-stranded B-DNA behaves as a continuous, unshearable, unstretchable and flexible thin rod. It requires four construction steps: (i) computation of the tri-dimensional trajectory of the elastic line, (ii) global deformation of single-stranded helical DNA onto the elastic line, (iii) optimisation of the nucleoside rotations about the elastic line, (iv) energy minimisation to restore backbone bond lengths and bond angles. This theoretical approach called ‘Biopolymer Chain Elasticity’ (BCE) is capable of reproducing the tri-dimensional course of the sugar–phosphate chain and, using NMR-derived distances, of reproducing models close to published solution structures. This is shown by computing three different types of distance criteria. The natural description provided by the elastic line and by the new parameter, Ω, which corresponds to the rotation angles of nucleosides about the elastic line, offers a considerable simplification of molecular modelling of hairpin loops. They can be varied independently from each other, since the global shape of the hairpin loop is preserved in all cases. PMID:12560506

  7. The influence of medium elasticity on the prediction of histotripsy-induced bubble expansion and erythrocyte viability

    NASA Astrophysics Data System (ADS)

    Bader, Kenneth B.

    2018-05-01

    Histotripsy is a form of therapeutic ultrasound that liquefies tissue mechanically via acoustic cavitation. Bubble expansion is paramount in the efficacy of histotripsy therapy, and the cavitation dynamics are strongly influenced by the medium elasticity. In this study, an analytic model to predict histotripsy-induced bubble expansion in a fluid was extended to include the effects of medium elasticity. Good agreement was observed between the predictions of the analytic model and numerical computations utilizing highly nonlinear excitations (shock-scattering histotripsy) and purely tensile pulses (microtripsy). No bubble expansion was computed for either form of histotripsy when the elastic modulus was greater than 20 MPa and the peak negative pressure was less than 50 MPa. Strain in the medium due to the expansion of a single bubble was also tabulated. The viability of red blood cells was calculated as a function of distance from the bubble wall based on empirical data of impulsive stretching of erythrocytes. Red blood cells remained viable at distances further than 44 µm from the bubble wall. As the medium elasticity increased, the distance over which bubble expansion-induced strain influenced red blood cells was found to decrease sigmoidally. These results highlight the relationship between tissue elasticity and the efficacy of histotripsy. In addition, an upper medium elasticity limit was identified, above which histotripsy may not be effective for tissue liquefaction.

  8. Hilbert complexes of nonlinear elasticity

    NASA Astrophysics Data System (ADS)

    Angoshtari, Arzhang; Yavari, Arash

    2016-12-01

    We introduce some Hilbert complexes involving second-order tensors on flat compact manifolds with boundary that describe the kinematics and the kinetics of motion in nonlinear elasticity. We then use the general framework of Hilbert complexes to write Hodge-type and Helmholtz-type orthogonal decompositions for second-order tensors. As some applications of these decompositions in nonlinear elasticity, we study the strain compatibility equations of linear and nonlinear elasticity in the presence of Dirichlet boundary conditions and the existence of stress functions on non-contractible bodies. As an application of these Hilbert complexes in computational mechanics, we briefly discuss the derivation of a new class of mixed finite element methods for nonlinear elasticity.

  9. Classifying proteins into functional groups based on all-versus-all BLAST of 10 million proteins.

    PubMed

    Kolker, Natali; Higdon, Roger; Broomall, William; Stanberry, Larissa; Welch, Dean; Lu, Wei; Haynes, Winston; Barga, Roger; Kolker, Eugene

    2011-01-01

    To address the monumental challenge of assigning function to millions of sequenced proteins, we completed the first of a kind all-versus-all sequence alignments using BLAST for 9.9 million proteins in the UniRef100 database. Microsoft Windows Azure produced over 3 billion filtered records in 6 days using 475 eight-core virtual machines. Protein classification into functional groups was then performed using Hive and custom jars implemented on top of Apache Hadoop utilizing the MapReduce paradigm. First, using the Clusters of Orthologous Genes (COG) database, a length normalized bit score (LNBS) was determined to be the best similarity measure for classification of proteins. LNBS achieved sensitivity and specificity of 98% each. Second, out of 5.1 million bacterial proteins, about two-thirds were assigned to significantly extended COG groups, encompassing 30 times more assigned proteins. Third, the remaining proteins were classified into protein functional groups using an innovative implementation of a single-linkage algorithm on an in-house Hadoop compute cluster. This implementation significantly reduces the run time for nonindexed queries and optimizes efficient clustering on a large scale. The performance was also verified on Amazon Elastic MapReduce. This clustering assigned nearly 2 million proteins to approximately half a million different functional groups. A similar approach was applied to classify 2.8 million eukaryotic sequences resulting in over 1 million proteins being assign to existing KOG groups and the remainder clustered into 100,000 functional groups.

  10. Dynamics of elastic systems

    NASA Astrophysics Data System (ADS)

    Sankovich, Vladimir

    1998-12-01

    The goal of this paper is to build a consistent physical theory of the dynamics of the bat-ball interaction. This requires creating realistic models for both the softball bat and the softball. Some of the features of these models are known phenomenologically, from experiments conducted in our laboratory, others will be introduced and computed from first principles here for the first time. Both interacting objects are treated from the viewpoint of the theory of elasticity, and it is shown how a computer can be used to accurately calculate all the relevant characteristics of batball collisions. It is shown also how the major elastic parameters of the material constituting the interior of a softball can be determined using the existing experimental data. These parameters, such as the Young's modulus, the Poisson ratio and the damping coefficient are vital for the accurate description of the ball's dynamics. We are demonstrating how the existing theories of the elastic behavior of solid bars and hollow shells can be augmented to simplify the resulting equations and make the subsequent computer analysis feasible. The standard system of fourth-order PDE's is reduced to a system of the second order, because of the inclusion of the usually ignored effects of the shear forces in the bat.

  11. AceCloud: Molecular Dynamics Simulations in the Cloud.

    PubMed

    Harvey, M J; De Fabritiis, G

    2015-05-26

    We present AceCloud, an on-demand service for molecular dynamics simulations. AceCloud is designed to facilitate the secure execution of large ensembles of simulations on an external cloud computing service (currently Amazon Web Services). The AceCloud client, integrated into the ACEMD molecular dynamics package, provides an easy-to-use interface that abstracts all aspects of interaction with the cloud services. This gives the user the experience that all simulations are running on their local machine, minimizing the learning curve typically associated with the transition to using high performance computing services.

  12. Hybrid cloud and cluster computing paradigms for life science applications

    PubMed Central

    2010-01-01

    Background Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Results Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. Conclusions The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. Methods We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments. PMID:21210982

  13. Hybrid cloud and cluster computing paradigms for life science applications.

    PubMed

    Qiu, Judy; Ekanayake, Jaliya; Gunarathne, Thilina; Choi, Jong Youl; Bae, Seung-Hee; Li, Hui; Zhang, Bingjing; Wu, Tak-Lon; Ruan, Yang; Ekanayake, Saliya; Hughes, Adam; Fox, Geoffrey

    2010-12-21

    Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments.

  14. Running Neuroimaging Applications on Amazon Web Services: How, When, and at What Cost?

    PubMed

    Madhyastha, Tara M; Koh, Natalie; Day, Trevor K M; Hernández-Fernández, Moises; Kelley, Austin; Peterson, Daniel J; Rajan, Sabreena; Woelfer, Karl A; Wolf, Jonathan; Grabowski, Thomas J

    2017-01-01

    The contribution of this paper is to identify and describe current best practices for using Amazon Web Services (AWS) to execute neuroimaging workflows "in the cloud." Neuroimaging offers a vast set of techniques by which to interrogate the structure and function of the living brain. However, many of the scientists for whom neuroimaging is an extremely important tool have limited training in parallel computation. At the same time, the field is experiencing a surge in computational demands, driven by a combination of data-sharing efforts, improvements in scanner technology that allow acquisition of images with higher image resolution, and by the desire to use statistical techniques that stress processing requirements. Most neuroimaging workflows can be executed as independent parallel jobs and are therefore excellent candidates for running on AWS, but the overhead of learning to do so and determining whether it is worth the cost can be prohibitive. In this paper we describe how to identify neuroimaging workloads that are appropriate for running on AWS, how to benchmark execution time, and how to estimate cost of running on AWS. By benchmarking common neuroimaging applications, we show that cloud computing can be a viable alternative to on-premises hardware. We present guidelines that neuroimaging labs can use to provide a cluster-on-demand type of service that should be familiar to users, and scripts to estimate cost and create such a cluster.

  15. Polycrystalline gamma plutonium's elastic moduli versus temperature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Migliori, Albert; Betts, J; Trugman, A

    2009-01-01

    Resonant ultrasound spectroscopy was used to measure the elastic properties of pure polycrystalline {sup 239}Pu in the {gamma} phase. Shear and longitudinal elastic moduli were measured simultaneously and the bulk modulus was computed from them. A smooth, linear, and large decrease of all elastic moduli with increasing temperature was observed. They calculated the Poisson ratio and found that it increases from 0.242 at 519 K to 0.252 at 571 K. These measurements on extremely well characterized pure Pu are in agreement with other reported results where overlap occurs.

  16. Elastic and thermal expansion asymmetry in dense molecular materials.

    PubMed

    Burg, Joseph A; Dauskardt, Reinhold H

    2016-09-01

    The elastic modulus and coefficient of thermal expansion are fundamental properties of elastically stiff molecular materials and are assumed to be the same (symmetric) under both tension and compression loading. We show that molecular materials can have a marked asymmetric elastic modulus and coefficient of thermal expansion that are inherently related to terminal chemical groups that limit molecular network connectivity. In compression, terminal groups sterically interact to stiffen the network, whereas in tension they interact less and disconnect the network. The existence of asymmetric elastic and thermal expansion behaviour has fundamental implications for computational approaches to molecular materials modelling and practical implications on the thermomechanical strains and associated elastic stresses. We develop a design space to control the degree of elastic asymmetry in molecular materials, a vital step towards understanding their integration into device technologies.

  17. A study on Rayleigh wave dispersion in bone according to Mindlin's Form II gradient elasticity.

    PubMed

    Vavva, Maria G; Gergidis, Leonidas N; Protopappas, Vasilios C; Charalambopoulos, Antonios; Polyzos, Demosthenes; Fotiadis, Dimitrios I

    2014-05-01

    The classical elasticity cannot effectively describe bone's mechanical behavior since only homogeneous media and local stresses are assumed. Additionally, it cannot predict the dispersive nature of the Rayleigh wave which has been reported in experimental studies and was also demonstrated in a previous computational study by adopting Mindlin's Form II gradient elasticity. In this work Mindlin's theory is employed to analytically determine the dispersion of Rayleigh waves in a strain gradient elastic half-space. An isotropic semi-infinite space is considered with properties equal to those of bone and dynamic behavior suffering from microstructural effects. Microstructural effects are considered by incorporating four intrinsic parameters in the stress analysis. The results are presented in the form of group and phase velocity dispersion curves and compared with existing computational results and semi-analytical curves calculated for a simpler case of Rayleigh waves in dipolar gradient elastic half-spaces. Comparisons are also performed with the velocity of the first-order antisymmetric mode propagating in a dipolar plate so as to observe the Rayleigh asymptotic behavior. It is shown that Mindlin's Form II gradient elasticity can effectively describe the dispersive nature of Rayleigh waves. This study could be regarded as a step toward the ultrasonic characterization of bone.

  18. A crowdsourced nickel-and-dime approach to analog OBM research: A behavioral economic framework for understanding workforce attrition.

    PubMed

    Henley, Amy J; DiGennaro Reed, Florence D; Reed, Derek D; Kaplan, Brent A

    2016-09-01

    Incentives are a popular method to achieve desired employee performance; however, research on optimal incentive magnitude is lacking. Behavioral economic demand curves model persistence of responding in the face of increasing cost and may be suitable to examine the reinforcing value of incentives on work performance. The present use-inspired basic study integrated an experiential human operant task within a crowdsourcing platform to evaluate the applicability of behavioral economics for quantifying changes in workforce attrition. Participants included 88 Amazon Mechanical Turk Workers who earned either a $0.05 or $0.10 incentive for completing a progressively increasing response requirement. Analyses revealed statistically significant differences in breakpoint between the two groups. Additionally, a novel translation of the Kaplan-Meier survival-curve analyses for use within a demand curve framework allowed for examination of elasticity of workforce attrition. Results indicate greater inelastic attrition in the $0.05 group. We discuss the benefits of a behavioral economic approach to modeling employee behavior, how the metrics obtained from the elasticity of workforce attrition analyses (e.g., P max ) may be used to set goals for employee behavior while balancing organizational costs, and how economy type may have influenced observed outcomes. © 2016 Society for the Experimental Analysis of Behavior.

  19. Leaf Pressure Volume Data in Caxiuana and Tapajos National Forest, Para, Brazil (2011)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Thomas; Moorcroft, Paul

    Pressure volume curve measurements on leaves of canopy trees from the from the Caxiuana and Tapajos National Forests, Para, Brazil. Tapajos samples were harvested from the km 67 forested area, which is adjacent to the decommissioned throughfall exclusion drought experimental plot. Caxiuana samples were harvested from trees growing in the throughfall exclusion plots. Data were collected in 2011. Dataset includes: date of measurement, site ID, plot ID, tree ID (species, tree tag #), leaf area, fresh weight, relative weight, leaf water potential, and leaf water loss. P-V curve parameters (turgor loss point, osmotic potential, and bulk modulus of elasticity) canmore » be found in Powell et al. (2017) Differences in xylem cavitation resistance and leaf hydraulic traits explain differences in drought tolerance among mature Amazon rainforest trees. Global Change Biology.« less

  20. Numerical aspects in modeling high Deborah number flow and elastic instability

    NASA Astrophysics Data System (ADS)

    Kwon, Youngdon

    2014-05-01

    Investigating highly nonlinear viscoelastic flow in 2D domain, we explore problem as well as property possibly inherent in the streamline upwinding technique (SUPG) and then present various results of elastic instability. The mathematically stable Leonov model written in tensor-logarithmic formulation is employed in the framework of finite element method for spatial discretization of several representative problem domains. For enhancement of computation speed, decoupled integration scheme is applied for shear thinning and Boger-type fluids. From the analysis of 4:1 contraction flow at low and moderate values of the Deborah number (De) the solution with SUPG method does not show noticeable difference from the one by the computation without upwinding. On the other hand, in the flow regime of high De, especially in the state of elastic instability the SUPG significantly distorts the flow field and the result differs considerably from the solution acquired straightforwardly. When the strength of elastic flow and thus the nonlinearity further increase, the computational scheme with upwinding fails to converge and evolutionary solution does not become available any more. All this result suggests that extreme care has to be taken on occasions where upwinding is applied, and one has to first of all prove validity of this algorithm in the case of high nonlinearity. On the contrary, the straightforward computation with no upwinding can efficiently model representative phenomena of elastic instability in such benchmark problems as 4:1 contraction flow, flow over a circular cylinder and flow over asymmetric array of cylinders. Asymmetry of the flow field occurring in the symmetric domain, enhanced spatial and temporal fluctuation of dynamic variables and flow effects caused by extension hardening are properly described in this study.

  1. The asymptotic homogenization elasticity tensor properties for composites with material discontinuities

    NASA Astrophysics Data System (ADS)

    Penta, Raimondo; Gerisch, Alf

    2017-01-01

    The classical asymptotic homogenization approach for linear elastic composites with discontinuous material properties is considered as a starting point. The sharp length scale separation between the fine periodic structure and the whole material formally leads to anisotropic elastic-type balance equations on the coarse scale, where the arising fourth rank operator is to be computed solving single periodic cell problems on the fine scale. After revisiting the derivation of the problem, which here explicitly points out how the discontinuity in the individual constituents' elastic coefficients translates into stress jump interface conditions for the cell problems, we prove that the gradient of the cell problem solution is minor symmetric and that its cell average is zero. This property holds for perfect interfaces only (i.e., when the elastic displacement is continuous across the composite's interface) and can be used to assess the accuracy of the computed numerical solutions. These facts are further exploited, together with the individual constituents' elastic coefficients and the specific form of the cell problems, to prove a theorem that characterizes the fourth rank operator appearing in the coarse-scale elastic-type balance equations as a composite material effective elasticity tensor. We both recover known facts, such as minor and major symmetries and positive definiteness, and establish new facts concerning the Voigt and Reuss bounds. The latter are shown for the first time without assuming any equivalence between coarse and fine-scale energies ( Hill's condition), which, in contrast to the case of representative volume elements, does not identically hold in the context of asymptotic homogenization. We conclude with instructive three-dimensional numerical simulations of a soft elastic matrix with an embedded cubic stiffer inclusion to show the profile of the physically relevant elastic moduli (Young's and shear moduli) and Poisson's ratio at increasing (up to 100 %) inclusion's volume fraction, thus providing a proxy for the design of artificial elastic composites.

  2. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  3. Pharmacokinetics of Compounded Intravenous and Oral Gabapentin in Hispaniolan Amazon Parrots ( Amazona ventralis ).

    PubMed

    Baine, Katherine; Jones, Michael P; Cox, Sherry; Martín-Jiménez, Tomás

    2015-09-01

    Neuropathic pain is a manifestation of chronic pain that arises with damage to the somatosensory system. Pharmacologic treatment recommendations for alleviation of neuropathic pain are often multimodal, and the few reports communicating treatment of suspected neuropathic pain in avian patients describe the use of gabapentin as part of the therapeutic regimen. To determine the pharmacokinetics of gabapentin in Hispaniolan Amazon parrots ( Amazona ventralis ), compounded gabapentin suspensions were administered at 30 mg/kg IV to 2 birds, 10 mg/kg PO to 3 birds, and 30 mg/kg PO to 3 birds. Blood samples were collected immediately before and at 9 different time points after drug administration. Plasma samples were analyzed for gabapentin concentration, and pharmacokinetic parameters were calculated with both a nonlinear mixed-effect approach and a noncompartmental analysis. The best compartmental, oral model was used to simulate the concentration-time profiles resulting from different dosing scenarios. Mild sedation was observed in both study birds after intravenous injection. Computer simulation of different dosing scenarios with the mean parameter estimates showed that 15 mg/kg every 8 hours would be a starting point for oral dosing in Hispaniolan Amazon parrots based on effective plasma concentrations reported for human patients; however, additional studies need to be performed to establish a therapeutic dose.

  4. Genetic diversity of the HLA-G coding region in Amerindian populations from the Brazilian Amazon: a possible role of natural selection.

    PubMed

    Mendes-Junior, C T; Castelli, E C; Meyer, D; Simões, A L; Donadi, E A

    2013-12-01

    HLA-G has an important role in the modulation of the maternal immune system during pregnancy, and evidence that balancing selection acts in the promoter and 3'UTR regions has been previously reported. To determine whether selection acts on the HLA-G coding region in the Amazon Rainforest, exons 2, 3 and 4 were analyzed in a sample of 142 Amerindians from nine villages of five isolated tribes that inhabit the Central Amazon. Six previously described single-nucleotide polymorphisms (SNPs) were identified and the Expectation-Maximization (EM) and PHASE algorithms were used to computationally reconstruct SNP haplotypes (HLA-G alleles). A new HLA-G allele, which originated in Amerindian populations by a crossing-over event between two widespread HLA-G alleles, was identified in 18 individuals. Neutrality tests evidenced that natural selection has a complex part in the HLA-G coding region. Although balancing selection is the type of selection that shapes variability at a local level (Native American populations), we have also shown that purifying selection may occur on a worldwide scale. Moreover, the balancing selection does not seem to act on the coding region as strongly as it acts on the flanking regulatory regions, and such coding signature may actually reflect a hitchhiking effect.

  5. Impact Cratering Calculations

    NASA Technical Reports Server (NTRS)

    Ahrens, Thomas J.

    2001-01-01

    This research is computational /theoretical and complements the Caltech experimental program. We have developed an understanding of the basic physical processes and produced computational models and implemented these into Eulerian and Lagrangian finite element codes. The key issues we have addressed include the conditions required for: faulting (strain localization), elastic moduli weakening, dynamic weakening (layering elastic instabilities and fluidization), bulking (creation of porosity at zero pressure) and compaction of pores, frictional melting (creation of pseudotachylytes), partial and selective devolatilization of materials (e.g. CaCO3, water/ice mixtures), and debris flows.

  6. Convergence of Legendre Expansion of Doppler-Broadened Double Differential Elastic Scattering Cross Section

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arbanas, Goran; Dunn, Michael E; Larson, Nancy M

    2012-01-01

    Convergence properties of Legendre expansion of a Doppler-broadened double-differential elastic neutron scattering cross section of {sup 238}U near the 6.67 eV resonance at temperature 10{sup 3} K are studied. A variance of Legendre expansion from a reference Monte Carlo computation is used as a measure of convergence and is computed for as many as 15 terms in the Legendre expansion. When the outgoing energy equals the incoming energy, it is found that the Legendre expansion converges very slowly. Therefore, a supplementary method of computing many higher-order terms is suggested and employed for this special case.

  7. Large-Scale, Multi-Sensor Atmospheric Data Fusion Using Hybrid Cloud Computing

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.

    2015-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, MODIS, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over 10 years of data. HySDS is a Hybrid-Cloud Science Data System that has been developed and applied under NASA AIST, MEaSUREs, and ACCESS grants. HySDS uses the SciFlow workflow engine to partition analysis workflows into parallel tasks (e.g. segmenting by time or space) that are pushed into a durable job queue. The tasks are "pulled" from the queue by worker Virtual Machines (VM's) and executed in an on-premise Cloud (Eucalyptus or OpenStack) or at Amazon in the public Cloud or govCloud. In this way, years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the transferred data. We are using HySDS to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a MEASURES grant. We will present the architecture of HySDS, describe the achieved "clock time" speedups in fusing datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. Our system demonstrates how one can pull A-Train variables (Levels 2 & 3) on-demand into the Amazon Cloud, and cache only those variables that are heavily used, so that any number of compute jobs can be executed "near" the multi-sensor data. Decade-long, multi-sensor studies can be performed without pre-staging data, with the researcher paying only his own Cloud compute bill.

  8. Determination of elastic stresses in gas-turbine disks

    NASA Technical Reports Server (NTRS)

    Manson, S S

    1947-01-01

    A method is presented for the calculation of elastic stresses in symmetrical disks typical of those of a high-temperature gas turbine. The method is essentially a finite-difference solution of the equilibrium and compatibility equations for elastic stresses in a symmetrical disk. Account can be taken of point-to-point variations in disk thickness, in temperature, in elastic modulus, in coefficient of thermal expansion, in material density, and in Poisson's ratio. No numerical integration or trial-and-error procedures are involved and the computations can be performed in rapid and routine fashion by nontechnical computers with little engineering supervision. Checks on problems for which exact mathematical solutions are known indicate that the method yields results of high accuracy. Illustrative examples are presented to show the manner of treating solid disks, disks with central holes, and disks constructed either of a single material or two or more welded materials. The effect of shrink fitting is taken into account by a very simple device.

  9. Elastic constants from microscopic strain fluctuations

    PubMed

    Sengupta; Nielaba; Rao; Binder

    2000-02-01

    Fluctuations of the instantaneous local Lagrangian strain epsilon(ij)(r,t), measured with respect to a static "reference" lattice, are used to obtain accurate estimates of the elastic constants of model solids from atomistic computer simulations. The measured strains are systematically coarse-grained by averaging them within subsystems (of size L(b)) of a system (of total size L) in the canonical ensemble. Using a simple finite size scaling theory we predict the behavior of the fluctuations as a function of L(b)/L and extract elastic constants of the system in the thermodynamic limit at nonzero temperature. Our method is simple to implement, efficient, and general enough to be able to handle a wide class of model systems, including those with singular potentials without any essential modification. We illustrate the technique by computing isothermal elastic constants of "hard" and "soft" disk triangular solids in two dimensions from Monte Carlo and molecular dynamics simulations. We compare our results with those from earlier simulations and theory.

  10. Two modified symplectic partitioned Runge-Kutta methods for solving the elastic wave equation

    NASA Astrophysics Data System (ADS)

    Su, Bo; Tuo, Xianguo; Xu, Ling

    2017-08-01

    Based on a modified strategy, two modified symplectic partitioned Runge-Kutta (PRK) methods are proposed for the temporal discretization of the elastic wave equation. The two symplectic schemes are similar in form but are different in nature. After the spatial discretization of the elastic wave equation, the ordinary Hamiltonian formulation for the elastic wave equation is presented. The PRK scheme is then applied for time integration. An additional term associated with spatial discretization is inserted into the different stages of the PRK scheme. Theoretical analyses are conducted to evaluate the numerical dispersion and stability of the two novel PRK methods. A finite difference method is used to approximate the spatial derivatives since the two schemes are independent of the spatial discretization technique used. The numerical solutions computed by the two new schemes are compared with those computed by a conventional symplectic PRK. The numerical results, which verify the new method, are superior to those generated by traditional conventional methods in seismic wave modeling.

  11. Quasi-Static Viscoelastic Finite Element Model of an Aircraft Tire

    NASA Technical Reports Server (NTRS)

    Johnson, Arthur R.; Tanner, John A.; Mason, Angela J.

    1999-01-01

    An elastic large displacement thick-shell mixed finite element is modified to allow for the calculation of viscoelastic stresses. Internal strain variables are introduced at the element's stress nodes and are employed to construct a viscous material model. First order ordinary differential equations relate the internal strain variables to the corresponding elastic strains at the stress nodes. The viscous stresses are computed from the internal strain variables using viscous moduli which are a fraction of the elastic moduli. The energy dissipated by the action of the viscous stresses is included in the mixed variational functional. The nonlinear quasi-static viscous equilibrium equations are then obtained. Previously developed Taylor expansions of the nonlinear elastic equilibrium equations are modified to include the viscous terms. A predictor-corrector time marching solution algorithm is employed to solve the algebraic-differential equations. The viscous shell element is employed to computationally simulate a stair-step loading and unloading of an aircraft tire in contact with a frictionless surface.

  12. Study of phonon modes and elastic properties of Sc36Al24Co20Y20 and Gd36Al24Co20Y20 rare-earth bulk metallic glasses

    NASA Astrophysics Data System (ADS)

    Suthar, P. H.; Gajjar, P. N.; Thakore, B. Y.; Jani, A. R.

    2013-04-01

    A phonon modes and elastic properties of two different rare-earth based bulk metallic glasses Sc36Al24Co20Y20 and Gd36Al24Co20Y20 are computed using Hubbard-Beeby approach and our well established model potential. The local field correlation functions due to Hartree (H), Taylor (T), Ichimaru and Utsumi (IU), Farid et al (F) and Sarkar Sen et al (S) are employed to investigate the influence of the screening effects on the vibrational dynamics of Sc36Al24Co20Y20 and Gd36Al24Co20Y20 bulk metallic glasses. The results for bulk modulus BT, modulus of rigidity G, Poisson's ratio ξ, Young's modulus Y, Debye temperature ΘD, propagation velocity of elastic waves and dispersion curves are reported. The computed elastic properties are found to be in good agreement with experimental and other available data.

  13. Metagenome sequencing of the microbial community of two Brazilian anthropogenic Amazon dark earth sites, Brazil.

    PubMed

    Lemos, Leandro Nascimento; de Souza, Rosineide Cardoso; de Souza Cannavan, Fabiana; Patricio, André; Pylro, Victor Satler; Hanada, Rogério Eiji; Mui, Tsai Siu

    2016-12-01

    The Anthropogenic Amazon Dark Earth soil is considered one of the world's most fertile soils. These soils differs from conventional Amazon soils because its higher organic content concentration. Here we describe the metagenome sequencing of microbial communities of two sites of Anthropogenic Amazon Dark Earth soils from Amazon Rainforest, Brazil. The raw sequence data are stored under Short Read Accession number: PRJNA344917.

  14. Automatic real time evaluation of red blood cell elasticity by optical tweezers

    NASA Astrophysics Data System (ADS)

    Moura, Diógenes S.; Silva, Diego C. N.; Williams, Ajoke J.; Bezerra, Marcos A. C.; Fontes, Adriana; de Araujo, Renato E.

    2015-05-01

    Optical tweezers have been used to trap, manipulate, and measure individual cell properties. In this work, we show that the association of a computer controlled optical tweezers system with image processing techniques allows rapid and reproducible evaluation of cell deformability. In particular, the deformability of red blood cells (RBCs) plays a key role in the transport of oxygen through the blood microcirculation. The automatic measurement processes consisted of three steps: acquisition, segmentation of images, and measurement of the elasticity of the cells. An optical tweezers system was setup on an upright microscope equipped with a CCD camera and a motorized XYZ stage, computer controlled by a Labview platform. On the optical tweezers setup, the deformation of the captured RBC was obtained by moving the motorized stage. The automatic real-time homemade system was evaluated by measuring RBCs elasticity from normal donors and patients with sickle cell anemia. Approximately 150 erythrocytes were examined, and the elasticity values obtained by using the developed system were compared to the values measured by two experts. With the automatic system, there was a significant time reduction (60 × ) of the erythrocytes elasticity evaluation. Automated system can help to expand the applications of optical tweezers in hematology and hemotherapy.

  15. The role of dedicated data computing centers in the age of cloud computing

    NASA Astrophysics Data System (ADS)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  16. The Application of Simulation Method in Isothermal Elastic Natural Gas Pipeline

    NASA Astrophysics Data System (ADS)

    Xing, Chunlei; Guan, Shiming; Zhao, Yue; Cao, Jinggang; Chu, Yanji

    2018-02-01

    This Elastic pipeline mathematic model is of crucial importance in natural gas pipeline simulation because of its compliance with the practical industrial cases. The numerical model of elastic pipeline will bring non-linear complexity to the discretized equations. Hence the Newton-Raphson method cannot achieve fast convergence in this kind of problems. Therefore A new Newton Based method with Powell-Wolfe Condition to simulate the Isothermal elastic pipeline flow is presented. The results obtained by the new method aregiven based on the defined boundary conditions. It is shown that the method converges in all cases and reduces significant computational cost.

  17. Gesellschaft fuer angewandte Mathematik und Mechanik, Annual Scientific Meeting, Universitaet Regensburg, Regensburg, West Germany, April 16-19, 1984, Proceedings

    NASA Astrophysics Data System (ADS)

    Problems in applied mathematics and mechanics are addressed in reviews and reports. Areas covered are vibration and stability, elastic and plastic mechanics, fluid mechanics, the numerical treatment of differential equations (general theory and finite-element methods in particular), optimization, decision theory, stochastics, actuarial mathematics, applied analysis and mathematical physics, and numerical analysis. Included are major lectures on separated flows, the transition regime of rarefied-gas dynamics, recent results in nonlinear elasticity, fluid-elastic vibration, the new computer arithmetic, and unsteady wave propagation in layered elastic bodies.

  18. Cloud-Based Numerical Weather Prediction for Near Real-Time Forecasting and Disaster Response

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Case, Jonathan; Venner, Jason; Schroeder, Richard; Checchi, Milton; Zavodsky, Bradley; O'Brien, Raymond

    2015-01-01

    Cloud computing capabilities have rapidly expanded within the private sector, offering new opportunities for meteorological applications. Collaborations between NASA Marshall, NASA Ames, and contractor partners led to evaluations of private (NASA) and public (Amazon) resources for executing short-term NWP systems. Activities helped the Marshall team further understand cloud capabilities, and benchmark use of cloud resources for NWP and other applications

  19. Data Intensive Computing on Amazon Web Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magana-Zook, S. A.

    The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less

  20. Monitoring performance of a highly distributed and complex computing infrastructure in LHCb

    NASA Astrophysics Data System (ADS)

    Mathe, Z.; Haen, C.; Stagni, F.

    2017-10-01

    In order to ensure an optimal performance of the LHCb Distributed Computing, based on LHCbDIRAC, it is necessary to be able to inspect the behavior over time of many components: firstly the agents and services on which the infrastructure is built, but also all the computing tasks and data transfers that are managed by this infrastructure. This consists of recording and then analyzing time series of a large number of observables, for which the usage of SQL relational databases is far from optimal. Therefore within DIRAC we have been studying novel possibilities based on NoSQL databases (ElasticSearch, OpenTSDB and InfluxDB) as a result of this study we developed a new monitoring system based on ElasticSearch. It has been deployed on the LHCb Distributed Computing infrastructure for which it collects data from all the components (agents, services, jobs) and allows creating reports through Kibana and a web user interface, which is based on the DIRAC web framework. In this paper we describe this new implementation of the DIRAC monitoring system. We give details on the ElasticSearch implementation within the DIRAC general framework, as well as an overview of the advantages of the pipeline aggregation used for creating a dynamic bucketing of the time series. We present the advantages of using the ElasticSearch DSL high-level library for creating and running queries. Finally we shall present the performances of that system.

  1. Table-top earthquakes; a demonstration of seismology for teachers and students that can be used to augment lessons in earth science, physics, math, social studies, geography

    USGS Publications Warehouse

    Lahr, J.C.

    1998-01-01

    The apparatus consists of a heavy object that is dragged steadily with an elastic cord. Although pulled with a constant velocity, the heavy object repeatedly slides and then stops. A small vibration sensor, attached to a computer display, graphically monitors this intermittent motion. 2 This intermittent sliding motion mimics the intermittent fault slippage that characterizes the earthquake fault zones. In tectonically active regions, the Earth's outer brittle shell, which is about 50 km thick, is slowly deformed elastically along active faults. As the deformation increases, stress also increases, until fault slippage releases the stored elastic energy. This process is called elastic rebound. Detailed instructions are given for assembly and construction of this demonstration. Included are suggested sources for the vibration sensor (geophone) and the computer interface. Exclusive of the personal computer, the total cost is between $125 and $150. I gave a talk at the Geological Society of America's Cordilleran Section Centennial meeting on June 2, 1999. The slides show how this table-top demonstration can be used to help meet many of the K-12 teaching goals described in Benchmarks for Science Literacy (American Association for the Advancement of Science, 1993).

  2. Multi-model analysis of the Atlantic influence on Southern Amazon rainfall

    DOE PAGES

    Yoon, Jin -Ho

    2015-12-07

    Amazon rainfall is subject to year-to-year fluctuation resulting in drought and flood in various intensities. A major climatic driver of the interannual variation of the Amazon rainfall is El Niño/Southern Oscillation. Also, the Sea Surface Temperature over the Atlantic Ocean is identified as an important climatic driver on the Amazon water cycle. Previously, observational datasets were used to support the Atlantic influence on Amazon rainfall. Furthermore, it is found that multiple global climate models do reproduce the Atlantic-Amazon link robustly. However, there exist differences in rainfall response, which primarily depends on the climatological rainfall amount.

  3. Brazil-U.S. Relations

    DTIC Science & Technology

    2009-06-03

    the Amazon falls within Brazilian borders, making Brazil home to 40% of the world’s remaining tropical forests.96 The Brazilian Amazon was largely...20 Amazon Conservation......................................................................................................... 20 Domestic Efforts...independence in 1822, Brazil occupies almost half of the continent of South America and boasts immense biodiversity, including the vast Amazon

  4. Stress-stress fluctuation formula for elastic constants in the NPT ensemble

    NASA Astrophysics Data System (ADS)

    Lips, Dominik; Maass, Philipp

    2018-05-01

    Several fluctuation formulas are available for calculating elastic constants from equilibrium correlation functions in computer simulations, but the ones available for simulations at constant pressure exhibit slow convergence properties and cannot be used for the determination of local elastic constants. To overcome these drawbacks, we derive a stress-stress fluctuation formula in the NPT ensemble based on known expressions in the NVT ensemble. We validate the formula in the NPT ensemble by calculating elastic constants for the simple nearest-neighbor Lennard-Jones crystal and by comparing the results with those obtained in the NVT ensemble. For both local and bulk elastic constants we find an excellent agreement between the simulated data in the two ensembles. To demonstrate the usefulness of the formula, we apply it to determine the elastic constants of a simulated lipid bilayer.

  5. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    NASA Astrophysics Data System (ADS)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  6. Business as Usual: Amazon.com and the Academic Library

    ERIC Educational Resources Information Center

    Van Ullen, Mary K.; Germain, Carol Anne

    2002-01-01

    In 1999, Steve Coffman proposed that libraries form a single interlibrary loan based entity patterned after Amazon.com. This study examined the suitability of Amazon.com's Web interface and record enhancements for academic libraries. Amazon.com could not deliver circulating monographs in the University at Albany Libraries' collection quickly…

  7. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    NASA Astrophysics Data System (ADS)

    Chen, A.; Pham, L.; Kempler, S.; Theobald, M.; Esfandiari, A.; Campino, J.; Vollmer, B.; Lynnes, C.

    2011-12-01

    Cloud Computing technology has been used to offer high-performance and low-cost computing and storage resources for both scientific problems and business services. Several cloud computing services have been implemented in the commercial arena, e.g. Amazon's EC2 & S3, Microsoft's Azure, and Google App Engine. There are also some research and application programs being launched in academia and governments to utilize Cloud Computing. NASA launched the Nebula Cloud Computing platform in 2008, which is an Infrastructure as a Service (IaaS) to deliver on-demand distributed virtual computers. Nebula users can receive required computing resources as a fully outsourced service. NASA Goddard Earth Science Data and Information Service Center (GES DISC) migrated several GES DISC's applications to the Nebula as a proof of concept, including: a) The Simple, Scalable, Script-based Science Processor for Measurements (S4PM) for processing scientific data; b) the Atmospheric Infrared Sounder (AIRS) data process workflow for processing AIRS raw data; and c) the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (GIOVANNI) for online access to, analysis, and visualization of Earth science data. This work aims to evaluate the practicability and adaptability of the Nebula. The initial work focused on the AIRS data process workflow to evaluate the Nebula. The AIRS data process workflow consists of a series of algorithms being used to process raw AIRS level 0 data and output AIRS level 2 geophysical retrievals. Migrating the entire workflow to the Nebula platform is challenging, but practicable. After installing several supporting libraries and the processing code itself, the workflow is able to process AIRS data in a similar fashion to its current (non-cloud) configuration. We compared the performance of processing 2 days of AIRS level 0 data through level 2 using a Nebula virtual computer and a local Linux computer. The result shows that Nebula has significantly better performance than the local machine. Much of the difference was due to newer equipment in the Nebula than the legacy computer, which is suggestive of a potential economic advantage beyond elastic power, i.e., access to up-to-date hardware vs. legacy hardware that must be maintained past its prime to amortize the cost. In addition to a trade study of advantages and challenges of porting complex processing to the cloud, a tutorial was developed to enable further progress in utilizing the Nebula for Earth Science applications and understanding better the potential for Cloud Computing in further data- and computing-intensive Earth Science research. In particular, highly bursty computing such as that experienced in the user-demand-driven Giovanni system may become more tractable in a Cloud environment. Our future work will continue to focus on migrating more GES DISC's applications/instances, e.g. Giovanni instances, to the Nebula platform and making matured migrated applications to be in operation on the Nebula.

  8. A forward-adjoint operator pair based on the elastic wave equation for use in transcranial photoacoustic computed tomography

    PubMed Central

    Mitsuhashi, Kenji; Poudel, Joemini; Matthews, Thomas P.; Garcia-Uribe, Alejandro; Wang, Lihong V.; Anastasio, Mark A.

    2017-01-01

    Photoacoustic computed tomography (PACT) is an emerging imaging modality that exploits optical contrast and ultrasonic detection principles to form images of the photoacoustically induced initial pressure distribution within tissue. The PACT reconstruction problem corresponds to an inverse source problem in which the initial pressure distribution is recovered from measurements of the radiated wavefield. A major challenge in transcranial PACT brain imaging is compensation for aberrations in the measured data due to the presence of the skull. Ultrasonic waves undergo absorption, scattering and longitudinal-to-shear wave mode conversion as they propagate through the skull. To properly account for these effects, a wave-equation-based inversion method should be employed that can model the heterogeneous elastic properties of the skull. In this work, a forward model based on a finite-difference time-domain discretization of the three-dimensional elastic wave equation is established and a procedure for computing the corresponding adjoint of the forward operator is presented. Massively parallel implementations of these operators employing multiple graphics processing units (GPUs) are also developed. The developed numerical framework is validated and investigated in computer19 simulation and experimental phantom studies whose designs are motivated by transcranial PACT applications. PMID:29387291

  9. Running Neuroimaging Applications on Amazon Web Services: How, When, and at What Cost?

    PubMed Central

    Madhyastha, Tara M.; Koh, Natalie; Day, Trevor K. M.; Hernández-Fernández, Moises; Kelley, Austin; Peterson, Daniel J.; Rajan, Sabreena; Woelfer, Karl A.; Wolf, Jonathan; Grabowski, Thomas J.

    2017-01-01

    The contribution of this paper is to identify and describe current best practices for using Amazon Web Services (AWS) to execute neuroimaging workflows “in the cloud.” Neuroimaging offers a vast set of techniques by which to interrogate the structure and function of the living brain. However, many of the scientists for whom neuroimaging is an extremely important tool have limited training in parallel computation. At the same time, the field is experiencing a surge in computational demands, driven by a combination of data-sharing efforts, improvements in scanner technology that allow acquisition of images with higher image resolution, and by the desire to use statistical techniques that stress processing requirements. Most neuroimaging workflows can be executed as independent parallel jobs and are therefore excellent candidates for running on AWS, but the overhead of learning to do so and determining whether it is worth the cost can be prohibitive. In this paper we describe how to identify neuroimaging workloads that are appropriate for running on AWS, how to benchmark execution time, and how to estimate cost of running on AWS. By benchmarking common neuroimaging applications, we show that cloud computing can be a viable alternative to on-premises hardware. We present guidelines that neuroimaging labs can use to provide a cluster-on-demand type of service that should be familiar to users, and scripts to estimate cost and create such a cluster. PMID:29163119

  10. Unidata cyberinfrastructure in the cloud: A progress report

    NASA Astrophysics Data System (ADS)

    Ramamurthy, Mohan

    2016-04-01

    Data services, software, and committed support are critical components of geosciences cyber-infrastructure that can help scientists address problems of unprecedented complexity, scale, and scope. Unidata is currently working on innovative ideas, new paradigms, and novel techniques to complement and extend its offerings. Our goal is to empower users so that they can tackle major, heretofore difficult problems. Unidata recognizes that its products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. To realize the above vision, Unidata is working toward: * Providing access to many types of data from a cloud (e.g., TDS, RAMADDA and EDEX); * Deploying data-proximate tools to easily process, analyze and visualize those data in a cloud environment cloud for consumption by any one, by any device, from anywhere, at any time; * Developing and providing a range of pre-configured and well-integrated tools and services that can be deployed by any university in their own private or public cloud settings. Specifically, Unidata has developed Docker for "containerized applications", making them easy to deploy. Docker helps to create "disposable" installs and eliminates many configuration challenges. Containerized applications include tools for data transport, access, analysis, and visualization: THREDDS Data Server, Integrated Data Viewer, GEMPAK, Local Data Manager, RAMADDA Data Server, and Python tools; * Fostering partnerships with NOAA and public cloud vendors (e.g., Amazon) to harness their capabilities and resources for the benefit of the academic community.

  11. Understanding the Performance and Potential of Cloud Computing for Scientific Applications

    DOE PAGES

    Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin; ...

    2015-02-19

    In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less

  12. Understanding the Performance and Potential of Cloud Computing for Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin

    In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less

  13. Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application

    NASA Astrophysics Data System (ADS)

    Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.

    2013-12-01

    The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.

  14. What We Can Learn from Amazon for Clinical Decision Support Systems.

    PubMed

    Abid, Sidra; Keshavjee, Karim; Karim, Arsalan; Guergachi, Aziz

    2017-01-01

    Health care continue to lag behind other industries, such as retail and financial services, in the use of decision-support-like tools. Amazon is particularly prolific in the use of advanced predictive and prescriptive analytics to assist its customers to purchase more, while increasing satisfaction, retention, repeat-purchases and loyalty. How can we do the same in health care? In this paper, we explore various elements of the Amazon website and Amazon's data science and big data practices to gather inspiration for re-designing clinical decision support in the health care sector. For each Amazon element we identified, we present one or more clinical applications to help us better understand where Amazon's.

  15. An Open-Source Toolbox for Surrogate Modeling of Joint Contact Mechanics

    PubMed Central

    Eskinazi, Ilan

    2016-01-01

    Goal Incorporation of elastic joint contact models into simulations of human movement could facilitate studying the interactions between muscles, ligaments, and bones. Unfortunately, elastic joint contact models are often too expensive computationally to be used within iterative simulation frameworks. This limitation can be overcome by using fast and accurate surrogate contact models that fit or interpolate input-output data sampled from existing elastic contact models. However, construction of surrogate contact models remains an arduous task. The aim of this paper is to introduce an open-source program called Surrogate Contact Modeling Toolbox (SCMT) that facilitates surrogate contact model creation, evaluation, and use. Methods SCMT interacts with the third party software FEBio to perform elastic contact analyses of finite element models and uses Matlab to train neural networks that fit the input-output contact data. SCMT features sample point generation for multiple domains, automated sampling, sample point filtering, and surrogate model training and testing. Results An overview of the software is presented along with two example applications. The first example demonstrates creation of surrogate contact models of artificial tibiofemoral and patellofemoral joints and evaluates their computational speed and accuracy, while the second demonstrates the use of surrogate contact models in a forward dynamic simulation of an open-chain leg extension-flexion motion. Conclusion SCMT facilitates the creation of computationally fast and accurate surrogate contact models. Additionally, it serves as a bridge between FEBio and OpenSim musculoskeletal modeling software. Significance Researchers may now create and deploy surrogate models of elastic joint contact with minimal effort. PMID:26186761

  16. Fabrication of a biomimetic elastic intervertebral disk scaffold using additive manufacturing.

    PubMed

    Whatley, Benjamin R; Kuo, Jonathan; Shuai, Cijun; Damon, Brooke J; Wen, Xuejun

    2011-03-01

    A custom-designed three-dimensional additive manufacturing device was developed to fabricate scaffolds for intervertebral disk (IVD) regeneration. This technique integrated a computer with a device capable of 3D movement allowing for precise motion and control over the polymer scaffold resolution. IVD scaffold structures were designed using computer-aided design to resemble the natural IVD structure. Degradable polyurethane (PU) was used as an elastic scaffold construct to mimic the elastic nature of the native IVD tissue and was deposited at a controlled rate using ultra-fine micropipettes connected to a syringe pump. The elastic PU was extruded directly onto a collecting substrate placed on a freezing stage. The three-dimensional movement of the computer-controlled device combined with the freezing stage enabled precise control of polymer deposition using extrusion. The addition of the freezing stage increased the polymer solution viscosity and hardened the polymer solution as it was extruded out of the micropipette tip. This technique created scaffolds with excellent control over macro- and micro-structure to influence cell behavior, specifically for cell adhesion, proliferation, and alignment. Concentric lamellae were printed at a high resolution to mimic the native shape and structure of the IVD. Seeded cells aligned along the concentric lamellae and acquired cell morphology similar to native tissue in the outer portion of the IVD. The fabricated scaffolds exhibited elastic behavior during compressive and shear testing, proving that the scaffolds could support loads with proper fatigue resistance without permanent deformation. Additionally, the mechanical properties of the scaffolds were comparable to those of native IVD tissue.

  17. The design, analysis and experimental evaluation of an elastic model wing

    NASA Technical Reports Server (NTRS)

    Cavin, R. K., III; Thisayakorn, C.

    1974-01-01

    An elastic orbiter model was developed to evaluate the effectiveness of aeroelasticity computer programs. The elasticity properties were introduced by constructing beam-like straight wings for the wind tunnel model. A standard influence coefficient mathematical model was used to estimate aeroelastic effects analytically. In general good agreement was obtained between the empirical and analytical estimates of the deformed shape. However, in the static aeroelasticity case, it was found that the physical wing exhibited less bending and more twist than was predicted by theory.

  18. Ab-initio study of electronic structure and elastic properties of ZrC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mund, H. S., E-mail: hmoond@gmail.com; Ahuja, B. L.

    2016-05-23

    The electronic and elastic properties of ZrC have been investigated using the linear combination of atomic orbitals method within the framework of density functional theory. Different exchange-correlation functionals are taken into account within generalized gradient approximation. We have computed energy bands, density of states, elastic constants, bulk modulus, shear modulus, Young’s modulus, Poisson’s ratio, lattice parameters and pressure derivative of the bulk modulus by calculating ground state energy of the rock salt structure type ZrC.

  19. Vulnerability of Amazon forests to storm-driven tree mortality

    NASA Astrophysics Data System (ADS)

    Negrón-Juárez, Robinson I.; Holm, Jennifer A.; Magnabosco Marra, Daniel; Rifai, Sami W.; Riley, William J.; Chambers, Jeffrey Q.; Koven, Charles D.; Knox, Ryan G.; McGroddy, Megan E.; Di Vittorio, Alan V.; Urquiza-Muñoz, Jose; Tello-Espinoza, Rodil; Alegria Muñoz, Waldemar; Ribeiro, Gabriel H. P. M.; Higuchi, Niro

    2018-05-01

    Tree mortality is a key driver of forest community composition and carbon dynamics. Strong winds associated with severe convective storms are dominant natural drivers of tree mortality in the Amazon. Why forests vary with respect to their vulnerability to wind events and how the predicted increase in storm events might affect forest ecosystems within the Amazon are not well understood. We found that windthrows are common in the Amazon region extending from northwest (Peru, Colombia, Venezuela, and west Brazil) to central Brazil, with the highest occurrence of windthrows in the northwest Amazon. More frequent winds, produced by more frequent severe convective systems, in combination with well-known processes that limit the anchoring of trees in the soil, help to explain the higher vulnerability of the northwest Amazon forests to winds. Projected increases in the frequency and intensity of convective storms in the Amazon have the potential to increase wind-related tree mortality. A forest demographic model calibrated for the northwestern and the central Amazon showed that northwestern forests are more resilient to increased wind-related tree mortality than forests in the central Amazon. Our study emphasizes the importance of including wind-related tree mortality in model simulations for reliable predictions of the future of tropical forests and their effects on the Earth’ system.

  20. iRODS-Based Climate Data Services and Virtualization-as-a-Service in the NASA Center for Climate Simulation

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; Tamkin, G. S.; Strong, S.; Ripley, D.; Gill, R.; Sinno, S. S.; Shen, Y.; Carriere, L. E.; Brieger, L.; Moore, R.; Rajasekar, A.; Schroeder, W.; Wan, M.

    2011-12-01

    Scientific data services are becoming an important part of the NASA Center for Climate Simulation's mission. Our technological response to this expanding role is built around the concept of specialized virtual climate data servers, repetitive cloud provisioning, image-based deployment and distribution, and virtualization-as-a-service. A virtual climate data server is an OAIS-compliant, iRODS-based data server designed to support a particular type of scientific data collection. iRODS is data grid middleware that provides policy-based control over collection-building, managing, querying, accessing, and preserving large scientific data sets. We have developed prototype vCDSs to manage NetCDF, HDF, and GeoTIF data products. We use RPM scripts to build vCDS images in our local computing environment, our local Virtual Machine Environment, NASA's Nebula Cloud Services, and Amazon's Elastic Compute Cloud. Once provisioned into these virtualized resources, multiple vCDSs can use iRODS's federation and realized object capabilities to create an integrated ecosystem of data servers that can scale and adapt to changing requirements. This approach enables platform- or software-as-a-service deployment of the vCDSs and allows the NCCS to offer virtualization-as-a-service, a capacity to respond in an agile way to new customer requests for data services, and a path for migrating existing services into the cloud. We have registered MODIS Atmosphere data products in a vCDS that contains 54 million registered files, 630TB of data, and over 300 million metadata values. We are now assembling IPCC AR5 data into a production vCDS that will provide the platform upon which NCCS's Earth System Grid (ESG) node publishes to the extended science community. In this talk, we describe our approach, experiences, lessons learned, and plans for the future.

  1. Pubcast and Genecast: Browsing and Exploring Publications and Associated Curated Content in Biology Through Mobile Devices.

    PubMed

    Goldweber, Scott; Theodore, Jamal; Torcivia-Rodriguez, John; Simonyan, Vahan; Mazumder, Raja

    2017-01-01

    Services such as Facebook, Amazon, and eBay were once solely accessed from stationary computers. These web services are now being used increasingly on mobile devices. We acknowledge this new reality by providing users a way to access publications and a curated cancer mutation database on their mobile device with daily automated updates. http://hive. biochemistry.gwu.edu/tools/HivePubcast.

  2. Collecting Response Times using Amazon Mechanical Turk and Adobe Flash

    PubMed Central

    Simcox, Travis; Fiez, Julie A.

    2017-01-01

    Crowdsourcing systems like Amazon's Mechanical Turk (AMT) allow data to be collected from a large sample of people in a short amount of time. This use has garnered considerable interest from behavioral scientists. So far, most experiments conducted on AMT have focused on survey-type instruments because of difficulties inherent in running many experimental paradigms over the Internet. This article investigated the viability of presenting stimuli and collecting response times using Adobe Flash to run ActionScript 3 code in conjunction with AMT. First, the timing properties of Adobe Flash were investigated using a phototransistor and two desktop computers running under several conditions mimicking those that may be present in research using AMT. This experiment revealed some strengths and weaknesses of the timing capabilities of this method. Next, a flanker task and a lexical decision task implemented in Adobe Flash were administered to participants recruited with AMT. The expected effects in these tasks were replicated. Power analyses were conducted to describe the number of participants needed to replicate these effects. A questionnaire was used to investigate previously undescribed computer use habits of 100 participants on AMT. We conclude that a Flash program in conjunction with AMT can be successfully used for running many experimental paradigms that rely on response times, although experimenters must understand the limitations of the method. PMID:23670340

  3. Recent advances in engineering science; Proceedings of the A. Cemal Eringen Symposium, University of California, Berkeley, June 20-22, 1988

    NASA Technical Reports Server (NTRS)

    Koh, Severino L. (Editor); Speziale, Charles G. (Editor)

    1989-01-01

    Various papers on recent advances in engineering science are presented. Some individual topics addressed include: advances in adaptive methods in computational fluid mechanics, mixtures of two medicomorphic materials, computer tests of rubber elasticity, shear bands in isotropic micropolar elastic materials, nonlinear surface wave and resonator effects in magnetostrictive crystals, simulation of electrically enhanced fibrous filtration, plasticity theory of granular materials, dynamics of viscoelastic media with internal oscillators, postcritical behavior of a cantilever bar, boundary value problems in nonlocal elasticity, stability of flexible structures with random parameters, electromagnetic tornadoes in earth's ionosphere and magnetosphere, helicity fluctuations and the energy cascade in turbulence, mechanics of interfacial zones in bonded materials, propagation of a normal shock in a varying area duct, analytical mechanics of fracture and fatigue.

  4. Differences in xylem and leaf hydraulic traits explain differences in drought tolerance among mature Amazon rainforest trees.

    PubMed

    Powell, Thomas L; Wheeler, James K; de Oliveira, Alex A R; da Costa, Antonio Carlos Lola; Saleska, Scott R; Meir, Patrick; Moorcroft, Paul R

    2017-10-01

    Considerable uncertainty surrounds the impacts of anthropogenic climate change on the composition and structure of Amazon forests. Building upon results from two large-scale ecosystem drought experiments in the eastern Brazilian Amazon that observed increases in mortality rates among some tree species but not others, in this study we investigate the physiological traits underpinning these differential demographic responses. Xylem pressure at 50% conductivity (xylem-P 50 ), leaf turgor loss point (TLP), cellular osmotic potential (π o ), and cellular bulk modulus of elasticity (ε), all traits mechanistically linked to drought tolerance, were measured on upper canopy branches and leaves of mature trees from selected species growing at the two drought experiment sites. Each species was placed a priori into one of four plant functional type (PFT) categories: drought-tolerant versus drought-intolerant based on observed mortality rates, and subdivided into early- versus late-successional based on wood density. We tested the hypotheses that the measured traits would be significantly different between the four PFTs and that they would be spatially conserved across the two experimental sites. Xylem-P 50 , TLP, and π o , but not ε, occurred at significantly higher water potentials for the drought-intolerant PFT compared to the drought-tolerant PFT; however, there were no significant differences between the early- and late-successional PFTs. These results suggest that these three traits are important for determining drought tolerance, and are largely independent of wood density-a trait commonly associated with successional status. Differences in these physiological traits that occurred between the drought-tolerant and drought-intolerant PFTs were conserved between the two research sites, even though they had different soil types and dry-season lengths. This more detailed understanding of how xylem and leaf hydraulic traits vary between co-occuring drought-tolerant and drought-intolerant tropical tree species promises to facilitate a much-needed improvement in the representation of plant hydraulics within terrestrial ecosystem and biosphere models, which will enhance our ability to make robust predictions of how future changes in climate will affect tropical forests. © 2017 John Wiley & Sons Ltd.

  5. Singularity-free dislocation dynamics with strain gradient elasticity

    NASA Astrophysics Data System (ADS)

    Po, Giacomo; Lazar, Markus; Seif, Dariush; Ghoniem, Nasr

    2014-08-01

    The singular nature of the elastic fields produced by dislocations presents conceptual challenges and computational difficulties in the implementation of discrete dislocation-based models of plasticity. In the context of classical elasticity, attempts to regularize the elastic fields of discrete dislocations encounter intrinsic difficulties. On the other hand, in gradient elasticity, the issue of singularity can be removed at the outset and smooth elastic fields of dislocations are available. In this work we consider theoretical and numerical aspects of the non-singular theory of discrete dislocation loops in gradient elasticity of Helmholtz type, with interest in its applications to three dimensional dislocation dynamics (DD) simulations. The gradient solution is developed and compared to its singular and non-singular counterparts in classical elasticity using the unified framework of eigenstrain theory. The fundamental equations of curved dislocation theory are given as non-singular line integrals suitable for numerical implementation using fast one-dimensional quadrature. These include expressions for the interaction energy between two dislocation loops and the line integral form of the generalized solid angle associated with dislocations having a spread core. The single characteristic length scale of Helmholtz elasticity is determined from independent molecular statics (MS) calculations. The gradient solution is implemented numerically within our variational formulation of DD, with several examples illustrating the viability of the non-singular solution. The displacement field around a dislocation loop is shown to be smooth, and the loop self-energy non-divergent, as expected from atomic configurations of crystalline materials. The loop nucleation energy barrier and its dependence on the applied shear stress are computed and shown to be in good agreement with atomistic calculations. DD simulations of Lome-Cottrell junctions in Al show that the strength of the junction and its configuration are easily obtained, without ad-hoc regularization of the singular fields. Numerical convergence studies related to the implementation of the non-singular theory in DD are presented.

  6. Biomechanical implications of cortical elastic properties of the macaque mandible.

    PubMed

    Dechow, Paul C; Panagiotopoulou, Olga; Gharpure, Poorva

    2017-10-01

    Knowledge of the variation in the elastic properties of mandibular cortical bone is essential for modeling bone function. Our aim was to characterize the elastic properties of rhesus macaque mandibular cortical bone and compare these to the elastic properties from mandibles of dentate humans and baboons. Thirty cylindrical samples were harvested from each of six adult female rhesus monkey mandibles. Assuming orthotropy, axes of maximum stiffness in the plane of the cortical plate were derived from ultrasound velocity measurements. Further velocity measurements with longitudinal and transverse ultrasonic transducers along with measurements of bone density were used to compute three-dimensional cortical elastic properties using equations based on Hooke's law. Results showed regional variations in the elastic properties of macaque mandibular cortical bone that have both similarities and differences with that of humans and baboons. So far, the biological and structural basis of these differences is poorly understood. Copyright © 2017 Elsevier GmbH. All rights reserved.

  7. Molecular characterization of an earliest cacao (Theobroma cacao L.) collection from Peruvian Amazon using microsatllite DNA markers

    USDA-ARS?s Scientific Manuscript database

    Cacao (Theobroma cacao L.) is indigenous to the Amazon region of South America. The Peruvian Amazon harbors a large number of diverse cacao populations. Since the 1930s, several numbers of populations have been collected from the Peruvian Amazon and maintained as ex situ germplasm repositories in ...

  8. Community Cloud Computing

    NASA Astrophysics Data System (ADS)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Jin -Ho

    Amazon rainfall is subject to year-to-year fluctuation resulting in drought and flood in various intensities. A major climatic driver of the interannual variation of the Amazon rainfall is El Niño/Southern Oscillation. Also, the Sea Surface Temperature over the Atlantic Ocean is identified as an important climatic driver on the Amazon water cycle. Previously, observational datasets were used to support the Atlantic influence on Amazon rainfall. Furthermore, it is found that multiple global climate models do reproduce the Atlantic-Amazon link robustly. However, there exist differences in rainfall response, which primarily depends on the climatological rainfall amount.

  10. Reconstruction of the Amazon Basin effective moisture availability over the past 14,000 years.

    PubMed

    Maslin, M A; Burns, S J

    2000-12-22

    Quantifying the moisture history of the Amazon Basin is essential for understanding the cause of rain forest diversity and its potential as a methane source. We reconstructed the Amazon River outflow history for the past 14,000 years to provide a moisture budget for the river drainage basin. The oxygen isotopic composition of planktonic foraminifera recovered from a marine sediment core in a region of Amazon River discharge shows that the Amazon Basin was extremely dry during the Younger Dryas, with the discharge reduced by at least 40% as compared with that of today. After the Younger Dryas, a meltwater-driven discharge event was followed by a steady increase in the Amazon Basin effective moisture throughout the Holocene.

  11. A numerical approximation to the elastic properties of sphere-reinforced composites

    NASA Astrophysics Data System (ADS)

    Segurado, J.; Llorca, J.

    2002-10-01

    Three-dimensional cubic unit cells containing 30 non-overlapping identical spheres randomly distributed were generated using a new, modified random sequential adsortion algorithm suitable for particle volume fractions of up to 50%. The elastic constants of the ensemble of spheres embedded in a continuous and isotropic elastic matrix were computed through the finite element analysis of the three-dimensional periodic unit cells, whose size was chosen as a compromise between the minimum size required to obtain accurate results in the statistical sense and the maximum one imposed by the computational cost. Three types of materials were studied: rigid spheres and spherical voids in an elastic matrix and a typical composite made up of glass spheres in an epoxy resin. The moduli obtained for different unit cells showed very little scatter, and the average values obtained from the analysis of four unit cells could be considered very close to the "exact" solution to the problem, in agreement with the results of Drugan and Willis (J. Mech. Phys. Solids 44 (1996) 497) referring to the size of the representative volume element for elastic composites. They were used to assess the accuracy of three classical analytical models: the Mori-Tanaka mean-field analysis, the generalized self-consistent method, and Torquato's third-order approximation.

  12. Sea surface salinity fronts in the Tropical Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Ruiz-Etcheverry, L.; Maximenko, N. A.; Melnichenko, O.

    2016-12-01

    Marine fronts are narrow boundaries that separate water masses of different properties. These fronts are caused by various forcing and believed to be an important component of the coupled ocean-atmosphere system, particularly in the tropical oceans. In this study, we use sea surface salinity (SSS) observations from Aquarius satellite to investigate the spatial structure and temporal variability of SSS fronts in the tropical Atlantic. A number of frontal features have been identified. The mean magnitude of the SSS gradient is maximum near the mouth of the Congo River (0.3-0.4 psu/100km). Relative maxima are also observed in the Inter Tropical Convergence Zone (ITCZ), the Gulf of Guinea, and the mouth of the Amazon River. The pattern of the magnitude of the SSS anomaly gradient revealed that the interaction between river plumes and saltier interior water is complex and highly variable during the three-year observation period. The variability of the magnitude of the density anomaly gradient computed from Aquarius SSS and Reynolds SST is also discussed. Images of the ocean color are utilized to trace the movement of the Congo and Amazon River plumes and compare them with the magnitude of the SSS gradient. Additionally, we analyze de circulation associated with the Amazon plume with altimetry data, and the vertical structure and its changes in time through Argo profiles.

  13. Use of several Cloud Computing approaches for climate modelling: performance, costs and opportunities

    NASA Astrophysics Data System (ADS)

    Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.

    2017-04-01

    Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on the trends over the next years to consolidate Cloud as the preferred solution.

  14. Ice Engineering - study of Related Properties of Floating Sea-Ice Sheets and Summary of Elastic and Viscoelastic Analyses

    DTIC Science & Technology

    1977-12-01

    Ice Plate Example. To demonstrate the capability of the visco- elastic finite-element computer code (5), the structural response of an infinite ... sea -ice plate on a fluid foundation is investigated for a simulated aircraft loading condition and, using relaxation functions, is determined

  15. Numerical study of interfacial solitary waves propagating under an elastic sheet

    PubMed Central

    Wang, Zhan; Părău, Emilian I.; Milewski, Paul A.; Vanden-Broeck, Jean-Marc

    2014-01-01

    Steady solitary and generalized solitary waves of a two-fluid problem where the upper layer is under a flexible elastic sheet are considered as a model for internal waves under an ice-covered ocean. The fluid consists of two layers of constant densities, separated by an interface. The elastic sheet resists bending forces and is mathematically described by a fully nonlinear thin shell model. Fully localized solitary waves are computed via a boundary integral method. Progression along the various branches of solutions shows that barotropic (i.e. surface modes) wave-packet solitary wave branches end with the free surface approaching the interface. On the other hand, the limiting configurations of long baroclinic (i.e. internal) solitary waves are characterized by an infinite broadening in the horizontal direction. Baroclinic wave-packet modes also exist for a large range of amplitudes and generalized solitary waves are computed in a case of a long internal mode in resonance with surface modes. In contrast to the pure gravity case (i.e without an elastic cover), these generalized solitary waves exhibit new Wilton-ripple-like periodic trains in the far field. PMID:25104909

  16. Approximate non-linear multiparameter inversion for multicomponent single and double P-wave scattering in isotropic elastic media

    NASA Astrophysics Data System (ADS)

    Ouyang, Wei; Mao, Weijian

    2018-03-01

    An asymptotic quadratic true-amplitude inversion method for isotropic elastic P waves is proposed to invert medium parameters. The multicomponent P-wave scattered wavefield is computed based on a forward relationship using second-order Born approximation and corresponding high-frequency ray theoretical methods. Within the local double scattering mechanism, the P-wave transmission factors are elaborately calculated, which results in the radiation pattern for P-waves scattering being a quadratic combination of the density and Lamé's moduli perturbation parameters. We further express the elastic P-wave scattered wavefield in a form of generalized Radon transform (GRT). After introducing classical backprojection operators, we obtain an approximate solution of the inverse problem by solving a quadratic non-linear system. Numerical tests with synthetic data computed by finite-differences scheme demonstrate that our quadratic inversion can accurately invert perturbation parameters for strong perturbations, compared with the P-wave single-scattering linear inversion method. Although our inversion strategy here is only syncretized with P-wave scattering, it can be extended to invert multicomponent elastic data containing both P-wave and S-wave information.

  17. Elastic properties of dense solid phases of hard cyclic pentamers and heptamers in two dimensions.

    PubMed

    Wojciechowski, K W; Tretiakov, K V; Kowalik, M

    2003-03-01

    Systems of model planar, nonconvex, hard-body "molecules" of fivefold and sevenfold symmetry axes are studied by constant pressure Monte Carlo simulations with variable shape of the periodic box. The molecules, referred to as pentamers (heptamers), are composed of five (seven) identical hard disks "atoms" with centers forming regular pentagons (heptagons) of sides equal to the disk diameter. The elastic compliances of defect-free solid phases are computed by analysis of strain fluctuations and the reference (equilibrium) state is determined within the same run in which the elastic properties are computed. Results obtained by using pseudorandom number generators based on the idea proposed by Holian and co-workers [Holian et al., Phys. Rev. E 50, 1607 (1994)] are in good agreement with the results generated by DRAND48. It is shown that singular behavior of the elastic constants near close packing is in agreement with the free volume approximation; the coefficients of the leading singularities are estimated. The simulations prove that the highest density structures of heptamers (in which the molecules cannot rotate) are auxetic, i.e., show negative Poisson ratios.

  18. Approximate nonlinear multiparameter inversion for multicomponent single and double P-wave scattering in isotropic elastic media

    NASA Astrophysics Data System (ADS)

    Ouyang, Wei; Mao, Weijian

    2018-07-01

    An asymptotic quadratic true-amplitude inversion method for isotropic elastic P waves is proposed to invert medium parameters. The multicomponent P-wave scattered wavefield is computed based on a forward relationship using second-order Born approximation and corresponding high-frequency ray theoretical methods. Within the local double scattering mechanism, the P-wave transmission factors are elaborately calculated, which results in the radiation pattern for P-wave scattering being a quadratic combination of the density and Lamé's moduli perturbation parameters. We further express the elastic P-wave scattered wavefield in a form of generalized Radon transform. After introducing classical backprojection operators, we obtain an approximate solution of the inverse problem by solving a quadratic nonlinear system. Numerical tests with synthetic data computed by finite-differences scheme demonstrate that our quadratic inversion can accurately invert perturbation parameters for strong perturbations, compared with the P-wave single-scattering linear inversion method. Although our inversion strategy here is only syncretized with P-wave scattering, it can be extended to invert multicomponent elastic data containing both P- and S-wave information.

  19. A fast mass spring model solver for high-resolution elastic objects

    NASA Astrophysics Data System (ADS)

    Zheng, Mianlun; Yuan, Zhiyong; Zhu, Weixu; Zhang, Guian

    2017-03-01

    Real-time simulation of elastic objects is of great importance for computer graphics and virtual reality applications. The fast mass spring model solver can achieve visually realistic simulation in an efficient way. Unfortunately, this method suffers from resolution limitations and lack of mechanical realism for a surface geometry model, which greatly restricts its application. To tackle these problems, in this paper we propose a fast mass spring model solver for high-resolution elastic objects. First, we project the complex surface geometry model into a set of uniform grid cells as cages through *cages mean value coordinate method to reflect its internal structure and mechanics properties. Then, we replace the original Cholesky decomposition method in the fast mass spring model solver with a conjugate gradient method, which can make the fast mass spring model solver more efficient for detailed surface geometry models. Finally, we propose a graphics processing unit accelerated parallel algorithm for the conjugate gradient method. Experimental results show that our method can realize efficient deformation simulation of 3D elastic objects with visual reality and physical fidelity, which has a great potential for applications in computer animation.

  20. Nuclear-relaxed elastic and piezoelectric constants of materials: Computational aspects of two quantum-mechanical approaches.

    PubMed

    Erba, Alessandro; Caglioti, Dominique; Zicovich-Wilson, Claudio Marcelo; Dovesi, Roberto

    2017-02-15

    Two alternative approaches for the quantum-mechanical calculation of the nuclear-relaxation term of elastic and piezoelectric tensors of crystalline materials are illustrated and their computational aspects discussed: (i) a numerical approach based on the geometry optimization of atomic positions at strained lattice configurations and (ii) a quasi-analytical approach based on the evaluation of the force- and displacement-response internal-strain tensors as combined with the interatomic force-constant matrix. The two schemes are compared both as regards their computational accuracy and performance. The latter approach, not being affected by the many numerical parameters and procedures of a typical quasi-Newton geometry optimizer, constitutes a more reliable and robust mean to the evaluation of such properties, at a reduced computational cost for most crystalline systems. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Finite difference elastic wave modeling with an irregular free surface using ADER scheme

    NASA Astrophysics Data System (ADS)

    Almuhaidib, Abdulaziz M.; Nafi Toksöz, M.

    2015-06-01

    In numerical modeling of seismic wave propagation in the earth, we encounter two important issues: the free surface and the topography of the surface (i.e. irregularities). In this study, we develop a 2D finite difference solver for the elastic wave equation that combines a 4th- order ADER scheme (Arbitrary high-order accuracy using DERivatives), which is widely used in aeroacoustics, with the characteristic variable method at the free surface boundary. The idea is to treat the free surface boundary explicitly by using ghost values of the solution for points beyond the free surface to impose the physical boundary condition. The method is based on the velocity-stress formulation. The ultimate goal is to develop a numerical solver for the elastic wave equation that is stable, accurate and computationally efficient. The solver treats smooth arbitrary-shaped boundaries as simple plane boundaries. The computational cost added by treating the topography is negligible compared to flat free surface because only a small number of grid points near the boundary need to be computed. In the presence of topography, using 10 grid points per shortest shear-wavelength, the solver yields accurate results. Benchmark numerical tests using several complex models that are solved by our method and other independent accurate methods show an excellent agreement, confirming the validity of the method for modeling elastic waves with an irregular free surface.

  2. On the elastic–plastic decomposition of crystal deformation at the atomic scale

    DOE PAGES

    Stukowski, Alexander; Arsenlis, A.

    2012-03-02

    Given two snapshots of an atomistic system, taken at different stages of the deformation process, one can compute the incremental deformation gradient field, F, as defined by continuum mechanics theory, from the displacements of atoms. However, such a kinematic analysis of the total deformation does not reveal the respective contributions of elastic and plastic deformation. We develop a practical technique to perform the multiplicative decomposition of the deformation field, F = F eF p, into elastic and plastic parts for the case of crystalline materials. The described computational analysis method can be used to quantify plastic deformation in a materialmore » due to crystal slip-based mechanisms in molecular dynamics and molecular statics simulations. The knowledge of the plastic deformation field, F p, and its variation with time can provide insight into the number, motion and localization of relevant crystal defects such as dislocations. As a result, the computed elastic field, F e, provides information about inhomogeneous lattice strains and lattice rotations induced by the presence of defects.« less

  3. Computational study of Drucker-Prager plasticity of rock using microtomography

    NASA Astrophysics Data System (ADS)

    Liu, J.; Sarout, J.; Zhang, M.; Dautriat, J.; Veveakis, M.; Regenauer-Lieb, K.

    2016-12-01

    Understanding the physics of rocks is essential for the industry of mining and petroleum. Microtomography provides a new way to quantify the relationship between the microstructure and their mechanical and transport properties. Transport and elastic properties have been studied widely while plastic properties are still poorly understood. In this study, we analyse a synthetic sandstone sample for its up-scaled plastic properties from the micro-scale. The computations are based on the representative volume element (RVE). The mechanical RVE was determined by the upper and lower bound finite element computations of elasticity. By comparing with experimental curves, the parameters of the matrix (solid part), which consists of calcite-cemented quartz grains, were investigated and quite accurate values obtained. Analyses deduced the bulk properties of yield stress, cohesion and the angle of friction of the rock with pores. Computations of a series of models of volume-sizes from 240-cube to 400-cube showed almost overlapped stress-strain curves, suggesting that the mechanical RVE determined by elastic computations is valid for plastic yielding. Furthermore, a series of derivative models were created which have similar structure but different porosity values. The analyses of these models showed that yield stress, cohesion and the angle of friction linearly decrease with the porosity increasing in the range of porosity from 8% to 28%. The angle of friction decreases the fastest and cohesion shows the most stable along with porosity.

  4. The Amazon Region; A Vision of Sovereignty

    DTIC Science & Technology

    1998-04-06

    and SPOT remote sensing satellites images, about 90% of the Amazon jungle remains almost untouched9. This 280 million hectares of vegetation hold...increasing energy needs, remain unanswered. Indian rights Has the Indian population been jeopardized by the development of the Amazon Region...or government agency. STRATEGY RESEARCH PROJECT THE AMAZON REGION; A VISION OF SOVEREIGNTY BY LIEUTENANT COLONEL EDUARDO JOSE BARBOSA

  5. Modelling Ecuador's rainfall distribution according to geographical characteristics.

    NASA Astrophysics Data System (ADS)

    Tobar, Vladimiro; Wyseure, Guido

    2017-04-01

    It is known that rainfall is affected by terrain characteristics and some studies had focussed on its distribution over complex terrain. Ecuador's temporal and spatial rainfall distribution is affected by its location on the ITCZ, the marine currents in the Pacific, the Amazon rainforest, and the Andes mountain range. Although all these factors are important, we think that the latter one may hold a key for modelling spatial and temporal distribution of rainfall. The study considered 30 years of monthly data from 319 rainfall stations having at least 10 years of data available. The relatively low density of stations and their location in accessible sites near to main roads or rivers, leave large and important areas ungauged, making it not appropriate to rely on traditional interpolation techniques to estimate regional rainfall for water balance. The aim of this research was to come up with a useful model for seasonal rainfall distribution in Ecuador based on geographical characteristics to allow its spatial generalization. The target for modelling was the seasonal rainfall, characterized by nine percentiles for each one of the 12 months of the year that results in 108 response variables, later on reduced to four principal components comprising 94% of the total variability. Predictor variables for the model were: geographic coordinates, elevation, main wind effects from the Amazon and Coast, Valley and Hill indexes, and average and maximum elevation above the selected rainfall station to the east and to the west, for each one of 18 directions (50-135°, by 5°) adding up to 79 predictors. A multiple linear regression model by the Elastic-net algorithm with cross-validation was applied for each one of the PC as response to select the most important ones from the 79 predictor variables. The Elastic-net algorithm deals well with collinearity problems, while allowing variable selection in a blended approach between the Ridge and Lasso regression. The model fitting produced explained variances of 59%, 81%, 49% and 17% for PC1, PC2, PC3 and PC4, respectively, backing up the hypothesis of good correlation between geographical characteristics and seasonal rainfall patterns (comprised in the four principal components). With the obtained coefficients from the regression, the 108 rainfall percentiles for each station were back estimated giving very good results when compared with the original ones, with an overall 60% explained variance.

  6. Self-amplified Amazon forest loss due to vegetation-atmosphere feedbacks.

    PubMed

    Zemp, Delphine Clara; Schleussner, Carl-Friedrich; Barbosa, Henrique M J; Hirota, Marina; Montade, Vincent; Sampaio, Gilvan; Staal, Arie; Wang-Erlandsson, Lan; Rammig, Anja

    2017-03-13

    Reduced rainfall increases the risk of forest dieback, while in return forest loss might intensify regional droughts. The consequences of this vegetation-atmosphere feedback for the stability of the Amazon forest are still unclear. Here we show that the risk of self-amplified Amazon forest loss increases nonlinearly with dry-season intensification. We apply a novel complex-network approach, in which Amazon forest patches are linked by observation-based atmospheric water fluxes. Our results suggest that the risk of self-amplified forest loss is reduced with increasing heterogeneity in the response of forest patches to reduced rainfall. Under dry-season Amazonian rainfall reductions, comparable to Last Glacial Maximum conditions, additional forest loss due to self-amplified effects occurs in 10-13% of the Amazon basin. Although our findings do not indicate that the projected rainfall changes for the end of the twenty-first century will lead to complete Amazon dieback, they suggest that frequent extreme drought events have the potential to destabilize large parts of the Amazon forest.

  7. Self-amplified Amazon forest loss due to vegetation-atmosphere feedbacks

    NASA Astrophysics Data System (ADS)

    Zemp, Delphine Clara; Schleussner, Carl-Friedrich; Barbosa, Henrique M. J.; Hirota, Marina; Montade, Vincent; Sampaio, Gilvan; Staal, Arie; Wang-Erlandsson, Lan; Rammig, Anja

    2017-03-01

    Reduced rainfall increases the risk of forest dieback, while in return forest loss might intensify regional droughts. The consequences of this vegetation-atmosphere feedback for the stability of the Amazon forest are still unclear. Here we show that the risk of self-amplified Amazon forest loss increases nonlinearly with dry-season intensification. We apply a novel complex-network approach, in which Amazon forest patches are linked by observation-based atmospheric water fluxes. Our results suggest that the risk of self-amplified forest loss is reduced with increasing heterogeneity in the response of forest patches to reduced rainfall. Under dry-season Amazonian rainfall reductions, comparable to Last Glacial Maximum conditions, additional forest loss due to self-amplified effects occurs in 10-13% of the Amazon basin. Although our findings do not indicate that the projected rainfall changes for the end of the twenty-first century will lead to complete Amazon dieback, they suggest that frequent extreme drought events have the potential to destabilize large parts of the Amazon forest.

  8. Fragmentation of Andes-to-Amazon connectivity by hydropower dams

    PubMed Central

    Anderson, Elizabeth P.; Jenkins, Clinton N.; Heilpern, Sebastian; Maldonado-Ocampo, Javier A.; Carvajal-Vallejos, Fernando M.; Encalada, Andrea C.; Rivadeneira, Juan Francisco; Hidalgo, Max; Cañas, Carlos M.; Ortega, Hernan; Salcedo, Norma; Maldonado, Mabel; Tedesco, Pablo A.

    2018-01-01

    Andes-to-Amazon river connectivity controls numerous natural and human systems in the greater Amazon. However, it is being rapidly altered by a wave of new hydropower development, the impacts of which have been previously underestimated. We document 142 dams existing or under construction and 160 proposed dams for rivers draining the Andean headwaters of the Amazon. Existing dams have fragmented the tributary networks of six of eight major Andean Amazon river basins. Proposed dams could result in significant losses in river connectivity in river mainstems of five of eight major systems—the Napo, Marañón, Ucayali, Beni, and Mamoré. With a newly reported 671 freshwater fish species inhabiting the Andean headwaters of the Amazon (>500 m), dams threaten previously unrecognized biodiversity, particularly among endemic and migratory species. Because Andean rivers contribute most of the sediment in the mainstem Amazon, losses in river connectivity translate to drastic alteration of river channel and floodplain geomorphology and associated ecosystem services. PMID:29399629

  9. Fragmentation of Andes-to-Amazon connectivity by hydropower dams.

    PubMed

    Anderson, Elizabeth P; Jenkins, Clinton N; Heilpern, Sebastian; Maldonado-Ocampo, Javier A; Carvajal-Vallejos, Fernando M; Encalada, Andrea C; Rivadeneira, Juan Francisco; Hidalgo, Max; Cañas, Carlos M; Ortega, Hernan; Salcedo, Norma; Maldonado, Mabel; Tedesco, Pablo A

    2018-01-01

    Andes-to-Amazon river connectivity controls numerous natural and human systems in the greater Amazon. However, it is being rapidly altered by a wave of new hydropower development, the impacts of which have been previously underestimated. We document 142 dams existing or under construction and 160 proposed dams for rivers draining the Andean headwaters of the Amazon. Existing dams have fragmented the tributary networks of six of eight major Andean Amazon river basins. Proposed dams could result in significant losses in river connectivity in river mainstems of five of eight major systems-the Napo, Marañón, Ucayali, Beni, and Mamoré. With a newly reported 671 freshwater fish species inhabiting the Andean headwaters of the Amazon (>500 m), dams threaten previously unrecognized biodiversity, particularly among endemic and migratory species. Because Andean rivers contribute most of the sediment in the mainstem Amazon, losses in river connectivity translate to drastic alteration of river channel and floodplain geomorphology and associated ecosystem services.

  10. Self-amplified Amazon forest loss due to vegetation-atmosphere feedbacks

    PubMed Central

    Zemp, Delphine Clara; Schleussner, Carl-Friedrich; Barbosa, Henrique M. J.; Hirota, Marina; Montade, Vincent; Sampaio, Gilvan; Staal, Arie; Wang-Erlandsson, Lan; Rammig, Anja

    2017-01-01

    Reduced rainfall increases the risk of forest dieback, while in return forest loss might intensify regional droughts. The consequences of this vegetation–atmosphere feedback for the stability of the Amazon forest are still unclear. Here we show that the risk of self-amplified Amazon forest loss increases nonlinearly with dry-season intensification. We apply a novel complex-network approach, in which Amazon forest patches are linked by observation-based atmospheric water fluxes. Our results suggest that the risk of self-amplified forest loss is reduced with increasing heterogeneity in the response of forest patches to reduced rainfall. Under dry-season Amazonian rainfall reductions, comparable to Last Glacial Maximum conditions, additional forest loss due to self-amplified effects occurs in 10–13% of the Amazon basin. Although our findings do not indicate that the projected rainfall changes for the end of the twenty-first century will lead to complete Amazon dieback, they suggest that frequent extreme drought events have the potential to destabilize large parts of the Amazon forest. PMID:28287104

  11. A Cloud-Based Infrastructure for Near-Real-Time Processing and Dissemination of NPP Data

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Valente, E. G.; Chettri, S. S.

    2011-12-01

    We are building a scalable cloud-based infrastructure for generating and disseminating near-real-time data products from a variety of geospatial and meteorological data sources, including the new National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP). Our approach relies on linking Direct Broadcast and other data streams to a suite of scientific algorithms coordinated by NASA's International Polar-Orbiter Processing Package (IPOPP). The resulting data products are directly accessible to a wide variety of end-user applications, via industry-standard protocols such as OGC Web Services, Unidata Local Data Manager, or OPeNDAP, using open source software components. The processing chain employs on-demand computing resources from Amazon.com's Elastic Compute Cloud and NASA's Nebula cloud services. Our current prototype targets short-term weather forecasting, in collaboration with NASA's Short-term Prediction Research and Transition (SPoRT) program and the National Weather Service. Direct Broadcast is especially crucial for NPP, whose current ground segment is unlikely to deliver data quickly enough for short-term weather forecasters and other near-real-time users. Direct Broadcast also allows full local control over data handling, from the receiving antenna to end-user applications: this provides opportunities to streamline processes for data ingest, processing, and dissemination, and thus to make interpreted data products (Environmental Data Records) available to practitioners within minutes of data capture at the sensor. Cloud computing lets us grow and shrink computing resources to meet large and rapid fluctuations in data availability (twice daily for polar orbiters) - and similarly large fluctuations in demand from our target (near-real-time) users. This offers a compelling business case for cloud computing: the processing or dissemination systems can grow arbitrarily large to sustain near-real time data access despite surges in data volumes or user demand, but that computing capacity (and hourly costs) can be dropped almost instantly once the surge passes. Cloud computing also allows low-risk experimentation with a variety of machine architectures (processor types; bandwidth, memory, and storage capacities, etc.) and of system configurations (including massively parallel computing patterns). Finally, our service-based approach (in which user applications invoke software processes on a Web-accessible server) facilitates access into datasets of arbitrary size and resolution, and allows users to request and receive tailored products on demand. To maximize the usefulness and impact of our technology, we have emphasized open, industry-standard software interfaces. We are also using and developing open source software to facilitate the widespread adoption of similar, derived, or interoperable systems for processing and serving near-real-time data from NPP and other sources.

  12. Large-Scale, Parallel, Multi-Sensor Atmospheric Data Fusion Using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.

    2013-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the 'A-Train' platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration analyses of important climate variables presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (MERRA), stratify the comparisons using a classification of the 'cloud scenes' from CloudSat, and repeat the entire analysis over 10 years of data. To efficiently assemble such datasets, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. However, these problems are Data Intensive computing so the data transfer times and storage costs (for caching) are key issues. SciReduce is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Figure 1 shows the architecture of the full computational system, with SciReduce at the core. Multi-year datasets are automatically 'sharded' by time and space across a cluster of nodes so that years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the cached input and intermediate datasets. We are using SciReduce to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a NASA MEASURES grant. We will present the architecture of SciReduce, describe the achieved 'clock time' speedups in fusing datasets on our own compute nodes and in the public Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. We will also present a concept/prototype for staging NASA's A-Train Atmospheric datasets (Levels 2 & 3) in the Amazon Cloud so that any number of compute jobs can be executed 'near' the multi-sensor data. Given such a system, multi-sensor climate studies over 10-20 years of data could be performed in an efficient way, with the researcher paying only his own Cloud compute bill. SciReduce Architecture

  13. Computation of forces from deformed visco-elastic biological tissues

    NASA Astrophysics Data System (ADS)

    Muñoz, José J.; Amat, David; Conte, Vito

    2018-04-01

    We present a least-squares based inverse analysis of visco-elastic biological tissues. The proposed method computes the set of contractile forces (dipoles) at the cell boundaries that induce the observed and quantified deformations. We show that the computation of these forces requires the regularisation of the problem functional for some load configurations that we study here. The functional measures the error of the dynamic problem being discretised in time with a second-order implicit time-stepping and in space with standard finite elements. We analyse the uniqueness of the inverse problem and estimate the regularisation parameter by means of an L-curved criterion. We apply the methodology to a simple toy problem and to an in vivo set of morphogenetic deformations of the Drosophila embryo.

  14. Documentation of programs that compute 1) quasi-static tilts produced by an expanding dislocation loop in an elastic and viscoelastic material, and 2) surface shear stresses, strains, and shear displacements produced by screw dislocations in a vertical slab with modulus contrast

    USGS Publications Warehouse

    McHugh, Stuart

    1976-01-01

    The material in this report can be grouped into two categories: 1) programs that compute tilts produced by a vertically oriented expanding rectangular dislocation loop in an elastic or viscoelastic material and 2) programs that compute the shear stresses, strains, and shear displacements in a three-phase half-space (i.e. a half-space containing a vertical slab). Each section describes the relevant theory, and provides a detailed guide to the operation of the programs. A series of examples is provided at the end of each section.

  15. Computing Fiber/Matrix Interfacial Effects In SiC/RBSN

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Hopkins, Dale A.

    1996-01-01

    Computational study conducted to demonstrate use of boundary-element method in analyzing effects of fiber/matrix interface on elastic and thermal behaviors of representative laminated composite materials. In study, boundary-element method implemented by Boundary Element Solution Technology - Composite Modeling System (BEST-CMS) computer program.

  16. Visualising elastic anisotropy: theoretical background and computational implementation

    NASA Astrophysics Data System (ADS)

    Nordmann, J.; Aßmus, M.; Altenbach, H.

    2018-02-01

    In this article, we present the technical realisation for visualisations of characteristic parameters of the fourth-order elasticity tensor, which is classified by three-dimensional symmetry groups. Hereby, expressions for spatial representations of uc(Young)'s modulus and bulk modulus as well as plane representations of shear modulus and uc(Poisson)'s ratio are derived and transferred into a comprehensible form to computer algebra systems. Additionally, we present approaches for spatial representations of both latter parameters. These three- and two-dimensional representations are implemented into the software MATrix LABoratory. Exemplary representations of characteristic materials complete the present treatise.

  17. [Elastic registration method to compute deformation functions for mitral valve].

    PubMed

    Yang, Jinyu; Zhang, Wan; Yin, Ran; Deng, Yuxiao; Wei, Yunfeng; Zeng, Junyi; Wen, Tong; Ding, Lu; Liu, Xiaojian; Li, Yipeng

    2014-10-01

    Mitral valve disease is one of the most popular heart valve diseases. Precise positioning and displaying of the valve characteristics is necessary for the minimally invasive mitral valve repairing procedures. This paper presents a multi-resolution elastic registration method to compute the deformation functions constructed from cubic B-splines in three dimensional ultrasound images, in which the objective functional to be optimized was generated by maximum likelihood method based on the probabilistic distribution of the ultrasound speckle noise. The algorithm was then applied to register the mitral valve voxels. Numerical results proved the effectiveness of the algorithm.

  18. Origin and processing of terrestrial organic carbon in the Amazon system: lignin phenols in river, shelf, and fan sediments

    NASA Astrophysics Data System (ADS)

    Sun, Shuwen; Schefuß, Enno; Mulitza, Stefan; Chiessi, Cristiano M.; Sawakuchi, André O.; Zabel, Matthias; Baker, Paul A.; Hefter, Jens; Mollenhauer, Gesine

    2017-05-01

    The Amazon River transports large amounts of terrestrial organic carbon (OCterr) from the Andean and Amazon neotropical forests to the Atlantic Ocean. In order to compare the biogeochemical characteristics of OCterr in the fluvial sediments from the Amazon drainage basin and in the adjacent marine sediments, we analysed riverbed sediments from the Amazon mainstream and its main tributaries as well as marine surface sediments from the Amazon shelf and fan for total organic carbon (TOC) content, organic carbon isotopic composition (δ13CTOC), and lignin phenol compositions. TOC and lignin content exhibit positive correlations with Al / Si ratios (indicative of the sediment grain size) implying that the grain size of sediment discharged by the Amazon River plays an important role in the preservation of TOC and leads to preferential preservation of lignin phenols in fine particles. Depleted δ13CTOC values (-26.1 to -29.9 ‰) in the main tributaries consistently correspond with the dominance of C3 vegetation. Ratios of syringyl to vanillyl (S / V) and cinnamyl to vanillyl (C / V) lignin phenols suggest that non-woody angiosperm tissues are the dominant source of lignin in the Amazon basin. Although the Amazon basin hosts a rich diversity of vascular plant types, distinct regional lignin compositions are not observed. In the marine sediments, the distribution of δ13CTOC and Λ8 (sum of eight lignin phenols in organic carbon (OC), expressed as mg/100 mg OC) values implies that OCterr discharged by the Amazon River is transported north-westward by the North Brazil Current and mostly deposited on the inner shelf. The lignin compositions in offshore sediments under the influence of the Amazon plume are consistent with the riverbed samples suggesting that processing of OCterr during offshore transport does not change the encoded source information. Therefore, the lignin compositions preserved in these offshore sediments can reliably reflect the vegetation in the Amazon River catchment. In sediments from the Amazon fan, low lignin content, relatively depleted δ13CTOC values and high (Ad / Al)V ratios indicating highly degraded lignin imply that a significant fraction of the deposited OCterr is derived from petrogenic (sourced from ancient rocks) sources.

  19. Consolidation of cloud computing in ATLAS

    NASA Astrophysics Data System (ADS)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  20. Elastic response of binary hard-sphere fluids

    NASA Astrophysics Data System (ADS)

    Rickman, J. M.; Ou-Yang, H. Daniel

    2011-07-01

    We derive expressions for the high-frequency, wave-number-dependent elastic constants of a binary hard-sphere fluid and employ Monte Carlo computer simulation to evaluate these constants in order to highlight the impact of composition and relative sphere diameter on the elastic response of this system. It is found that the elastic constant c11(k) exhibits oscillatory behavior as a function of k whereas the high-frequency shear modulus, for example, does not. This behavior is shown to be dictated by the angular dependence (in k⃗ space) of derivatives of the interatomic force at contact. The results are related to recent measurements of the compressibility of colloidal fluids in laser trapping experiments.

  1. Variation of the energy release rate as a crack approaches and passes through an elastic inclusion

    NASA Astrophysics Data System (ADS)

    Li, Rongshun; Chudnovsky, A.

    1993-02-01

    The variation of the energy release rate (ERP) at the tip of a crack penetrating an elastic inclusion is analyzed using an approach involving modeling the random array of microcracks or other defects by an elastic inclusion with effective elastic properties. Computations are carried out using a finite element procedure. The eight-noded isoparametric serendipity element with the shift of the midpoint to the quarter-point is used to simulate the singularity at the crack tip, and the crack growth is accommodated by implementing a mesh regeneration technique. The ERP values were calculated for various crack tip positions which simulate the process of the crack approaching and penetrating the inclusion.

  2. Variation of the energy release rate as a crack approaches and passes through an elastic inclusion

    NASA Technical Reports Server (NTRS)

    Li, Rongshun; Chudnovsky, A.

    1993-01-01

    The variation of the energy release rate (ERP) at the tip of a crack penetrating an elastic inclusion is analyzed using an approach involving modeling the random array of microcracks or other defects by an elastic inclusion with effective elastic properties. Computations are carried out using a finite element procedure. The eight-noded isoparametric serendipity element with the shift of the midpoint to the quarter-point is used to simulate the singularity at the crack tip, and the crack growth is accommodated by implementing a mesh regeneration technique. The ERP values were calculated for various crack tip positions which simulate the process of the crack approaching and penetrating the inclusion.

  3. Ultrasonic studies of aluminium-substituted Bi(Pb)-2223 superconductors

    NASA Astrophysics Data System (ADS)

    Solunke, M. B.; Sharma, P. U.; Pandya, M. P.; Lakhani, V. K.; Modi, K. B.; Venugopal Reddy, P.; Shah, S. S.

    2005-09-01

    The compositional dependence of elastic properties of Al^{3+}-substitu- ted Bi(Pb)-2223 superconducting system with the general formula Bi_{1.7-x}Al_xPb_{0.3}Sr_2Ca_2- Cu_3O_y (x = 0.0, 0.1, 0.2 and 0.3) have been studied by means of ultrasonic pulse transmission (UPT) technique at 1 MHz (300 K). The elastic moduli of the specimens are computed and corrected to zero porosity. The observed variation of elastic constants with aluminium substitution has been explained on the basis of the strength of interatomic bonding. The applicability of heterogeneous metal mixture rule for estimating elastic constants and transition temperature has been tested.

  4. A two-layered mechanical model of the rat esophagus. Experiment and theory

    PubMed Central

    Fan, Yanhua; Gregersen, Hans; Kassab, Ghassan S

    2004-01-01

    Background The function of esophagus is to move food by peristaltic motion which is the result of the interaction of the tissue forces in the esophageal wall and the hydrodynamic forces in the food bolus. The structure of the esophagus is layered. In this paper, the esophagus is treated as a two-layered structure consisting of an inner collagen-rich submucosa layer and an outer muscle layer. We developed a model and experimental setup for determination of elastic moduli in the two layers in circumferential direction and related the measured elastic modulus of the intact esophagus to the elastic modulus computed from the elastic moduli of the two layers. Methods Inflation experiments were done at in vivo length and pressure-diameters relations were recorded for the rat esophagus. Furthermore, the zero-stress state was taken into consideration. Results The radius and the strain increased as function of pressure in the intact as well as in the individual layers of the esophagus. At pressures higher than 1.5 cmH2O the muscle layer had a larger radius and strain than the mucosa-submucosa layer. The strain for the intact esophagus and for the muscle layer was negative at low pressures indicating the presence of residual strains in the tissue. The stress-strain curve for the submucosa-mucosa layer was shifted to the left of the curves for the muscle layer and for the intact esophagus at strains higher than 0.3. The tangent modulus was highest in the submucosa-mucosa layer, indicating that the submucosa-mucosa has the highest stiffness. A good agreement was found between the measured elastic modulus of the intact esophagus and the elastic modulus computed from the elastic moduli of the two separated layers. PMID:15518591

  5. Elastic energy within the human plantar aponeurosis contributes to arch shortening during the push-off phase of running.

    PubMed

    Wager, Justin C; Challis, John H

    2016-03-21

    During locomotion, the lower limb tendons undergo stretch and recoil, functioning like springs that recycle energy with each step. Cadaveric testing has demonstrated that the arch of the foot operates in this capacity during simple loading, yet it remains unclear whether this function exists during locomotion. In this study, one of the arch׳s passive elastic tissues (the plantar aponeurosis; PA) was investigated to glean insights about it and the entire arch of the foot during running. Subject specific computer models of the foot were driven using the kinematics of eight subjects running at 3.1m/s using two initial contact patterns (rearfoot and non-rearfoot). These models were used to estimate PA strain, force, and elastic energy storage during the stance phase. To examine the release of stored energy, the foot joint moments, powers, and work created by the PA were computed. Mean elastic energy stored in the PA was 3.1±1.6J, which was comparable to in situ testing values. Changes to the initial contact pattern did not change elastic energy storage or late stance PA function, but did alter PA pre-tensioning and function during early stance. In both initial contact patterns conditions, the PA power was positive during late stance, which reveals that the release of the stored elastic energy assists with shortening of the arch during push-off. As the PA is just one of the arch׳s passive elastic tissues, the entire arch may store additional energy and impact the metabolic cost of running. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Uncertainty Estimation in Elastic Full Waveform Inversion by Utilising the Hessian Matrix

    NASA Astrophysics Data System (ADS)

    Hagen, V. S.; Arntsen, B.; Raknes, E. B.

    2017-12-01

    Elastic Full Waveform Inversion (EFWI) is a computationally intensive iterative method for estimating elastic model parameters. A key element of EFWI is the numerical solution of the elastic wave equation which lies as a foundation to quantify the mismatch between synthetic (modelled) and true (real) measured seismic data. The misfit between the modelled and true receiver data is used to update the parameter model to yield a better fit between the modelled and true receiver signal. A common approach to the EFWI model update problem is to use a conjugate gradient search method. In this approach the resolution and cross-coupling for the estimated parameter update can be found by computing the full Hessian matrix. Resolution of the estimated model parameters depend on the chosen parametrisation, acquisition geometry, and temporal frequency range. Although some understanding has been gained, it is still not clear which elastic parameters can be reliably estimated under which conditions. With few exceptions, previous analyses have been based on arguments using radiation pattern analysis. We use the known adjoint-state technique with an expansion to compute the Hessian acting on a model perturbation to conduct our study. The Hessian is used to infer parameter resolution and cross-coupling for different selections of models, acquisition geometries, and data types, including streamer and ocean bottom seismic recordings. Information about the model uncertainty is obtained from the exact Hessian, and is essential when evaluating the quality of estimated parameters due to the strong influence of source-receiver geometry and frequency content. Investigation is done on both a homogeneous model and the Gullfaks model where we illustrate the influence of offset on parameter resolution and cross-coupling as a way of estimating uncertainty.

  7. BEST3D user's manual: Boundary Element Solution Technology, 3-Dimensional Version 3.0

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The theoretical basis and programming strategy utilized in the construction of the computer program BEST3D (boundary element solution technology - three dimensional) and detailed input instructions are provided for the use of the program. An extensive set of test cases and sample problems is included in the manual and is also available for distribution with the program. The BEST3D program was developed under the 3-D Inelastic Analysis Methods for Hot Section Components contract (NAS3-23697). The overall objective of this program was the development of new computer programs allowing more accurate and efficient three-dimensional thermal and stress analysis of hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The BEST3D program allows both linear and nonlinear analysis of static and quasi-static elastic problems and transient dynamic analysis for elastic problems. Calculation of elastic natural frequencies and mode shapes is also provided.

  8. Finite-difference modeling with variable grid-size and adaptive time-step in porous media

    NASA Astrophysics Data System (ADS)

    Liu, Xinxin; Yin, Xingyao; Wu, Guochen

    2014-04-01

    Forward modeling of elastic wave propagation in porous media has great importance for understanding and interpreting the influences of rock properties on characteristics of seismic wavefield. However, the finite-difference forward-modeling method is usually implemented with global spatial grid-size and time-step; it consumes large amounts of computational cost when small-scaled oil/gas-bearing structures or large velocity-contrast exist underground. To overcome this handicap, combined with variable grid-size and time-step, this paper developed a staggered-grid finite-difference scheme for elastic wave modeling in porous media. Variable finite-difference coefficients and wavefield interpolation were used to realize the transition of wave propagation between regions of different grid-size. The accuracy and efficiency of the algorithm were shown by numerical examples. The proposed method is advanced with low computational cost in elastic wave simulation for heterogeneous oil/gas reservoirs.

  9. Calculation of open and closed system elastic coefficients for multicomponent solids

    NASA Astrophysics Data System (ADS)

    Mishin, Y.

    2015-06-01

    Thermodynamic equilibrium in multicomponent solids subject to mechanical stresses is a complex nonlinear problem whose exact solution requires extensive computations. A few decades ago, Larché and Cahn proposed a linearized solution of the mechanochemical equilibrium problem by introducing the concept of open system elastic coefficients [Acta Metall. 21, 1051 (1973), 10.1016/0001-6160(73)90021-7]. Using the Ni-Al solid solution as a model system, we demonstrate that open system elastic coefficients can be readily computed by semigrand canonical Monte Carlo simulations in conjunction with the shape fluctuation approach. Such coefficients can be derived from a single simulation run, together with other thermodynamic properties needed for prediction of compositional fields in solid solutions containing defects. The proposed calculation approach enables streamlined solutions of mechanochemical equilibrium problems in complex alloys. Second order corrections to the linear theory are extended to multicomponent systems.

  10. Implementation and clinical application of a deformation method for fast simulation of biological tissue formed by fibers and fluid.

    PubMed

    Sardinha, Ana Gabriella de Oliveira; Oyama, Ceres Nunes de Resende; de Mendonça Maroja, Armando; Costa, Ivan F

    2016-01-01

    The aim of this paper is to provide a general discussion, algorithm, and actual working programs of the deformation method for fast simulation of biological tissue formed by fibers and fluid. In order to demonstrate the benefit of the clinical applications software, we successfully used our computational program to deform a 3D breast image acquired from patients, using a 3D scanner, in a real hospital environment. The method implements a quasi-static solution for elastic global deformations of objects. Each pair of vertices of the surface is connected and defines an elastic fiber. The set of all the elastic fibers defines a mesh of smaller size than the volumetric meshes, allowing for simulation of complex objects with less computational effort. The behavior similar to the stress tensor is obtained by the volume conservation equation that mixes the 3D coordinates. Step by step, we show the computational implementation of this approach. As an example, a 2D rectangle formed by only 4 vertices is solved and, for this simple geometry, all intermediate results are shown. On the other hand, actual implementations of these ideas in the form of working computer routines are provided for general 3D objects, including a clinical application.

  11. Pareto Joint Inversion of Love and Quasi Rayleigh's waves - synthetic study

    NASA Astrophysics Data System (ADS)

    Bogacz, Adrian; Dalton, David; Danek, Tomasz; Miernik, Katarzyna; Slawinski, Michael A.

    2017-04-01

    In this contribution the specific application of Pareto joint inversion in solving geophysical problem is presented. Pareto criterion combine with Particle Swarm Optimization were used to solve geophysical inverse problems for Love and Quasi Rayleigh's waves. Basic theory of forward problem calculation for chosen surface waves is described. To avoid computational problems some simplification were made. This operation allowed foster and more straightforward calculation without lost of solution generality. According to the solving scheme restrictions, considered model must have exact two layers, elastic isotropic surface layer and elastic isotropic half space with infinite thickness. The aim of the inversion is to obain elastic parameters and model geometry using dispersion data. In calculations different case were considered, such as different number of modes for different wave types and different frequencies. Created solutions are using OpenMP standard for parallel computing, which help in reduction of computational times. The results of experimental computations are presented and commented. This research was performed in the context of The Geomechanics Project supported by Husky Energy. Also, this research was partially supported by the Natural Sciences and Engineering Research Council of Canada, grant 238416-2013, and by the Polish National Science Center under contract No. DEC-2013/11/B/ST10/0472.

  12. A spline-based parameter and state estimation technique for static models of elastic surfaces

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Daniel, P. L.; Armstrong, E. S.

    1983-01-01

    Parameter and state estimation techniques for an elliptic system arising in a developmental model for the antenna surface in the Maypole Hoop/Column antenna are discussed. A computational algorithm based on spline approximations for the state and elastic parameters is given and numerical results obtained using this algorithm are summarized.

  13. Surface Electrochemistry of Metals

    DTIC Science & Technology

    1993-04-30

    maxima along the 12 directions of open channels .vhich are also the interatomic directions). Elastic scattering angular distributions always contain... scatterer geometric relationships for such samples. Distributions from ordered atomic bilayers reveal that the Auger signal from the underlayer is attenuated...are developing a theoretical model and computational code which include both elastic scattering and inhomogeneous inelastic scattering . We seek

  14. Tuition Elasticity of the Demand for Higher Education among Current Students: A Pricing Model.

    ERIC Educational Resources Information Center

    Bryan, Glenn A.; Whipple, Thomas W.

    1995-01-01

    A pricing model is offered, based on retention of current students, that colleges can use to determine appropriate tuition. A computer-based model that quantifies the relationship between tuition elasticity and projected net return to the college was developed and applied to determine an appropriate tuition rate for a small, private liberal arts…

  15. Using FRED Data to Teach Price Elasticity of Demand

    ERIC Educational Resources Information Center

    Méndez-Carbajo, Diego; Asarta, Carlos J.

    2017-01-01

    In this article, the authors discuss the use of Federal Reserve Economic Data (FRED) statistics to teach the concept of price elasticity of demand in an introduction to economics course. By using real data in its computation, they argue that instructors can create a value-adding context for illustrating and applying a foundational concept in…

  16. A First Approach to Filament Dynamics

    ERIC Educational Resources Information Center

    Silva, P. E. S.; de Abreu, F. Vistulo; Simoes, R.; Dias, R. G.

    2010-01-01

    Modelling elastic filament dynamics is a topic of high interest due to the wide range of applications. However, it has reached a high level of complexity in the literature, making it unaccessible to a beginner. In this paper we explain the main steps involved in the computational modelling of the dynamics of an elastic filament. We first derive…

  17. ELATE: an open-source online application for analysis and visualization of elastic tensors

    NASA Astrophysics Data System (ADS)

    Gaillac, Romain; Pullumbi, Pluton; Coudert, François-Xavier

    2016-07-01

    We report on the implementation of a tool for the analysis of second-order elastic stiffness tensors, provided with both an open-source Python module and a standalone online application allowing the visualization of anisotropic mechanical properties. After describing the software features, how we compute the conventional elastic constants and how we represent them graphically, we explain our technical choices for the implementation. In particular, we focus on why a Python module is used to generate the HTML web page with embedded Javascript for dynamical plots.

  18. Elastic properties of rigid fiber-reinforced composites

    NASA Astrophysics Data System (ADS)

    Chen, J.; Thorpe, M. F.; Davis, L. C.

    1995-05-01

    We study the elastic properties of rigid fiber-reinforced composites with perfect bonding between fibers and matrix, and also with sliding boundary conditions. In the dilute region, there exists an exact analytical solution. Around the rigidity threshold we find the elastic moduli and Poisson's ratio by decomposing the deformation into a compression mode and a rotation mode. For perfect bonding, both modes are important, whereas only the compression mode is operative for sliding boundary conditions. We employ the digital-image-based method and a finite element analysis to perform computer simulations which confirm our analytical predictions.

  19. Socio-ecological costs of Amazon nut and timber production at community household forests in the Bolivian Amazon.

    PubMed

    Soriano, Marlene; Mohren, Frits; Ascarrunz, Nataly; Dressler, Wolfram; Peña-Claros, Marielos

    2017-01-01

    The Bolivian Amazon holds a complex configuration of people and forested landscapes in which communities hold secure tenure rights over a rich ecosystem offering a range of livelihood income opportunities. A large share of this income is derived from Amazon nut (Bertholletia excelsa). Many communities also have long-standing experience with community timber management plans. However, livelihood needs and desires for better living conditions may continue to place these resources under considerable stress as income needs and opportunities intensify and diversify. We aim to identify the socioeconomic and biophysical factors determining the income from forests, husbandry, off-farm and two keystone forest products (i.e., Amazon nut and timber) in the Bolivian Amazon region. We used structural equation modelling tools to account for the complex inter-relationships between socioeconomic and biophysical factors in predicting each source of income. The potential exists to increase incomes from existing livelihood activities in ways that reduce dependency upon forest resources. For example, changes in off-farm income sources can act to increase or decrease forest incomes. Market accessibility, social, financial, and natural and physical assets determined the amount of income community households could derive from Amazon nut and timber. Factors related to community households' local ecological knowledge, such as the number of non-timber forest products harvested and the number of management practices applied to enhance Amazon nut production, defined the amount of income these households could derive from Amazon nut and timber, respectively. The (inter) relationships found among socioeconomic and biophysical factors over income shed light on ways to improve forest-dependent livelihoods in the Bolivian Amazon. We believe that our analysis could be applicable to other contexts throughout the tropics as well.

  20. Impact of the biomass burning on methane variability during dry years in the Amazon measured from an aircraft and the AIRS sensor.

    PubMed

    Ribeiro, Igor Oliveira; Andreoli, Rita Valéria; Kayano, Mary Toshie; de Sousa, Thaiane Rodrigues; Medeiros, Adan Sady; Guimarães, Patrícia Costa; Barbosa, Cybelli G G; Godoi, Ricardo H M; Martin, Scot T; de Souza, Rodrigo Augusto Ferreira

    2018-05-15

    The present study examines the spatiotemporal variability and interrelations of the atmospheric methane (CH 4 ), carbon monoxide (CO) and biomass burning (BB) outbreaks retrieved from satellite data over the Amazon region during the 2003-2012 period. In the climatological context, we found consistent seasonal cycles of BB outbreaks and CO in the Amazon, both variables showing a peak during the dry season. The dominant CO variability mode features the largest positive loadings in the southern Amazon, and describes the interannual CO variations related to BB outbreaks along the deforestation arc during the dry season. In line with CO variability and BB outbreaks, the results show strong correspondence with the spatiotemporal variability of CH 4 in the southern Amazon during years of intense drought. Indeed, the areas with the largest positive CH 4 anomalies in southern Amazon overlap the areas with high BB outbreaks and positive CO anomalies. The analyses also showed that high (low) BB outbreaks in the southern Amazon occur during dry (wet) years. In consequence, the interannual climate variability modulates the BB outbreaks in the southern Amazon, which in turn have considerable impacts on CO and CH 4 interannual variability in the region. Therefore, the BB outbreaks might play a major role in modulating the CH 4 and CO variations, at least in the southern Amazon. This study also provides a comparison between the estimate of satellite and aircraft measurements for the CH 4 over the southern Amazon, which indicates relatively small differences from the aircraft measurements in the lower troposphere, with errors ranging from 0.18% to 1.76%. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Multispecies Fisheries in the Lower Amazon River and Its Relationship with the Regional and Global Climate Variability

    PubMed Central

    Buss de Souza, Ronald; Freire, Juan; Isaac, Victoria Judith

    2016-01-01

    This paper aims to describe the spatial-temporal variability in catch of the main fishery resources of the Amazon River and floodplain lakes of the Lower Amazon, as well as relating the Catch per Unit of Effort with anomalies of some of the Amazon River, atmosphere and Atlantic Ocean system variables, determining the influence of the environment on the Amazonian fishery resources. Finfish landings data from the towns and villages of the Lower Amazon for the fisheries of three sites (Óbidos, Santarém and Monte Alegre), were obtained for the period between January 1993 and December 2004. Analysis of variance, detrended correspondence analysis, redundancy analysis and multiple regression techniques were used for the statistical analysis of the distinct time series. Fisheries production in the Lower Amazon presents differences between the Amazon River and the floodplain lakes. Production in the Amazon River is approximately half of the one of the floodplain lakes. This variability occurs both along the Lower Amazon River region (longitudinal gradient) and laterally (latitudinal gradient) for every fishing ground studied here. The distinct environmental variables alone or in association act differently on the fishery stocks and the success of catches in each fishery group studied here. Important variables are the flooding events; the soil the sea surface temperatures; the humidity; the wind and the occurence of El Niño-Southern Oscillation events. Fishery productivity presents a large difference in quantity and distribution patterns between the river and floodplain lakes. This variability occurs in the region of the Lower Amazon as well as laterally for each fishery group studied, being dependent on the ecological characteristics and life strategies of each fish group considered here. PMID:27314951

  2. Socio-ecological costs of Amazon nut and timber production at community household forests in the Bolivian Amazon

    PubMed Central

    Mohren, Frits; Ascarrunz, Nataly; Dressler, Wolfram; Peña-Claros, Marielos

    2017-01-01

    The Bolivian Amazon holds a complex configuration of people and forested landscapes in which communities hold secure tenure rights over a rich ecosystem offering a range of livelihood income opportunities. A large share of this income is derived from Amazon nut (Bertholletia excelsa). Many communities also have long-standing experience with community timber management plans. However, livelihood needs and desires for better living conditions may continue to place these resources under considerable stress as income needs and opportunities intensify and diversify. We aim to identify the socioeconomic and biophysical factors determining the income from forests, husbandry, off-farm and two keystone forest products (i.e., Amazon nut and timber) in the Bolivian Amazon region. We used structural equation modelling tools to account for the complex inter-relationships between socioeconomic and biophysical factors in predicting each source of income. The potential exists to increase incomes from existing livelihood activities in ways that reduce dependency upon forest resources. For example, changes in off-farm income sources can act to increase or decrease forest incomes. Market accessibility, social, financial, and natural and physical assets determined the amount of income community households could derive from Amazon nut and timber. Factors related to community households’ local ecological knowledge, such as the number of non-timber forest products harvested and the number of management practices applied to enhance Amazon nut production, defined the amount of income these households could derive from Amazon nut and timber, respectively. The (inter) relationships found among socioeconomic and biophysical factors over income shed light on ways to improve forest-dependent livelihoods in the Bolivian Amazon. We believe that our analysis could be applicable to other contexts throughout the tropics as well. PMID:28235090

  3. East of the Andes: The genetic profile of the Peruvian Amazon populations.

    PubMed

    Di Corcia, T; Sanchez Mellado, C; Davila Francia, T J; Ferri, G; Sarno, S; Luiselli, D; Rickards, O

    2017-06-01

    Assuming that the differences between the Andes and the Amazon rainforest at environmental and historical levels have influenced the distribution patterns of genes, languages, and cultures, the maternal and paternal genetic reconstruction of the Peruvian Amazon populations was used to test the relationships within and between these two extreme environments. We analyzed four Peruvian Amazon communities (Ashaninka, Huambisa, Cashibo, and Shipibo) for both Y chromosome (17 STRs and 8 SNPs) and mtDNA data (control region sequences, two diagnostic sites of the coding region, and one INDEL), and we studied their variability against the rest of South America. We detected a high degree of genetic diversity in the Peruvian Amazon people, both for mtDNA than for Y chromosome, excepting for Cashibo people, who seem to have had no exchanges with their neighbors, in contrast with the others communities. The genetic structure follows the divide between the Andes and the Amazon, but we found a certain degree of gene flow between these two environments, as particularly emerged with the Y chromosome descent cluster's (DCs) analysis. The Peruvian Amazon is home to an array of populations with differential rates of genetic exchanges with their neighbors and with the Andean people, depending on their peculiar demographic histories. We highlighted some successful Y chromosome lineages expansions originated in Peru during the pre-Columbian history which involved both Andeans and Amazon Arawak people, showing that at least a part of the Amazon rainforest did not remain isolated from those exchanges. © 2017 Wiley Periodicals, Inc.

  4. Amazon Forest Responses to Drought and Fire

    NASA Astrophysics Data System (ADS)

    Morton, D. C.

    2015-12-01

    Deforestation and agricultural land uses provide a consistent source of ignitions along the Amazon frontier during the dry season. The risk of understory fires in Amazon forests is amplified by drought conditions, when fires at the forest edge may spread for weeks before rains begin. Fire activity also impacts the regional response of intact forests to drought through diffuse light effects and nutrient redistribution, highlighting the complexity of feedbacks in this coupled human and natural system. This talk will focus on recent advances in our understanding of fire-climate feedbacks in the Amazon, building on research themes initiated under NASA's Large-scale Biosphere-Atmosphere Experiment in Amazonia (LBA). NASA's LBA program began in the wake of the 1997-1998 El Niño, a strong event that exposed the vulnerability of Amazon forests to drought and fire under current climate and projections of climate change. With forecasts of another strong El Niño event in 2015-2016, this talk will provide a multi-scale synthesis of Amazon forest responses to drought and fire based on field measurements, airborne lidar data, and satellite observations of fires, rainfall, and terrestrial water storage. These studies offer new insights into the mechanisms governing fire season severity in the southern Amazon and regional variability in carbon losses from understory fires. The contributions from remote sensing to our understanding of drought and fire in Amazon forests reflect the legacy of NASA's LBA program and the sustained commitment to interdisciplinary research across the Amazon region.

  5. Quantitative Investigation of the Technologies That Support Cloud Computing

    ERIC Educational Resources Information Center

    Hu, Wenjin

    2014-01-01

    Cloud computing is dramatically shaping modern IT infrastructure. It virtualizes computing resources, provides elastic scalability, serves as a pay-as-you-use utility, simplifies the IT administrators' daily tasks, enhances the mobility and collaboration of data, and increases user productivity. We focus on providing generalized black-box…

  6. Are Cloud Environments Ready for Scientific Applications?

    NASA Astrophysics Data System (ADS)

    Mehrotra, P.; Shackleford, K.

    2011-12-01

    Cloud computing environments are becoming widely available both in the commercial and government sectors. They provide flexibility to rapidly provision resources in order to meet dynamic and changing computational needs without the customers incurring capital expenses and/or requiring technical expertise. Clouds also provide reliable access to resources even though the end-user may not have in-house expertise for acquiring or operating such resources. Consolidation and pooling in a cloud environment allow organizations to achieve economies of scale in provisioning or procuring computing resources and services. Because of these and other benefits, many businesses and organizations are migrating their business applications (e.g., websites, social media, and business processes) to cloud environments-evidenced by the commercial success of offerings such as the Amazon EC2. In this paper, we focus on the feasibility of utilizing cloud environments for scientific workloads and workflows particularly of interest to NASA scientists and engineers. There is a wide spectrum of such technical computations. These applications range from small workstation-level computations to mid-range computing requiring small clusters to high-performance simulations requiring supercomputing systems with high bandwidth/low latency interconnects. Data-centric applications manage and manipulate large data sets such as satellite observational data and/or data previously produced by high-fidelity modeling and simulation computations. Most of the applications are run in batch mode with static resource requirements. However, there do exist situations that have dynamic demands, particularly ones with public-facing interfaces providing information to the general public, collaborators and partners, as well as to internal NASA users. In the last few months we have been studying the suitability of cloud environments for NASA's technical and scientific workloads. We have ported several applications to multiple cloud environments including NASA's Nebula environment, Amazon's EC2, Magellan at NERSC, and SGI's Cyclone system. We critically examined the performance of the applications on these systems. We also collected information on the usability of these cloud environments. In this talk we will present the results of our study focusing on the efficacy of using clouds for NASA's scientific applications.

  7. Ab initio elastic tensor of cubic Ti0.5Al0.5N alloys: Dependence of elastic constants on size and shape of the supercell model and their convergence

    NASA Astrophysics Data System (ADS)

    Tasnádi, Ferenc; Odén, M.; Abrikosov, Igor A.

    2012-04-01

    In this study we discuss the performance of the special quasirandom structure (SQS) method in predicting the elastic properties of B1 (rocksalt) Ti0.5Al0.5N alloy. We use a symmetry-based projection technique, which gives the closest cubic approximate of the elastic tensor and allows us to align the SQSs of different shapes and sizes for a comparison in modeling elastic tensors. We show that the derived closest cubic approximate of the elastic tensor converges faster with respect to SQS size than the elastic tensor itself. That establishes a less demanding computational strategy to achieve convergence for the elastic constants. We determine the cubic elastic constants (Cij) and Zener's type elastic anisotropy (A) of Ti0.5Al0.5N. Optimal supercells, which capture accurately both the configurational disorder and cubic symmetry of elastic tensor, result in C11=447 GPa, C12=158 GPa, and C44=203 GPa with 3% of error and A=1.40 with 6% of error. In addition, we establish the general importance of selecting proper SQS with symmetry arguments to reliably model elasticity of alloys. We suggest the calculation of nine elastic tensor elements: C11, C22, C33, C12, C13, C23, C44, C55, and C66, to analyze the performance of SQSs and predict elastic constants of cubic alloys. The described methodology is general enough to be extended for alloys with other symmetry at arbitrary composition.

  8. Morphology and growth pattern of Amazon deep-sea fan: a computer-processed GLORIA side-scan mosaic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flood, R.D.; Damuth, J.E.

    1984-04-01

    Deep-sea fans have become increasingly important targets for exploration because of their favorable facies associations. A better understanding of deep-sea fans is needed to successfully exploit these complex sediment bodies. Recent studies of the Amazon fan, using long-range side-scan sonar (GLORIA) and single-channel seismic data, provide an overall view of channel patterns of this fan and demonstrate the relationship between successive channel/levee systems. The digitally collected GLORIA data have been computer processed to produce a mosaic of the fan. Computer processing has corrected the records for slant range and ship navigation, and targets have been enhanced. Many features of themore » modern fan system are readily apparent on the sonar mosaic. The 1.5 to 0.5-km (5000 to 1600-ft) wide channels meander intensely across the fan with sinuosities up to 2.5. Because of these meanders, the channel gradients decrease regularly across the fan despite changes in regional slope. Other channel-related targets include cutoff meanders, overbank deposits (especially small debris flows), and channel branchings. Other debris flows cover large areas of the fan and override channel/levee systems. Air-gun records show that this fan is built of a series of channel/levee systems that overlay one another. Channels from at least 6 of these systems are visible at the surface now, but apparently only one channel at a time has been active. The length of time needed to build a single channel/levee system is not known, but it appears to be rapid.« less

  9. A Proposed Method for the Computer-aided Discovery and Design of High-strength, Ductile Metals

    NASA Astrophysics Data System (ADS)

    Winter, Ian Stewart

    Gum Metal, a class of Ti-Nb alloys, has generated a great deal of interest in the metallurgical community since its development in 2003. These alloys display numerous novel and anomalous properties, many of which only occur after severe plastic deformation has been incurred on the material. Such properties include: super-elasticity, super-coldworkability, Invar and Elinvar behavior, high ductility, as well as high strength. The high strength of gum metal has generated particular enthusiasm as it is on the order of the predicted ideal strength of the material. Many of the properties of gum metal appear to be a direct result of tuning the composition to be near an elastic instability resulting in a high degree of elastic anisotropy. This presents an opportunity for the computer-aided discovery and design of structural materials as the ideal strength and elastic anisotropy can be approximated from the elastic constants. Two approaches are described for searching for this high ansitropy. In the first, The possibility of forming gum metal in Mg is explored by tuning the material to be near the BCC-HCP transition either by pressure or alloying with Li. The second makes use of the Materials Project's elastic constants database, which contains thousands of ordered compounds, in order to screen for gum metal candidates. By defining an elastic anisotropy parameter consistent with the behavior of gum metal and calculating it for all cubic materials in the elastic constants database several gum metal candidates are found. In order to better assess their candidacy information on the intrinsic ductility of these materials is necessary. A method is proposed for calculating the ideal strength and deformation mode of a solid solution from first-principles. In order to validate this method the intrinsic ductile-to-brittle transition composition of Ti-V systems is calculated. It is further shown that this method can be applied to the calculation of an ideal tensile yield surface.

  10. Fully nonlocal inelastic scattering computations for spectroscopical transmission electron microscopy methods

    NASA Astrophysics Data System (ADS)

    Rusz, Ján; Lubk, Axel; Spiegelberg, Jakob; Tyutyunnikov, Dmitry

    2017-12-01

    The complex interplay of elastic and inelastic scattering amenable to different levels of approximation constitutes the major challenge for the computation and hence interpretation of TEM-based spectroscopical methods. The two major approaches to calculate inelastic scattering cross sections of fast electrons on crystals—Yoshioka-equations-based forward propagation and the reciprocal wave method—are founded in two conceptually differing schemes—a numerical forward integration of each inelastically scattered wave function, yielding the exit density matrix, and a computation of inelastic scattering matrix elements using elastically scattered initial and final states (double channeling). Here, we compare both approaches and show that the latter is computationally competitive to the former by exploiting analytical integration schemes over multiple excited states. Moreover, we show how to include full nonlocality of the inelastic scattering event, neglected in the forward propagation approaches, at no additional computing costs in the reciprocal wave method. Detailed simulations show in some cases significant errors due to the z -locality approximation and hence pitfalls in the interpretation of spectroscopical TEM results.

  11. Direct measurement of 3D elastic anisotropy on rocks from the Ivrea zone (Southern Alps, NW Italy)

    NASA Astrophysics Data System (ADS)

    Pros, Z.; Lokajíček, T.; Přikryl, R.; Klíma, K.

    2003-07-01

    Lower crustal and upper mantle rocks exposed at the earth's surface present direct possibility to measure their physical properties that must be, in other cases, interpreted using indirect methods. The results of these direct measurements can be then used for the corrections of models based on the indirect data. Elastic properties are among the most important parameters studied in geophysics and employed in many fields of earth sciences. In laboratory, dynamic elastic properties are commonly tested in three mutually perpendicular directions. The spatial distribution of P- and S-wave velocities are then computed using textural data, modal composition, density and elastic constants. During such computation, it is virtually impossible to involve all microfabric parameters like different types of microcracking, micropores, mineral alteration or quality of grain boundaries. In this study, complete 3D ultrasonic transmission of spherical samples in 132 independent directions at several levels of confining pressure up to 400 MPa has been employed for study of selected mafic and ultrabasic rocks sampled in and nearby Balmuccia ultrabasic massif (Ivrea zone, Southern Alps, NW Italy). This method revealed large directional variance of maximum P-wave velocity and different symmetries (orthorhombic vs. transversal isotropic) of elastic waves 3D distribution that has not been recorded on these rocks before. Moreover, one dunite sample exhibits P-wave velocity approaching to that of olivine single crystal being interpreted as influence of CPO.

  12. Security of the Brazilian Amazon Area

    DTIC Science & Technology

    1992-04-01

    effect in Amazonia". Brazil’s Institute for Space Research. Sio Paulo, April 1991: 5-6. Thompson, Dick. "A Global Agenda for the Amazon." Time, 18...to be overcome as Brazil pursues settlement and development of the Amazon. The natural ecologic systems of the Amazon must be defended with...agricultural techniques appropriate to the region and developed within the context of a comprehensive, responsible program that meets Brazil’s needs for

  13. Cloud characteristics, thermodynamic controls and radiative impacts during the Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giangrande, Scott E.; Feng, Zhe; Jensen, Michael P.

    Routine cloud, precipitation and thermodynamic observations collected by the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF) and Aerial Facility (AAF) during the 2-year US Department of Energy (DOE) ARM Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) campaign are summarized. These observations quantify the diurnal to large-scale thermodynamic regime controls on the clouds and precipitation over the undersampled, climatically important Amazon basin region. The extended ground deployment of cloud-profiling instrumentation enabled a unique look at multiple cloud regimes at high temporal and vertical resolution. This longer-term ground deployment, coupled with two short-term aircraft intensive observing periods, allowed new opportunitiesmore » to better characterize cloud and thermodynamic observational constraints as well as cloud radiative impacts for modeling efforts within typical Amazon wet and dry seasons.« less

  14. An explicit GIS-based river basin framework for aquatic ecosystem conservation in the Amazon

    NASA Astrophysics Data System (ADS)

    Venticinque, Eduardo; Forsberg, Bruce; Barthem, Ronaldo; Petry, Paulo; Hess, Laura; Mercado, Armando; Cañas, Carlos; Montoya, Mariana; Durigan, Carlos; Goulding, Michael

    2016-11-01

    Despite large-scale infrastructure development, deforestation, mining and petroleum exploration in the Amazon Basin, relatively little attention has been paid to the management scale required for the protection of wetlands, fisheries and other aspects of aquatic ecosystems. This is due, in part, to the enormous size, multinational composition and interconnected nature of the Amazon River system, as well as to the absence of an adequate spatial model for integrating data across the entire Amazon Basin. In this data article we present a spatially uniform multi-scale GIS framework that was developed especially for the analysis, management and monitoring of various aspects of aquatic systems in the Amazon Basin. The Amazon GIS-Based River Basin Framework is accessible as an ESRI geodatabase at doi:10.5063/F1BG2KX8.

  15. Deforestation effects on Amazon forest resilience

    NASA Astrophysics Data System (ADS)

    Zemp, D. C.; Schleussner, C.-F.; Barbosa, H. M. J.; Rammig, A.

    2017-06-01

    Through vegetation-atmosphere feedbacks, rainfall reductions as a result of Amazon deforestation could reduce the resilience on the remaining forest to perturbations and potentially lead to large-scale Amazon forest loss. We track observation-based water fluxes from sources (evapotranspiration) to sinks (rainfall) to assess the effect of deforestation on continental rainfall. By studying 21st century deforestation scenarios, we show that deforestation can reduce dry season rainfall by up to 20% far from the deforested area, namely, over the western Amazon basin and the La Plata basin. As a consequence, forest resilience is systematically eroded in the southwestern region covering a quarter of the current Amazon forest. Our findings suggest that the climatological effects of deforestation can lead to permanent forest loss in this region. We identify hot spot regions where forest loss should be avoided to maintain the ecological integrity of the Amazon forest.

  16. Cloud characteristics, thermodynamic controls and radiative impacts during the Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) experiment

    DOE PAGES

    Giangrande, Scott E.; Feng, Zhe; Jensen, Michael P.; ...

    2017-12-06

    Routine cloud, precipitation and thermodynamic observations collected by the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF) and Aerial Facility (AAF) during the 2-year US Department of Energy (DOE) ARM Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) campaign are summarized. These observations quantify the diurnal to large-scale thermodynamic regime controls on the clouds and precipitation over the undersampled, climatically important Amazon basin region. The extended ground deployment of cloud-profiling instrumentation enabled a unique look at multiple cloud regimes at high temporal and vertical resolution. This longer-term ground deployment, coupled with two short-term aircraft intensive observing periods, allowed new opportunitiesmore » to better characterize cloud and thermodynamic observational constraints as well as cloud radiative impacts for modeling efforts within typical Amazon wet and dry seasons.« less

  17. Towards an Aero-Propulso-Servo-Elasticity Analysis of a Commercial Supersonic Transport

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph W.; Kopasakis, George; Chwalowski, Pawel; Sanetrik, Mark D.; Carlson, Jan-Renee; Silva, Walt A.; McNamara, Jack

    2016-01-01

    This paper covers the development of an aero-propulso-servo-elastic (APSE) model using computational fluid dynamics (CFD) and linear structural deformations. The APSE model provides the integration of the following two previously developed nonlinear dynamic simulations: a variable cycle turbofan engine and an elastic supersonic commercial transport vehicle. The primary focus of this study is to provide a means to include relevant dynamics of a turbomachinery propulsion system into the aeroelastic studies conducted during a vehicle design, which have historically neglected propulsion effects. A high fidelity CFD tool is used here for the integration platform. The elastic vehicle neglecting the propulsion system serves as a comparison of traditional approaches to the APSE results. An overview of the methodology is presented for integrating the propulsion system and elastic vehicle. Static aeroelastic analysis comparisons between the traditional and developed APSE models for a wing tip detection indicate that the propulsion system impact on the vehicle elastic response could increase the detection by approximately ten percent.

  18. Finite-Temperature Behavior of PdH x Elastic Constants Computed by Direct Molecular Dynamics

    DOE PAGES

    Zhou, X. W.; Heo, T. W.; Wood, B. C.; ...

    2017-05-30

    In this paper, robust time-averaged molecular dynamics has been developed to calculate finite-temperature elastic constants of a single crystal. We find that when the averaging time exceeds a certain threshold, the statistical errors in the calculated elastic constants become very small. We applied this method to compare the elastic constants of Pd and PdH 0.6 at representative low (10 K) and high (500 K) temperatures. The values predicted for Pd match reasonably well with ultrasonic experimental data at both temperatures. In contrast, the predicted elastic constants for PdH 0.6 only match well with ultrasonic data at 10 K; whereas, atmore » 500 K, the predicted values are significantly lower. We hypothesize that at 500 K, the facile hydrogen diffusion in PdH 0.6 alters the speed of sound, resulting in significantly reduced values of predicted elastic constants as compared to the ultrasonic experimental data. Finally, literature mechanical testing experiments seem to support this hypothesis.« less

  19. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    PubMed

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.

  20. Study of the elastic behavior of synthetic lightweight aggregates (SLAs)

    NASA Astrophysics Data System (ADS)

    Jin, Na

    Synthetic lightweight aggregates (SLAs), composed of coal fly ash and recycled plastics, represent a resilient construction material that could be a key aspect to future sustainable development. This research focuses on a prediction of the elastic modulus of SLA, assumed as a homogenous and isotropic composite of particulates of high carbon fly ash (HCFA) and a matrix of plastics (HDPE, LDPE, PS and mixture of plastics), with the emphasis on SLAs made of HCFA and PS. The elastic moduli of SLA with variable fly ash volume fractions are predicted based on finite element analyses (FEA) performed using the computer programs ABAQUS and PLAXIS. The effect of interface friction (roughness) between phases and other computation parameters; e.g., loading strain, stiffness of component, element type and boundary conditions, are included in these analyses. Analytical models and laboratory tests provide a baseline for comparison. Overall, results indicate ABAQUS generates elastic moduli closer to those predicted by well-established analytical models than moduli predicted from PLAXIS, especially for SLAs with lower fly ash content. In addition, an increase in roughness, loading strain indicated increase of SLAs stiffness, especially as fly ash content increases. The elastic moduli obtained from unconfined compression generally showed less elastic moduli than those obtained from analytical and ABAQUS 3D predictions. This may be caused by possible existence of pre-failure surface in specimen and the directly interaction between HCFA particles. Recommendations for the future work include laboratory measurements of SLAs moduli and FEM modeling that considers various sizes and random distribution of HCFA particles in SLAs.

  1. Macroscopic elastic properties of textured ZrN-AlN polycrystalline aggregates: From ab initio calculations to grain-scale interactions

    NASA Astrophysics Data System (ADS)

    Holec, D.; Tasnádi, F.; Wagner, P.; Friák, M.; Neugebauer, J.; Mayrhofer, P. H.; Keckes, J.

    2014-11-01

    Despite the fast development of computational material modeling, the theoretical description of macroscopic elastic properties of textured polycrystalline aggregates starting from basic principles remains a challenging task. In this study we use a supercell-based approach to obtain the elastic properties of a random solid solution cubic Zr1 -xAlxN system as a function of the metallic sublattice composition and texture descriptors. The employed special quasirandom structures are optimized not only with respect to short-range-order parameters, but also to make the three cubic directions [1 0 0 ] , [0 1 0 ] , and [0 0 1 ] as similar as possible. In this way, only a small spread of elastic constant tensor components is achieved and an optimum trade-off between modeling of chemical disorder and computational limits regarding the supercell size and calculational time is proposed. The single-crystal elastic constants are shown to vary smoothly with composition, yielding x ≈0.5 an alloy constitution with an almost isotropic response. Consequently, polycrystals with this composition are suggested to have Young's modulus independent of the actual microstructure. This is indeed confirmed by explicit calculations of polycrystal elastic properties, both within the isotropic aggregate limit and with fiber textures with various orientations and sharpness. It turns out that for low AlN mole fractions, the spread of the possible Young's modulus data caused by the texture variation can be larger than 100 GPa. Consequently, our discussion of Young's modulus data of cubic Zr1 -xAlxN contains also the evaluation of the texture typical for thin films.

  2. Computational upscaling of Drucker-Prager plasticity from micro-CT images of synthetic porous rock

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Sarout, Joel; Zhang, Minchao; Dautriat, Jeremie; Veveakis, Emmanouil; Regenauer-Lieb, Klaus

    2018-01-01

    Quantifying rock physical properties is essential for the mining and petroleum industry. Microtomography provides a new way to quantify the relationship between the microstructure and the mechanical and transport properties of a rock. Studies reporting the use microtomographic images to derive permeability and elastic moduli of rocks are common; only rare studies were devoted to yield and failure parameters using this technique. In this study, we simulate the macroscale plastic properties of a synthetic sandstone sample made of calcite-cemented quartz grains using the microscale information obtained from microtomography. The computations rely on the concept of representative volume elements (RVEs). The mechanical RVE is determined using the upper and lower bounds of finite-element computations for elasticity. We present computational upscaling methods from microphysical processes to extract the plasticity parameters of the RVE and compare results to experimental data. The yield stress, cohesion and internal friction angle of the matrix (solid part) of the rock were obtained with reasonable accuracy. Computations of plasticity of a series of models of different volume-sizes showed almost overlapping stress-strain curves, suggesting that the mechanical RVE determined by elastic computations is also valid for plastic yielding. Furthermore, a series of models were created by self-similarly inflating/deflating the porous models, that is keeping a similar structure while achieving different porosity values. The analysis of these models showed that yield stress, cohesion and internal friction angle linearly decrease with increasing porosity in the porosity range between 8 and 28 per cent. The internal friction angle decreases the most significantly, while cohesion remains stable.

  3. Large-Scale, Parallel, Multi-Sensor Atmospheric Data Fusion Using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E.

    2013-05-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration analyses of important climate variables presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over 10 years of data. To efficiently assemble such datasets, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. However, these problems are Data Intensive computing so the data transfer times and storage costs (for caching) are key issues. SciReduce is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Figure 1 shows the architecture of the full computational system, with SciReduce at the core. Multi-year datasets are automatically "sharded" by time and space across a cluster of nodes so that years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the cached input and intermediate datasets. We are using SciReduce to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a NASA MEASURES grant. We will present the architecture of SciReduce, describe the achieved "clock time" speedups in fusing datasets on our own nodes and in the Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. We will also present a concept/prototype for staging NASA's A-Train Atmospheric datasets (Levels 2 & 3) in the Amazon Cloud so that any number of compute jobs can be executed "near" the multi-sensor data. Given such a system, multi-sensor climate studies over 10-20 years of data could be performed in an efficient way, with the researcher paying only his own Cloud compute bill.; Figure 1 -- Architecture.

  4. Research on Key Technologies of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  5. Determination of the Stresses Produced by the Landing Impact in the Bulkheads of a Seaplane Bottom

    NASA Technical Reports Server (NTRS)

    Darevsky, V. M.

    1944-01-01

    The present report deals with the determination of the impact stresses in the bulkhead floors of a seaplane bottom. The dynamic problem is solved on the assumption of a certain elastic system, the floor being assumed as a weightless elastic beam with concentrated masses at the ends (due to the mass of the float) and with a spring which replaces the elastic action of the keel in the center. The distributed load on the floor is that due to the hydrodynamic force acting over a certain portion of the bottom. The pressure distribution over the width of the float is assumed to follow the Wagner law. The formulas given for the maximum bending moment are derived on the assumption that the keel is relatively elastic, in which case it can be shown that at each instant of time the maximum bending moment is at the point of juncture of the floor with the keel. The bending moment at this point is a function of the half width of the wetted surface c and reaches its maximum value when c is approximately equal to b/2 where b is the half width of the float. In general, however, for computing the bending moment the values of the bending moment at the keel for certain values of c are determined and a curve is drawn. The illustrative sample computation gave for the stresses a result approximately equal to that obtained by the conventional factory computation.

  6. A Simple Experiment for Determining the Elastic Constant of a Fine Wire

    ERIC Educational Resources Information Center

    Freeman, W. Larry; Freda, Ronald F.

    2007-01-01

    Many general physics laboratories involve the use of springs to demonstrate Hooke's law, and much ado is made about how this can be used as a model for describing the elastic characteristics of materials at the molecular or atomic level. In recent years, the proliferation of computers, and appropriate sensors, have made it possible to demonstrate…

  7. Improved Optics For Quasi-Elastic Light Scattering

    NASA Technical Reports Server (NTRS)

    Cheung, Harry Michael

    1995-01-01

    Improved optical train devised for use in light-scattering measurements of quasi-elastic light scattering (QELS) and laser spectroscopy. Measurements performed on solutions, microemulsions, micellular solutions, and colloidal dispersions. Simultaneous measurements of total intensity and fluctuations in total intensity of light scattered from sample at various angles provides data used, in conjunction with diffusion coefficients, to compute sizes of particles in sample.

  8. Privacy authentication using key attribute-based encryption in mobile cloud computing

    NASA Astrophysics Data System (ADS)

    Mohan Kumar, M.; Vijayan, R.

    2017-11-01

    Mobile Cloud Computing is becoming more popular in nowadays were users of smartphones are getting increased. So, the security level of cloud computing as to be increased. Privacy Authentication using key-attribute based encryption helps the users for business development were the data sharing with the organization using the cloud in a secured manner. In Privacy Authentication the sender of data will have permission to add their receivers to whom the data access provided for others the access denied. In sender application, the user can choose the file which is to be sent to receivers and then that data will be encrypted using Key-attribute based encryption using AES algorithm. In which cipher created, and that stored in Amazon Cloud along with key value and the receiver list.

  9. Constancy in the vegetation of the Amazon Basin during the late Pleistocene: Evidence from the organic matter composition of Amazon deep sea fan sediments

    NASA Astrophysics Data System (ADS)

    Kastner, Thomas P.; Goñi, Miguel A.

    2003-04-01

    Analyses of more than 60 sediment samples from the Amazon deep sea fan show remarkably constant terrigenous biomarkers (lignin phenols and cutin acids) and stable carbon isotopic compositions of organic matter (δ13COM) deposited from 10 to 70 ka. Sediments from the nine Amazon deep sea fan channel-levee systems investigated in this study yielded relatively narrow ranges for diagnostic parameters such as organic carbon (OC) normalized total lignin yields (Λ = 3.1 ± 1.1 mg/100 mg OC), syringyl:vanillyl phenol ratios (S/V = 0.84 ± 0.06), cinnamyl:vanillyl phenol ratios (C/V = 0.08 ± 0.02), isomeric abundances of cutin-derived dihydroxyhexadecanoic acid (f10,16-OH = 0.65 ± 0.02), and δ13COM (-27.6% ± 0.6 ‰). Our measurements support the hypothesis that the vegetation of the Amazon Basin did not change significantly during the late Pleistocene, even during the Last Glacial Maximum. Moreover, the compositions obtained from the Amazon deep sea fan are similar to those of modern Amazon River suspended sediments. Such results strongly indicate that the current tropical rainforest vegetation has been a permanent and dominant feature of the Amazon River watershed over the past 70 k.y. Specifically, we found no evidence for the development of large savannas that had been previously postulated as indicators of increased glacial aridity in Amazonia. Climate models need to be modified to account for the uninterrupted input of moisture to the tropical Amazon region over the late Pleistocene Holocene period.

  10. Hydrodynamic mobility of a sphere moving on the centerline of an elastic tube

    NASA Astrophysics Data System (ADS)

    Daddi-Moussa-Ider, Abdallah; Lisicki, Maciej; Gekle, Stephan

    2017-11-01

    Elastic channels are an important component of many soft matter systems, in which hydrodynamic interactions with confining membranes determine the behavior of particles in flow. In this work, we derive analytical expressions for Green's functions associated with a point-force (Stokeslet) directed parallel or perpendicular to the axis of an elastic cylindrical channel exhibiting resistance against shear and bending. We then compute the leading order self- and pair mobility functions of particles on the cylinder axis, finding that the mobilities are primarily determined by membrane shear and that bending does not play a significant role. In the quasi-steady limit of vanishing frequency, the particle self- and pair mobilities near a no-slip hard cylinder are recovered only if the membrane possesses a non-vanishing shear rigidity. We further compute the membrane deformation, finding that deformation is generally more pronounced in the axial (radial) directions, for the motion along (perpendicular to) the cylinder centerline, respectively. Our analytical calculations for Green's functions in an elastic cylinder can serve as a fundamental building block for future studies and are verified by fully resolved boundary integral simulations where very good agreement is obtained.

  11. Implementation of perfectly matched layers in an arbitrary geometrical boundary for elastic wave modelling

    NASA Astrophysics Data System (ADS)

    Gao, Hongwei; Zhang, Jianfeng

    2008-09-01

    The perfectly matched layer (PML) absorbing boundary condition is incorporated into an irregular-grid elastic-wave modelling scheme, thus resulting in an irregular-grid PML method. We develop the irregular-grid PML method using the local coordinate system based PML splitting equations and integral formulation of the PML equations. The irregular-grid PML method is implemented under a discretization of triangular grid cells, which has the ability to absorb incident waves in arbitrary directions. This allows the PML absorbing layer to be imposed along arbitrary geometrical boundaries. As a result, the computational domain can be constructed with smaller nodes, for instance, to represent the 2-D half-space by a semi-circle rather than a rectangle. By using a smooth artificial boundary, the irregular-grid PML method can also avoid the special treatments to the corners, which lead to complex computer implementations in the conventional PML method. We implement the irregular-grid PML method in both 2-D elastic isotropic and anisotropic media. The numerical simulations of a VTI lamb's problem, wave propagation in an isotropic elastic medium with curved surface and in a TTI medium demonstrate the good behaviour of the irregular-grid PML method.

  12. Analytic Intermodel Consistent Modeling of Volumetric Human Lung Dynamics.

    PubMed

    Ilegbusi, Olusegun; Seyfi, Behnaz; Neylon, John; Santhanam, Anand P

    2015-10-01

    Human lung undergoes breathing-induced deformation in the form of inhalation and exhalation. Modeling the dynamics is numerically complicated by the lack of information on lung elastic behavior and fluid-structure interactions between air and the tissue. A mathematical method is developed to integrate deformation results from a deformable image registration (DIR) and physics-based modeling approaches in order to represent consistent volumetric lung dynamics. The computational fluid dynamics (CFD) simulation assumes the lung is a poro-elastic medium with spatially distributed elastic property. Simulation is performed on a 3D lung geometry reconstructed from four-dimensional computed tomography (4DCT) dataset of a human subject. The heterogeneous Young's modulus (YM) is estimated from a linear elastic deformation model with the same lung geometry and 4D lung DIR. The deformation obtained from the CFD is then coupled with the displacement obtained from the 4D lung DIR by means of the Tikhonov regularization (TR) algorithm. The numerical results include 4DCT registration, CFD, and optimal displacement data which collectively provide consistent estimate of the volumetric lung dynamics. The fusion method is validated by comparing the optimal displacement with the results obtained from the 4DCT registration.

  13. 77 FR 40084 - Certain Portable Communication Devices; Determination Not To Review Initial Determinations...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ...''); Amazon.com , Inc. of Seattle, Washington (``Amazon''); Nokia Corporation of Espoo, Finland, Nokia Inc. of..., Motorola, Samsung, Sony, Amazon, and Pantech filed a joint motion under Commission Rule 210.21(a)(2) to...

  14. Measuring Water Storage in the Amazon

    NASA Image and Video Library

    2010-07-07

    This image is from data taken by NASA Gravity Recovery and Climate Experiment showing the Amazon basin in South America. The amount of water stored in the Amazon basin varies from month to month. Animations are available at the Photojournal.

  15. How Do Tropical Sea Surface Temperatures Influence the Seasonal Distribution of Precipitation in the Equatorial Amazon?.

    NASA Astrophysics Data System (ADS)

    Fu, Rong; Dickinson, Robert E.; Chen, Mingxuan; Wang, Hui

    2001-10-01

    Although the correlation between precipitation over tropical South America and sea surface temperatures (SSTs) over the Pacific and Atlantic has been documented since the early twentieth century, the impact of each ocean on the timing and intensity of the wet season over tropical South America and the underlying mechanisms have remained unclear. Numerical experiments have been conducted using the National Center for Atmospheric Research Community Climate Model Version 3 to explore these impacts. The results suggest the following.1)Seasonality of SSTs in the tropical Pacific and Atlantic has an important influence on precipitation in the eastern Amazon during the equinox seasons. The eastern side of the Amazon is influenced both by the direct thermal circulation of the Atlantic intertropical convergence zone (ITCZ) and by Rossby waves. These processes are enhanced by the seasonal cycles of SSTs in the tropical Atlantic and Pacific. SSTs affect Amazon precipitation much less during the solstice seasons and in the western Amazon.2)The seasonality of SSTs in the Atlantic more strongly affects Amazon rainfall than does that of the Pacific. Without the former, austral spring in the eastern equatorial Amazon would be a wet season, rather than the observed dry season. As a consequence of the lag at that time of the southward seasonal migration of the Atlantic SSTs behind that of the insolation, the Atlantic ITCZ centers itself near 10°N, instead of at the equator, imposing subsidence and low-level anticyclonic flow over the eastern equatorial Amazon, thus drying the air above the planetary boundary layer and reducing the low-level moisture convergence. Consequently, convection in the eastern Amazon is suppressed despite strong surface heating.3)Seasonality of the SSTs in the tropical Pacific also tends to reduce precipitation in the eastern Amazon during both spring and fall. In spring, subsidence is enhanced not only through a zonal direct circulation, but also through Rossby waves propagating from the extratropical South Pacific to subtropical South America. This teleconnection strengthens the South Atlantic convergence zone (SACZ) and the Nordeste low, in both cases reducing precipitation in the eastern Amazon. A direct thermal response to the Pacific SSTs enhances lower-level divergence and reduces precipitation from the northern tropical Atlantic to the northeastern Amazon.

  16. Mapping and spatiotemporal characterization of degraded forests in the Brazilian Amazon through remote sensing

    NASA Astrophysics Data System (ADS)

    de Souza, Carlos Moreira, Jr.

    Large forested areas have recently been impoverished by degradation caused by selective logging, forest fires and fragmentation in the Amazon region, causing partial change of the original forest structure and composition. As opposed to deforestation that has been monitored with Landsat images since the late 70's, degraded forests have not been monitored in the Amazon region. In this dissertation, remote sensing techniques for identifying and mapping unambiguously degraded forests with Landsat images are proposed. The test area was the region of Sinop, located in the state of Mato Grosso, Brazil. This region was selected because a gradient of degraded forest environments exist and a robust time-series of Landsat images and forest transect data were available. First, statistical analyses were applied to identify the best set of spectral information extracted from Landsat images to detect several types of degraded forest environments. Fraction images derived from Spectral Mixture Analysis (SMA) were the best type of information for that purpose. A new spectral index based on fraction images---Normalized Difference Fraction Index (NDFI)---was proposed to enhance the detection of canopy damaged areas in degraded forests. Second, a contextual classification algorithm was implemented to separate unambiguously forest degradation caused by anthropogenic activities from natural forest disturbances. These techniques were validated using forest transects and high resolution aerial videography images and proved to be highly accurate. Next, these techniques were applied to a time-series data set of Landsat images, encompassing 20 years, to evaluate the relationship between forest degradation and deforestation. The most important finding of the forest change detection analysis was that forest degradation and deforestation are independent events in the study area, making worse the current forest impacts in the Amazon region. Finally, the techniques developed and tested in the Sinop region were successfully applied to forty Landsat images covering other regions of the Brazilian Amazon. Standard fractions and NDFI images were computed for these other regions and both physically and spatially consistent results were obtained. An automated decision tree classification using genetic algorithm was implemented successfully to classify land cover types and sub-classes of degraded forests. The remote sensing techniques proposed in this dissertation are fully automated and have the potential to be used in tropical forest monitoring programs.

  17. Pathosphere.org: Pathogen Detection and Characterization Through a Web-based, Open-source Informatics Platform

    DTIC Science & Technology

    2015-12-29

    human), Homo sapiens chromosome (human), Mus_musculus ( rodent ), Sus scrofa (pig), mitochondrion genome, and Xenopus laevis (frog) . The taxonomy... Amazon Web Services. PLoS Comput Biol 2011, 7:e1002147. 10. Briese T, Paweska JT, McMullan LK, Hutchison SK, Street C, Palacios G, Khristova ML...human enterovirus C genotypes found in respiratory samples from Peru . J Gen Virol 2013, 94(Pt 1):120–7. 54. Jacob ST, Crozier I, Schieffelin JS

  18. Shake, Rattle and Roles: Lessons from Experimental Earthquake Engineering for Incorporating Remote Users in Large-Scale E-Science Experiments

    DTIC Science & Technology

    2007-01-01

    Mechanical Turk: Artificial Artificial Intelligence . Retrieved May 15, 2006 from http://www.mturk.com/ mturk/welcome Atkins, D. E., Droegemeier, K. K...Turk (Amazon, 2006) site goes beyond volunteers and pays people to do Human Intelligence Tasks, those that are difficult for computers but relatively...geographically distributed scientific collaboration, and the use of videogame technology for training. Address: U.S. Army Research Institute, 2511 Jefferson

  19. DoD Global Emerging Infections System Annual Report, Fiscal Year 2000

    DTIC Science & Technology

    2000-01-01

    partners in Peru , Bolivia, and Suriname and was conducted largely under the supervision of a CDC officer assigned to NMRCD. Surveillance in the Amazon ...pyrimethamine on the north coast of Peru . In the Amazon basin, resistance to both chloroquine and sulfadoxine-pyrimethamine is noted in the central Amazon ...the north coast and in the Amazon region, respectively.As a result of this work, Peru now has better and more up-to-date information on antimalarial

  20. River Mixing in the Amazon as a Driver of Concentration-Discharge Relationships

    NASA Astrophysics Data System (ADS)

    Bouchez, Julien; Moquet, Jean-Sébastien; Espinoza, Jhan Carlo; Martinez, Jean-Michel; Guyot, Jean-Loup; Lagane, Christelle; Filizola, Naziano; Noriega, Luis; Hidalgo Sanchez, Liz; Pombosa, Rodrigo

    2017-11-01

    Large hydrological systems aggregate compositionally different waters derived from a variety of pathways. In the case of continental-scale rivers, such aggregation occurs noticeably at confluences between tributaries. Here we explore how such aggregation can affect solute concentration-discharge (C-Q) relationships and thus obscure the message carried by these relationships in terms of weathering properties of the Critical Zone. We build up a simple model for tributary mixing to predict the behavior of C-Q relationships during aggregation. We test a set of predictions made in the context of the largest world's river, the Amazon. In particular, we predict that the C-Q relationships of the rivers draining heterogeneous catchments should be the most "dilutional" and should display the widest hysteresis loops. To check these predictions, we compute 10 day-periodicity time series of Q and major solute (Si, Ca2+, Mg2+, K+, Na+, Cl-, SO42-) C and fluxes (F) for 13 gauging stations located throughout the Amazon basin. In agreement with the model predictions, C-Q relationships of most solutes shift from a fairly "chemostatic" behavior (nearly constant C) at the Andean mountain front and in pure lowland areas, to more "dilutional" patterns (negative C-Q relationship) toward the system mouth. More prominent C-Q hysteresis loops are also observed at the most downstream stations. Altogether, this study suggests that mixing of water and solutes between different flowpaths exerts a strong control on C-Q relationships of large-scale hydrological systems.

  1. Living Rivers: Importance of Andes-Amazon Connectivity and Consequences of Hydropower Development

    NASA Astrophysics Data System (ADS)

    Anderson, E.

    2016-12-01

    The inherent dynamism of rivers along elevational and longitudinal gradients underpins freshwater biodiversity, ecosystem function, and ecosystem services in the Andean-Amazon. While this region covers only a small part of the entire Amazon Basin, its influences on downstream ecology, biogeochemistry, and human wellbeing are disproportionate with its relative small size. Seasonal flow pulses from Andean rivers maintain habitat, signal migratory fishes, and export sediment, nutrients, and organic matter to distant ecosystems—like lowland Amazonia and the Atlantic coast of Brazil. Rivers are key transportation routes, and freshwater fisheries are a primary protein source for the >30 million people that inhabit the Amazon Basin. Numerous cultural traditions depend on free-flowing Andean rivers; examples include Kukama beliefs in the underwater cities of the Marañon River, where people who have drowned in rivers whose bodies are not recovered go to live, or the pre-dawn bathing rituals of the Peruvian Shawi, who gain energy and connect with ancestors in cold, fast-flowing Andean waters. Transformations in the Andean-Amazon landscape—in particular from dams—threaten to compromise flows critical for human and ecosystem wellbeing. Presently, at least 250 hydropower dams are in operation, under construction, or proposed for Andean-Amazon rivers. This presentation will discuss regional trends in hydropower development, quantify effects of existing and proposed dams on Andean-Amazon connectivity, and examine the social and cultural importance of free-flowing Andean-Amazon rivers.

  2. Computational Modeling of Micro-Crack Induced Attenuation in CFRP Composites

    NASA Technical Reports Server (NTRS)

    Roberts, R. A.; Leckey, C. A. C.

    2012-01-01

    A computational study is performed to determine the contribution to ultrasound attenuation in carbon fiber reinforced polymer composite laminates of linear elastic scattering by matrix micro-cracking. Multiple scattering approximations are benchmarked against exact computational approaches. Results support linear scattering as the source of observed increased attenuation in the presence of micro-cracking.

  3. Elastic constants of hcp 4He: Path-integral Monte Carlo results versus experiment

    NASA Astrophysics Data System (ADS)

    Ardila, Luis Aldemar Peña; Vitiello, Silvio A.; de Koning, Maurice

    2011-09-01

    The elastic constants of hcp 4He are computed using the path-integral Monte Carlo (PIMC) method. The stiffness coefficients are obtained by imposing different distortions to a periodic cell containing 180 atoms, followed by measurement of the elements of the corresponding stress tensor. For this purpose an appropriate path-integral expression for the stress tensor observable is derived and implemented into the pimc++ package. In addition to allowing the determination of the elastic stiffness constants, this development also opens the way to an explicit atomistic determination of the Peierls stress for dislocation motion using the PIMC technique. A comparison of the results to available experimental data shows an overall good agreement of the density dependence of the elastic constants, with the single exception of C13. Additional calculations for the bcc phase, on the other hand, show good agreement for all elastic constants.

  4. Acceleration for 2D time-domain elastic full waveform inversion using a single GPU card

    NASA Astrophysics Data System (ADS)

    Jiang, Jinpeng; Zhu, Peimin

    2018-05-01

    Full waveform inversion (FWI) is a challenging procedure due to the high computational cost related to the modeling, especially for the elastic case. The graphics processing unit (GPU) has become a popular device for the high-performance computing (HPC). To reduce the long computation time, we design and implement the GPU-based 2D elastic FWI (EFWI) in time domain using a single GPU card. We parallelize the forward modeling and gradient calculations using the CUDA programming language. To overcome the limitation of relatively small global memory on GPU, the boundary saving strategy is exploited to reconstruct the forward wavefield. Moreover, the L-BFGS optimization method used in the inversion increases the convergence of the misfit function. A multiscale inversion strategy is performed in the workflow to obtain the accurate inversion results. In our tests, the GPU-based implementations using a single GPU device achieve >15 times speedup in forward modeling, and about 12 times speedup in gradient calculation, compared with the eight-core CPU implementations optimized by OpenMP. The test results from the GPU implementations are verified to have enough accuracy by comparing the results obtained from the CPU implementations.

  5. Security Risks of Cloud Computing and Its Emergence as 5th Utility Service

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.

  6. Cointegration of output, capital, labor, and energy

    NASA Astrophysics Data System (ADS)

    Stresing, R.; Lindenberger, D.; Kã¼mmel, R.

    2008-11-01

    Cointegration analysis is applied to the linear combinations of the time series of (the logarithms of) output, capital, labor, and energy for Germany, Japan, and the USA since 1960. The computed cointegration vectors represent the output elasticities of the aggregate energy-dependent Cobb-Douglas function. The output elasticities give the economic weights of the production factors capital, labor, and energy. We find that they are for labor much smaller and for energy much larger than the cost shares of these factors. In standard economic theory output elasticities equal cost shares. Our heterodox findings support results obtained with LINEX production functions.

  7. Morton et al. Reply

    NASA Technical Reports Server (NTRS)

    Morton, Douglas C.; Nagol, Jyoteshwar; Carabajal, Claudia C.; Rosette, Jacqueline; Palace, Michael; Cook, Bruce D.; Vermote, Eric F.; Harding, David J.; North, Peter R. J.

    2016-01-01

    Multiple mechanisms could lead to up-regulation of dry-season photosynthesis in Amazon forests, including canopy phenology and illumination geometry. We specifically tested two mechanisms for phenology-driven changes in Amazon forests during dry-season months, and the combined evidence from passive optical and lidar satellite data was incompatible with large net changes in canopy leaf area or leaf reflectance suggested by previous studies. We therefore hypothesized that seasonal changes in the fraction of sunlit and shaded canopies, one aspect of bidirectional reflectance effects in Moderate Resolution Imaging Spectroradiometer (MODIS) data, could alter light availability for dry-season photosynthesis and the photosynthetic capacity of Amazon forests without large net changes in canopy composition. Subsequent work supports the hypothesis that seasonal changes in illumination geometry and diffuse light regulate light saturation in Amazon forests. These studies clarify the physical mechanisms that govern light availability in Amazon forests from seasonal variability in direct and diffuse illumination. Previously, in the debate over light limitation of Amazon forest productivity, seasonal changes in the distribution of light within complex Amazon forest canopies were confounded with dry-season increases in total incoming photosynthetically active radiation. In the accompanying Comment, Saleska et al. do not fully account for this confounding effect of forest structure on photosynthetic capacity.

  8. Mouths of the Amazon River, Brazil, South America

    NASA Technical Reports Server (NTRS)

    1992-01-01

    In this view of the Amazon River Mouth (0.0, 51.0W), a large sediment plume can be seen expanding outward into the Atlantic Ocean. The sediment plume can be seen hugging the coast north of the delta as a result of the northwest flowing coastal Guyana Current. In recent years, the flow of the Amazon has become heavily laden with sediment as soil runoff from the denuded landscape of the interior enters the Amazon River (and other rivers) drainage system.

  9. Epidemiology of Spotted Fever Group and Typhus Group Rickettsial Infection in the Amazon Basin of Peru

    DTIC Science & Technology

    2010-01-01

    MediCine and Hyg1cne Epidemiology of Spotted Fever Group and Typhus Group Rickettsial Infection in the Amazon Basin of Peru Brett M. Forshey, Allison...approximately 4% of acute febrile episodes detected in Iquitos, a city located in the Amazon region of northeast- ern Peru , could be attributed to SFGR...2010 to 00-00-2010 4. TITLE AND SUBTITLE Epidemiology Of Spotted Fever Group And Typhus Group Rickettsial Infection In The Amazon Basin Of Peru 5a

  10. DoD Global Emerging Infections System Annual Report, Fiscal Year 1999

    DTIC Science & Technology

    1999-01-01

    findings provided the impetus to the government of Peru to change national drug policy regarding treatment of malaria in the Peruvian Amazon . As soon as...feasible, a 13 14 Malaria Emerges in the Amazon Basin of Peru (Department of Loreto) 1992-1997 During the last 10-15 years, malaria has emerged as a major...public health problem in the Amazon basin of South America. In Peru the total number of malaria cases reported annually from the Peruvian Amazon

  11. Act No. 24994 of 19 January 1989. Basic Law on the Rural Development of the Peruvian Amazon Region.

    PubMed

    1989-01-01

    This Act sets forth the government's policy on rural development of the Peruvian Amazon region. Major objectives of the Act include the promotion of new rural settlements in the Amazon region, the promotion of migration from the Andes to the Amazon region, and the stimulation of agriculture, livestock, and forestry activities in the Amazon region. The following are the means that the government will use, among others, to attain these goals: 1) the development of Population Displacement Programmes, which will give individual persons and families economic and logistic support in moving; 2) the establishment of Civic Colonizing Services, temporary mobile units, which will offer settlers health services, education services, technical assistance with respect to agriculture and livestock, and promotional credits; 3) the creation of the Council for Amazon River Transport to coordinate and recommend activities to improve river transport; 4) the granting to settlers of land, free education for their children, medical care, technical training and assistance with respect to agriculture, and a supply of seeds; 5) the exemption of certain investors from payment of income taxes; and 6) the granting of a wide range of incentives for agricultural production. The Act also creates a Council for Planning and Development in the Amazon Region to draw up and approve a Plan for the Development of the Amazon Region. It calls for the rational use of the natural resources of the Amazon Region in the framework of preserving the ecosystem and preventing its ruin and delegates to the regional governments the authority to enter into contracts on the use of forest materials and to undertake reforestation programs. Finally, the Act provides various guarantees for the native population, including guarantees with respect to land and preservation of ethnic and social identity.

  12. The influence of anisotropy on the core structure of Shockley partial dislocations within FCC materials

    NASA Astrophysics Data System (ADS)

    Szajewski, B. A.; Hunter, A.; Luscher, D. J.; Beyerlein, I. J.

    2018-01-01

    Both theoretical and numerical models of dislocations often necessitate the assumption of elastic isotropy to retain analytical tractability in addition to reducing computational load. As dislocation based models evolve towards physically realistic material descriptions, the assumption of elastic isotropy becomes increasingly worthy of examination. We present an analytical dislocation model for calculating the full dissociated core structure of dislocations within anisotropic face centered cubic (FCC) crystals as a function of the degree of material elastic anisotropy, two misfit energy densities on the γ-surface ({γ }{{isf}}, {γ }{{usf}}) and the remaining elastic constants. Our solution is independent of any additional features of the γ-surface. Towards this pursuit, we first demonstrate that the dependence of the anisotropic elasticity tensor on the orientation of the dislocation line within the FCC crystalline lattice is small and may be reasonably neglected for typical materials. With this approximation, explicit analytic solutions for the anisotropic elasticity tensor {B} for both nominally edge and screw dislocations within an FCC crystalline lattice are devised, and employed towards defining a set of effective isotropic elastic constants which reproduce fully anisotropic results, however do not retain the bulk modulus. Conversely, Hill averaged elastic constants which both retain the bulk modulus and reasonably approximate the dislocation core structure are employed within subsequent numerical calculations. We examine a wide range of materials within this study, and the features of each partial dislocation core are sufficiently localized that application of discrete linear elasticity accurately describes the separation of each partial dislocation core. In addition, the local features (the partial dislocation core distribution) are well described by a Peierls-Nabarro dislocation model. We develop a model for the displacement profile which depends upon two disparate dislocation length scales which describe the core structure; (i) the equilibrium stacking fault width between two Shockley partial dislocations, R eq and (ii) the maximum slip gradient, χ, of each Shockley partial dislocation. We demonstrate excellent agreement between our own analytic predictions, numerical calculations, and R eq computed directly by both ab-initio and molecular statics methods found elsewhere within the literature. The results suggest that understanding of various plastic mechanisms, e.g., cross-slip and nucleation may be augmented with the inclusion of elastic anisotropy.

  13. Tropical North Atlantic ocean-atmosphere interactions synchronize forest carbon losses from hurricanes and Amazon fires

    NASA Astrophysics Data System (ADS)

    Chen, Yang; Randerson, James T.; Morton, Douglas C.

    2015-08-01

    We describe a climate mode synchronizing forest carbon losses from North and South America by analyzing time series of tropical North Atlantic sea surface temperatures (SSTs), landfall hurricanes and tropical storms, and Amazon fires during 1995-2013. Years with anomalously high tropical North Atlantic SSTs during March-June were often followed by a more active hurricane season and a larger number of satellite-detected fires in the southern Amazon during June-November. The relationship between North Atlantic tropical cyclones and southern Amazon fires (r = 0.61, p < 0.003) was stronger than links between SSTs and either cyclones or fires alone, suggesting that fires and tropical cyclones were directly coupled to the same underlying atmospheric dynamics governing tropical moisture redistribution. These relationships help explain why seasonal outlook forecasts for hurricanes and Amazon fires both failed in 2013 and may enable the design of improved early warning systems for drought and fire in Amazon forests.

  14. Crop damage of Eriotheca gracilipes (Bombacaceae) by the Blue-Fronted Amazon (Amazona aestiva, Psittacidae), in the Brazilian Cerrado.

    PubMed

    Ragusa-Netto, J

    2014-11-01

    Seed predation has major effects on the reproductive success of individuals, spatial patterns of populations, genetic variability, interspecific interactions and ultimately in the diversity of tree communities. At a Brazilian savanna, I evaluated the proportional crop loss of Eriotheca gracilipes due the Blue-Fronted Amazon (Amazona aestiva) during a fruiting period. Also, I analyzed the relationship between proportional crop loss to Amazons and both fruit crop size and the distance from the nearest damaged conspecific. Trees produced from 1 to 109 fruits, so that Amazons foraged more often on trees bearing larger fruit crop size, while seldom visited less productive trees. Moreover, the relationship between fruit crop sizes and the number of depredated fruits was significant. However, when only damaged trees were assessed, I found a negative and significant relation between fruit crop size and proportional crop loss to Blue-Fronted Amazons. Taking into account this as a measure more directly related to the probability of seed survival, a negative density dependent effect emerged. Also, Amazons similarly damaged the fruit crops of either close or distant neighboring damaged trees. Hence, in spite of Blue-Fronted Amazons searched for E. gracilipes bearing large fruit crops, they were swamped due to the presence of more fruits than they could eat. Moderate seed predation by Blue-Fronted Amazons either at trees with large fruit crops or in areas where fruiting trees were aggregated implies in an enhanced probability of E. gracilipes seed survival and consequent regeneration success.

  15. Elastic Multi-scale Mechanisms: Computation and Biological Evolution.

    PubMed

    Diaz Ochoa, Juan G

    2018-01-01

    Explanations based on low-level interacting elements are valuable and powerful since they contribute to identify the key mechanisms of biological functions. However, many dynamic systems based on low-level interacting elements with unambiguous, finite, and complete information of initial states generate future states that cannot be predicted, implying an increase of complexity and open-ended evolution. Such systems are like Turing machines, that overlap with dynamical systems that cannot halt. We argue that organisms find halting conditions by distorting these mechanisms, creating conditions for a constant creativity that drives evolution. We introduce a modulus of elasticity to measure the changes in these mechanisms in response to changes in the computed environment. We test this concept in a population of predators and predated cells with chemotactic mechanisms and demonstrate how the selection of a given mechanism depends on the entire population. We finally explore this concept in different frameworks and postulate that the identification of predictive mechanisms is only successful with small elasticity modulus.

  16. Thermo-elastic wave model of the photothermal and photoacoustic signal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meja, P.; Steiger, B.; Delsanto, P.P.

    1996-12-31

    By means of the thermo-elastic wave equation the dynamical propagation of mechanical stress and temperature can be described and applied to model the photothermal and photoacoustic signal. Analytical solutions exist only in particular cases. Using massively parallel computers it is possible to simulate the photothermal and photoacoustic signal in a most sufficient way. In this paper the method of local interaction simulation approach (LISA) is presented and selected examples of its application are given. The advantages of this method, which is particularly suitable for parallel processing, consist in reduced computation time and simple description of the photoacoustic signal in opticalmore » materials. The present contribution introduces the authors model, the formalism and some results in the 1 D case for homogeneous nonattenuative materials. The photoacoustic wave can be understood as a wave with locally limited displacement. This displacement corresponds to a temperature variation. Both variables are usually measured in photoacoustics and photothermal measurements. Therefore the temperature and displacement dependence on optical, elastic and thermal constants is analysed.« less

  17. Direct method of design and stress analysis of rotating disks with temperature gradient

    NASA Technical Reports Server (NTRS)

    Manson, S S

    1950-01-01

    A method is presented for the determination of the contour of disks, typified by those of aircraft gas turbines, to incorporate arbitrary elastic-stress distributions resulting from either centrifugal or combined centrifugal and thermal effects. The specified stress may be radial, tangential, or any combination of the two. Use is made of the finite-difference approach in solving the stress equations, the amount of computation necessary in the evolution of a design being greatly reduced by the judicious selection of point stations by the aid of a design chart. Use of the charts and of a preselected schedule of point stations is also applied to the direct problem of finding the elastic and plastic stress distribution in disks of a given design, thereby effecting a great reduction in the amount of calculation. Illustrative examples are presented to show computational procedures in the determination of a new design and in analyzing an existing design for elastic stress and for stresses resulting from plastic flow.

  18. A parametric analysis of waves propagating in a porous solid saturated by a three-phase fluid.

    PubMed

    Santos, Juan E; Savioli, Gabriela B

    2015-11-01

    This paper presents an analysis of a model for the propagation of waves in a poroelastic solid saturated by a three-phase viscous, compressible fluid. The constitutive relations and the equations of motion are stated first. Then a plane wave analysis determines the phase velocities and attenuation coefficients of the four compressional waves and one shear wave that propagate in this type of medium. A procedure to compute the elastic constants in the constitutive relations is defined next. Assuming the knowledge of the shear modulus of the dry matrix, the other elastic constants in the stress-strain relations are determined by employing ideal gedanken experiments generalizing those of Biot's theory for single-phase fluids. These experiments yield expressions for the elastic constants in terms of the properties of the individual solid and fluids phases. Finally the phase velocities and attenuation coefficients of all waves are computed for a sample of Berea sandstone saturated by oil, gas, and water.

  19. Numerical simulations of SHPB experiments for the dynamic compressive strength and failure of ceramics

    NASA Astrophysics Data System (ADS)

    Anderson, Charles E., Jr.; O'Donoghue, Padraic E.; Lankford, James; Walker, James D.

    1992-06-01

    Complementary to a study of the compressive strength of ceramic as a function of strain rate and confinement, numerical simulations of the split-Hopkinson pressure bar (SHPB) experiments have been performed using the two-dimensional wave propagation computer program HEMP. The numerical effort had two main thrusts. Firstly, the interpretation of the experimental data relies on several assumptions. The numerical simulations were used to investigate the validity of these assumptions. The second part of the effort focused on computing the idealized constitutive response of a ceramic within the SHPB experiment. These numerical results were then compared against experimental data. Idealized models examined included a perfectly elastic material, an elastic-perfectly plastic material, and an elastic material with failure. Post-failure material was modeled as having either no strength, or a strength proportional to the mean stress. The effects of confinement were also studied. Conclusions concerning the dynamic behavior of a ceramic up to and after failure are drawn from the numerical study.

  20. A mass weighted chemical elastic network model elucidates closed form domain motions in proteins

    PubMed Central

    Kim, Min Hyeok; Seo, Sangjae; Jeong, Jay Il; Kim, Bum Joon; Liu, Wing Kam; Lim, Byeong Soo; Choi, Jae Boong; Kim, Moon Ki

    2013-01-01

    An elastic network model (ENM), usually Cα coarse-grained one, has been widely used to study protein dynamics as an alternative to classical molecular dynamics simulation. This simple approach dramatically saves the computational cost, but sometimes fails to describe a feasible conformational change due to unrealistically excessive spring connections. To overcome this limitation, we propose a mass-weighted chemical elastic network model (MWCENM) in which the total mass of each residue is assumed to be concentrated on the representative alpha carbon atom and various stiffness values are precisely assigned according to the types of chemical interactions. We test MWCENM on several well-known proteins of which both closed and open conformations are available as well as three α-helix rich proteins. Their normal mode analysis reveals that MWCENM not only generates more plausible conformational changes, especially for closed forms of proteins, but also preserves protein secondary structures thus distinguishing MWCENM from traditional ENMs. In addition, MWCENM also reduces computational burden by using a more sparse stiffness matrix. PMID:23456820

  1. Systematic study of the elastic, optoelectronic, and thermoelectric behavior of MRh2O4 (M = Zn, Cd) based on first principles calculations

    NASA Astrophysics Data System (ADS)

    Abbas, Syed Adeel; Rashid, Muhammad; Faridi, Muhammad Ayub; Saddique, Muhammad Bilal; Mahmood, Asif; Ramay, Shahid Muhammad

    2018-02-01

    In the present study, we performed first principles total energy calculations to explore the electronic, elastic, optical, and thermoelectric behavior of MRh2O4(M = Zn, Cd) spinel oxides. We employed Perdew-Burke-Ernzerhof-sol as well as the modified Becke and Johnson potential to compute the elastic, optoelectronic, and thermoelectric behavior of MRh2O4(M = Zn, Cd). The optical behavior was investigated by calculating the complex dielectric constant, refractive index, optical reflectivity, absorption coefficient, and optical conductivity. All of the optical parameters indicated a shift to lower energies as the atomic size increased from Zn to Cd, thereby suggesting potential applications of the spinel oxides in optoelectronic device. Moreover, the thermoelectric properties of MRh2O4(M = Zn, Cd) spinel oxides were computed in terms of the electrical conductivity (σ), Seebeck coefficient (S), thermal conductivity (k), and power factor (σS2) using the BoltzTraP code.

  2. A Computer Code for Dynamic Stress Analysis of Media-Structure Problems with Nonlinearities (SAMSON). Volume III. User’s Manual.

    DTIC Science & Technology

    NONLINEAR SYSTEMS, LINEAR SYSTEMS, SUBROUTINES , SOIL MECHANICS, INTERFACES, DYNAMICS, LOADS(FORCES), FORCE(MECHANICS), DAMPING, ACCELERATION, ELASTIC...PROPERTIES, PLASTIC PROPERTIES, CRACKS , REINFORCING MATERIALS , COMPOSITE MATERIALS , FAILURE(MECHANICS), MECHANICAL PROPERTIES, INSTRUCTION MANUALS, DIGITAL COMPUTERS...STRESSES, *COMPUTER PROGRAMS), (*STRUCTURES, STRESSES), (*DATA PROCESSING, STRUCTURAL PROPERTIES), SOILS , STRAIN(MECHANICS), MATHEMATICAL MODELS

  3. Effect of ripples on the finite temperature elastic properties of hexagonal boron nitride using strain-fluctuation method

    NASA Astrophysics Data System (ADS)

    Thomas, Siby; Ajith, K. M.; Valsakumar, M. C.

    2017-11-01

    This work intents to put forth the results of a classical molecular dynamics study to investigate the temperature dependent elastic constants of monolayer hexagonal boron nitride (h-BN) between 100 and 1000 K for the first time using strain fluctuation method. The temperature dependence of out-of-plane fluctuations (ripples) is quantified and is explained using continuum theory of membranes. At low temperatures, negative in-plane thermal expansion is observed and at high temperatures, a transition to positive thermal expansion has been observed due to the presence of thermally excited ripples. The decrease of Young's modulus, bulk modulus, shear modulus and Poisson's ratio with increase in temperature has been analyzed. The thermal rippling in h-BN leads to strong anharmonic behaviour that causes large deviation from the isotropic elasticity. A detailed study shows that the strong thermal rippling in large systems is also responsible for the softening of elastic constants in h-BN. From the determined values of elastic constants and elastic moduli, it has been elucidated that 2D h-BN sheets meet the Born's mechanical stability criterion in the investigated temperature range. The variation of longitudinal and shear velocities with temperature is also calculated from the computed values of elastic constants and elastic moduli.

  4. Numerical solution of acoustic scattering by finite perforated elastic plates

    PubMed Central

    2016-01-01

    We present a numerical method to compute the acoustic field scattered by finite perforated elastic plates. A boundary element method is developed to solve the Helmholtz equation subjected to boundary conditions related to the plate vibration. These boundary conditions are recast in terms of the vibration modes of the plate and its porosity, which enables a direct solution procedure. A parametric study is performed for a two-dimensional problem whereby a cantilevered perforated elastic plate scatters sound from a point quadrupole near the free edge. Both elasticity and porosity tend to diminish the scattered sound, in agreement with previous work considering semi-infinite plates. Finite elastic plates are shown to reduce acoustic scattering when excited at high Helmholtz numbers k0 based on the plate length. However, at low k0, finite elastic plates produce only modest reductions or, in cases related to structural resonance, an increase to the scattered sound level relative to the rigid case. Porosity, on the other hand, is shown to be more effective in reducing the radiated sound for low k0. The combined beneficial effects of elasticity and porosity are shown to be effective in reducing the scattered sound for a broader range of k0 for perforated elastic plates. PMID:27274685

  5. Numerical solution of acoustic scattering by finite perforated elastic plates.

    PubMed

    Cavalieri, A V G; Wolf, W R; Jaworski, J W

    2016-04-01

    We present a numerical method to compute the acoustic field scattered by finite perforated elastic plates. A boundary element method is developed to solve the Helmholtz equation subjected to boundary conditions related to the plate vibration. These boundary conditions are recast in terms of the vibration modes of the plate and its porosity, which enables a direct solution procedure. A parametric study is performed for a two-dimensional problem whereby a cantilevered perforated elastic plate scatters sound from a point quadrupole near the free edge. Both elasticity and porosity tend to diminish the scattered sound, in agreement with previous work considering semi-infinite plates. Finite elastic plates are shown to reduce acoustic scattering when excited at high Helmholtz numbers k 0 based on the plate length. However, at low k 0 , finite elastic plates produce only modest reductions or, in cases related to structural resonance, an increase to the scattered sound level relative to the rigid case. Porosity, on the other hand, is shown to be more effective in reducing the radiated sound for low k 0 . The combined beneficial effects of elasticity and porosity are shown to be effective in reducing the scattered sound for a broader range of k 0 for perforated elastic plates.

  6. Assessing exchange-correlation functionals for elasticity and thermodynamics of α -ZrW2O8 : A density functional perturbation theory study

    NASA Astrophysics Data System (ADS)

    Weck, Philippe F.; Kim, Eunja; Greathouse, Jeffery A.; Gordon, Margaret E.; Bryan, Charles R.

    2018-04-01

    Elastic and thermodynamic properties of negative thermal expansion (NTE) α -ZrW2O8 have been calculated using PBEsol and PBE exchange-correlation functionals within the framework of density functional perturbation theory (DFPT). Measured elastic constants are reproduced within ∼ 2 % with PBEsol and ∼ 6 % with PBE. The thermal evolution of the Grüneisen parameter computed within the quasi-harmonic approximation exhibits negative values below the Debye temperature, consistent with observation. The standard molar heat capacity is predicted to be CP0 = 192.2 and 193.8 J mol-1K-1 with PBEsol and PBE, respectively. These results suggest superior accuracy of DFPT/PBEsol for studying the lattice dynamics, elasticity and thermodynamics of NTE materials.

  7. Ab initio study of single-crystalline and polycrystalline elastic properties of Mg-substituted calcite crystals.

    PubMed

    Zhu, L-F; Friák, M; Lymperakis, L; Titrian, H; Aydin, U; Janus, A M; Fabritius, H-O; Ziegler, A; Nikolov, S; Hemzalová, P; Raabe, D; Neugebauer, J

    2013-04-01

    We employ ab initio calculations and investigate the single-crystalline elastic properties of (Ca,Mg)CO3 crystals covering the whole range of concentrations from pure calcite CaCO3 to pure magnesite MgCO3. Studying different distributions of Ca and Mg atoms within 30-atom supercells, our theoretical results show that the energetically most favorable configurations are characterized by elastic constants that nearly monotonously increase with the Mg content. Based on the first principles-derived single-crystalline elastic anisotropy, the integral elastic response of (Ca,Mg)CO3 polycrystals is determined employing a mean-field self-consistent homogenization method. As in case of single-crystalline elastic properties, the computed polycrystalline elastic parameters sensitively depend on the chemical composition and show a significant stiffening impact of Mg atoms on calcite crystals in agreement with the experimental findings. Our analysis also shows that it is not advantageous to use a higher-scale two-phase mix of stoichiometric calcite and magnesite instead of substituting Ca atoms by Mg ones on the atomic scale. Such two-phase composites are not significantly thermodynamically favorable and do not provide any strong additional stiffening effect. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Floppy swimming: Viscous locomotion of actuated elastica

    NASA Astrophysics Data System (ADS)

    Lauga, Eric

    2007-04-01

    Actuating periodically an elastic filament in a viscous liquid generally breaks the constraints of Purcell’s scallop theorem, resulting in the generation of a net propulsive force. This observation suggests a method to design simple swimming devices—which we call “elastic swimmers”—where the actuation mechanism is embedded in a solid body and the resulting swimmer is free to move. In this paper, we study theoretically the kinematics of elastic swimming. After discussing the basic physical picture of the phenomenon and the expected scaling relationships, we derive analytically the elastic swimming velocities in the limit of small actuation amplitude. The emphasis is on the coupling between the two unknowns of the problems—namely the shape of the elastic filament and the swimming kinematics—which have to be solved simultaneously. We then compute the performance of the resulting swimming device and its dependence on geometry. The optimal actuation frequency and body shapes are derived and a discussion of filament shapes and internal torques is presented. Swimming using multiple elastic filaments is discussed, and simple strategies are presented which result in straight swimming trajectories. Finally, we compare the performance of elastic swimming with that of swimming micro-organisms.

  9. Cloud Computing Value Chains: Understanding Businesses and Value Creation in the Cloud

    NASA Astrophysics Data System (ADS)

    Mohammed, Ashraf Bany; Altmann, Jörn; Hwang, Junseok

    Based on the promising developments in Cloud Computing technologies in recent years, commercial computing resource services (e.g. Amazon EC2) or software-as-a-service offerings (e.g. Salesforce. com) came into existence. However, the relatively weak business exploitation, participation, and adoption of other Cloud Computing services remain the main challenges. The vague value structures seem to be hindering business adoption and the creation of sustainable business models around its technology. Using an extensive analyze of existing Cloud business models, Cloud services, stakeholder relations, market configurations and value structures, this Chapter develops a reference model for value chains in the Cloud. Although this model is theoretically based on porter's value chain theory, the proposed Cloud value chain model is upgraded to fit the diversity of business service scenarios in the Cloud computing markets. Using this model, different service scenarios are explained. Our findings suggest new services, business opportunities, and policy practices for realizing more adoption and value creation paths in the Cloud.

  10. Cloud-based MOTIFSIM: Detecting Similarity in Large DNA Motif Data Sets.

    PubMed

    Tran, Ngoc Tam L; Huang, Chun-Hsi

    2017-05-01

    We developed the cloud-based MOTIFSIM on Amazon Web Services (AWS) cloud. The tool is an extended version from our web-based tool version 2.0, which was developed based on a novel algorithm for detecting similarity in multiple DNA motif data sets. This cloud-based version further allows researchers to exploit the computing resources available from AWS to detect similarity in multiple large-scale DNA motif data sets resulting from the next-generation sequencing technology. The tool is highly scalable with expandable AWS.

  11. Virtual machine provisioning, code management, and data movement design for the Fermilab HEPCloud Facility

    NASA Astrophysics Data System (ADS)

    Timm, S.; Cooper, G.; Fuess, S.; Garzoglio, G.; Holzman, B.; Kennedy, R.; Grassano, D.; Tiradani, A.; Krishnamurthy, R.; Vinayagam, S.; Raicu, I.; Wu, H.; Ren, S.; Noh, S.-Y.

    2017-10-01

    The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores. This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.

  12. Virtual Machine Provisioning, Code Management, and Data Movement Design for the Fermilab HEPCloud Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timm, S.; Cooper, G.; Fuess, S.

    The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores.more » This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.« less

  13. Daily Planet Imagery: GIBS MODIS Products on ArcGIS Online

    NASA Astrophysics Data System (ADS)

    Plesea, L.

    2015-12-01

    The NASA EOSDIS Global Imagery Browse Services (GIBS) is rapidly becoming an invaluable GIS resource for the science community and for the public at large. Reliable, fast access to historical as well as near real time, georeferenced images form a solid basis on which many innovative applications and projects can be built. Esri has recognized the value of this effort and is a GIBS user and collaborator. To enable the use of GIBS services within the ArcGIS ecosystem, Esri has built a GIBS reflector server at http://modis.arcgis.com, server which offers the facilities of a time enabled Mosaic Service on top of the GIBS provided images. Currently the MODIS reflectance products are supported by this mosaic service, possibilities of handling other GIBS products are being explored. This reflector service is deployed on the Amazon Elastic Compute Cloud platform, and is freely available to the end users. Due to the excellent response time from GIBS, image tiles do not have to be stored by the Esri mosaic server, all needed data being retrieved directly from GIBS when needed, continuously reflecting the state of GIBS, and greatly simplifying the maintenance of this service. Response latency is usually under one second, making it easy to interact with the data. The remote data access is achieved by using the Geospatial Data Abstraction Library (GDAL) Tiled Web Map Server (TWMS) driver. The response time of this server is excellent, usually under one second. The MODIS imagery has proven to be one of the most popular ones on the ArcGIS Online platform, where it is frequently use to provide temporal context to maps, or by itself, to tell a compelling story.

  14. Measuring Tree Properties and Responses Using Low-Cost Accelerometers

    DOE PAGES

    van Emmerik, Tim; Steele-Dunne, Susan; Hut, Rolf; ...

    2017-05-11

    Trees play a crucial role in the water, carbon and nitrogen cycle on local, regional and global scales. Understanding the exchange of momentum, heat, water, and CO 2 between trees and the atmosphere is important to assess the impact of drought, deforestation and climate change. Unfortunately, ground measurements of tree properties such as mass and canopy interception of precipitation are often expensive or difficult due to challenging environments. This paper aims to demonstrate the concept of using robust and affordable accelerometers to measure tree properties and responses. Tree sway is dependent on mass, canopy structure, drag coefficient, and wind forcing.more » By measuring tree acceleration, we can relate the tree motion to external forcing (e.g., wind, precipitation and related canopy interception) and tree physical properties (e.g., mass, elasticity). Using five months of acceleration data of 19 trees in the Brazilian Amazon, we show that the frequency spectrum of tree sway is related to mass, canopy interception of precipitation, and canopy–atmosphere turbulent exchange.« less

  15. Measuring Tree Properties and Responses Using Low-Cost Accelerometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Emmerik, Tim; Steele-Dunne, Susan; Hut, Rolf

    Trees play a crucial role in the water, carbon and nitrogen cycle on local, regional and global scales. Understanding the exchange of momentum, heat, water, and CO 2 between trees and the atmosphere is important to assess the impact of drought, deforestation and climate change. Unfortunately, ground measurements of tree properties such as mass and canopy interception of precipitation are often expensive or difficult due to challenging environments. This paper aims to demonstrate the concept of using robust and affordable accelerometers to measure tree properties and responses. Tree sway is dependent on mass, canopy structure, drag coefficient, and wind forcing.more » By measuring tree acceleration, we can relate the tree motion to external forcing (e.g., wind, precipitation and related canopy interception) and tree physical properties (e.g., mass, elasticity). Using five months of acceleration data of 19 trees in the Brazilian Amazon, we show that the frequency spectrum of tree sway is related to mass, canopy interception of precipitation, and canopy–atmosphere turbulent exchange.« less

  16. Measuring Tree Properties and Responses Using Low-Cost Accelerometers

    PubMed Central

    van Emmerik, Tim; Steele-Dunne, Susan; Hut, Rolf; Gentine, Pierre; Guerin, Marceau; Oliveira, Rafael S.; Wagner, Jim; Selker, John; van de Giesen, Nick

    2017-01-01

    Trees play a crucial role in the water, carbon and nitrogen cycle on local, regional and global scales. Understanding the exchange of momentum, heat, water, and CO2 between trees and the atmosphere is important to assess the impact of drought, deforestation and climate change. Unfortunately, ground measurements of tree properties such as mass and canopy interception of precipitation are often expensive or difficult due to challenging environments. This paper aims to demonstrate the concept of using robust and affordable accelerometers to measure tree properties and responses. Tree sway is dependent on mass, canopy structure, drag coefficient, and wind forcing. By measuring tree acceleration, we can relate the tree motion to external forcing (e.g., wind, precipitation and related canopy interception) and tree physical properties (e.g., mass, elasticity). Using five months of acceleration data of 19 trees in the Brazilian Amazon, we show that the frequency spectrum of tree sway is related to mass, canopy interception of precipitation, and canopy–atmosphere turbulent exchange. PMID:28492477

  17. Seroprevalence of Toxoplasma gondii in free-living Amazon river dolphins (Inia geoffrensis) from central Amazon, Brazil

    USDA-ARS?s Scientific Manuscript database

    Toxoplasma gondii is an important pathogen in aquatic mammals and its presence in these animals may indicate water contamination of aquatic environment by oocysts. Serum samples from 95 dolphins from free-living Amazon River dolphins (Inia geoffrensis) from Sustainable Development Reserve Mamirauá (...

  18. Seroprevalence of Toxoplasma gondii in free-living amazon river dolphins (Inia geoffrensis) from central Amazon, Brazil

    USDA-ARS?s Scientific Manuscript database

    Toxoplasma gondii is an important pathogen in aquatic mammals and its presence in these animals may indicate water contamination of aquatic environment by oocysts. Serum samples from 95 dolphins from free-living Amazon River dolphins (Inia geoffrensis) from Sustainable Development Reserve Mamirauá (...

  19. Amazon River investigations, reconnaissance measurements of July 1963

    USGS Publications Warehouse

    Oltman, Roy Edwin; Sternberg, H. O'R.; Ames, F.C.; Davis, L.C.

    1964-01-01

    The first measurements of the flow of the Amazon River were made in July 1963 as a joint project of the University of Brazil, the Brazilian Navy, and the U.S. Geological Survey. The discharge of the Amazon River at Obidos was 7,640,000 cfs at an annual flood stage somewhat lower than the average. For comparison the maximum known discharge of the Mississippi River at Vicksburg is about 2,300,000 cfs. Dissolved-solids concentrations and sediment loads of the Amazon River and of several major tributaries were found to be low.

  20. Spectrometry of Pasture Condition and Biogeochemistry in the Central Amazon

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P.; Townsend, Alan R.; Bustamante, Mercedes M. C.

    1999-01-01

    Regional analyses of Amazon cattle pasture biogeochemistry are difficult due to the complexity of human, edaphic, biotic and climatic factors and persistent cloud cover in satellite observations. We developed a method to estimate key biophysical properties of Amazon pastures using hyperspectral reflectance data and photon transport inverse modeling. Remote estimates of live and senescent biomass were strongly correlated with plant-available forms of soil phosphorus and calcium. These results provide a basis for monitoring pasture condition and biogeochemistry in the Amazon Basin using spaceborne hyperspectral sensors.

  1. Forest-rainfall cascades buffer against drought across the Amazon

    NASA Astrophysics Data System (ADS)

    Staal, Arie; Tuinenburg, Obbe A.; Bosmans, Joyce H. C.; Holmgren, Milena; van Nes, Egbert H.; Scheffer, Marten; Zemp, Delphine Clara; Dekker, Stefan C.

    2018-06-01

    Tree transpiration in the Amazon may enhance rainfall for downwind forests. Until now it has been unclear how this cascading effect plays out across the basin. Here, we calculate local forest transpiration and the subsequent trajectories of transpired water through the atmosphere in high spatial and temporal detail. We estimate that one-third of Amazon rainfall originates within its own basin, of which two-thirds has been transpired. Forests in the southern half of the basin contribute most to the stability of other forests in this way, whereas forests in the south-western Amazon are particularly dependent on transpired-water subsidies. These forest-rainfall cascades buffer the effects of drought and reveal a mechanism by which deforestation can compromise the resilience of the Amazon forest system in the face of future climatic extremes.

  2. Surveillance, health promotion and control of Chagas disease in the Amazon Region - Medical attention in the Brazilian Amazon Region: a proposal

    PubMed Central

    Coura, José Rodrigues; Junqueira, Angela CV

    2015-01-01

    We refer to Oswaldo Cruz's reports dating from 1913 about the necessities of a healthcare system for the Brazilian Amazon Region and about the journey of Carlos Chagas to 27 locations in this region and the measures that would need to be adopted. We discuss the risks of endemicity of Chagas disease in the Amazon Region. We recommend that epidemiological surveillance of Chagas disease in the Brazilian Amazon Region and Pan-Amazon region should be implemented through continuous monitoring of the human population that lives in the area, their housing, the environment and the presence of triatomines. The monitoring should be performed with periodic seroepidemiological surveys, semi-annual visits to homes by health agents and the training of malaria microscopists and healthcare technicians to identify Trypanosoma cruzi from patients' samples and T. cruzi infection rates among the triatomines caught. We recommend health promotion and control of Chagas disease through public health policies, especially through sanitary education regarding the risk factors for Chagas disease. Finally, we propose a healthcare system through base hospitals, intermediate-level units in the areas of the Brazilian Amazon Region and air transportation, considering the distances to be covered for medical care. PMID:26560976

  3. Separating the Effects of Tropical Atlantic and Pacific SST-driven Climate Variability on Amazon Carbon Exchange

    NASA Astrophysics Data System (ADS)

    Liptak, J.; Keppel-Aleks, G.

    2016-12-01

    Amazon forests store an estimated 25% percent of global terrestrial carbon per year1, 2, but the responses of Amazon carbon uptake to climate change is highly uncertain. One source of this uncertainty is tropical sea surface temperature variability driven by teleconnections. El Nino-Southern Oscillation (ENSO) is a key driver of year-to-year Amazon carbon exchange, with associated temperature and precipitation changes favoring net carbon storage in La Nina years, and net carbon release during El Nino years3. To determine how Amazon climate and terrestrial carbon fluxes react to ENSO alone and in concert with other SST-driven teleconnections such as the Atlantic Multidecadal Oscillation (AMO), we force the atmosphere (CAM5) and land (CLM4) components of the CESM(BGC) with prescribed monthly SSTs over the period 1950—2014 in a Historical control simulation. We then run an experiment (PAC) with time-varying SSTs applied only to the tropical equatorial Pacific Ocean, and repeating SST seasonal cycle climatologies elsewhere. Limiting SST variability to the equatorial Pacific indicates that other processes enhance ENSO-driven Amazon climate anomalies. Compared to the Historical control simulation, warming, drying and terrestrial carbon loss over the Amazon during El Nino periods are lower in the PAC simulation, especially prior to 1990 during the cool phase of the AMO. Cooling, moistening, and net carbon uptake during La Nina periods are also reduced in the PAC simulation, but differences are greater after 1990 during the warm phase of the AMO. By quantifying the relationships among climate drivers and carbon fluxes in the Historical and PAC simulations, we both assess the sensitivity of these relationships to the magnitude of ENSO forcing and quantify how other teleconnections affect ENSO-driven Amazon climate feedbacks. We expect that these results will help us improve hypotheses for how Atlantic and Pacific climate trends will affect future Amazon carbon carbon cycling. Pan, Y. et al. A large and persistent carbon sink in the world's forests. Science 333, 988-993 (2011) Brienen, Roel J. W. et al. Long-term decline of the Amazon carbon sink. Nature 519, 344-348 (2015) Botta, A. et al. Long-term variations of climate and carbon fluxes over the Amazon basin. Geophys. Res. Lett. 29 (2002)

  4. Repeat-Pass Multi-Temporal Interferometric SAR Coherence Variations with Amazon Floodplain and Lake Habitats

    NASA Astrophysics Data System (ADS)

    Jung, H.; Alsdorf, D.

    2006-12-01

    Monitoring discharge in the main channels of rivers and upland tributaries as well as storage changes in floodplain lakes is necessary for understanding flooding hazards, methane production, sediment transport, and nutrient exchange. Interferometric processing of synthetic aperture radar (SAR) data may enable hydrologists to detect environmental and ecological changes in hydrological systems over space and time. An aim of our experiments is to characterize interferometric SAR coherence variations that occur in Amazon aquatic habitats. We analyze coherence variations in JERS-1 data at three central Amazon sites; Lake Balbina, the Cabaliana floodplain, and the confluence of the Purus and Amazon rivers. Because radar pulse interactions with inundated vegetation typically follow a double-bounce travel path which returns energy to the antenna, coherence will vary with vegetation type, physical baseline, and temporal baseline. Balbina's vegetation consists mostly of forest and inundated trunks of dead, leafless trees as opposed to Cabaliana and Amazon- Purus (dominated by flooded forests), thus it serves to isolate the vegetation signal. Coherence variations with baselines were determined from 253 interferograms at Balbina, 210 at Calbaliana, and 153 at Purus. The average temporal and perpendicular baselines (mean std.) are 574 394 days and 1708 1159 m at Balbina, 637 435 days and 1381 981 m at Cabaliana, and 587 425 days and 1430 964 m at Purus. Balbina has a stronger coherence than either Cabaliana or Amazon-Purus. With results of Mann-Whitney statistical tests, Balbina has a difference between terre-firme and flooded coherence values plotted with perpendicular baseline but Cabaliana and Amazon-Purus do not show this difference. Balbina has a linearly decreasing trend in coherence plotted with temporal baseline whereas Cabaliana and Amazon-Purus have a steep drop-off, non- linear change. A strong annual periodicity is evident on power spectrums of the coherence values for Cabaliana and Amazon-Purus, but not in Balbina and is likely an indicator of the annual Amazon flood wave. Each ecological habitat is delineated in the Balbina coherence values plotted with temporal baseline, but only during high water and time-periods less than 2 years is such delineation visible in the Cabaliana and Amazon-Purus regions. Taken together, these observations suggest terre-firme does not have a seasonal variation whereas flooded areas vary with the season.

  5. Software architecture and design of the web services facilitating climate model diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.

    2015-12-01

    Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.

  6. A nominally second-order cell-centered Lagrangian scheme for simulating elastic-plastic flows on two-dimensional unstructured grids

    NASA Astrophysics Data System (ADS)

    Maire, Pierre-Henri; Abgrall, Rémi; Breil, Jérôme; Loubère, Raphaël; Rebourcet, Bernard

    2013-02-01

    In this paper, we describe a cell-centered Lagrangian scheme devoted to the numerical simulation of solid dynamics on two-dimensional unstructured grids in planar geometry. This numerical method, utilizes the classical elastic-perfectly plastic material model initially proposed by Wilkins [M.L. Wilkins, Calculation of elastic-plastic flow, Meth. Comput. Phys. (1964)]. In this model, the Cauchy stress tensor is decomposed into the sum of its deviatoric part and the thermodynamic pressure which is defined by means of an equation of state. Regarding the deviatoric stress, its time evolution is governed by a classical constitutive law for isotropic material. The plasticity model employs the von Mises yield criterion and is implemented by means of the radial return algorithm. The numerical scheme relies on a finite volume cell-centered method wherein numerical fluxes are expressed in terms of sub-cell force. The generic form of the sub-cell force is obtained by requiring the scheme to satisfy a semi-discrete dissipation inequality. Sub-cell force and nodal velocity to move the grid are computed consistently with cell volume variation by means of a node-centered solver, which results from total energy conservation. The nominally second-order extension is achieved by developing a two-dimensional extension in the Lagrangian framework of the Generalized Riemann Problem methodology, introduced by Ben-Artzi and Falcovitz [M. Ben-Artzi, J. Falcovitz, Generalized Riemann Problems in Computational Fluid Dynamics, Cambridge Monogr. Appl. Comput. Math. (2003)]. Finally, the robustness and the accuracy of the numerical scheme are assessed through the computation of several test cases.

  7. Equivalent orthotropic elastic moduli identification method for laminated electrical steel sheets

    NASA Astrophysics Data System (ADS)

    Saito, Akira; Nishikawa, Yasunari; Yamasaki, Shintaro; Fujita, Kikuo; Kawamoto, Atsushi; Kuroishi, Masakatsu; Nakai, Hideo

    2016-05-01

    In this paper, a combined numerical-experimental methodology for the identification of elastic moduli of orthotropic media is presented. Special attention is given to the laminated electrical steel sheets, which are modeled as orthotropic media with nine independent engineering elastic moduli. The elastic moduli are determined specifically for use with finite element vibration analyses. We propose a three-step methodology based on a conventional nonlinear least squares fit between measured and computed natural frequencies. The methodology consists of: (1) successive augmentations of the objective function by increasing the number of modes, (2) initial condition updates, and (3) appropriate selection of the natural frequencies based on their sensitivities on the elastic moduli. Using the results of numerical experiments, it is shown that the proposed method achieves more accurate converged solution than a conventional approach. Finally, the proposed method is applied to measured natural frequencies and mode shapes of the laminated electrical steel sheets. It is shown that the method can successfully identify the orthotropic elastic moduli that can reproduce the measured natural frequencies and frequency response functions by using finite element analyses with a reasonable accuracy.

  8. Micromechanics and effective elastoplastic behavior of two-phase metal matrix composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ju, J.W.; Chen, T.M.

    A micromechanical framework is presented to predict effective (overall) elasto-(visco-)plastic behavior of two-phase particle-reinforced metal matrix composites (PRMMC). In particular, the inclusion phase (particle) is assumed to be elastic and the matrix material is elasto-(visco-)plastic. Emanating from Ju and Chen's (1994a,b) work on effective elastic properties of composites containing many randomly dispersed inhomogeneities, effective elastoplastic deformations and responses of PRMMC are estimated by means of the effective yield criterion'' derived micromechanically by considering effects due to elastic particles embedded in the elastoplastic matrix. The matrix material is elastic or plastic, depending on local stress and deformation, and obeys general plasticmore » flow rule and hardening law. Arbitrary (general) loadings and unloadings are permitted in the framework through the elastic predictor-plastic corrector two-step operator splitting methodology. The proposed combined micromechanical and computational approach allows one to estimate overall elastoplastic responses of PRMMCs by accounting for the microstructural information (such as the spatial distribution and micro-geometry of particles), elastic properties of constituent phases, and the plastic behavior of the matrix-only materials.« less

  9. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  10. Exploration of cloud computing late start LDRD #149630 : Raincoat. v. 2.1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Echeverria, Victor T.; Metral, Michael David; Leger, Michelle A.

    This report contains documentation from an interoperability study conducted under the Late Start LDRD 149630, Exploration of Cloud Computing. A small late-start LDRD from last year resulted in a study (Raincoat) on using Virtual Private Networks (VPNs) to enhance security in a hybrid cloud environment. Raincoat initially explored the use of OpenVPN on IPv4 and demonstrates that it is possible to secure the communication channel between two small 'test' clouds (a few nodes each) at New Mexico Tech and Sandia. We extended the Raincoat study to add IPSec support via Vyatta routers, to interface with a public cloud (Amazon Elasticmore » Compute Cloud (EC2)), and to be significantly more scalable than the previous iteration. The study contributed to our understanding of interoperability in a hybrid cloud.« less

  11. Modeling Elastic Wave Propagation from an Underground Chemical Explosion Using Higher Order Finite Difference Approximation: Theory, Validation and Application to SPE

    NASA Astrophysics Data System (ADS)

    Hirakawa, E. T.; Ezzedine, S. M.; Petersson, A.; Sjogreen, B.; Vorobiev, O.; Pitarka, A.; Antoun, T.; Walter, W. R.

    2016-12-01

    Motions from underground explosions are governed by non-linear hydrodynamic response of material. However, the numerical calculation of this non-linear constitutive behavior is computationally intensive in contrast to the elastic and acoustic linear wave propagation solvers. Here, we develop a hybrid modeling approach with one-way hydrodynamic-to-elastic coupling in three dimensions in order to propagate explosion generated ground motions from the non-linear near-source region to the far-field. Near source motions are computed using GEODYN-L, a Lagrangian hydrodynamics code for high-energy loading of earth materials. Motions on a dense grid of points sampled on two nested shells located beyond the non-linear damaged zone are saved, and then passed to SW4, an anelastic anisotropic fourth order finite difference code for seismic wave modeling. Our coupling strategy is based on the decomposition and uniqueness theorems where motions are introduced into SW4 as a boundary source and continue to propagate as elastic waves at a much lower computational cost than by using GEODYN-L to cover the entire near- and the far-field domain. The accuracy of the numerical calculations and the coupling strategy is demonstrated in cases with a purely elastic medium as well as non-linear medium. Our hybrid modeling approach is applied to SPE-4' and SPE-5 which are the most recent underground chemical explosions conducted at the Nevada National Security Site (NNSS) where the Source Physics Experiments (SPE) are performed. Our strategy by design is capable of incorporating complex non-linear effects near the source as well as volumetric and topographic material heterogeneity along the propagation path to receiver, and provides new prospects for modeling and understanding explosion generated seismic waveforms. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-698608.

  12. Comparative analysis of different variants of the Uzawa algorithm in problems of the theory of elasticity for incompressible materials.

    PubMed

    Styopin, Nikita E; Vershinin, Anatoly V; Zingerman, Konstantin M; Levin, Vladimir A

    2016-09-01

    Different variants of the Uzawa algorithm are compared with one another. The comparison is performed for the case in which this algorithm is applied to large-scale systems of linear algebraic equations. These systems arise in the finite-element solution of the problems of elasticity theory for incompressible materials. A modification of the Uzawa algorithm is proposed. Computational experiments show that this modification improves the convergence of the Uzawa algorithm for the problems of solid mechanics. The results of computational experiments show that each variant of the Uzawa algorithm considered has its advantages and disadvantages and may be convenient in one case or another.

  13. Computer program: Jet 3 to calculate the large elastic plastic dynamically induced deformations of free and restrained, partial and/or complete structural rings

    NASA Technical Reports Server (NTRS)

    Wu, R. W.; Witmer, E. A.

    1972-01-01

    A user-oriented FORTRAN 4 computer program, called JET 3, is presented. The JET 3 program, which employs the spatial finite-element and timewise finite-difference method, can be used to predict the large two-dimensional elastic-plastic transient Kirchhoff-type deformations of a complete or partial structural ring, with various support conditions and restraints, subjected to a variety of initial velocity distributions and externally-applied transient forcing functions. The geometric shapes of the structural ring can be circular or arbitrarily curved and with variable thickness. Strain-hardening and strain-rate effects of the material are taken into account.

  14. Phonon Dispersion in Amorphous Ni-Alloys

    NASA Astrophysics Data System (ADS)

    Vora, A. M.

    2007-06-01

    The well-known model potential is used to investigate the longitudinal and transverse phonon dispersion curves for six Ni-based binary amorphous alloys, viz. Ni31Dy69, Ni33Y67, Ni36Zr64, Ni50Zr50, Ni60 Nb40, and Ni81B19. The thermodynamic and elastic properties are also computed from the elastic limits of the phonon dispersion curves. The theoretical approach given by Hubbard-Beeby is used in the present study to compute the phonon dispersion curves. Five local field correction functions proposed by Hartree, Taylor, Ichimaru-Utsumi, Farid et al. and Sarkar et al. are employed to see the effect of exchange and correlation in the aforesaid properties.

  15. A Conforming Multigrid Method for the Pure Traction Problem of Linear Elasticity: Mixed Formulation

    NASA Technical Reports Server (NTRS)

    Lee, Chang-Ock

    1996-01-01

    A multigrid method using conforming P-1 finite element is developed for the two-dimensional pure traction boundary value problem of linear elasticity. The convergence is uniform even as the material becomes nearly incompressible. A heuristic argument for acceleration of the multigrid method is discussed as well. Numerical results with and without this acceleration as well as performance estimates on a parallel computer are included.

  16. An Information-Based Machine Learning Approach to Elasticity Imaging

    PubMed Central

    Hoerig, Cameron; Ghaboussi, Jamshid; Insana, Michael. F.

    2016-01-01

    An information-based technique is described for applications in mechanical-property imaging of soft biological media under quasi-static loads. We adapted the Autoprogressive method that was originally developed for civil engineering applications for this purpose. The Autoprogressive method is a computational technique that combines knowledge of object shape and a sparse distribution of force and displacement measurements with finite-element analyses and artificial neural networks to estimate a complete set of stress and strain vectors. Elasticity imaging parameters are then computed from estimated stresses and strains. We introduce the technique using ultrasonic pulse-echo measurements in simple gelatin imaging phantoms having linear-elastic properties so that conventional finite-element modeling can be used to validate results. The Autoprogressive algorithm does not require any assumptions about the material properties and can, in principle, be used to image media with arbitrary properties. We show that by selecting a few well-chosen force-displacement measurements that are appropriately applied during training and establish convergence, we can estimate all nontrivial stress and strain vectors throughout an object and accurately estimate an elastic modulus at high spatial resolution. This new method of modeling the mechanical properties of tissue-like materials introduces a unique method of solving the inverse problem and is the first technique for imaging stress without assuming the underlying constitutive model. PMID:27858175

  17. Elastic and failure response of imperfect three-dimensional metallic lattices: the role of geometric defects induced by Selective Laser Melting

    NASA Astrophysics Data System (ADS)

    Liu, Lu; Kamm, Paul; García-Moreno, Francisco; Banhart, John; Pasini, Damiano

    2017-10-01

    This paper examines three-dimensional metallic lattices with regular octet and rhombicuboctahedron units fabricated with geometric imperfections via Selective Laser Sintering. We use X-ray computed tomography to capture morphology, location, and distribution of process-induced defects with the aim of studying their role in the elastic response, damage initiation, and failure evolution under quasi-static compression. Testing results from in-situ compression tomography show that each lattice exhibits a distinct failure mechanism that is governed not only by cell topology but also by geometric defects induced by additive manufacturing. Extracted from X-ray tomography images, the statistical distributions of three sets of defects, namely strut waviness, strut thickness variation, and strut oversizing, are used to develop numerical models of statistically representative lattices with imperfect geometry. Elastic and failure responses are predicted within 10% agreement from the experimental data. In addition, a computational study is presented to shed light into the relationship between the amplitude of selected defects and the reduction of elastic properties compared to their nominal values. The evolution of failure mechanisms is also explained with respect to strut oversizing, a parameter that can critically cause failure mode transitions that are not visible in defect-free lattices.

  18. Documentation of a computer program to simulate aquifer-system compaction using the modular finite-difference ground-water flow model

    USGS Publications Warehouse

    Leake, S.A.; Prudic, David E.

    1988-01-01

    The process of permanent compaction is not routinely included in simulations of groundwater flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U. S. Geological Survey modular finite-difference groundwater flow model. The new program is called the Interbed-Storage Package. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of skeletal component of elastic specific storage and thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the groundwater flow model by adding an additional term to the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum head. Another package that allows for a time-varying specified-head boundary is also documented. This package was written to reduce the data requirements for test simulations of the Interbed-Storage Package. (USGS)

  19. Crustal motion measurements from the POLENET Antarctic Network: comparisons with glacial isostatic adjustment models

    NASA Astrophysics Data System (ADS)

    Wilson, T. J.; Konfal, S. A.; Bevis, M. G.; Spada, G.; Melini, D.; Barletta, V. R.; Kendrick, E. C.; Saddler, D.; Smalley, R., Jr.; Dalziel, I. W. D.; Willis, M. J.

    2016-12-01

    Crustal motions measured by GPS provide a unique proxy record of ice mass change, due to the elastic and viscoelastic response of the earth to removal of ice loads. The ANET/POLENET array of bedrock GPS sites spans much of the Antarctic interior, encompassing regions where glacial isostatic adjustment (GIA) models predict large crustal displacements due to LGM ice loss and including coastal West Antarctica where major modern ice mass loss is documented. To isolate the long-term GIA component of measured crustal motions, we computed and removed elastic displacements due to recent ice mass change. We used the annually resolved ice mass balance data from Martín-Español et al. (2016) derived from a statistical inversion of satellite altimetry, gravimetry, and elastic-corrected GPS data for the period 2003-2013. The Regional Elastic Rebound Calculator (REAR) [Melini et al., 2015] was used to compute elastic vertical and horizontal surface displacements. Uplift due to elastic rebound is substantial in West Antarctica, very minimal in East Antarctica, and variable across the Weddell Embayment. The ANET GPS-derived crustal motion patterns ascribed to non-elastic GIA are spatially complex and differ significantly in magnitude from model predictions. We present a systematic comparison of measured and predicted velocities within different sectors of Antarctica, in order to examine spatial patterns relative to modern ice mass changes, ice history model uncertainties, and lateral variations in earth properties. In the Weddell Embayment region most vertical velocities are lower than uplift predicted by GIA models. Several sites in the southernmost Transantarctic Mountains and the Whitmore Mountains, where small ice mass increase occurs, have vertical uplift significantly exceeding GIA model predictions. There is an intriguing spatial correlation of these fast-moving sites with a low-velocity anomaly in the upper mantle documented by analysis of teleseismic Rayleigh waves by Heeszel et al. (2016). Significant non-elastic GIA velocities occur in the Amundsen Sea Embayment sector, with high uplift flanked by subsiding regions. This pattern can be modeled as a viscoelastic response to ice loss on decadal-centennial time scales in a region with weak upper mantle, consistent with seismic results in the region.

  20. CLUSTOM-CLOUD: In-Memory Data Grid-Based Software for Clustering 16S rRNA Sequence Data in the Cloud Environment.

    PubMed

    Oh, Jeongsu; Choi, Chi-Hwan; Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo

    2016-01-01

    High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology-a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in JAVA and is freely available at http://clustomcloud.kopri.re.kr.

  1. CLUSTOM-CLOUD: In-Memory Data Grid-Based Software for Clustering 16S rRNA Sequence Data in the Cloud Environment

    PubMed Central

    Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo

    2016-01-01

    High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology–a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in JAVA and is freely available at http://clustomcloud.kopri.re.kr. PMID:26954507

  2. Dispersive approach to two-photon exchange in elastic electron-proton scattering

    DOE PAGES

    Blunden, P. G.; Melnitchouk, W.

    2017-06-14

    We examine the two-photon exchange corrections to elastic electron-nucleon scattering within a dispersive approach, including contributions from both nucleon and Δ intermediate states. The dispersive analysis avoids off-shell uncertainties inherent in traditional approaches based on direct evaluation of loop diagrams, and guarantees the correct unitary behavior in the high energy limit. Using empirical information on the electromagnetic nucleon elastic and NΔ transition form factors, we compute the two-photon exchange corrections both algebraically and numerically. Finally, results are compared with recent measurements of e + p to e - p cross section ratios from the CLAS, VEPP-3 and OLYMPUS experiments.

  3. Land-use and climate change risks in the Amazon and the need of a novel sustainable development paradigm

    PubMed Central

    Nobre, Carlos A.; Sampaio, Gilvan; Borma, Laura S.; Castilla-Rubio, Juan Carlos; Silva, José S.; Cardoso, Manoel

    2016-01-01

    For half a century, the process of economic integration of the Amazon has been based on intensive use of renewable and nonrenewable natural resources, which has brought significant basin-wide environmental alterations. The rural development in the Amazonia pushed the agricultural frontier swiftly, resulting in widespread land-cover change, but agriculture in the Amazon has been of low productivity and unsustainable. The loss of biodiversity and continued deforestation will lead to high risks of irreversible change of its tropical forests. It has been established by modeling studies that the Amazon may have two “tipping points,” namely, temperature increase of 4 °C or deforestation exceeding 40% of the forest area. If transgressed, large-scale “savannization” of mostly southern and eastern Amazon may take place. The region has warmed about 1 °C over the last 60 y, and total deforestation is reaching 20% of the forested area. The recent significant reductions in deforestation—80% reduction in the Brazilian Amazon in the last decade—opens up opportunities for a novel sustainable development paradigm for the future of the Amazon. We argue for a new development paradigm—away from only attempting to reconcile maximizing conservation versus intensification of traditional agriculture and expansion of hydropower capacity—in which we research, develop, and scale a high-tech innovation approach that sees the Amazon as a global public good of biological assets that can enable the creation of innovative high-value products, services, and platforms through combining advanced digital, biological, and material technologies of the Fourth Industrial Revolution in progress. PMID:27638214

  4. Land-use and climate change risks in the Amazon and the need of a novel sustainable development paradigm.

    PubMed

    Nobre, Carlos A; Sampaio, Gilvan; Borma, Laura S; Castilla-Rubio, Juan Carlos; Silva, José S; Cardoso, Manoel

    2016-09-27

    For half a century, the process of economic integration of the Amazon has been based on intensive use of renewable and nonrenewable natural resources, which has brought significant basin-wide environmental alterations. The rural development in the Amazonia pushed the agricultural frontier swiftly, resulting in widespread land-cover change, but agriculture in the Amazon has been of low productivity and unsustainable. The loss of biodiversity and continued deforestation will lead to high risks of irreversible change of its tropical forests. It has been established by modeling studies that the Amazon may have two "tipping points," namely, temperature increase of 4 °C or deforestation exceeding 40% of the forest area. If transgressed, large-scale "savannization" of mostly southern and eastern Amazon may take place. The region has warmed about 1 °C over the last 60 y, and total deforestation is reaching 20% of the forested area. The recent significant reductions in deforestation-80% reduction in the Brazilian Amazon in the last decade-opens up opportunities for a novel sustainable development paradigm for the future of the Amazon. We argue for a new development paradigm-away from only attempting to reconcile maximizing conservation versus intensification of traditional agriculture and expansion of hydropower capacity-in which we research, develop, and scale a high-tech innovation approach that sees the Amazon as a global public good of biological assets that can enable the creation of innovative high-value products, services, and platforms through combining advanced digital, biological, and material technologies of the Fourth Industrial Revolution in progress.

  5. Land-use and climate change risks in the Amazon and the need of a novel sustainable development paradigm

    NASA Astrophysics Data System (ADS)

    Nobre, Carlos A.; Sampaio, Gilvan; Borma, Laura S.; Castilla-Rubio, Juan Carlos; Silva, José S.; Cardoso, Manoel

    2016-09-01

    For half a century, the process of economic integration of the Amazon has been based on intensive use of renewable and nonrenewable natural resources, which has brought significant basin-wide environmental alterations. The rural development in the Amazonia pushed the agricultural frontier swiftly, resulting in widespread land-cover change, but agriculture in the Amazon has been of low productivity and unsustainable. The loss of biodiversity and continued deforestation will lead to high risks of irreversible change of its tropical forests. It has been established by modeling studies that the Amazon may have two “tipping points,” namely, temperature increase of 4 °C or deforestation exceeding 40% of the forest area. If transgressed, large-scale “savannization” of mostly southern and eastern Amazon may take place. The region has warmed about 1 °C over the last 60 y, and total deforestation is reaching 20% of the forested area. The recent significant reductions in deforestation—80% reduction in the Brazilian Amazon in the last decade—opens up opportunities for a novel sustainable development paradigm for the future of the Amazon. We argue for a new development paradigm—away from only attempting to reconcile maximizing conservation versus intensification of traditional agriculture and expansion of hydropower capacity—in which we research, develop, and scale a high-tech innovation approach that sees the Amazon as a global public good of biological assets that can enable the creation of innovative high-value products, services, and platforms through combining advanced digital, biological, and material technologies of the Fourth Industrial Revolution in progress.

  6. Holocene palaeoenvironmental history of the Amazonian mangrove belt

    NASA Astrophysics Data System (ADS)

    Cohen, Marcelo Cancela Lisboa; Pessenda, Luiz Carlos Ruiz; Behling, Hermann; de Fátima Rossetti, Dilce; França, Marlon Carlos; Guimarães, José Tasso Felix; Friaes, Yuri; Smith, Clarisse Beltrão

    2012-11-01

    Wetland dynamic in the northern Brazilian Amazon region during the Holocene was reviewed using palynological, carbon and nitrogen isotopes records, and C/N ratio previously published. The integration of 72 radiocarbon dates recorded in 34 sediment cores sampled along the marine and fluvial littoral, and mainly influenced by the Amazon River, reveals that marine influence and mangrove vegetation were wider than today on the mouth of Amazon River between >8990-8690 and 2300-2230 cal yr BP, forming a continuous mangrove belt along the northern Brazilian Amazon littoral. The establishment of this mangrove strip is a direct consequence of the marine incursion caused by post-glacial sea-level rise possibly associated with tectonic subsidence during the Early and Middle Holocene. In the Late Holocene, in areas influenced by the Amazon River discharge, the mangroves were replaced by freshwater vegetation, and the coast morphology evolved from an estuarine dominated into a rectilinear coast due to coastal progradation. Nevertheless, the marine-influenced littoral, which is currently dominated by mangroves and salt-marsh vegetation, has persistently had brackish water vegetation over tidal mud flats throughout the entire Holocene. Likely, the fragmentation of this continuous mangrove line during the Late Holocene was caused by the increase of river freshwater discharge associated to the change from dry into wet climates in the Late Holocene. This caused a significant decrease of tidal water salinity in areas near the mouth of Amazon River. These changes in the Amazon discharge are probably associated with dry and wet periods in the northern Amazon region during the Holocene.

  7. Introduction: Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5)

    NASA Astrophysics Data System (ADS)

    Martin, S. T.; Artaxo, P.; Machado, L. A. T.; Manzi, A. O.; Souza, R. A. F.; Schumacher, C.; Wang, J.; Andreae, M. O.; Barbosa, H. M. J.; Fan, J.; Fisch, G.; Goldstein, A. H.; Guenther, A.; Jimenez, J. L.; Pöschl, U.; Silva Dias, M. A.; Smith, J. N.; Wendisch, M.

    2016-04-01

    The Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) Experiment was carried out in the environs of Manaus, Brazil, in the central region of the Amazon basin for 2 years from 1 January 2014 through 31 December 2015. The experiment focused on the complex interactions among vegetation, atmospheric chemistry, and aerosol production on the one hand and their connections to aerosols, clouds, and precipitation on the other. The objective was to understand and quantify these linked processes, first under natural conditions to obtain a baseline and second when altered by the effects of human activities. To this end, the pollution plume from the Manaus metropolis, superimposed on the background conditions of the central Amazon basin, served as a natural laboratory. The present paper, as the introduction to the special issue of GoAmazon2014/5, presents the context and motivation of the GoAmazon2014/5 Experiment. The nine research sites, including the characteristics and instrumentation of each site, are presented. The sites range from time point zero (T0) upwind of the pollution, to T1 in the midst of the pollution, to T2 just downwind of the pollution, to T3 furthest downwind of the pollution (70 km). In addition to the ground sites, a low-altitude G-159 Gulfstream I (G-1) observed the atmospheric boundary layer and low clouds, and a high-altitude Gulfstream G550 (HALO) operated in the free troposphere. During the 2-year experiment, two Intensive Operating Periods (IOP1 and IOP2) also took place that included additional specialized research instrumentation at the ground sites as well as flights of the two aircraft. GoAmazon2014/5 IOP1 was carried out from 1 February to 31 March 2014 in the wet season. GoAmazon2014/5 IOP2 was conducted from 15 August to 15 October 2014 in the dry season. The G-1 aircraft flew during both IOP1 and IOP2, and the HALO aircraft flew during IOP2. In the context of the Amazon basin, the two IOPs also correspond to the clean and biomass burning seasons, respectively. The Manaus plume is present year-round, and it is transported by prevailing northeasterly and easterly winds in the wet and dry seasons, respectively. This introduction also organizes information relevant to many papers in the special issue. Information is provided on the vehicle fleet, power plants, and industrial activities of Manaus. The mesoscale and synoptic meteorologies relevant to the two IOPs are presented. Regional and long-range transport of emissions during the two IOPs is discussed based on satellite observations across South America and Africa. Fire locations throughout the airshed are detailed. In conjunction with the context and motivation of GoAmazon2014/5 as presented in this introduction, research articles including thematic overview articles are anticipated in this special issue to describe the detailed results and findings of the GoAmazon2014/5 Experiment.

  8. Disease Vector Ecology Profile: Colombia

    DTIC Science & Technology

    1998-12-01

    the Amazon Basin. Unknown. Unknown. Culex gnomatus VEE (Ecuador, Peru ) Many mammals and birds, but equines are key reservoirs with high...Arboviruses other than Dengue or Yellow Fever in the Amazon Basin and Associated Northwestern Regions of South America...widespread (Amazonia, Orinoquia, Cauca Valley, Caribbean regions), and to mefloquine and amodiaquine ( Amazon Basin). Plasmodium vivax resistance

  9. Selective logging in the Brazilian Amazon.

    Treesearch

    G. P. Asner; D. E. Knapp; E. N. Broadbent; P. J. C. Oliveira; M Keller; J. N. Silva

    2005-01-01

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square...

  10. The Green Ocean: Precipitation Insights from the GoAmazon2014/5 Experiment

    DOE PAGES

    Wang, Die; Giangrande, Scott E.; Bartholomew, Mary Jane; ...

    2018-02-07

    This study summarizes the precipitation properties collected during the GoAmazon2014/5 campaign near Manaus in central Amazonia, Brazil. Precipitation breakdowns, summary radar rainfall relationships and self-consistency concepts from a coupled disdrometer and radar wind profiler measurements are presented. The properties of Amazon cumulus and associated stratiform precipitation are discussed, including segregations according to seasonal (Wet/Dry regime) variability, cloud echo-top height and possible aerosol influences on the apparent oceanic characteristics of the precipitation drop size distributions. Overall, we observe that the Amazon precipitation straddles behaviors found during previous U.S. Department of Energy Atmospheric Radiation Measurements program (ARM) tropical deployments, with distributions favoringmore » higher concentrations of smaller drops than ARM continental examples. Oceanic type precipitation characteristics are predominantly observed during the Amazon Wet seasons. Finally, an exploration of the controls on Wet season precipitation properties reveals that wind direction, as compared with other standard radiosonde thermodynamic parameters or aerosol count/regime classifications performed at the ARM site, provides a good indicator for those Wet season Amazon events having an oceanic character for their precipitation drop size distributions.« less

  11. The Green Ocean: Precipitation Insights from the GoAmazon2014/5 Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Die; Giangrande, Scott E.; Bartholomew, Mary Jane

    This study summarizes the precipitation properties collected during the GoAmazon2014/5 campaign near Manaus in central Amazonia, Brazil. Precipitation breakdowns, summary radar rainfall relationships and self-consistency concepts from a coupled disdrometer and radar wind profiler measurements are presented. The properties of Amazon cumulus and associated stratiform precipitation are discussed, including segregations according to seasonal (Wet/Dry regime) variability, cloud echo-top height and possible aerosol influences on the apparent oceanic characteristics of the precipitation drop size distributions. Overall, we observe that the Amazon precipitation straddles behaviors found during previous U.S. Department of Energy Atmospheric Radiation Measurements program (ARM) tropical deployments, with distributions favoringmore » higher concentrations of smaller drops than ARM continental examples. Oceanic type precipitation characteristics are predominantly observed during the Amazon Wet seasons. Finally, an exploration of the controls on Wet season precipitation properties reveals that wind direction, as compared with other standard radiosonde thermodynamic parameters or aerosol count/regime classifications performed at the ARM site, provides a good indicator for those Wet season Amazon events having an oceanic character for their precipitation drop size distributions.« less

  12. Electromagnetic scattering and emission by a fixed multi-particle object in local thermal equilibrium: General formalism.

    PubMed

    Mishchenko, Michael I

    2017-10-01

    The majority of previous studies of the interaction of individual particles and multi-particle groups with electromagnetic field have focused on either elastic scattering in the presence of an external field or self-emission of electromagnetic radiation. In this paper we apply semi-classical fluctuational electrodynamics to address the ubiquitous scenario wherein a fixed particle or a fixed multi-particle group is exposed to an external quasi-polychromatic electromagnetic field as well as thermally emits its own electromagnetic radiation. We summarize the main relevant axioms of fluctuational electrodynamics, formulate in maximally rigorous mathematical terms the general scattering-emission problem for a fixed object, and derive such fundamental corollaries as the scattering-emission volume integral equation, the Lippmann-Schwinger equation for the dyadic transition operator, the multi-particle scattering-emission equations, and the far-field limit. We show that in the framework of fluctuational electrodynamics, the computation of the self-emitted component of the total field is completely separated from that of the elastically scattered field. The same is true of the computation of the emitted and elastically scattered components of quadratic/bilinear forms in the total electromagnetic field. These results pave the way to the practical computation of relevant optical observables.

  13. Aqua-planet simulations of the formation of the South Atlantic convergence zone

    NASA Technical Reports Server (NTRS)

    Nieto Ferreira, Rosana; Chao, Winston C.

    2013-01-01

    The impact of Amazon Basin convection and cold fronts on the formation and maintenance of the South Atlantic convergence zone (SACZ) is studied using aqua-planet simulations with a general circulation model. In the model, a circular patch of warm sea-surface temperature (SST) is used to mimic the effect of the Amazon Basin on South American monsoon convection. The aqua-planet simulations were designed to study the effect of the strength and latitude of Amazon Basin convection on the formation of the SACZ. The simulations indicate that the strength of the SACZ increases as the Amazon convection intensifies and is moved away from the equator. Of the two controls studied here, the latitude of the Amazon convection exerts the strongest effect on the strength of the SACZ. An analysis of the synoptic-scale variability in the simulations shows the importance of frontal systems in the formation of the aqua-planet SACZ. Composite time series of frontal systems that occurred in the simulations show that a robust SACZ occurs when fronts penetrate into the subtropics and become stationary there as they cross eastward of the longitude of the Amazon Basin. Moisture convergence associated with these frontal systems produces rainfall not along the model SACZ region and along a large portion of the northern model Amazon Basin. Simulations in which the warm SST patch was too weak or too close to the equator did not produce frontal systems that extended into the tropics and became stationary, and did not form a SACZ. In the model, the SACZ forms as Amazon Basin convection strengthens and migrates far enough southward to allow frontal systems to penetrate into the tropics and stall over South America. This result is in agreement with observations that the SACZ tends to form after the onset of the monsoon season in the Amazon Basin.

  14. System and method for measuring residual stress

    DOEpatents

    Prime, Michael B.

    2002-01-01

    The present invention is a method and system for determining the residual stress within an elastic object. In the method, an elastic object is cut along a path having a known configuration. The cut creates a portion of the object having a new free surface. The free surface then deforms to a contour which is different from the path. Next, the contour is measured to determine how much deformation has occurred across the new free surface. Points defining the contour are collected in an empirical data set. The portion of the object is then modeled in a computer simulator. The points in the empirical data set are entered into the computer simulator. The computer simulator then calculates the residual stress along the path which caused the points within the object to move to the positions measured in the empirical data set. The calculated residual stress is then presented in a useful format to an analyst.

  15. Computational strategies in the dynamic simulation of constrained flexible MBS

    NASA Technical Reports Server (NTRS)

    Amirouche, F. M. L.; Xie, M.

    1993-01-01

    This research focuses on the computational dynamics of flexible constrained multibody systems. At first a recursive mapping formulation of the kinematical expressions in a minimum dimension as well as the matrix representation of the equations of motion are presented. The method employs Kane's equation, FEM, and concepts of continuum mechanics. The generalized active forces are extended to include the effects of high temperature conditions, such as creep, thermal stress, and elastic-plastic deformation. The time variant constraint relations for rolling/contact conditions between two flexible bodies are also studied. The constraints for validation of MBS simulation of gear meshing contact using a modified Timoshenko beam theory are also presented. The last part deals with minimization of vibration/deformation of the elastic beam in multibody systems making use of time variant boundary conditions. The above methodologies and computational procedures developed are being implemented in a program called DYAMUS.

  16. Simulating hydrologic and hydraulic processes throughout the Amazon River Basin

    USGS Publications Warehouse

    Beighley, R.E.; Eggert, K.G.; Dunne, T.; He, Y.; Gummadi, V.; Verdin, K.L.

    2009-01-01

    Presented here is a model framework based on a land surface topography that can be represented with various degrees of resolution and capable of providing representative channel/floodplain hydraulic characteristics on a daily to hourly scale. The framework integrates two models: (1) a water balance model (WBM) for the vertical fluxes and stores of water in and through the canopy and soil layers based on the conservation of mass and energy, and (2) a routing model for the horizontal routing of surface and subsurface runoff and channel and floodplain waters based on kinematic and diffusion wave methodologies. The WBM is driven by satellite-derived precipitation (TRMM_3B42) and air temperature (MOD08_M3). The model's use of an irregular computational grid is intended to facilitate parallel processing for applications to continental and global scales. Results are presented for the Amazon Basin over the period Jan 2001 through Dec 2005. The model is shown to capture annual runoff totals, annual peaks, seasonal patterns, and daily fluctuations over a range of spatial scales (>1, 000 to < 4·7M km2). For the period of study, results suggest basin-wide total water storage changes in the Amazon vary by approximately + /− 5 to 10 cm, and the fractional components accounting for these changes are: root zone soil moisture (20%), subsurface water being routed laterally to channels (40%) and channel/floodplain discharge (40%). Annual variability in monthly water storage changes by + /− 2·5 cm is likely due to 0·5 to 1 month variability in the arrival of significant rainfall periods throughout the basin.

  17. Computer-animated model of accommodation and presbyopia.

    PubMed

    Goldberg, Daniel B

    2015-02-01

    To understand, demonstrate, and further research the mechanisms of accommodation and presbyopia. Private practice, Little Silver, New Jersey, USA. Experimental study. The CAMA 2.0 computer-animated model of accommodation and presbyopia was produced in collaboration with an experienced medical animator using Autodesk Maya animation software and Adobe After Effects. The computer-animated model demonstrates the configuration and synchronous movements of all accommodative elements. A new classification of the zonular apparatus based on structure and function is proposed. There are 3 divisions of zonular fibers; that is, anterior, crossing, and posterior. The crossing zonular fibers form a scaffolding to support the lens; the anterior and posterior zonular fibers work reciprocally to achieve focused vision. The model demonstrates the important support function of Weiger ligament. Dynamic movement of the ora serrata demonstrates that the forces of ciliary muscle contraction store energy for disaccommodation in the elastic choroid. The flow of aqueous and vitreous provides strong evidence for our understanding of the hydrodynamic interactions during the accommodative cycle. The interaction may result from the elastic stretch in the choroid transmitted to the vitreous rather than from vitreous pressue. The model supports the concept that presbyopia results from loss of elasticity and increasing ocular rigidity in both the lenticular and extralenticular structures. The computer-animated model demonstrates the structures of accommodation moving in synchrony and might enhance understanding of the mechanisms of accommodation and presbyopia. Dr. Goldberg is a consultant to Acevision, Inc., and Bausch & Lomb. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  18. In vitro flow assessment: from PC-MRI to computational fluid dynamics including fluid-structure interaction

    NASA Astrophysics Data System (ADS)

    Kratzke, Jonas; Rengier, Fabian; Weis, Christian; Beller, Carsten J.; Heuveline, Vincent

    2016-04-01

    Initiation and development of cardiovascular diseases can be highly correlated to specific biomechanical parameters. To examine and assess biomechanical parameters, numerical simulation of cardiovascular dynamics has the potential to complement and enhance medical measurement and imaging techniques. As such, computational fluid dynamics (CFD) have shown to be suitable to evaluate blood velocity and pressure in scenarios, where vessel wall deformation plays a minor role. However, there is a need for further validation studies and the inclusion of vessel wall elasticity for morphologies being subject to large displacement. In this work, we consider a fluid-structure interaction (FSI) model including the full elasticity equation to take the deformability of aortic wall soft tissue into account. We present a numerical framework, in which either a CFD study can be performed for less deformable aortic segments or an FSI simulation for regions of large displacement such as the aortic root and arch. Both of the methods are validated by means of an aortic phantom experiment. The computational results are in good agreement with 2D phase-contrast magnetic resonance imaging (PC-MRI) velocity measurements as well as catheter-based pressure measurements. The FSI simulation shows a characteristic vessel compliance effect on the flow field induced by the elasticity of the vessel wall, which the CFD model is not capable of. The in vitro validated FSI simulation framework can enable the computation of complementary biomechanical parameters such as the stress distribution within the vessel wall.

  19. Elastic membranes in confinement.

    PubMed

    Bostwick, J B; Miksis, M J; Davis, S H

    2016-07-01

    An elastic membrane stretched between two walls takes a shape defined by its length and the volume of fluid it encloses. Many biological structures, such as cells, mitochondria and coiled DNA, have fine internal structure in which a membrane (or elastic member) is geometrically 'confined' by another object. Here, the two-dimensional shape of an elastic membrane in a 'confining' box is studied by introducing a repulsive confinement pressure that prevents the membrane from intersecting the wall. The stage is set by contrasting confined and unconfined solutions. Continuation methods are then used to compute response diagrams, from which we identify the particular membrane mechanics that generate mitochondria-like shapes. Large confinement pressures yield complex response diagrams with secondary bifurcations and multiple turning points where modal identities may change. Regions in parameter space where such behaviour occurs are then mapped. © 2016 The Author(s).

  20. Assessing exchange-correlation functionals for elasticity and thermodynamics of α - ZrW 2 O 8 : A density functional perturbation theory study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weck, Philippe F.; Kim, Eunja; Greathouse, Jeffery A.

    Elastic and thermodynamic properties of negative thermal expansion (NTE) αα-ZrW2O8 have been calculated using PBEsol and PBE exchange-correlation functionals within the framework of density functional perturbation theory (DFPT). Measured elastic constants are reproduced within ~2% with PBEsol and 6% with PBE. The thermal evolution of the Grüneisen parameter computed within the quasi-harmonic approximation exhibits negative values below the Debye temperature, consistent with observation. The standard molar heat capacity is predicted to be Cmore » $$O\\atop{P}$$=192.2 and 193.8 J mol -1K -1 with PBEsol and PBE, respectively. These results suggest superior accuracy of DFPT/PBEsol for studying the lattice dynamics, elasticity and thermodynamics of NTE materials.« less

  1. Rayleigh-Taylor instability in soft elastic layers

    NASA Astrophysics Data System (ADS)

    Riccobelli, D.; Ciarletta, P.

    2017-04-01

    This work investigates the morphological stability of a soft body composed of two heavy elastic layers attached to a rigid surface and subjected only to the bulk gravity force. Using theoretical and computational tools, we characterize the selection of different patterns as well as their nonlinear evolution, unveiling the interplay between elastic and geometric effects for their formation. Unlike similar gravity-induced shape transitions in fluids, such as the Rayleigh-Taylor instability, we prove that the nonlinear elastic effects saturate the dynamic instability of the bifurcated solutions, displaying a rich morphological diagram where both digitations and stable wrinkling can emerge. The results of this work provide important guidelines for the design of novel soft systems with tunable shapes, with several applications in engineering sciences. This article is part of the themed issue 'Patterning through instabilities in complex media: theory and applications.'

  2. Nonstationary Deformation of an Elastic Layer with Mixed Boundary Conditions

    NASA Astrophysics Data System (ADS)

    Kubenko, V. D.

    2016-11-01

    The analytic solution to the plane problem for an elastic layer under a nonstationary surface load is found for mixed boundary conditions: normal stress and tangential displacement are specified on one side of the layer (fourth boundary-value problem of elasticity) and tangential stress and normal displacement are specified on the other side of the layer (second boundary-value problem of elasticity). The Laplace and Fourier integral transforms are applied. The inverse Laplace and Fourier transforms are found exactly using tabulated formulas and convolution theorems for various nonstationary loads. Explicit analytical expressions for stresses and displacements are derived. Loads applied to a constant surface area and to a surface area varying in a prescribed manner are considered. Computations demonstrate the dependence of the normal stress on time and spatial coordinates. Features of wave processes are analyzed

  3. Assessing exchange-correlation functionals for elasticity and thermodynamics of α - ZrW 2 O 8 : A density functional perturbation theory study

    DOE PAGES

    Weck, Philippe F.; Kim, Eunja; Greathouse, Jeffery A.; ...

    2018-03-15

    Elastic and thermodynamic properties of negative thermal expansion (NTE) αα-ZrW2O8 have been calculated using PBEsol and PBE exchange-correlation functionals within the framework of density functional perturbation theory (DFPT). Measured elastic constants are reproduced within ~2% with PBEsol and 6% with PBE. The thermal evolution of the Grüneisen parameter computed within the quasi-harmonic approximation exhibits negative values below the Debye temperature, consistent with observation. The standard molar heat capacity is predicted to be Cmore » $$O\\atop{P}$$=192.2 and 193.8 J mol -1K -1 with PBEsol and PBE, respectively. These results suggest superior accuracy of DFPT/PBEsol for studying the lattice dynamics, elasticity and thermodynamics of NTE materials.« less

  4. The economic value of the climate regulation ecosystem service provided by the Amazon rainforest

    NASA Astrophysics Data System (ADS)

    Heil Costa, Marcos; Pires, Gabrielle; Fontes, Vitor; Brumatti, Livia

    2017-04-01

    The rainy Amazon climate allowed important activities to develop in the region as large rainfed agricultural lands and hydropower plants. The Amazon rainforest is an important source of moisture to the regional atmosphere and helps regulate the local climate. The replacement of forest by agricultural lands decreases the flux of water vapor into the atmosphere and changes the precipitation patterns, which may severely affect such economic activities. Assign an economic value to this ecosystem service may emphasize the significance to preserve the Amazon rainforest. In this work, we provide a first approximation of the quantification of the climate regulation ecosystem service provided by the Amazon rainforest using the marginal production method. We use climate scenarios derived from Amazon deforestation scenarios as input to crop and runoff models to assess how land use change would affect agriculture and hydropower generation. The effects of forest removal on soybean production and on cattle beef production can both be as high as US 16 per year per ha deforested, and the effects on hydropower generation can be as high as US 8 per year per ha deforested. We consider this as a conservative estimate of a permanent service provided by the rainforest. Policy makers and other Amazon agriculture and energy businesses must be aware of these numbers, and consider them while planning their activities.

  5. Virtual Facility at Fermilab: Infrastructure and Services Expand to Public Clouds

    DOE PAGES

    Timm, Steve; Garzoglio, Gabriele; Cooper, Glenn; ...

    2016-02-18

    In preparation for its new Virtual Facility Project, Fermilab has launched a program of work to determine the requirements for running a computation facility on-site, in public clouds, or a combination of both. This program builds on the work we have done to successfully run experimental workflows of 1000-VM scale both on an on-site private cloud and on Amazon AWS. To do this at scale we deployed dynamically launched and discovered caching services on the cloud. We are now testing the deployment of more complicated services on Amazon AWS using native load balancing and auto scaling features they provide. Themore » Virtual Facility Project will design and develop a facility including infrastructure and services that can live on the site of Fermilab, off-site, or a combination of both. We expect to need this capacity to meet the peak computing requirements in the future. The Virtual Facility is intended to provision resources on the public cloud on behalf of the facility as a whole instead of having each experiment or Virtual Organization do it on their own. We will describe the policy aspects of a distributed Virtual Facility, the requirements, and plans to make a detailed comparison of the relative cost of the public and private clouds. Furthermore, this talk will present the details of the technical mechanisms we have developed to date, and the plans currently taking shape for a Virtual Facility at Fermilab.« less

  6. Open NASA Earth Exchange (OpenNEX): Strategies for enabling cross organization collaboration in the earth sciences

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Ganguly, S.; Nemani, R. R.; Votava, P.; Wang, W.; Lee, T. J.; Dungan, J. L.

    2014-12-01

    Sharing community-valued codes, intermediary datasets and results from individual efforts with others that are not in a direct funded collaboration can be a challenge. Cross organization collaboration is often impeded due to infrastructure security constraints, rigid financial controls, bureaucracy, and workforce nationalities, etc., which can force groups to work in a segmented fashion and/or through awkward and suboptimal web services. We show how a focused community may come together, share modeling and analysis codes, computing configurations, scientific results, knowledge and expertise on a public cloud platform; diverse groups of researchers working together at "arms length". Through the OpenNEX experimental workshop, users can view short technical "how-to" videos and explore encapsulated working environment. Workshop participants can easily instantiate Amazon Machine Images (AMI) or launch full cluster and data processing configurations within minutes. Enabling users to instantiate computing environments from configuration templates on large public cloud infrastructures, such as Amazon Web Services, may provide a mechanism for groups to easily use each others work and collaborate indirectly. Moreover, using the public cloud for this workshop allowed a single group to host a large read only data archive, making datasets of interest to the community widely available on the public cloud, enabling other groups to directly connect to the data and reduce the costs of the collaborative work by freeing other individual groups from redundantly retrieving, integrating or financing the storage of the datasets of interest.

  7. Virtual Facility at Fermilab: Infrastructure and Services Expand to Public Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timm, Steve; Garzoglio, Gabriele; Cooper, Glenn

    In preparation for its new Virtual Facility Project, Fermilab has launched a program of work to determine the requirements for running a computation facility on-site, in public clouds, or a combination of both. This program builds on the work we have done to successfully run experimental workflows of 1000-VM scale both on an on-site private cloud and on Amazon AWS. To do this at scale we deployed dynamically launched and discovered caching services on the cloud. We are now testing the deployment of more complicated services on Amazon AWS using native load balancing and auto scaling features they provide. Themore » Virtual Facility Project will design and develop a facility including infrastructure and services that can live on the site of Fermilab, off-site, or a combination of both. We expect to need this capacity to meet the peak computing requirements in the future. The Virtual Facility is intended to provision resources on the public cloud on behalf of the facility as a whole instead of having each experiment or Virtual Organization do it on their own. We will describe the policy aspects of a distributed Virtual Facility, the requirements, and plans to make a detailed comparison of the relative cost of the public and private clouds. Furthermore, this talk will present the details of the technical mechanisms we have developed to date, and the plans currently taking shape for a Virtual Facility at Fermilab.« less

  8. Shaping through buckling in elastic gridshells: from camping tents to architectural roofs

    NASA Astrophysics Data System (ADS)

    Reis, Pedro

    Elastic gridshells comprise an initially planar network of elastic rods that is actuated into a 3D shell-like structure by loading its extremities. This shaping results from elastic buckling and the subsequent geometrically nonlinear deformation of the grid structure. Architectural elastic gridshells first appeared in the 1970's. However, to date, only a limited number of examples have been constructed around the world, primarily due to the challenges involved in their structural design. Yet, elastic gridshells are highly appealing: they can cover wide spans with low self-weight, they allow for aesthetically pleasing shapes and their construction is typically simple and rapid. We study the mechanics of elastic gridshells by combining precision model experiments that explore their scale invariance, together with computer simulations that employ the Discrete Elastic Rods method. Excellent agreement is found between the two. Upon validation, the numerics are then used to systematically explore parameter space and identify general design principles for specific target final shapes. Our findings are rationalized using the theory of discrete Chebyshev nets, together with the group theory for crystals. Higher buckling modes occur for some configurations due to geometric incompatibility at the boundary and result in symmetry breaking. Along with the systematic classification of the various possible modes of deformation, we provide a reduced model that rationalizes form-finding in elastic gridshells. This work was done in collaboration with Changyeob Baek, Khalid Jawed and Andrew Sageman-Furnas. We are grateful to the NSF for funding (CAREER, CMMI-1351449).

  9. Metagenomics Analysis of Microorganisms in Freshwater Lakes of the Amazon Basin.

    PubMed

    Toyama, Danyelle; Kishi, Luciano Takeshi; Santos-Júnior, Célio Dias; Soares-Costa, Andrea; de Oliveira, Tereza Cristina Souza; de Miranda, Fernando Pellon; Henrique-Silva, Flávio

    2016-12-22

    The Amazon Basin is the largest hydrographic basin on the planet, and the dynamics of its aquatic microorganisms strongly impact global biogeochemical cycles. However, it remains poorly studied. This metagenome project was performed to obtain a snapshot of prokaryotic microbiota from four important lakes in the Amazon Basin. Copyright © 2016 Toyama et al.

  10. Simulation of Blast Loading on an Ultrastructurally-based Computational Model of the Ocular Lens

    DTIC Science & Technology

    2016-12-01

    organelles. Additionally, the cell membranes demonstrated the classic ball-and-socket loops . For the SEM images, they were placed in two fixatives and mounted...considered (fibrous network and matrix), both components are modelled using a hyper - elastic framework, and the resulting constitutive model is embedded in a...within the framework of hyper - elasticity). Full details on the linearization procedures that were adopted in these previous models or the convergence

  11. Elastic plate spallation

    NASA Technical Reports Server (NTRS)

    Oline, L.; Medaglia, J.

    1972-01-01

    The dynamic finite element method was used to investigate elastic stress waves in a plate. Strain displacement and stress strain relations are discussed along with the stiffness and mass matrix. The results of studying point load, and distributed load over small, intermediate, and large radii are reported. The derivation of finite element matrices, and the derivation of lumped and consistent matrices for one dimensional problems with Laplace transfer solutions are included. The computer program JMMSPALL is also included.

  12. Asymptotic analysis of hierarchical martensitic microstructure

    NASA Astrophysics Data System (ADS)

    Cesana, Pierluigi; Porta, Marcel; Lookman, Turab

    2014-12-01

    We consider a hierarchical nested microstructure, which also contains a point of singularity (disclination) at the origin, observed in lead orthovanadate. We show how to exactly compute the energy cost and associated displacement field within linearized elasticity by enforcing geometric compatibility of strains across interfaces of the three-phase mixture of distortions (variants) in the microstructure. We prove that the mechanical deformation is purely elastic and discuss the behavior of the system close to the origin.

  13. Analyzing and modeling gravity and magnetic anomalies using the SPHERE program and Magsat data

    NASA Technical Reports Server (NTRS)

    Braile, L. W.; Hinze, W. J.; Vonfrese, R. R. B. (Principal Investigator)

    1981-01-01

    Computer codes were completed, tested, and documented for analyzing magnetic anomaly vector components by equivalent point dipole inversion. The codes are intended for use in inverting the magnetic anomaly due to a spherical prism in a horizontal geomagnetic field and for recomputing the anomaly in a vertical geomagnetic field. Modeling of potential fields at satellite elevations that are derived from three dimensional sources by program SPHERE was made significantly more efficient by improving the input routines. A preliminary model of the Andean subduction zone was used to compute the anomaly at satellite elevations using both actual geomagnetic parameters and vertical polarization. Program SPHERE is also being used to calculate satellite level magnetic and gravity anomalies from the Amazon River Aulacogen.

  14. Oil and gas projects in the Western Amazon: threats to wilderness, biodiversity, and indigenous peoples.

    PubMed

    Finer, Matt; Jenkins, Clinton N; Pimm, Stuart L; Keane, Brian; Ross, Carl

    2008-08-13

    The western Amazon is the most biologically rich part of the Amazon basin and is home to a great diversity of indigenous ethnic groups, including some of the world's last uncontacted peoples living in voluntary isolation. Unlike the eastern Brazilian Amazon, it is still a largely intact ecosystem. Underlying this landscape are large reserves of oil and gas, many yet untapped. The growing global demand is leading to unprecedented exploration and development in the region. We synthesized information from government sources to quantify the status of oil development in the western Amazon. National governments delimit specific geographic areas or "blocks" that are zoned for hydrocarbon activities, which they may lease to state and multinational energy companies for exploration and production. About 180 oil and gas blocks now cover approximately 688,000 km(2) of the western Amazon. These blocks overlap the most species-rich part of the Amazon. We also found that many of the blocks overlap indigenous territories, both titled lands and areas utilized by peoples in voluntary isolation. In Ecuador and Peru, oil and gas blocks now cover more than two-thirds of the Amazon. In Bolivia and western Brazil, major exploration activities are set to increase rapidly. Without improved policies, the increasing scope and magnitude of planned extraction means that environmental and social impacts are likely to intensify. We review the most pressing oil- and gas-related conservation policy issues confronting the region. These include the need for regional Strategic Environmental Impact Assessments and the adoption of roadless extraction techniques. We also consider the conflicts where the blocks overlap indigenous peoples' territories.

  15. On Dams in the Amazon Basin, Teleconnected Impacts, and Neighbors Unaware of the Damage to their Natural Resources and Assets.

    NASA Astrophysics Data System (ADS)

    Latrubesse, E. M.; Park, E.

    2017-12-01

    In a recent study, Latrubesse et al., (2017) demonstrated that the accumulated negative environmental effects of more than one hundred existing dams and at least 288 proposed dams, if constructed, will trigger massive hydrophysical and biotic disturbances that will affect the Amazon basin's floodplains, estuary and sediment plume. The authors introduced a Dam Environmental Vulnerability Index (DEVI) to quantify the current and potential impacts of dams in the basin. The current and potential vulnerabilities of different regions of the Amazon basin was assessed, and the results highlighted the need for a more efficient and integrative legal framework involving all nine countries of the basin in an anticipatory assessment to minimize the negative socio-environmental and biotic impacts of hydropower developments. Here we present expanded information on the potential impacts of dams in the lower Amazon and the northeast Atlantic coast of South America, and revisit our proposed integrative strategies for basin management which are based on the adaptation and functionality of the institutional and legal framework already existing in the Amazon countries. Participative strategies involving members from the Amazon Cooperation Treaty Organization (ACTO) countries, and additional members (for example, France), such as the creation of a basin committee -as defined by the Brazilian Law of Waters of Brazil-, and the creation of an Amazon Basin Panel allowing the participation of scientists that could have a policy-relevant role but should be not policy-prescriptive, are also discussed. ReferencesLatrubesse, E., Arima E. Dunne T., Park E., Baker V, Horta F.,Wight, C., Wittmann F., Zuanon, J., Baker P., Ribas C, Norgaard R., Filizola N., Ansar A., Flyvbjerg B., Stevaux, J. 2017. Damming the rivers of the Amazon basin. Nature, 546, 363-369.

  16. Oil and Gas Projects in the Western Amazon: Threats to Wilderness, Biodiversity, and Indigenous Peoples

    PubMed Central

    Finer, Matt; Jenkins, Clinton N.; Pimm, Stuart L.; Keane, Brian; Ross, Carl

    2008-01-01

    Background The western Amazon is the most biologically rich part of the Amazon basin and is home to a great diversity of indigenous ethnic groups, including some of the world's last uncontacted peoples living in voluntary isolation. Unlike the eastern Brazilian Amazon, it is still a largely intact ecosystem. Underlying this landscape are large reserves of oil and gas, many yet untapped. The growing global demand is leading to unprecedented exploration and development in the region. Methodology/Principal Findings We synthesized information from government sources to quantify the status of oil development in the western Amazon. National governments delimit specific geographic areas or “blocks” that are zoned for hydrocarbon activities, which they may lease to state and multinational energy companies for exploration and production. About 180 oil and gas blocks now cover ∼688,000 km2 of the western Amazon. These blocks overlap the most species-rich part of the Amazon. We also found that many of the blocks overlap indigenous territories, both titled lands and areas utilized by peoples in voluntary isolation. In Ecuador and Peru, oil and gas blocks now cover more than two-thirds of the Amazon. In Bolivia and western Brazil, major exploration activities are set to increase rapidly. Conclusions/Significance Without improved policies, the increasing scope and magnitude of planned extraction means that environmental and social impacts are likely to intensify. We review the most pressing oil- and gas-related conservation policy issues confronting the region. These include the need for regional Strategic Environmental Impact Assessments and the adoption of roadless extraction techniques. We also consider the conflicts where the blocks overlap indigenous peoples' territories. PMID:18716679

  17. Earth Observations taken by the Expedition 17 Crew

    NASA Image and Video Library

    2008-08-19

    ISS017-E-013856 (19 Aug. 2008) --- Amazon River, Brazil is featured in this image photographed by an Expedition 17 crewmember on the International Space Station. This image shows the huge sunglint zone, common to oblique views from space, of the setting sun shining off the Amazon River and numerous lakes on its floodplain. About 150 kilometers of the sinuous Amazon course is shown here, as it appears about 1,000 kilometers from the Atlantic Ocean. The Uatuma River enters on the north side of the Amazon (top). A small side channel of the very large Madeira River enters the view from the left. Tupinambarama Island occupies the swampy wetlands between the Amazon and Madeira rivers. Sunglint images reveal great detail in waterbodies -- in this case the marked difference between the smooth outline of the Amazon and the jagged shoreline of the Uatuma River. The jagged shoreline results from valley sides being eroded in relatively hard rocks. The Uatuma River has since been dammed up by the sediment mass of the Amazon floodplain. Because the Amazon flows in its own soft sediment, its huge water discharge smooths the banks. Another dammed valley (known as a ria) is visible beneath the cirrus cloud of a storm (bottom). Although no smoke plumes from forest fires are visible in the view, two kinds of evidence show that there is smoke in the atmosphere. The coppery color of the sunglint is typically produced by smoke particles and other aerosols scattering yellow and red light. Second, a small patch of cloud (top right) casts a distinct shadow. The shadow, say scientists, is visible because so many particles in the surrounding sunlit parts of the atmosphere reflect light to the camera.

  18. The Climate Effects of Deforestation the Amazon Rainforest under Global Warming Conditions

    NASA Astrophysics Data System (ADS)

    Werth, D.; Avissar, R.

    2006-12-01

    Replacement of tropical rainforests has been observed to have a strong drying effect in Amazon simulations, with effects reaching high into the atmospheric column and into the midlatitudes. The drying effects of deforestation, however, can be moderated by the effects of global warming, which should accelerate the hydrologic cycle of the Amazon. The effects of a prescribed, time-varying Amazon deforestation done in conjunction with a steady, moderate increase in CO2 concentrations are determined using a climate model. The model agrees with previous studies when each forcing is applied individually - compared to a control run, Amazon deforestation decreases the local precipitation and global warming increases it. When both are applied, however, the precipitation and other hydrologic variables decrease, but to a lesser extent than when deforestation alone was applied. In effect, the two effects act opposite to one another and bring the simulated climate closer to that of the control.

  19. Amazon Molly, Poecilia formosa, as a model for studies of the effects of ionizing radiation. [Radiosensitivity of Poecilia formosa, Carassius auratus, Ictalurus punctatus, and Oncorhynchus tshawytscha in laboratory environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodhead, A.D.; Setlow, R.B.; Hart, R.W.

    1977-01-01

    It is suggested that the viviparous teleost, Poecilia formosa, (Amazon molly) may have wide potential use for aquatic radiation studies. The Amazon molly is a naturally occurring gynogenetic species, in which the eggs are activated after mating with the males of closely related species, without the subsequent genetic contribution from the male. The offspring of a single original female constitute a clone, having identical genotypes. Clones of the genetically homogeneous Amazon molly may prove to be equally as valuable to aquatic radiobiologists as the inbred rodent lines have been to mammalian studies. In many other respects the Amazon molly ismore » a satisfactory laboratory animal. It is robust, easy to rear, and has large broods of young when fully grown. Maintenance costs are low. Details are given of the conditions under which colonies reproduce.« less

  20. In-plane time-harmonic elastic wave motion and resonance phenomena in a layered phononic crystal with periodic cracks.

    PubMed

    Golub, Mikhail V; Zhang, Chuanzeng

    2015-01-01

    This paper presents an elastodynamic analysis of two-dimensional time-harmonic elastic wave propagation in periodically multilayered elastic composites, which are also frequently referred to as one-dimensional phononic crystals, with a periodic array of strip-like interior or interface cracks. The transfer matrix method and the boundary integral equation method in conjunction with the Bloch-Floquet theorem are applied to compute the elastic wave fields in the layered periodic composites. The effects of the crack size, spacing, and location, as well as the incidence angle and the type of incident elastic waves on the wave propagation characteristics in the composite structure are investigated in details. In particular, the band-gaps, the localization and the resonances of elastic waves are revealed by numerical examples. In order to understand better the wave propagation phenomena in layered phononic crystals with distributed cracks, the energy flow vector of Umov and the corresponding energy streamlines are visualized and analyzed. The numerical results demonstrate that large energy vortices obstruct elastic wave propagation in layered phononic crystals at resonance frequencies. They occur before the cracks reflecting most of the energy transmitted by the incoming wave and disappear when the problem parameters are shifted from the resonant ones.

  1. Parallel Program Systems for the Analysis of Wave Processes in Elastic-Plastic, Granular, Porous and Multi-Blocky Media

    NASA Astrophysics Data System (ADS)

    Sadovskaya, Oxana; Sadovskii, Vladimir

    2017-04-01

    Under modeling the wave propagation processes in geomaterials (granular and porous media, soils and rocks) it is necessary to take into account the structural inhomogeneity of these materials. Parallel program systems for numerical solution of 2D and 3D problems of the dynamics of deformable media with constitutive relationships of rather general form on the basis of universal mathematical model describing small strains of elastic, elastic-plastic, granular and porous materials are worked out. In the case of an elastic material, the model is reduced to the system of equations, hyperbolic by Friedrichs, written in terms of velocities and stresses in a symmetric form. In the case of an elastic-plastic material, the model is a special formulation of the Prandtl-Reuss theory in the form of variational inequality with one-sided constraints on the stress tensor. Generalization of the model to describe granularity and the collapse of pores is obtained by means of the rheological approach, taking into account different resistance of materials to tension and compression. Rotational motion of particles in the material microstructure is considered within the framework of a mathematical model of the Cosserat continuum. Computational domain may have a blocky structure, composed of an arbitrary number of layers, strips in a layer and blocks in a strip from different materials with self-consistent curvilinear interfaces. At the external boundaries of computational domain the main types of dissipative boundary conditions in terms of velocities, stresses or mixed boundary conditions can be given. Shock-capturing algorithm is proposed for implementation of the model on supercomputers with cluster architecture. It is based on the two-cyclic splitting method with respect to spatial variables and the special procedures of the stresses correction to take into account plasticity, granularity or porosity of a material. An explicit monotone ENO-scheme is applied for solving one-dimensional systems of equations at the stages of splitting method. The parallelizing of computations is carried out using the MPI library and the SPMD technology. The data exchange between processors occurs at step "predictor" of the finite-difference scheme. Program systems allow simulate the propagation of waves produced by external mechanical effects in a medium, aggregated of arbitrary number of heterogeneous blocks. Some computations of dynamic problems with and without taking into account the moment properties of a material were performed on clusters of ICM SB RAS (Krasnoyarsk) and JSCC RAS (Moscow). Parallel program systems 2Dyn_Granular, 3Dyn_Granular, 2Dyn_Cosserat, 3Dyn_Cosserat and 2Dyn_Blocks_MPI for numerical solution of 2D and 3D elastic-plastic problems of the dynamics of granular media and problems of the Cosserat elasticity theory, as well as for modeling of the dynamic processes in multi-blocky media with pliant viscoelastic, porous and fluid-saturated interlayers on cluster systems were registered by Rospatent.

  2. Introduction: Observations and modeling of the Green Ocean Amazon (GoAmazon2014/5)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, S. T.; Artaxo, P.; Machado, L. A. T.

    The Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) Experiment was carried out in the environs of Manaus, Brazil, in the central region of the Amazon basin for 2 years from 1 January 2014 through 31 December 2015. The experiment focused on the complex interactions among vegetation, atmospheric chemistry, and aerosol production on the one hand and their connections to aerosols, clouds, and precipitation on the other. The objective was to understand and quantify these linked processes, first under natural conditions to obtain a baseline and second when altered by the effects of human activities. To this end, the pollution plume from themore » Manaus metropolis, superimposed on the background conditions of the central Amazon basin, served as a natural laboratory. The present paper, as the introduction to the special issue of GoAmazon2014/5, presents the context and motivation of the GoAmazon2014/5 Experiment. The nine research sites, including the characteristics and instrumentation of each site, are presented. The sites range from time point zero (T0) upwind of the pollution, to T1 in the midst of the pollution, to T2 just downwind of the pollution, to T3 furthest downwind of the pollution (70 km). In addition to the ground sites, a low-altitude G-159 Gulfstream I (G-1) observed the atmospheric boundary layer and low clouds, and a high-altitude Gulfstream G550 (HALO) operated in the free troposphere. During the 2-year experiment, two Intensive Operating Periods (IOP1 and IOP2) also took place that included additional specialized research instrumentation at the ground sites as well as flights of the two aircraft. GoAmazon2014/5 IOP1 was carried out from 1 February to 31 March 2014 in the wet season. GoAmazon2014/5 IOP2 was conducted from 15 August to 15 October 2014 in the dry season. In addition, the G-1 aircraft flew during both IOP1 and IOP2, and the HALO aircraft flew during IOP2. In the context of the Amazon basin, the two IOPs also correspond to the clean and biomass burning seasons, respectively. The Manaus plume is present year-round, and it is transported by prevailing northeasterly and easterly winds in the wet and dry seasons, respectively. This introduction also organizes information relevant to many papers in the special issue. Information is provided on the vehicle fleet, power plants, and industrial activities of Manaus. The mesoscale and synoptic meteorologies relevant to the two IOPs are presented. Regional and long-range transport of emissions during the two IOPs is discussed based on satellite observations across South America and Africa. Fire locations throughout the airshed are detailed. In conjunction with the context and motivation of GoAmazon2014/5 as presented in this introduction, research articles including thematic overview articles are anticipated in this special issue to describe the detailed results and findings of the GoAmazon2014/5 Experiment.« less

  3. Introduction: Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5)

    NASA Astrophysics Data System (ADS)

    Martin, S. T.; Artaxo, P.; Machado, L. A. T.; Manzi, A. O.; Souza, R. A. F.; Schumacher, C.; Wang, J.; Andreae, M. O.; Barbosa, H. M. J.; Fan, J.; Fisch, G.; Goldstein, A. H.; Guenther, A.; Jimenez, J. L.; Pöschl, U.; Silva Dias, M. A.; Smith, J. N.; Wendisch, M.

    2015-11-01

    The Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) Experiment was carried out in the environs of Manaus, Brazil, in the central region of the Amazon basin during two years from 1 January 2014 through 31 December 2015. The experiment focused on the complex interactions among vegetation, atmospheric chemistry, and aerosol production on the one hand and their connections to aerosols, clouds, and precipitation on the other. The objective was to understand and quantify these linked processes, first under natural conditions to obtain a baseline and second when altered by the effects of human activities. To this end, the pollution plume from the Manaus metropolis, superimposed on the background conditions of the central Amazon basin, served as a natural laboratory. The present paper, as the Introduction to the GoAmazon2014/5 Special Issue, presents the context and motivation of the GoAmazon2014/5 Experiment. The nine research sites, including the characteristics and instrumentation of each site, are presented. The sites range from time point zero (T0) upwind of the pollution, to T1 in the midst of the pollution, to T2 just downwind of the pollution, to T3 furthest downwind of the pollution (70 km). In addition to the ground sites, a low-altitude G-159 Gulfstream I (G1) observed the atmospheric boundary layer and low clouds, and a high-altitude Gulfstream G550 (HALO) operated in the free troposphere. During the two-year experiment, two Intensive Operating Periods (IOP1 and IOP2) also took place that included additional specialized research instrumentation at the ground sites as well as flights of the two aircraft. GoAmazon2014/5 IOP1 was carried out from 1 February to 31 March 2014 in the wet season. GoAmazon2014/5 IOP2 was conducted from 15 August to 15 October 2014 in the dry season. The G1 aircraft flew during both IOP1 and IOP2, and the HALO aircraft flew during IOP2. In the context of the Amazon basin, the two IOPs also correspond to the clean and biomass burning seasons, respectively. The Manaus plume is present year round, and it is transported by prevailing northeasterly and easterly winds in the wet and dry seasons, respectively. This Introduction also organizes information relevant to many papers in the Special Issue. Information is provided on the vehicle fleet, power plants, and industrial activities of Manaus. The mesoscale and synoptic meteorologies relevant to the two IOPs are presented. Regional and long-range transport of emissions during the two IOPs is discussed based on satellite observations across South America and Africa. Fire locations throughout the airshed are detailed. In conjunction with the context and motivation of GoAmazon2014/5, as presented herein in this Introduction, research articles published in this Special Issue are anticipated in the near future to describe the detailed results and findings of the GoAmazon2014/5 Experiment.

  4. Introduction: Observations and modeling of the Green Ocean Amazon (GoAmazon2014/5)

    DOE PAGES

    Martin, S. T.; Artaxo, P.; Machado, L. A. T.; ...

    2016-04-19

    The Observations and Modeling of the Green Ocean Amazon (GoAmazon2014/5) Experiment was carried out in the environs of Manaus, Brazil, in the central region of the Amazon basin for 2 years from 1 January 2014 through 31 December 2015. The experiment focused on the complex interactions among vegetation, atmospheric chemistry, and aerosol production on the one hand and their connections to aerosols, clouds, and precipitation on the other. The objective was to understand and quantify these linked processes, first under natural conditions to obtain a baseline and second when altered by the effects of human activities. To this end, the pollution plume from themore » Manaus metropolis, superimposed on the background conditions of the central Amazon basin, served as a natural laboratory. The present paper, as the introduction to the special issue of GoAmazon2014/5, presents the context and motivation of the GoAmazon2014/5 Experiment. The nine research sites, including the characteristics and instrumentation of each site, are presented. The sites range from time point zero (T0) upwind of the pollution, to T1 in the midst of the pollution, to T2 just downwind of the pollution, to T3 furthest downwind of the pollution (70 km). In addition to the ground sites, a low-altitude G-159 Gulfstream I (G-1) observed the atmospheric boundary layer and low clouds, and a high-altitude Gulfstream G550 (HALO) operated in the free troposphere. During the 2-year experiment, two Intensive Operating Periods (IOP1 and IOP2) also took place that included additional specialized research instrumentation at the ground sites as well as flights of the two aircraft. GoAmazon2014/5 IOP1 was carried out from 1 February to 31 March 2014 in the wet season. GoAmazon2014/5 IOP2 was conducted from 15 August to 15 October 2014 in the dry season. In addition, the G-1 aircraft flew during both IOP1 and IOP2, and the HALO aircraft flew during IOP2. In the context of the Amazon basin, the two IOPs also correspond to the clean and biomass burning seasons, respectively. The Manaus plume is present year-round, and it is transported by prevailing northeasterly and easterly winds in the wet and dry seasons, respectively. This introduction also organizes information relevant to many papers in the special issue. Information is provided on the vehicle fleet, power plants, and industrial activities of Manaus. The mesoscale and synoptic meteorologies relevant to the two IOPs are presented. Regional and long-range transport of emissions during the two IOPs is discussed based on satellite observations across South America and Africa. Fire locations throughout the airshed are detailed. In conjunction with the context and motivation of GoAmazon2014/5 as presented in this introduction, research articles including thematic overview articles are anticipated in this special issue to describe the detailed results and findings of the GoAmazon2014/5 Experiment.« less

  5. Free Oscillations of a Fluid-filled Cavity in an Infinite Elastic Medium

    NASA Astrophysics Data System (ADS)

    Sakuraba, A.

    2016-12-01

    Volcanic low-frequency earthquakes and tremor have been widely recognized as a good indicator of hidden activities of volcanoes. It is likely that existence or movement of underground magma and geothermal fluids play a crucial role in their generation mechanisms, but there are still many unknowns. This presentation aims to give a fundamental contribution to understanding and interpreting volcanic low-frequency seismic events. The problem we consider is to compute eigen modes of free oscillations of a fluid-filled cavity surrounded by an infinite linearly elastic medium. A standard boundary element method is used to solve fluid and elastic motion around a cavity of arbitrary shape. Nonlinear advection term is neglected, but viscosity is generally considered in a fluid medium. Of a great importance is to find not only characteristic frequencies but attenuation properties of the oscillations, the latter being determined by both viscous dissipation in the fluid cavity and elastic wave radiation to infinity. One of the simplest cases may be resonance of a fluid-filled crack, which has been studied numerically (Chouet, JGR 1986; Yamamoto and Kawakatsu, GJI 2008) and analytically (Maeda and Kumagai, GRL 2013). In the present study, we generally consider a three-dimensional cavity with emphasis on treating the crack model and other simplest models such as spherical and cylindrical resonators as the extreme cases. In order to reduce computational costs, we assume symmetries about three orthogonal planes and calculate the eigen modes separately for each symmetry. The current status of this project is that the computational code has been checked through comparison to eigen modes of a spherical inviscid cavity (Sakuraba et al., EPS 2002), and another comparison to resonance of a fluid-filled crack is undertook.

  6. Mechanical properties of additively manufactured octagonal honeycombs.

    PubMed

    Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A

    2016-12-01

    Honeycomb structures have found numerous applications as structural and biomedical materials due to their favourable properties such as low weight, high stiffness, and porosity. Application of additive manufacturing and 3D printing techniques allows for manufacturing of honeycombs with arbitrary shape and wall thickness, opening the way for optimizing the mechanical and physical properties for specific applications. In this study, the mechanical properties of honeycomb structures with a new geometry, called octagonal honeycomb, were investigated using analytical, numerical, and experimental approaches. An additive manufacturing technique, namely fused deposition modelling, was used to fabricate the honeycomb from polylactic acid (PLA). The honeycombs structures were then mechanically tested under compression and the mechanical properties of the structures were determined. In addition, the Euler-Bernoulli and Timoshenko beam theories were used for deriving analytical relationships for elastic modulus, yield stress, Poisson's ratio, and buckling stress of this new design of honeycomb structures. Finite element models were also created to analyse the mechanical behaviour of the honeycombs computationally. The analytical solutions obtained using Timoshenko beam theory were close to computational results in terms of elastic modulus, Poisson's ratio and yield stress, especially for relative densities smaller than 25%. The analytical solutions based on the Timoshenko analytical solution and the computational results were in good agreement with experimental observations. Finally, the elastic properties of the proposed honeycomb structure were compared to those of other honeycomb structures such as square, triangular, hexagonal, mixed, diamond, and Kagome. The octagonal honeycomb showed yield stress and elastic modulus values very close to those of regular hexagonal honeycombs and lower than the other considered honeycombs. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. An accurate boundary element method for the exterior elastic scattering problem in two dimensions

    NASA Astrophysics Data System (ADS)

    Bao, Gang; Xu, Liwei; Yin, Tao

    2017-11-01

    This paper is concerned with a Galerkin boundary element method solving the two dimensional exterior elastic wave scattering problem. The original problem is first reduced to the so-called Burton-Miller [1] boundary integral formulation, and essential mathematical features of its variational form are discussed. In numerical implementations, a newly-derived and analytically accurate regularization formula [2] is employed for the numerical evaluation of hyper-singular boundary integral operator. A new computational approach is employed based on the series expansions of Hankel functions for the computation of weakly-singular boundary integral operators during the reduction of corresponding Galerkin equations into a discrete linear system. The effectiveness of proposed numerical methods is demonstrated using several numerical examples.

  8. Computer modeling of the mechanical behavior of composites -- Interfacial cracks in fiber-reinforced materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmauder, S.; Haake, S.; Mueller, W.H.

    Computer modeling of materials and especially modeling the mechanical behavior of composites became increasingly popular in the past few years. Among them are examples of micromechanical modeling of real structures as well as idealized model structures of linear elastic and elasto-plastic material response. In this paper, Erdogan`s Integral Equation Method (IEM) is chosen as an example for a powerful method providing principle insight into elastic fracture mechanical situations. IEM or, alternatively, complex function techniques sometimes even allow for deriving analytical solutions such as in the case of a circumferential crack along a fiber/matrix interface. The analytical formulae of this interfacemore » crack will be analyzed numerically and typical results will be presented graphically.« less

  9. A Fourier-based total-field/scattered-field technique for three-dimensional broadband simulations of elastic targets near a water-sand interface.

    PubMed

    Shao, Yu; Wang, Shumin

    2016-12-01

    The numerical simulation of acoustic scattering from elastic objects near a water-sand interface is critical to underwater target identification. Frequency-domain methods are computationally expensive, especially for large-scale broadband problems. A numerical technique is proposed to enable the efficient use of finite-difference time-domain method for broadband simulations. By incorporating a total-field/scattered-field boundary, the simulation domain is restricted inside a tightly bounded region. The incident field is further synthesized by the Fourier transform for both subcritical and supercritical incidences. Finally, the scattered far field is computed using a half-space Green's function. Numerical examples are further provided to demonstrate the accuracy and efficiency of the proposed technique.

  10. Computer-assisted assessment of ultrasound real-time elastography: initial experience in 145 breast lesions.

    PubMed

    Zhang, Xue; Xiao, Yang; Zeng, Jie; Qiu, Weibao; Qian, Ming; Wang, Congzhi; Zheng, Rongqin; Zheng, Hairong

    2014-01-01

    To develop and evaluate a computer-assisted method of quantifying five-point elasticity scoring system based on ultrasound real-time elastography (RTE), for classifying benign and malignant breast lesions, with pathologic results as the reference standard. Conventional ultrasonography (US) and RTE images of 145 breast lesions (67 malignant, 78 benign) were performed in this study. Each lesion was automatically contoured on the B-mode image by the level set method and mapped on the RTE image. The relative elasticity value of each pixel was reconstructed and classified into hard or soft by the fuzzy c-means clustering method. According to the hardness degree inside lesion and its surrounding tissue, the elasticity score of the RTE image was computed in an automatic way. Visual assessments of the radiologists were used for comparing the diagnostic performance. Histopathologic examination was used as the reference standard. The Student's t test and receiver operating characteristic (ROC) curve analysis were performed for statistical analysis. Considering score 4 or higher as test positive for malignancy, the diagnostic accuracy, sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) were 93.8% (136/145), 92.5% (62/67), 94.9% (74/78), 93.9% (62/66), and 93.7% (74/79) for the computer-assisted scheme, and 89.7% (130/145), 85.1% (57/67), 93.6% (73/78), 92.0% (57/62), and 88.0% (73/83) for manual assessment. Area under ROC curve (Az value) for the proposed method was higher than the Az value for visual assessment (0.96 vs. 0.93). Computer-assisted quantification of classical five-point scoring system can significantly eliminate the interobserver variability and thereby improve the diagnostic confidence of classifying the breast lesions to avoid unnecessary biopsy. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. A Slippery Slope: Children's Perceptions of Their Role in Environmental Preservation in the Peruvian Amazon

    ERIC Educational Resources Information Center

    Galeano, Rebecca

    2013-01-01

    Despite international attention and attempts to preserve the environmental diversity of the Amazon, it is an accepted fact that those who inhabit the forest must be the ones who preserve it. This article presents an analysis of how children in small rural riverine communities along the Amazon understand the importance of environmental preservation…

  12. Comparative Phylogeography of Neotropical Birds

    DTIC Science & Technology

    2009-05-01

    of lowland Neotropical rainforest birds that have populations isolated on either side of the Andes, Amazon River, and Madeira River. I found widely...Unlike canopy species, understory birds were structured at smaller spatial scales, particularly across riverine barriers of the Amazon basin...expansive complementary forest of the Amazon Basin. This divide is relatively young as the northern Andes were only half their present elevation

  13. Surveying the area of deforestation of the Amazon by LANDSAT satellite imagery. [Mato grosso, Goias and Para, Brazil

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Tardin, A. T.; Dossantos, A. P.; Lee, D. C. L.; Soaresmaia, F. C.; Mendonca, F. J.; Assuncao, G. V.; Rodrigues, J. E.; Demouraabdon, M.; Novaes, R. A.

    1979-01-01

    LANDSAT imagery was used to determine the amount of deforestation in a study area comprising 55 million hectares of the Amazon region. Results show that more than 4 million hectares were deforested. Maps and pictures of the deforested area in relation to the total area of the Amazon are included.

  14. Sources, Properties, Aging, and Anthropogenic Influences on OA and SOA over the Southeast US and the Amazon duing SOAS, DC3, SEAC4RS, and GoAmazon

    EPA Science Inventory

    The SE US and the Amazon have large sources of biogenic VOCs, varying anthropogenic pollution impacts, and often poor organic aerosol (OA) model performance. Recent results on the sources, properties, aging, and impact of anthropogenic pollution on OA and secondary OA (SOA) over ...

  15. Damming the rivers of the Amazon basin

    NASA Astrophysics Data System (ADS)

    Latrubesse, Edgardo M.; Arima, Eugenio Y.; Dunne, Thomas; Park, Edward; Baker, Victor R.; D'Horta, Fernando M.; Wight, Charles; Wittmann, Florian; Zuanon, Jansen; Baker, Paul A.; Ribas, Camila C.; Norgaard, Richard B.; Filizola, Naziano; Ansar, Atif; Flyvbjerg, Bent; Stevaux, Jose C.

    2017-06-01

    More than a hundred hydropower dams have already been built in the Amazon basin and numerous proposals for further dam constructions are under consideration. The accumulated negative environmental effects of existing dams and proposed dams, if constructed, will trigger massive hydrophysical and biotic disturbances that will affect the Amazon basin’s floodplains, estuary and sediment plume. We introduce a Dam Environmental Vulnerability Index to quantify the current and potential impacts of dams in the basin. The scale of foreseeable environmental degradation indicates the need for collective action among nations and states to avoid cumulative, far-reaching impacts. We suggest institutional innovations to assess and avoid the likely impoverishment of Amazon rivers.

  16. Damming the rivers of the Amazon basin.

    PubMed

    Latrubesse, Edgardo M; Arima, Eugenio Y; Dunne, Thomas; Park, Edward; Baker, Victor R; d'Horta, Fernando M; Wight, Charles; Wittmann, Florian; Zuanon, Jansen; Baker, Paul A; Ribas, Camila C; Norgaard, Richard B; Filizola, Naziano; Ansar, Atif; Flyvbjerg, Bent; Stevaux, Jose C

    2017-06-14

    More than a hundred hydropower dams have already been built in the Amazon basin and numerous proposals for further dam constructions are under consideration. The accumulated negative environmental effects of existing dams and proposed dams, if constructed, will trigger massive hydrophysical and biotic disturbances that will affect the Amazon basin's floodplains, estuary and sediment plume. We introduce a Dam Environmental Vulnerability Index to quantify the current and potential impacts of dams in the basin. The scale of foreseeable environmental degradation indicates the need for collective action among nations and states to avoid cumulative, far-reaching impacts. We suggest institutional innovations to assess and avoid the likely impoverishment of Amazon rivers.

  17. Modelling conservation in the Amazon basin.

    PubMed

    Soares-Filho, Britaldo Silveira; Nepstad, Daniel Curtis; Curran, Lisa M; Cerqueira, Gustavo Coutinho; Garcia, Ricardo Alexandrino; Ramos, Claudia Azevedo; Voll, Eliane; McDonald, Alice; Lefebvre, Paul; Schlesinger, Peter

    2006-03-23

    Expansion of the cattle and soy industries in the Amazon basin has increased deforestation rates and will soon push all-weather highways into the region's core. In the face of this growing pressure, a comprehensive conservation strategy for the Amazon basin should protect its watersheds, the full range of species and ecosystem diversity, and the stability of regional climates. Here we report that protected areas in the Amazon basin--the central feature of prevailing conservation approaches--are an important but insufficient component of this strategy, based on policy-sensitive simulations of future deforestation. By 2050, current trends in agricultural expansion will eliminate a total of 40% of Amazon forests, including at least two-thirds of the forest cover of six major watersheds and 12 ecoregions, releasing 32 +/- 8 Pg of carbon to the atmosphere. One-quarter of the 382 mammalian species examined will lose more than 40% of the forest within their Amazon ranges. Although an expanded and enforced network of protected areas could avoid as much as one-third of this projected forest loss, conservation on private lands is also essential. Expanding market pressures for sound land management and prevention of forest clearing on lands unsuitable for agriculture are critical ingredients of a strategy for comprehensive conservation.

  18. Hydrologic resilience and Amazon productivity.

    PubMed

    Ahlström, Anders; Canadell, Josep G; Schurgers, Guy; Wu, Minchao; Berry, Joseph A; Guan, Kaiyu; Jackson, Robert B

    2017-08-30

    The Amazon rainforest is disproportionately important for global carbon storage and biodiversity. The system couples the atmosphere and land, with moist forest that depends on convection to sustain gross primary productivity and growth. Earth system models that estimate future climate and vegetation show little agreement in Amazon simulations. Here we show that biases in internally generated climate, primarily precipitation, explain most of the uncertainty in Earth system model results; models, empirical data and theory converge when precipitation biases are accounted for. Gross primary productivity, above-ground biomass and tree cover align on a hydrological relationship with a breakpoint at ~2000 mm annual precipitation, where the system transitions between water and radiation limitation of evapotranspiration. The breakpoint appears to be fairly stable in the future, suggesting resilience of the Amazon to climate change. Changes in precipitation and land use are therefore more likely to govern biomass and vegetation structure in Amazonia.Earth system model simulations of future climate in the Amazon show little agreement. Here, the authors show that biases in internally generated climate explain most of this uncertainty and that the balance between water-saturated and water-limited evapotranspiration controls the Amazon resilience to climate change.

  19. Carbon Emissions from Deforestation in the Brazilian Amazon Region

    NASA Technical Reports Server (NTRS)

    Potter, C.; Klooster, S.; Genovese, V.

    2009-01-01

    A simulation model based on satellite observations of monthly vegetation greenness from the Moderate Resolution Imaging Spectroradiometer (MODIS) was used to estimate monthly carbon fluxes in terrestrial ecosystems of Brazilian Amazon and Cerrado regions over the period 2000-2002. The NASA-CASA (Carnegie Ames Stanford Approach) model estimates of annual forest production were used for the first time as the basis to generate a prediction for the standing pool of carbon in above-ground biomass (AGB; gC/sq m) for forested areas of the Brazilian Amazon region. Plot-level measurements of the residence time of carbon in wood in Amazon forest from Malhi et al. (2006) were interpolated by inverse distance weighting algorithms and used with CASA to generate a new regional map of AGB. Data from the Brazilian PRODES (Estimativa do Desflorestamento da Amazonia) project were used to map deforested areas. Results show that net primary production (NPP) sinks for carbon varied between 4.25 Pg C/yr (1 Pg=10(exp 15)g) and 4.34 Pg C for the region and were highest across the eastern and northern Amazon areas, whereas deforestation sources of CO2 flux from decomposition of residual woody debris were higher and less seasonal in the central Amazon than in the eastern and southern areas. Increased woody debris from past deforestation events was predicted to alter the net ecosystem carbon balance of the Amazon region to generate annual CO2 source fluxes at least two times higher than previously predicted by CASA modeling studies. Variations in climate, land cover, and forest burning were predicted to release carbon at rates of 0.5 to 1 Pg C/yr from the Brazilian Amazon. When direct deforestation emissions of CO2 from forest burning of between 0.2 and 0.6 Pg C/yr in the Legal Amazon are overlooked in regional budgets, the year-to-year variations in this net biome flux may appear to be large, whereas our model results implies net biome fluxes had actually been relatively consistent from year to year during the period 2000-2002. This is the first study to use MODIS data to model all carbon pools (wood, leaf, root) dynamically in simulations of Amazon forest deforestation from clearing and burning of all kinds.

  20. Estimation of the fractional coverage of rainfall in climate models

    NASA Technical Reports Server (NTRS)

    Eltahir, E. A. B.; Bras, R. L.

    1993-01-01

    The fraction of the grid cell area covered by rainfall, mu, is an essential parameter in descriptions of land surface hydrology in climate models. A simple procedure is presented for estimating this fraction, based on extensive observations of storm areas and rainfall volumes. Storm area and rainfall volume are often linearly related; this relation can be used to compute the storm area from the volume of rainfall simulated by a climate model. A formula is developed for computing mu, which describes the dependence of the fractional coverage of rainfall on the season of the year, the geographical region, rainfall volume, and the spatial and temporal resolution of the model. The new formula is applied in computing mu over the Amazon region. Significant temporal variability in the fractional coverage of rainfall is demonstrated. The implications of this variability for the modeling of land surface hydrology in climate models are discussed.

Top