Sample records for models additional open

  1. OpenIPSL: Open-Instance Power System Library - Update 1.5 to "iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations"

    NASA Astrophysics Data System (ADS)

    Baudette, Maxime; Castro, Marcelo; Rabuzin, Tin; Lavenius, Jan; Bogodorova, Tetiana; Vanfretti, Luigi

    2018-01-01

    This paper presents the latest improvements implemented in the Open-Instance Power System Library (OpenIPSL). The OpenIPSL is a fork from the original iTesla Power Systems Library (iPSL) by some of the original developers of the iPSL. This fork's motivation comes from the will of the authors to further develop the library with additional features tailored to research and teaching purposes. The enhancements include improvements to existing models, the addition of a new package of three phase models, and the implementation of automated tests through continuous integration.

  2. Simplified cost models for prefeasibility mineral evaluations

    USGS Publications Warehouse

    Camm, Thomas W.

    1991-01-01

    This report contains 2 open pit models, 6 underground mine models, 11 mill models, and cost equations for access roads, power lines, and tailings ponds. In addition, adjustment factors for variation in haulage distances are provided for open pit models and variation in mining depths for underground models.

  3. Behavioral and locomotor measurements using an open field activity monitoring system for skeletal muscle diseases.

    PubMed

    Tatem, Kathleen S; Quinn, James L; Phadke, Aditi; Yu, Qing; Gordish-Dressman, Heather; Nagaraju, Kanneboyina

    2014-09-29

    The open field activity monitoring system comprehensively assesses locomotor and behavioral activity levels of mice. It is a useful tool for assessing locomotive impairment in animal models of neuromuscular disease and efficacy of therapeutic drugs that may improve locomotion and/or muscle function. The open field activity measurement provides a different measure than muscle strength, which is commonly assessed by grip strength measurements. It can also show how drugs may affect other body systems as well when used with additional outcome measures. In addition, measures such as total distance traveled mirror the 6 min walk test, a clinical trial outcome measure. However, open field activity monitoring is also associated with significant challenges: Open field activity measurements vary according to animal strain, age, sex, and circadian rhythm. In addition, room temperature, humidity, lighting, noise, and even odor can affect assessment outcomes. Overall, this manuscript provides a well-tested and standardized open field activity SOP for preclinical trials in animal models of neuromuscular diseases. We provide a discussion of important considerations, typical results, data analysis, and detail the strengths and weaknesses of open field testing. In addition, we provide recommendations for optimal study design when using open field activity in a preclinical trial.

  4. High-Fidelity Thermal Radiation Models and Measurements for High-Pressure Reacting Laminar and Turbulent Flows

    DTIC Science & Technology

    2013-06-26

    flow code used ( OpenFOAM ) to include differential diffusion and cell-based stochastic RTE solvers. The models were validated by simulation of laminar...wavenumber selection is improved about by a factor of 10. (5) OpenFOAM Improvements for Laminar Flames A laminar-diffusion combustion solver, taking into...account the effects of differential diffusion, was developed within the open source CFD package OpenFOAM [18]. In addition, OpenFOAM was augmented to take

  5. Global review of open access risk assessment software packages valid for global or continental scale analysis

    NASA Astrophysics Data System (ADS)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.

  6. Implementation of the Realized Genomic Relationship Matrix to Open-Pollinated White Spruce Family Testing for Disentangling Additive from Nonadditive Genetic Effects

    PubMed Central

    Gamal El-Dien, Omnia; Ratcliffe, Blaise; Klápště, Jaroslav; Porth, Ilga; Chen, Charles; El-Kassaby, Yousry A.

    2016-01-01

    The open-pollinated (OP) family testing combines the simplest known progeny evaluation and quantitative genetics analyses as candidates’ offspring are assumed to represent independent half-sib families. The accuracy of genetic parameter estimates is often questioned as the assumption of “half-sibling” in OP families may often be violated. We compared the pedigree- vs. marker-based genetic models by analysing 22-yr height and 30-yr wood density for 214 white spruce [Picea glauca (Moench) Voss] OP families represented by 1694 individuals growing on one site in Quebec, Canada. Assuming half-sibling, the pedigree-based model was limited to estimating the additive genetic variances which, in turn, were grossly overestimated as they were confounded by very minor dominance and major additive-by-additive epistatic genetic variances. In contrast, the implemented genomic pairwise realized relationship models allowed the disentanglement of additive from all nonadditive factors through genetic variance decomposition. The marker-based models produced more realistic narrow-sense heritability estimates and, for the first time, allowed estimating the dominance and epistatic genetic variances from OP testing. In addition, the genomic models showed better prediction accuracies compared to pedigree models and were able to predict individual breeding values for new individuals from untested families, which was not possible using the pedigree-based model. Clearly, the use of marker-based relationship approach is effective in estimating the quantitative genetic parameters of complex traits even under simple and shallow pedigree structure. PMID:26801647

  7. Cardiac sodium channel Markov model with temperature dependence and recovery from inactivation.

    PubMed Central

    Irvine, L A; Jafri, M S; Winslow, R L

    1999-01-01

    A Markov model of the cardiac sodium channel is presented. The model is similar to the CA1 hippocampal neuron sodium channel model developed by Kuo and Bean (1994. Neuron. 12:819-829) with the following modifications: 1) an additional open state is added; 2) open-inactivated transitions are made voltage-dependent; and 3) channel rate constants are exponential functions of enthalpy, entropy, and voltage and have explicit temperature dependence. Model parameters are determined using a simulated annealing algorithm to minimize the error between model responses and various experimental data sets. The model reproduces a wide range of experimental data including ionic currents, gating currents, tail currents, steady-state inactivation, recovery from inactivation, and open time distributions over a temperature range of 10 degrees C to 25 degrees C. The model also predicts measures of single channel activity such as first latency, probability of a null sweep, and probability of reopening. PMID:10096885

  8. Leveraging the Value of Human Relationships to Improve Health Outcomes. Lessons learned from the OpenMRS Electronic Health Record System.

    PubMed

    Kasthurirathne, Suranga N; Mamlin, Burke W; Cullen, Theresa

    2017-02-01

    Despite significant awareness on the value of leveraging patient relationships across the healthcare continuum, there is no research on the potential of using Electronic Health Record (EHR) systems to store structured patient relationship data, or its impact on enabling better healthcare. We sought to identify which EHR systems supported effective patient relationship data collection, and for systems that do, what types of relationship data is collected, how this data is used, and the perceived value of doing so. We performed a literature search to identify EHR systems that supported patient relationship data collection. Based on our results, we defined attributes of an effective patient relationship model. The Open Medical Record System (OpenMRS), an open source medical record platform for underserved settings met our eligibility criteria for effective patient relationship collection. We performed a survey to understand how the OpenMRS patient relationship model was used, and how it brought value to implementers. The OpenMRS patient relationship model has won widespread adoption across many implementations and is perceived to be valuable in enabling better health care delivery. Patient relationship information is widely used for community health programs and enabling chronic care. Additionally, many OpenMRS implementers were using this feature to collect custom relationship types for implementation specific needs. We believe that flexible patient relationship data collection is critical for better healthcare, and can inform community care and chronic care initiatives across the world. Additionally, patient relationship data could also be leveraged for many other initiatives such as patient centric care and in the field of precision medicine.

  9. Open-Source Integrated Design-Analysis Environment For Nuclear Energy Advanced Modeling & Simulation Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick

    The framework created through the Open-Source Integrated Design-Analysis Environment (IDAE) for Nuclear Energy Advanced Modeling & Simulation grant has simplify and democratize advanced modeling and simulation in the nuclear energy industry that works on a range of nuclear engineering applications. It leverages millions of investment dollars from the Department of Energy's Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energy's research and development. The IDEA framework enhanced Kitware’s Computational Model Builder (CMB) while leveraging existing open-source toolkits and creating a graphical end-to-end umbrella guiding end-users and developers through the nuclear energymore » advanced modeling and simulation lifecycle. In addition, the work deliver strategic advancements in meshing and visualization for ensembles.« less

  10. Methodology to evaluate the performance of simulation models for alternative compiler and operating system configurations

    USDA-ARS?s Scientific Manuscript database

    Simulation modelers increasingly require greater flexibility for model implementation on diverse operating systems, and they demand high computational speed for efficient iterative simulations. Additionally, model users may differ in preference for proprietary versus open-source software environment...

  11. The importance of surface recombination and energy-bandgap narrowing in p-n-junction silicon solar cells

    NASA Technical Reports Server (NTRS)

    Fossum, J. G.; Lindholm, F. A.; Shibib, M. A.

    1979-01-01

    Experimental data demonstrating the sensitivity of open-circuit voltage to front-surface conditions are presented for a variety of p-n-junction silicon solar cells. Analytical models accounting for the data are defined and supported by additional experiments. The models and the data imply that a) surface recombination significantly limits the open-circuit voltage (and the short-circuit current) of typical silicon cells, and b) energy-bandgap narrowing is important in the manifestation of these limitations. The models suggest modifications in both the structural design and the fabrication processing of the cells that would result in substantial improvements in cell performance. The benefits of one such modification - the addition of a thin thermal silicon-dioxide layer on the front surface - are indicated experimentally.

  12. Incubator weaning in preterm infants and associated practice variation.

    PubMed

    Schneiderman, R; Kirkby, S; Turenne, W; Greenspan, J

    2009-08-01

    To evaluate the relationship of weight of preterm infants when first placed into an open crib with days to full oral feedings, growth velocity and length of stay (LOS), and to identify unwarranted variation in incubator weaning after adjusting for severity indices. A retrospective study using the ParadigmHealth neonatal database from 2003 to 2006 reviewed incubator weaning to an open crib in appropriate-for-gestational-age (AGA) infants from 22 to weeks gestation. Primary outcome measurements included days to full oral (PO) feeding, weight gain from open crib to discharge and length of stay. Models were severity adjusted. To understand hospital practice variation, we also used a regression model to estimate the weight at open crib for the top 10 volume hospitals. In all 2908 infants met the inclusion criteria for the study. Their mean weight at open crib was 1850 g. On average every additional 100 g an infant weighed at the open crib was associated with increased time to full PO feeding by 0.8 days, decreased weight gained per day by 1 gram and increased LOS by 0.9 days. For the top 10 volume hospitals, severity variables alone accounted for 9% of the variation in weight at open crib, whereas the hospital in which the baby was treated accounted for an additional 19% of the variation. Even after controlling for severity, significant practice variation exists in weaning to an open crib, leading to potential delays in achieving full-volume oral feeds, decreased growth velocity and prolonged LOS.

  13. Interpreting Musculoskeletal Models and Dynamic Simulations: Causes and Effects of Differences Between Models.

    PubMed

    Roelker, Sarah A; Caruthers, Elena J; Baker, Rachel K; Pelz, Nicholas C; Chaudhari, Ajit M W; Siston, Robert A

    2017-11-01

    With more than 29,000 OpenSim users, several musculoskeletal models with varying levels of complexity are available to study human gait. However, how different model parameters affect estimated joint and muscle function between models is not fully understood. The purpose of this study is to determine the effects of four OpenSim models (Gait2392, Lower Limb Model 2010, Full-Body OpenSim Model, and Full Body Model 2016) on gait mechanics and estimates of muscle forces and activations. Using OpenSim 3.1 and the same experimental data for all models, six young adults were scaled in each model, gait kinematics were reproduced, and static optimization estimated muscle function. Simulated measures differed between models by up to 6.5° knee range of motion, 0.012 Nm/Nm peak knee flexion moment, 0.49 peak rectus femoris activation, and 462 N peak rectus femoris force. Differences in coordinate system definitions between models altered joint kinematics, influencing joint moments. Muscle parameter and joint moment discrepancies altered muscle activations and forces. Additional model complexity yielded greater error between experimental and simulated measures; therefore, this study suggests Gait2392 is a sufficient model for studying walking in healthy young adults. Future research is needed to determine which model(s) is best for tasks with more complex motion.

  14. Qualitative investigation of booster recovery in open sea

    NASA Technical Reports Server (NTRS)

    Beck, P. E.

    1973-01-01

    Limited tests were conducted using 1/27 scale model of a Titan 3C booster plus 1/32.9 and 1/15.6 scale models of a solid rocket booster case to establish some of the characteristics that will effect recovery operations in open seas. This preliminary effort was designed to provide additional background information for conceptual development of a waterborne recovery system for space shuttle boosters, pending initiation of comprehensive studies. The models were not instrumented; therefore, all data are qualitative (approximations) and are based on observations plus photographic coverage.

  15. Epitaxial lateral overgrowth of InP on Si from nano-openings: Theoretical and experimental indication for defect filtering throughout the grown layer

    NASA Astrophysics Data System (ADS)

    Olsson, F.; Xie, M.; Lourdudoss, S.; Prieto, I.; Postigo, P. A.

    2008-11-01

    We present a model for the filtration of dislocations inside the seed window in epitaxial lateral overgrowth (ELO). We found that, when the additive effects of image and gliding forces exceed the defect line tension force, filtering can occur even in the openings. The model is applied to ELO of InP on Si where the opening size and the thermal stress arising due to the mask and the grown material are taken into account and analyzed. Further, we have also designed the mask patterns in net structures, where the tilting angles of the openings in the nets are chosen in order to take advantage of the filtering in the openings more effectively, and to minimize new defects due to coalescence in the ELO. Photoluminescence intensities of ELO InP on Si and on InP are compared and found to be in qualitative agreement with the model.

  16. BPS States, Crystals, and Matrices

    DOE PAGES

    Sułkowski, Piotr

    2011-01-01

    We review free fermion, melting crystal, and matrix model representations of wall-crossing phenomena on local, toric Calabi-Yau manifolds. We consider both unrefined and refined BPS counting of closed BPS states involving D2- and D0-branes bound to a D6-brane, as well as open BPS states involving open D2-branes ending on an additional D4-brane. Appropriate limit of these constructions provides, among the others, matrix model representation of refined and unrefined topological string amplitudes.

  17. Bayesian Model Development for Analysis of Open Source Information to Support the Assessment of Nuclear Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.

    2013-07-15

    Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less

  18. ORBDA: An openEHR benchmark dataset for performance assessment of electronic health record servers.

    PubMed

    Teodoro, Douglas; Sundvall, Erik; João Junior, Mario; Ruch, Patrick; Miranda Freire, Sergio

    2018-01-01

    The openEHR specifications are designed to support implementation of flexible and interoperable Electronic Health Record (EHR) systems. Despite the increasing number of solutions based on the openEHR specifications, it is difficult to find publicly available healthcare datasets in the openEHR format that can be used to test, compare and validate different data persistence mechanisms for openEHR. To foster research on openEHR servers, we present the openEHR Benchmark Dataset, ORBDA, a very large healthcare benchmark dataset encoded using the openEHR formalism. To construct ORBDA, we extracted and cleaned a de-identified dataset from the Brazilian National Healthcare System (SUS) containing hospitalisation and high complexity procedures information and formalised it using a set of openEHR archetypes and templates. Then, we implemented a tool to enrich the raw relational data and convert it into the openEHR model using the openEHR Java reference model library. The ORBDA dataset is available in composition, versioned composition and EHR openEHR representations in XML and JSON formats. In total, the dataset contains more than 150 million composition records. We describe the dataset and provide means to access it. Additionally, we demonstrate the usage of ORBDA for evaluating inserting throughput and query latency performances of some NoSQL database management systems. We believe that ORBDA is a valuable asset for assessing storage models for openEHR-based information systems during the software engineering process. It may also be a suitable component in future standardised benchmarking of available openEHR storage platforms.

  19. ORBDA: An openEHR benchmark dataset for performance assessment of electronic health record servers

    PubMed Central

    Sundvall, Erik; João Junior, Mario; Ruch, Patrick; Miranda Freire, Sergio

    2018-01-01

    The openEHR specifications are designed to support implementation of flexible and interoperable Electronic Health Record (EHR) systems. Despite the increasing number of solutions based on the openEHR specifications, it is difficult to find publicly available healthcare datasets in the openEHR format that can be used to test, compare and validate different data persistence mechanisms for openEHR. To foster research on openEHR servers, we present the openEHR Benchmark Dataset, ORBDA, a very large healthcare benchmark dataset encoded using the openEHR formalism. To construct ORBDA, we extracted and cleaned a de-identified dataset from the Brazilian National Healthcare System (SUS) containing hospitalisation and high complexity procedures information and formalised it using a set of openEHR archetypes and templates. Then, we implemented a tool to enrich the raw relational data and convert it into the openEHR model using the openEHR Java reference model library. The ORBDA dataset is available in composition, versioned composition and EHR openEHR representations in XML and JSON formats. In total, the dataset contains more than 150 million composition records. We describe the dataset and provide means to access it. Additionally, we demonstrate the usage of ORBDA for evaluating inserting throughput and query latency performances of some NoSQL database management systems. We believe that ORBDA is a valuable asset for assessing storage models for openEHR-based information systems during the software engineering process. It may also be a suitable component in future standardised benchmarking of available openEHR storage platforms. PMID:29293556

  20. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  1. 78 FR 29139 - Medicare Program; Bundled Payments for Care Improvement Model 1 Open Period

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-17

    ... initiative. DATES: Model 1 of the Bundled Payments for Care Improvement Deadline: Interested organizations... initiative. For additional information on this initiative go to the CMS Center for Medicare and Medicaid Innovation Web site at http://innovation.cms.gov/initiatives/BPCI-Model-1/index.html . SUPPLEMENTARY...

  2. Parallel implementation of approximate atomistic models of the AMOEBA polarizable model

    NASA Astrophysics Data System (ADS)

    Demerdash, Omar; Head-Gordon, Teresa

    2016-11-01

    In this work we present a replicated data hybrid OpenMP/MPI implementation of a hierarchical progression of approximate classical polarizable models that yields speedups of up to ∼10 compared to the standard OpenMP implementation of the exact parent AMOEBA polarizable model. In addition, our parallel implementation exhibits reasonable weak and strong scaling. The resulting parallel software will prove useful for those who are interested in how molecular properties converge in the condensed phase with respect to the MBE, it provides a fruitful test bed for exploring different electrostatic embedding schemes, and offers an interesting possibility for future exascale computing paradigms.

  3. Coastal Processes with Improved Tidal Opening in Chilika Lagoon (east Coast of India)

    NASA Astrophysics Data System (ADS)

    Jayaraman, Girija; Dube, Anumeha

    Chilika Lagoon (19°28-19°54¢N and 85°06-85°36¢E) is the largest brackish water lagoon with estuarine character. Interest in detailed analysis of the ecology of the lagoon and the various factors affecting it is due to the opening of the new mouth on September 23, 2000 to resolve the threat to its environment from various factors - Eutrophication, weed proliferation, siltation, industrial pollution, and depletion of bioresources. The opening of the new mouth has changed the lagoon environment significantly with better socio­economic implications. There is a serious concern if the significant improvement in the biological productivity of the lagoon post-mouth opening is indeed sustainable. The present study focuses on the changes in the coastal processes as a result of the additional opening of a new mouth. Our results based on mathematical modeling and numerical simulation compare the dynamics, nutrient, and plankton distribution before and after the new mouth opening. The model could confirm the significant increase (14-66% depending on the sector) in the salinity after the new mouth opening, the maximum change being observed in the channel which connects the lagoon to the sea. The constriction in the lagoon which blocks the tidal effects entering the lagoon must be responsible for maintaining the main body of the lagoon with low salinity. The ecological model is first tested for different sectors individually before a complete model, including the entire lagoon area, is included incorporating their distinct characteristics. The model is validated with available observations of plankton and nutrients made before the opening of the new mouth. It predicts the annual distribution of plankton in all the sectors of the lagoon for post-mouth opening which is to be verified when the data will be forthcoming.

  4. Coastal Processes with Improved Tidal Opening in Chilika Lagoon (east Coast of India)

    NASA Astrophysics Data System (ADS)

    Jayaraman, Girija; Dube, Anumeha

    Chilika Lagoon (19°28-19°54'N and 85°06-85°36'E) is the largest brackish water lagoon with estuarine character. Interest in detailed analysis of the ecology of the lagoon and the various factors affecting it is due to the opening of the new mouth on September 23, 2000 to resolve the threat to its environment from various factors — Eutrophication, weed proliferation, siltation, industrial pollution, and depletion of bioresources. The opening of the new mouth has changed the lagoon environment significantly with better socio-economic implications. There is a serious concern if the significant improvement in the biological productivity of the lagoon post-mouth opening is indeed sustainable. The present study focuses on the changes in the coastal processes as a result of the additional opening of a new mouth. Our results based on mathematical modeling and numerical simulation compare the dynamics, nutrient, and plankton distribution before and after the new mouth opening. The model could confirm the significant increase (14-66% depending on the sector) in the salinity after the new mouth opening, the maximum change being observed in the channel which connects the lagoon to the sea. The constriction in the lagoon which blocks the tidal effects entering the lagoon must be responsible for maintaining the main body of the lagoon with low salinity. The ecological model is first tested for different sectors individually before a complete model, including the entire lagoon area, is included incorporating their distinct characteristics. The model is validated with available observations of plankton and nutrients made before the opening of the new mouth. It predicts the annual distribution of plankton in all the sectors of the lagoon for post-mouth opening which is to be verified when the data will be forthcoming.

  5. Architectural Improvements and New Processing Tools for the Open XAL Online Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, Christopher K; Pelaia II, Tom; Freed, Jonathan M

    The online model is the component of Open XAL providing accelerator modeling, simulation, and dynamic synchronization to live hardware. Significant architectural changes and feature additions have been recently made in two separate areas: 1) the managing and processing of simulation data, and 2) the modeling of RF cavities. Simulation data and data processing have been completely decoupled. A single class manages all simulation data while standard tools were developed for processing the simulation results. RF accelerating cavities are now modeled as composite structures where parameter and dynamics computations are distributed. The beam and hardware models both maintain their relative phasemore » information, which allows for dynamic phase slip and elapsed time computation.« less

  6. Computational toxicology using the OpenTox application programming interface and Bioclipse

    PubMed Central

    2011-01-01

    Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173

  7. Experimental Studies on the Mechanical Behaviour of Rock Joints with Various Openings

    NASA Astrophysics Data System (ADS)

    Li, Y.; Oh, J.; Mitra, R.; Hebblewhite, B.

    2016-03-01

    The mechanical behaviour of rough joints is markedly affected by the degree of joint opening. A systematic experimental study was conducted to investigate the effect of the initial opening on both normal and shear deformations of rock joints. Two types of joints with triangular asperities were produced in the laboratory and subjected to compression tests and direct shear tests with different initial opening values. The results showed that opened rock joints allow much greater normal closure and result in much lower normal stiffness. A semi-logarithmic law incorporating the degree of interlocking is proposed to describe the normal deformation of opened rock joints. The proposed equation agrees well with the experimental results. Additionally, the results of direct shear tests demonstrated that shear strength and dilation are reduced because of reduced involvement of and increased damage to asperities in the process of shearing. The results indicate that constitutive models of rock joints that consider the true asperity contact area can be used to predict shear resistance along opened rock joints. Because rock masses are loosened and rock joints become open after excavation, the model suggested in this study can be incorporated into numerical procedures such as finite-element or discrete-element methods. Use of the model could then increase the accuracy and reliability of stability predictions for rock masses under excavation.

  8. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.

  9. Continuum Damage Mechanics Models for the Analysis of Progressive Failure in Open-Hole Tension Laminates

    NASA Technical Reports Server (NTRS)

    Song, Kyonchan; Li, Yingyong; Rose, Cheryl A.

    2011-01-01

    The performance of a state-of-the-art continuum damage mechanics model for interlaminar damage, coupled with a cohesive zone model for delamination is examined for failure prediction of quasi-isotropic open-hole tension laminates. Limitations of continuum representations of intra-ply damage and the effect of mesh orientation on the analysis predictions are discussed. It is shown that accurate prediction of matrix crack paths and stress redistribution after cracking requires a mesh aligned with the fiber orientation. Based on these results, an aligned mesh is proposed for analysis of the open-hole tension specimens consisting of different meshes within the individual plies, such that the element edges are aligned with the ply fiber direction. The modeling approach is assessed by comparison of analysis predictions to experimental data for specimen configurations in which failure is dominated by complex interactions between matrix cracks and delaminations. It is shown that the different failure mechanisms observed in the tests are well predicted. In addition, the modeling approach is demonstrated to predict proper trends in the effect of scaling on strength and failure mechanisms of quasi-isotropic open-hole tension laminates.

  10. Comparison of Open-Hole Compression Strength and Compression After Impact Strength on Carbon Fiber/Epoxy Laminates for the Ares I Composite Interstage

    NASA Technical Reports Server (NTRS)

    Hodge, Andrew J.; Nettles, Alan T.; Jackson, Justin R.

    2011-01-01

    Notched (open hole) composite laminates were tested in compression. The effect on strength of various sizes of through holes was examined. Results were compared to the average stress criterion model. Additionally, laminated sandwich structures were damaged from low-velocity impact with various impact energy levels and different impactor geometries. The compression strength relative to damage size was compared to the notched compression result strength. Open-hole compression strength was found to provide a reasonable bound on compression after impact.

  11. Collaborative development of predictive toxicology applications

    PubMed Central

    2010-01-01

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436

  12. Collaborative development of predictive toxicology applications.

    PubMed

    Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia

    2010-08-31

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.

  13. From Learning Object to Learning Cell: A Resource Organization Model for Ubiquitous Learning

    ERIC Educational Resources Information Center

    Yu, Shengquan; Yang, Xianmin; Cheng, Gang; Wang, Minjuan

    2015-01-01

    This paper presents a new model for organizing learning resources: Learning Cell. This model is open, evolving, cohesive, social, and context-aware. By introducing a time dimension into the organization of learning resources, Learning Cell supports the dynamic evolution of learning resources while they are being used. In addition, by introducing a…

  14. Modelling of additive manufacturing processes: a review and classification

    NASA Astrophysics Data System (ADS)

    Stavropoulos, Panagiotis; Foteinopoulos, Panagis

    2018-03-01

    Additive manufacturing (AM) is a very promising technology; however, there are a number of open issues related to the different AM processes. The literature on modelling the existing AM processes is reviewed and classified. A categorization of the different AM processes in process groups, according to the process mechanism, has been conducted and the most important issues are stated. Suggestions are made as to which approach is more appropriate according to the key performance indicator desired to be modelled and a discussion is included as to the way that future modelling work can better contribute to improving today's AM process understanding.

  15. Multispectral open-air intraoperative fluorescence imaging.

    PubMed

    Behrooz, Ali; Waterman, Peter; Vasquez, Kristine O; Meganck, Jeff; Peterson, Jeffrey D; Faqir, Ilias; Kempner, Joshua

    2017-08-01

    Intraoperative fluorescence imaging informs decisions regarding surgical margins by detecting and localizing signals from fluorescent reporters, labeling targets such as malignant tissues. This guidance reduces the likelihood of undetected malignant tissue remaining after resection, eliminating the need for additional treatment or surgery. The primary challenges in performing open-air intraoperative fluorescence imaging come from the weak intensity of the fluorescence signal in the presence of strong surgical and ambient illumination, and the auto-fluorescence of non-target components, such as tissue, especially in the visible spectral window (400-650 nm). In this work, a multispectral open-air fluorescence imaging system is presented for translational image-guided intraoperative applications, which overcomes these challenges. The system is capable of imaging weak fluorescence signals with nanomolar sensitivity in the presence of surgical illumination. This is done using synchronized fluorescence excitation and image acquisition with real-time background subtraction. Additionally, the system uses a liquid crystal tunable filter for acquisition of multispectral images that are used to spectrally unmix target fluorescence from non-target auto-fluorescence. Results are validated by preclinical studies on murine models and translational canine oncology models.

  16. Attitudes toward Face-to-Face and Online Counseling: Roles of Self-Concealment, Openness to Experience, Loss of Face, Stigma, and Disclosure Expectations among Korean College Students

    ERIC Educational Resources Information Center

    Bathje, Geoff J.; Kim, Eunha; Rau, Ellen; Bassiouny, Muhammad Adam; Kim, Taehoon

    2014-01-01

    This study examined attitudes toward face-to-face (f2f) and online counseling among 228 Korean college students. In addition, it tested a hypothesized model proposing that general propensities (i.e., self-concealment, openness to experience, and loss of face) would influence counseling-specific expectations (i.e., self-stigma and disclosure…

  17. A Programming Model Performance Study Using the NAS Parallel Benchmarks

    DOE PAGES

    Shan, Hongzhang; Blagojević, Filip; Min, Seung-Jai; ...

    2010-01-01

    Harnessing the power of multicore platforms is challenging due to the additional levels of parallelism present. In this paper we use the NAS Parallel Benchmarks to study three programming models, MPI, OpenMP and PGAS to understand their performance and memory usage characteristics on current multicore architectures. To understand these characteristics we use the Integrated Performance Monitoring tool and other ways to measure communication versus computation time, as well as the fraction of the run time spent in OpenMP. The benchmarks are run on two different Cray XT5 systems and an Infiniband cluster. Our results show that in general the threemore » programming models exhibit very similar performance characteristics. In a few cases, OpenMP is significantly faster because it explicitly avoids communication. For these particular cases, we were able to re-write the UPC versions and achieve equal performance to OpenMP. Using OpenMP was also the most advantageous in terms of memory usage. Also we compare performance differences between the two Cray systems, which have quad-core and hex-core processors. We show that at scale the performance is almost always slower on the hex-core system because of increased contention for network resources.« less

  18. Application of crowd-sourced data to multi-scale evolutionary exposure and vulnerability models

    NASA Astrophysics Data System (ADS)

    Pittore, Massimiliano

    2016-04-01

    Seismic exposure, defined as the assets (population, buildings, infrastructure) exposed to earthquake hazard and susceptible to damage, is a critical -but often neglected- component of seismic risk assessment. This partly stems from the burden associated with the compilation of a useful and reliable model over wide spatial areas. While detailed engineering data have still to be collected in order to constrain exposure and vulnerability models, the availability of increasingly large crowd-sourced datasets (e. g. OpenStreetMap) opens up the exciting possibility to generate incrementally evolving models. Integrating crowd-sourced and authoritative data using statistical learning methodologies can reduce models uncertainties and also provide additional drive and motivation to volunteered geoinformation collection. A case study in Central Asia will be presented and discussed.

  19. Nonlinear Structured Growth Mixture Models in Mplus and OpenMx

    PubMed Central

    Grimm, Kevin J.; Ram, Nilam; Estabrook, Ryne

    2014-01-01

    Growth mixture models (GMMs; Muthén & Muthén, 2000; Muthén & Shedden, 1999) are a combination of latent curve models (LCMs) and finite mixture models to examine the existence of latent classes that follow distinct developmental patterns. GMMs are often fit with linear, latent basis, multiphase, or polynomial change models because of their common use, flexibility in modeling many types of change patterns, the availability of statistical programs to fit such models, and the ease of programming. In this paper, we present additional ways of modeling nonlinear change patterns with GMMs. Specifically, we show how LCMs that follow specific nonlinear functions can be extended to examine the presence of multiple latent classes using the Mplus and OpenMx computer programs. These models are fit to longitudinal reading data from the Early Childhood Longitudinal Study-Kindergarten Cohort to illustrate their use. PMID:25419006

  20. Recent Updates to the System Advisor Model (SAM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiOrio, Nicholas A

    The System Advisor Model (SAM) is a mature suite of techno-economic models for many renewable energy technologies that can be downloaded for free as a desktop application or software development kit. SAM is used for system-level modeling, including generating performance pro the release of the code as an open source project on GitHub. Other additions that will be covered include the ability to download data directly into SAM from the National Solar Radiation Database (NSRDB) and up- dates to a user-interface macro that assists with PV system sizing. A brief update on SAM's battery model and its integration with themore » detailed photovoltaic model will also be discussed. Finally, an outline of planned work for the next year will be presented, including the addition of a bifacial model, support for multiple MPPT inputs for detailed inverter modeling, and the addition of a model for inverter thermal behavior.« less

  1. Improving daylight in mosques using domes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alturki, I.; Schiler, M.; Boyajian, Y.

    1996-10-01

    This paper studies the possibilities for improving daylight in mosques by measuring the illumination level under various domes in an old mosque ``Mosque of Guzelce Hasan Bey in Hayrabolu`` using an architectural physical model. The illumination level under the domes were tested under three different cases: a dome without openings (the original building), a dome with a central opening, and a dome with openings around the base. It was found that a dome with openings around the base brings an evenly distributed light all over the prayer hall during the critical hours of 12:00 p.m. and 3:00 p.m. In addition,more » it improves the quality and quantity of light.« less

  2. Additional considerations to the model of musical empathic engagement: Empathy facets, preferences, and openness. Comment on "Music, empathy, and cultural understanding" by E. Clarke et al.

    NASA Astrophysics Data System (ADS)

    Greenberg, David M.

    2015-12-01

    Recent research has shown that empathy plays an important role in musical experience including perception, preference, and performance [9,11,13,16,17]. Clarke, DeNora, and Vuoskoski's [4] timely review extends this work by establishing a framework for how ;music empathic engagement; can facilitate cultural understanding. In this commentary I raise attention to some additional factors that may be at play in their model.

  3. An Open-Source Toolbox for Surrogate Modeling of Joint Contact Mechanics

    PubMed Central

    Eskinazi, Ilan

    2016-01-01

    Goal Incorporation of elastic joint contact models into simulations of human movement could facilitate studying the interactions between muscles, ligaments, and bones. Unfortunately, elastic joint contact models are often too expensive computationally to be used within iterative simulation frameworks. This limitation can be overcome by using fast and accurate surrogate contact models that fit or interpolate input-output data sampled from existing elastic contact models. However, construction of surrogate contact models remains an arduous task. The aim of this paper is to introduce an open-source program called Surrogate Contact Modeling Toolbox (SCMT) that facilitates surrogate contact model creation, evaluation, and use. Methods SCMT interacts with the third party software FEBio to perform elastic contact analyses of finite element models and uses Matlab to train neural networks that fit the input-output contact data. SCMT features sample point generation for multiple domains, automated sampling, sample point filtering, and surrogate model training and testing. Results An overview of the software is presented along with two example applications. The first example demonstrates creation of surrogate contact models of artificial tibiofemoral and patellofemoral joints and evaluates their computational speed and accuracy, while the second demonstrates the use of surrogate contact models in a forward dynamic simulation of an open-chain leg extension-flexion motion. Conclusion SCMT facilitates the creation of computationally fast and accurate surrogate contact models. Additionally, it serves as a bridge between FEBio and OpenSim musculoskeletal modeling software. Significance Researchers may now create and deploy surrogate models of elastic joint contact with minimal effort. PMID:26186761

  4. Offset-Free Model Predictive Control of Open Water Channel Based on Moving Horizon Estimation

    NASA Astrophysics Data System (ADS)

    Ekin Aydin, Boran; Rutten, Martine

    2016-04-01

    Model predictive control (MPC) is a powerful control option which is increasingly used by operational water managers for managing water systems. The explicit consideration of constraints and multi-objective management are important features of MPC. However, due to the water loss in open water systems by seepage, leakage and evaporation a mismatch between the model and the real system will be created. These mismatch affects the performance of MPC and creates an offset from the reference set point of the water level. We present model predictive control based on moving horizon estimation (MHE-MPC) to achieve offset free control of water level for open water canals. MHE-MPC uses the past predictions of the model and the past measurements of the system to estimate unknown disturbances and the offset in the controlled water level is systematically removed. We numerically tested MHE-MPC on an accurate hydro-dynamic model of the laboratory canal UPC-PAC located in Barcelona. In addition, we also used well known disturbance modeling offset free control scheme for the same test case. Simulation experiments on a single canal reach show that MHE-MPC outperforms disturbance modeling offset free control scheme.

  5. Effects of the window openings on the micro-environmental condition in a school bus

    NASA Astrophysics Data System (ADS)

    Li, Fei; Lee, Eon S.; Zhou, Bin; Liu, Junjie; Zhu, Yifang

    2017-10-01

    School bus is an important micro-environment for children's health because the level of in-cabin air pollution can increase due to its own exhaust in addition to on-road traffic emissions. However, it has been challenging to understand the in-cabin air quality that is associated with complex airflow patterns inside and outside a school bus. This study conducted Computational Fluid Dynamics (CFD) modeling analyses to determine the effects of window openings on the self-pollution for a school bus. Infiltration through the window gaps is modeled by applying variable numbers of active computational cells as a function of the effective area ratio of the opening. The experimental data on ventilation rates from the literature was used to validate the model. Ultrafine particles (UFPs) and black carbon (BC) concentrations were monitored in ;real world; field campaigns using school buses. This modeling study examined the airflow pattern inside the school bus under four different types of side-window openings at 20, 40, and 60 mph (i.e., a total of 12 cases). We found that opening the driver's window could allow the infiltration of exhaust through window/door gaps in the back of school bus; whereas, opening windows in the middle of the school bus could mitigate this phenomenon. We also found that an increased driving speed (from 20 mph to 60 mph) could result in a higher ventilation rate (up to 3.4 times) and lower mean age of air (down to 0.29 time) inside the bus.

  6. Prototyping an online wetland ecosystem services model using open model sharing standards

    USGS Publications Warehouse

    Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.

    2011-01-01

    Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America.

  7. A Solution Strategy to Include the Opening of the Opercular Slits in Moving-Mesh CFD Models of Suction Feeding.

    PubMed

    Van Wassenbergh, Sam

    2015-07-01

    The gill cover of fish and pre-metamorphic salamanders has a key role in suction feeding by acting as a one-way valve. It initially closes and avoids an inflow of water through the gill slits, after which it opens to allow outflow of the water that was sucked through the mouth into the expanded buccopharyngeal cavity. However, due to the inability of analytical models (relying on the continuity principle) to calculate the flow of fluid through a cavity with two openings and that was changing in shape and size, stringent boundary conditions had to be used in previously developed mathematical models after the moment of the valve's opening. By solving additionally for the conservation of momentum, computational fluid dynamics (CFD) has the capacity to dynamically simulate these flows, but this technique also faces complications in modeling a transition from closed to open valves. Here, I present a relatively simple solution strategy to incorporate the opening of the valves, exemplified in an axisymmetrical model of a suction-feeding sunfish in ANSYS Fluent software. By controlling viscosity of a separately defined fluid entity in the region of the opercular cavity, early inflow can be blocked (high viscosity assigned) and later outflow can be allowed (changing viscosity to that of water). Finally, by analyzing the CFD solution obtained for the sunfish model, a few new insights into the biomechanics of suction feeding are gained. © The Author 2015. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  8. Reconfigurable Model Execution in the OpenMDAO Framework

    NASA Technical Reports Server (NTRS)

    Hwang, John T.

    2017-01-01

    NASA's OpenMDAO framework facilitates constructing complex models and computing their derivatives for multidisciplinary design optimization. Decomposing a model into components that follow a prescribed interface enables OpenMDAO to assemble multidisciplinary derivatives from the component derivatives using what amounts to the adjoint method, direct method, chain rule, global sensitivity equations, or any combination thereof, using the MAUD architecture. OpenMDAO also handles the distribution of processors among the disciplines by hierarchically grouping the components, and it automates the data transfer between components that are on different processors. These features have made OpenMDAO useful for applications in aircraft design, satellite design, wind turbine design, and aircraft engine design, among others. This paper presents new algorithms for OpenMDAO that enable reconfigurable model execution. This concept refers to dynamically changing, during execution, one or more of: the variable sizes, solution algorithm, parallel load balancing, or set of variables-i.e., adding and removing components, perhaps to switch to a higher-fidelity sub-model. Any component can reconfigure at any point, even when running in parallel with other components, and the reconfiguration algorithm presented here performs the synchronized updates to all other components that are affected. A reconfigurable software framework for multidisciplinary design optimization enables new adaptive solvers, adaptive parallelization, and new applications such as gradient-based optimization with overset flow solvers and adaptive mesh refinement. Benchmarking results demonstrate the time savings for reconfiguration compared to setting up the model again from scratch, which can be significant in large-scale problems. Additionally, the new reconfigurability feature is applied to a mission profile optimization problem for commercial aircraft where both the parametrization of the mission profile and the time discretization are adaptively refined, resulting in computational savings of roughly 10% and the elimination of oscillations in the optimized altitude profile.

  9. Java Web Simulation (JWS); a web based database of kinetic models.

    PubMed

    Snoep, J L; Olivier, B G

    2002-01-01

    Software to make a database of kinetic models accessible via the internet has been developed and a core database has been set up at http://jjj.biochem.sun.ac.za/. This repository of models, available to everyone with internet access, opens a whole new way in which we can make our models public. Via the database, a user can change enzyme parameters and run time simulations or steady state analyses. The interface is user friendly and no additional software is necessary. The database currently contains 10 models, but since the generation of the program code to include new models has largely been automated the addition of new models is straightforward and people are invited to submit their models to be included in the database.

  10. nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.

    PubMed

    Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia

    2017-12-01

    Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Modeling Spoken Word Recognition Performance by Pediatric Cochlear Implant Users using Feature Identification

    PubMed Central

    Frisch, Stefan A.; Pisoni, David B.

    2012-01-01

    Objective Computational simulations were carried out to evaluate the appropriateness of several psycholinguistic theories of spoken word recognition for children who use cochlear implants. These models also investigate the interrelations of commonly used measures of closed-set and open-set tests of speech perception. Design A software simulation of phoneme recognition performance was developed that uses feature identification scores as input. Two simulations of lexical access were developed. In one, early phoneme decisions are used in a lexical search to find the best matching candidate. In the second, phoneme decisions are made only when lexical access occurs. Simulated phoneme and word identification performance was then applied to behavioral data from the Phonetically Balanced Kindergarten test and Lexical Neighborhood Test of open-set word recognition. Simulations of performance were evaluated for children with prelingual sensorineural hearing loss who use cochlear implants with the MPEAK or SPEAK coding strategies. Results Open-set word recognition performance can be successfully predicted using feature identification scores. In addition, we observed no qualitative differences in performance between children using MPEAK and SPEAK, suggesting that both groups of children process spoken words similarly despite differences in input. Word recognition ability was best predicted in the model in which phoneme decisions were delayed until lexical access. Conclusions Closed-set feature identification and open-set word recognition focus on different, but related, levels of language processing. Additional insight for clinical intervention may be achieved by collecting both types of data. The most successful model of performance is consistent with current psycholinguistic theories of spoken word recognition. Thus it appears that the cognitive process of spoken word recognition is fundamentally the same for pediatric cochlear implant users and children and adults with normal hearing. PMID:11132784

  12. OpenMP parallelization of a gridded SWAT (SWATG)

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin

    2017-12-01

    Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.

  13. OpenDanubia - An integrated, modular simulation system to support regional water resource management

    NASA Astrophysics Data System (ADS)

    Muerth, M.; Waldmann, D.; Heinzeller, C.; Hennicker, R.; Mauser, W.

    2012-04-01

    The already completed, multi-disciplinary research project GLOWA-Danube has developed a regional scale, integrated modeling system, which was successfully applied on the 77,000 km2 Upper Danube basin to investigate the impact of Global Change on both the natural and anthropogenic water cycle. At the end of the last project phase, the integrated modeling system was transferred into the open source project OpenDanubia, which now provides both the core system as well as all major model components to the general public. First, this will enable decision makers from government, business and management to use OpenDanubia as a tool for proactive management of water resources in the context of global change. Secondly, the model framework to support integrated simulations and all simulation models developed for OpenDanubia in the scope of GLOWA-Danube are further available for future developments and research questions. OpenDanubia allows for the investigation of water-related scenarios considering different ecological and economic aspects to support both scientists and policy makers to design policies for sustainable environmental management. OpenDanubia is designed as a framework-based, distributed system. The model system couples spatially distributed physical and socio-economic process during run-time, taking into account their mutual influence. To simulate the potential future impacts of Global Change on agriculture, industrial production, water supply, households and tourism businesses, so-called deep actor models are implemented in OpenDanubia. All important water-related fluxes and storages in the natural environment are implemented in OpenDanubia as spatially explicit, process-based modules. This includes the land surface water and energy balance, dynamic plant water uptake, ground water recharge and flow as well as river routing and reservoirs. Although the complete system is relatively demanding on data requirements and hardware requirements, the modular structure and the generic core system (Core Framework, Actor Framework) allows the application in new regions and the selection of a reduced number of modules for simulation. As part of the Open Source Initiative in GLOWA-Danube (opendanubia.glowa-danube.de) a comprehensive documentation for the system installation was created and both the program code of the framework and of all major components is licensed under the GNU General Public License. In addition, some helpful programs and scripts necessary for the operation and processing of input and result data sets are provided.

  14. The city model as a tool for participatory urban planning - a case study: The Bilotti open air museum of Cosenza

    NASA Astrophysics Data System (ADS)

    Artese, S.

    2014-05-01

    The paper describes the implementation of the 3D city model of the pedestrian area of Cosenza, which in recent years has become the Bilotti Open Air Museum (MAB). For this purpose were used both the data available (regional technical map, city maps, orthophotos) and acquired through several surveys of buildings and "Corso Mazzini" street (photos, topographic measurements, laser scanner point clouds). In addition to the urban scale model, the survey of the statues of the MAB was carried out. By means of data processing, the models of the same statues have been created, that can be used as objects within the city model. The 3D model of the MAB open air museum has been used to implement a Web-GIS allowing the citizen's participation, understanding and suggestions. The 3D city model is intended as a new tool for urban planning, therefore it has been used both for representing the current situation of the MAB and for design purposes, by acknowledging suggestions regarding a possible different location of the statues and a new way to enjoy the museum.

  15. New Insights into the Electroreduction of Ethylene Sulfite as Electrolyte Additive for Facilitating Solid Electrolyte Interphase of Lithium Ion Battery

    PubMed Central

    Sun, Youmin; Wang, Yixuan

    2017-01-01

    To help understand the solid electrolyte interphase (SEI) formation facilitated by electrolyte additives of lithium-ion batteries (LIB) the supermolecular clusters [(ES)Li+(PC)m](PC)n (m=1–2; n=0, 6, and 9) were used to investigate the electroreductive decompositions of the electrolyte additive, ethylene sulfite (ES), as well as the solvent, propylene carbonate (PC) with density functional theory. The results show that ES can be reduced prior to PC, resulting in a reduction precursor that will then undergo a ring opening decomposition to yield a radical anion. A new concerted pathway (path B) was located for the ring opening of the reduced ES, which has much lower energy barrier than the previously reported stepwise pathway (path A). The transition state for the ring opening of PC induced by the reduced ES (path C, indirect path) is closer to that of path A than path B in energy. The direct ring opening of the reduced PC (path D) has lower energy barrier than those of paths A, B and C, yet it is less favorable than the latter paths in terms of thermodynamics (vertical electron affinity or the reduction potential dissociation energy). The overall rate constant including the initial reduction and the subsequent ring opening for path B is the largest among the four paths, followed by paths A>C>D, which further signifies the importance of the concerted new path in facilitating the SEI. The hybrid models, the supermolecular cluster augmented by polarized continuum model, PCM-[(ES)Li+(PC)2](PC)n (n=0,6, and 9), were used to further estimate the reduction potential by taking into account both explicit and implicit solvent effects. The second solvation shell of Li+ in [(ES)Li+(PC)2](PC)n (n=6, and 9) partially compensates the overestimation of solvent effects arising from the PCM model for the naked (ES)Li+(PC)2, and the theoretical reduction potential with PCM-[(ES)Li+(PC)2](PC)6 (1.90–1.93V) agrees very well with the experimental one (1.8–2.0V). PMID:28220165

  16. Sucrose breakdown within guard cells provides substrates for glycolysis and glutamine biosynthesis during light-induced stomatal opening.

    PubMed

    Medeiros, David B; Perez Souza, Leonardo; Antunes, Werner C; Araújo, Wagner L; Daloso, Danilo M; Fernie, Alisdair R

    2018-05-01

    Sucrose has long been thought to play an osmolytic role in stomatal opening. However, recent evidence supports the idea that the role of sucrose in this process is primarily energetic. Here we used a combination of stomatal aperture assays and kinetic [U- 13 C]-sucrose isotope labelling experiments to confirm that sucrose is degraded during light-induced stomatal opening and to define the fate of the C released from sucrose breakdown. We additionally show that addition of sucrose to the medium did not enhance light-induced stomatal opening. The isotope experiment showed a consistent 13 C enrichment in fructose and glucose, indicating that during light-induced stomatal opening sucrose is indeed degraded. We also observed a clear 13 C enrichment in glutamate and glutamine (Gln), suggesting a concerted activation of sucrose degradation, glycolysis and the tricarboxylic acid cycle. This is in contrast to the situation for Gln biosynthesis in leaves under light, which has been demonstrated to rely on previously stored C. Our results thus collectively allow us to redraw current models concerning the influence of sucrose during light-induced stomatal opening, in which, instead of being accumulated, sucrose is degraded providing C skeletons for Gln biosynthesis. © 2018 The Authors The Plant Journal © 2018 John Wiley & Sons Ltd.

  17. A public hedonic analysis of environmental attributes in an open space preservation program

    NASA Astrophysics Data System (ADS)

    Nordman, Erik E.

    The Town of Brookhaven, on Long Island, NY, has implemented an open space preservation program to protect natural areas, and the ecosystem services they provide, from suburban growth. I used a public hedonic model of Brookhaven's open space purchases to estimate implicit prices for various environmental attributes, locational variables and spatial metrics. I also measured the correlation between cost per acre and non-monetary environmental benefit scores and tested whether including cost data, as opposed to non-monetary environmental benefit score alone, would change the prioritization ranks of acquired properties. The mean acquisition cost per acre was 82,501. I identified the key on-site environmental and locational variables using stepwise regression for four functional forms. The log-log specification performed best ( R2adj= 0.727). I performed a second stepwise regression (log-log form) which included spatial metrics, calculated from a high-resolution land cover classification, in addition to the environmental and locational variables. This markedly improved the model's performance ( R2adj=0.866). Statistically significant variables included the property size, location in the Pine Barrens Compatible Growth Area, location in a FEMA flood zone, adjacency to public land, and several other environmental dummy variables. The single significant spatial metric, the fractal dimension of the tree cover class, had the largest elasticity of any variable. Of the dummy variables, location within the Compatible Growth Area had the largest implicit price (298,792 per acre). The priority rank for the two methods, non-monetary environmental benefit score alone and the ratio of non-monetary environmental benefit score to acquisition cost were significantly positively correlated. This suggests that, despite the lack of cost data in their ranking method, Brookhaven does not suffer from efficiency losses. The economics literature encourages using both environmental benefits and acquisition costs to ensure cost-effective conservation programs. I recommend that Brookhaven consider acquisition costs in addition to environmental benefits to avert potential efficiency losses in future open space purchases. This dissertation shows that the addition of spatial metrics can enhance the performance of hedonic models. It also provides a baseline valuation for the environmental attributes of Brookhaven' open spaces and shows that location is critical when dealing with open space preservation programs.

  18. Rapid Fabrication of Flat Plate Cavity Phosphor Thermography Test Models for Shuttle Return-to-Flight Aero-Heating

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.; Powers, Michael A.; Nevins, Stephen C.; Griffith, Mark S.; Wainwright, Gary A.

    2006-01-01

    Methods, materials and equipment are documented for fabricating flat plate test models at NASA Langley Research Center for Shuttle return-to-flight aeroheating experiments simulating open and closed cavity interactions in Langley s hypersonic 20-Inch Mach 6 air wind tunnel. Approximately 96 silica ceramic flat plate cavity phosphor thermography test models have been fabricated using these methods. On one model, an additional slot is machined through the back of the plate and into the cavity and vented into an evacuated plenum chamber to simulate a further opening in the cavity. After sintering ceramic to 2150 F, and mounting support hardware, a ceramic-based two-color thermographic phosphor coating is applied for global temperature and heat transfer measurements, with fiducial markings for image registration.

  19. Using open source computational tools for predicting human metabolic stability and additional absorption, distribution, metabolism, excretion, and toxicity properties.

    PubMed

    Gupta, Rishi R; Gifford, Eric M; Liston, Ted; Waller, Chris L; Hohman, Moses; Bunin, Barry A; Ekins, Sean

    2010-11-01

    Ligand-based computational models could be more readily shared between researchers and organizations if they were generated with open source molecular descriptors [e.g., chemistry development kit (CDK)] and modeling algorithms, because this would negate the requirement for proprietary commercial software. We initially evaluated open source descriptors and model building algorithms using a training set of approximately 50,000 molecules and a test set of approximately 25,000 molecules with human liver microsomal metabolic stability data. A C5.0 decision tree model demonstrated that CDK descriptors together with a set of Smiles Arbitrary Target Specification (SMARTS) keys had good statistics [κ = 0.43, sensitivity = 0.57, specificity = 0.91, and positive predicted value (PPV) = 0.64], equivalent to those of models built with commercial Molecular Operating Environment 2D (MOE2D) and the same set of SMARTS keys (κ = 0.43, sensitivity = 0.58, specificity = 0.91, and PPV = 0.63). Extending the dataset to ∼193,000 molecules and generating a continuous model using Cubist with a combination of CDK and SMARTS keys or MOE2D and SMARTS keys confirmed this observation. When the continuous predictions and actual values were binned to get a categorical score we observed a similar κ statistic (0.42). The same combination of descriptor set and modeling method was applied to passive permeability and P-glycoprotein efflux data with similar model testing statistics. In summary, open source tools demonstrated predictive results comparable to those of commercial software with attendant cost savings. We discuss the advantages and disadvantages of open source descriptors and the opportunity for their use as a tool for organizations to share data precompetitively, avoiding repetition and assisting drug discovery.

  20. A generic open-source software framework supporting scenario simulations in bioterrorist crises.

    PubMed

    Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie

    2013-09-01

    Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios.

  1. Openness to experience is related to better memory ability in older adults with questionable dementia.

    PubMed

    Terry, Douglas P; Puente, Antonio N; Brown, Courtney L; Faraco, Carlos C; Miller, L Stephen

    2013-01-01

    The personality traits Openness to experience and Neuroticism of the five-factor model have previously been associated with memory performance in nondemented older adults, but this relationship has not been investigated in samples with memory impairment. Our examination of 50 community-dwelling older adults (29 cognitively intact; 21 with questionable dementia as determined by the Clinical Dementia Rating Scale) showed that demographic variables (age, years of education, gender, and estimated premorbid IQ) and current depressive symptoms explained a significant amount of variance of Repeatable Battery of Neuropsychological Status Delayed Memory (adjusted R (2) = 0.23). After controlling for these variables, a measure of global cognitive status further explained a significant portion of variance in memory performance (ΔR(2) = 0.13; adjusted R(2) = 0.36; p < .01). Finally, adding Openness to this hierarchical linear regression model explained a significant additional portion of variance (ΔR(2) = 0.08; adjusted R(2) = 0.44; p < .01) but adding Neuroticism did not explain any additional variance. This significant relationship between Openness and better memory performance above and beyond one's cognitive status and demographic variables may suggest that a lifelong pattern of involvement in new cognitive activities could be preserved in old age or protect from memory decline. This study suggests that personality may be a powerful predictor of memory ability and clinically useful in this heterogeneous population.

  2. 2005-2008

    ERIC Educational Resources Information Center

    Council of Chief State School Officers, 2009

    2009-01-01

    The purpose of this document is to describe the US ED growth model pilot program from its inception to the point at which the program was no longer a pilot and was opened up to all states. In addition, the paper is designed to help states in the process of planning to submit a growth model proposal. The information in this document is mostly…

  3. Models of Voltage-Dependent Conformational Changes in NaChBac Channels

    PubMed Central

    Shafrir, Yinon; Durell, Stewart R.; Guy, H. Robert

    2008-01-01

    Models of the transmembrane region of the NaChBac channel were developed in two open/inactivated and several closed conformations. Homology models of NaChBac were developed using crystal structures of Kv1.2 and a Kv1.2/2.1 chimera as templates for open conformations, and MlotiK and KcsA channels as templates for closed conformations. Multiple molecular-dynamic simulations were performed to refine and evaluate these models. A striking difference between the S4 structures of the Kv1.2-like open models and MlotiK-like closed models is the secondary structure. In the open model, the first part of S4 forms an α-helix, and the last part forms a 310 helix, whereas in the closed model, the first part of S4 forms a 310 helix, and the last part forms an α-helix. A conformational change that involves this type of transition in secondary structure should be voltage-dependent. However, this transition alone is not sufficient to account for the large gating charge movement reported for NaChBac channels and for experimental results in other voltage-gated channels. To increase the magnitude of the motion of S4, we developed another model of an open/inactivated conformation, in which S4 is displaced farther outward, and a number of closed models in which S4 is displaced farther inward. A helical screw motion for the α-helical part of S4 and a simple axial translation for the 310 portion were used to develop models of these additional conformations. In our models, four positively charged residues of S4 moved outwardly during activation, across a transition barrier formed by highly conserved hydrophobic residues on S1, S2, and S3. The S4 movement was coupled to an opening of the activation gate formed by S6 through interactions with the segment linking S4 to S5. Consistencies of our models with experimental studies of NaChBac and Kv channels are discussed. PMID:18641074

  4. Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators

    PubMed Central

    Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew

    2014-01-01

    Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in multi-dimensional media. PMID:24497950

  5. Acoustic performance of inlet suppressors on an engine generating a single mode

    NASA Technical Reports Server (NTRS)

    Heidelberg, L. J.; Rice, E. J.; Homyak, L.

    1981-01-01

    Three single degree of freedom liners with different open area ratio face sheets were designed for a single spinning mode in order to evaluate an inlet suppressor design method based on mode cutoff ratio. This mode was generated by placing 41 rods in front of the 28 blade fan of a JT15D turbofan engine. At the liner design this near cutoff mode has a theoretical maximum attenuation of nearly 200 dB per L/D. The data show even higher attenuations at the design condition than predicted by the theory for dissipation of a single mode within the liner. This additional attenuation is large for high open area ratios and should be accounted for in the theory. The data show the additional attenuation to be inversely proportional to acoustic resistance. It was thought that the additional attenuation could be caused by reflection and modal scattering at the hard to soft wall interface. A reflection model was developed, and then modified to fit the data. This model was checked against independent (multiple pure tone) data with good agreement.

  6. Functional Additive Mixed Models

    PubMed Central

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2014-01-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592

  7. Functional Additive Mixed Models.

    PubMed

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2015-04-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.

  8. Development of an Integrated Hydrologic Modeling System for Rainfall-Runoff Simulation

    NASA Astrophysics Data System (ADS)

    Lu, B.; Piasecki, M.

    2008-12-01

    This paper aims to present the development of an integrated hydrological model which involves functionalities of digital watershed processing, online data retrieval, hydrologic simulation and post-event analysis. The proposed system is intended to work as a back end to the CUAHSI HIS cyberinfrastructure developments. As a first step into developing this system, a physics-based distributed hydrologic model PIHM (Penn State Integrated Hydrologic Model) is wrapped into OpenMI(Open Modeling Interface and Environment ) environment so as to seamlessly interact with OpenMI compliant meteorological models. The graphical user interface is being developed from the openGIS application called MapWindows which permits functionality expansion through the addition of plug-ins. . Modules required to set up through the GUI workboard include those for retrieving meteorological data from existing database or meteorological prediction models, obtaining geospatial data from the output of digital watershed processing, and importing initial condition and boundary condition. They are connected to the OpenMI compliant PIHM to simulate rainfall-runoff processes and includes a module for automatically displaying output after the simulation. Online databases are accessed through the WaterOneFlow web services, and the retrieved data are either stored in an observation database(OD) following the schema of Observation Data Model(ODM) in case for time series support, or a grid based storage facility which may be a format like netCDF or a grid-based-data database schema . Specific development steps include the creation of a bridge to overcome interoperability issue between PIHM and the ODM, as well as the embedding of TauDEM (Terrain Analysis Using Digital Elevation Models) into the model. This module is responsible for developing watershed and stream network using digital elevation models. Visualizing and editing geospatial data is achieved by the usage of MapWinGIS, an ActiveX control developed by MapWindow team. After applying to the practical watershed, the performance of the model can be tested by the post-event analysis module.

  9. Urban Climate Resilience - Connecting climate models with decision support cyberinfrastructure using open standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Idol, T. A.

    2015-12-01

    Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.

  10. Isolated Open Rotor Noise Prediction Assessment Using the F31A31 Historical Blade Set

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, William T.; Boyd, D. Douglas, Jr.; Zawodny, Nikolas S.

    2016-01-01

    In an effort to mitigate next-generation fuel efficiency and environmental impact concerns for aviation, open rotor propulsion systems have received renewed interest. However, maintaining the high propulsive efficiency while simultaneously meeting noise goals has been one of the challenges in making open rotor propulsion a viable option. Improvements in prediction tools and design methodologies have opened the design space for next generation open rotor designs that satisfy these challenging objectives. As such, validation of aerodynamic and acoustic prediction tools has been an important aspect of open rotor research efforts. This paper describes validation efforts of a combined computational fluid dynamics and Ffowcs Williams and Hawkings equation methodology for open rotor aeroacoustic modeling. Performance and acoustic predictions were made for a benchmark open rotor blade set and compared with measurements over a range of rotor speeds and observer angles. Overall, the results indicate that the computational approach is acceptable for assessing low-noise open rotor designs. Additionally, this approach may be used to provide realistic incident source fields for acoustic shielding/scattering studies on various aircraft configurations.

  11. Semi-Markov Approach to the Shipping Safety Modelling

    NASA Astrophysics Data System (ADS)

    Guze, Sambor; Smolarek, Leszek

    2012-02-01

    In the paper the navigational safety model of a ship on the open area has been studied under conditions of incomplete information. Moreover the structure of semi-Markov processes is used to analyse the stochastic ship safety according to the subjective acceptance of risk by the navigator. In addition, the navigator’s behaviour can be analysed by using the numerical simulation to estimate the probability of collision in the safety model.

  12. Increasing the object recognition distance of compact open air on board vision system

    NASA Astrophysics Data System (ADS)

    Kirillov, Sergey; Kostkin, Ivan; Strotov, Valery; Dmitriev, Vladimir; Berdnikov, Vadim; Akopov, Eduard; Elyutin, Aleksey

    2016-10-01

    The aim of this work was developing an algorithm eliminating the atmospheric distortion and improves image quality. The proposed algorithm is entirely software without using additional hardware photographic equipment. . This algorithm does not required preliminary calibration. It can work equally effectively with the images obtained at a distances from 1 to 500 meters. An algorithm for the open air images improve designed for Raspberry Pi model B on-board vision systems is proposed. The results of experimental examination are given.

  13. Improved Speech Coding Based on Open-Loop Parameter Estimation

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Chen, Ya-Chin; Longman, Richard W.

    2000-01-01

    A nonlinear optimization algorithm for linear predictive speech coding was developed early that not only optimizes the linear model coefficients for the open loop predictor, but does the optimization including the effects of quantization of the transmitted residual. It also simultaneously optimizes the quantization levels used for each speech segment. In this paper, we present an improved method for initialization of this nonlinear algorithm, and demonstrate substantial improvements in performance. In addition, the new procedure produces monotonically improving speech quality with increasing numbers of bits used in the transmitted error residual. Examples of speech encoding and decoding are given for 8 speech segments and signal to noise levels as high as 47 dB are produced. As in typical linear predictive coding, the optimization is done on the open loop speech analysis model. Here we demonstrate that minimizing the error of the closed loop speech reconstruction, instead of the simpler open loop optimization, is likely to produce negligible improvement in speech quality. The examples suggest that the algorithm here is close to giving the best performance obtainable from a linear model, for the chosen order with the chosen number of bits for the codebook.

  14. Computational Fluid Dynamics and Additive Manufacturing to Diagnose and Treat Cardiovascular Disease.

    PubMed

    Randles, Amanda; Frakes, David H; Leopold, Jane A

    2017-11-01

    Noninvasive engineering models are now being used for diagnosing and planning the treatment of cardiovascular disease. Techniques in computational modeling and additive manufacturing have matured concurrently, and results from simulations can inform and enable the design and optimization of therapeutic devices and treatment strategies. The emerging synergy between large-scale simulations and 3D printing is having a two-fold benefit: first, 3D printing can be used to validate the complex simulations, and second, the flow models can be used to improve treatment planning for cardiovascular disease. In this review, we summarize and discuss recent methods and findings for leveraging advances in both additive manufacturing and patient-specific computational modeling, with an emphasis on new directions in these fields and remaining open questions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. The OpenMI - its Transformation From a Research Output to a Global Standard for the Integrated Modelling Community

    NASA Astrophysics Data System (ADS)

    Moore, R.

    2008-12-01

    The pressure to take a more integrated approach both to science and to management increases by the day. At almost any scale from local to global, it is no longer possible to consider issues in isolation; to do so runs a high risk of creating more problems than are solved. The consequence of this situation is that there is strong encouragement in the scientific world not just to understand and to be able to predict the response of individual processes but also to predict how those processes will interact. The manager is similarly encouraged to think in the widest terms about the likely impact of any policy before it is implemented. A new reservoir may solve a water supply problem but will it adversely affect the fishing and hence the tourist trade? How will climate change impact biodiversity? Will the drugs for treating a flu pandemic adversely affect river water quality? One approach to predicting such impacts would be to create new models simulating more and more processes. This, however, is neither feasible nor useful and makes poor use of the huge investment in existing models. A better approach, with many additional benefits, would be to find a way of linking existing models and modelling components such as databases or visualisation systems. Against this background, the European Commission, as part of its research programme to facilitate the introduction of integrated water management, commissioned a community project to find a generic solution to the linking of simulation models at run time. The outcome of this work was the Open Modelling Interface (OpenMI) standard and the creation of the OpenMI Association, an open, non-proprietary, not-for-profit, international organisation for its support. The work has received widespread recognition and encouragement from across the world, especially in the USA. A second phase is now building a community to continue the OpenMI's development and promote its use. The community's vision, mission and implementation strategy can be summarised as follows: Vision. The OpenMI Association believes that integrated management in some form or another is the only option for the future management of our resources. Although not yet widely accepted outside the modelling world, because of the inherent complexities, it is foreseeable that managers will demand decision support systems, i.e. predictive models. As the need to understand the wider impacts of decisions increases, so the models will have to take account of more and more interacting processes. The OpenMI Association, therefore, foresees a future where the concept of integrated modelling becomes widely accepted, and the need for standards such as the OpenMI becomes greater. Mission. The attainment of the vision will require the collective energy and resources of developers, modellers and users. Within this context, the mission that the OpenMI Association has set itself, is a) to promote integrated modelling as a means of achieving better management and b) to develop and support the OpenMI Standard. Implementation Strategy. To achieve its mission, the OpenMI Association will focus on the following key actions. They are a) creating a culture that facilitates the take up and use of integrated modelling and the OpenMI, b) ensuring that the OpenMI remains relevant, easy to use, of high quality and available under acceptable conditions, c) supporting the community of OpenMI users and providing a compliancy service, d) disseminating information, e) enabling the community to participate in the development of the OpenMI, and f) securing the necessary resources. The session will present and invite debate on this strategy.

  16. Modeling association among demographic parameters in analysis of open population capture-recapture data.

    PubMed

    Link, William A; Barker, Richard J

    2005-03-01

    We present a hierarchical extension of the Cormack-Jolly-Seber (CJS) model for open population capture-recapture data. In addition to recaptures of marked animals, we model first captures of animals and losses on capture. The parameter set includes capture probabilities, survival rates, and birth rates. The survival rates and birth rates are treated as a random sample from a bivariate distribution, thus the model explicitly incorporates correlation in these demographic rates. A key feature of the model is that the likelihood function, which includes a CJS model factor, is expressed entirely in terms of identifiable parameters; losses on capture can be factored out of the model. Since the computational complexity of classical likelihood methods is prohibitive, we use Markov chain Monte Carlo in a Bayesian analysis. We describe an efficient candidate-generation scheme for Metropolis-Hastings sampling of CJS models and extensions. The procedure is illustrated using mark-recapture data for the moth Gonodontis bidentata.

  17. Modeling association among demographic parameters in analysis of open population capture-recapture data

    USGS Publications Warehouse

    Link, William A.; Barker, Richard J.

    2005-01-01

    We present a hierarchical extension of the Cormack–Jolly–Seber (CJS) model for open population capture–recapture data. In addition to recaptures of marked animals, we model first captures of animals and losses on capture. The parameter set includes capture probabilities, survival rates, and birth rates. The survival rates and birth rates are treated as a random sample from a bivariate distribution, thus the model explicitly incorporates correlation in these demographic rates. A key feature of the model is that the likelihood function, which includes a CJS model factor, is expressed entirely in terms of identifiable parameters; losses on capture can be factored out of the model. Since the computational complexity of classical likelihood methods is prohibitive, we use Markov chain Monte Carlo in a Bayesian analysis. We describe an efficient candidate-generation scheme for Metropolis–Hastings sampling of CJS models and extensions. The procedure is illustrated using mark-recapture data for the moth Gonodontis bidentata.

  18. Geothermal Case Studies

    DOE Data Explorer

    Young, Katherine

    2014-09-30

    database.) In fiscal year 2015, NREL is working with universities to populate additional case studies on OpenEI. The goal is to provide a large enough dataset to start conducting analyses of exploration programs to identify correlations between successful exploration plans for areas with similar geologic occurrence models.

  19. On trans-parenchymal transport after blood brain barrier opening: pump-diffuse-pump hypothesis

    NASA Astrophysics Data System (ADS)

    Postnov, D. E.; Postnikov, E. B.; Karavaev, A. S.; Glushkovskaya-Semyachkina, O. V.

    2018-04-01

    Transparenchymal transport attracted the attention of many research groups after the discovery of glymphatic mechanism for the brain drainage in 2012. While the main facts of rapid transport of substances across the parenchyma are well established experimentally, specific mechanisms that drive this drainage are just hypothezised but not proved yed. Moreover, the number of modeling studies show that the pulse wave powered mechanism is unlikely able to perform pumping as suggested. Thus, the problem is still open. In addition, new data obtained under the conditions of intensionally opened blood brain barrier shows the presence of equally fast transport in opposite durection. In our study we investigate the possible physical mechanisms for rapid transport of substances after the opening of blood-brain barrier under the conditions of zero net flow.

  20. First order sea-level cycles and supercontinent break up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heller, P.L.; Angevine, C.L.

    1985-01-01

    The authors have developed a model that successfully predicts the approximate magnitude and timing of long term sea-level change without relying on short term increases in global spreading rates. The model involves the following key assumptions. (1) Ocean basins have two types of area/age distributions; Pacific ocean basins are rimmed by subduction zones and have triangular distributions; and Atlantic ocean basins which open at constant rates, have no subduction, and so have rectangular distributions. (2) The total area of the global ocean is constant so that the Pacific basin must close as the Atlantic opens. These assumptions approximate modern globalmore » ocean basin conditions. The model begins with supercontinent break up. As the Atlantic begins to open, the mean age of the global ocean decreases, the mean depth of the sea floor shallows, and sea level, therefore, rises. Once the Atlantic occupies more than 8 to 10% of the global ocean area, the mean age and depth of the ocean floor increases resulting in a sea-level fall. The model can be applied to the mid-Cretaceous sea-level high stand which followed break up of Pangea by 80 to 100 Ma. Based on average Atlantic opening rates, sea level rises to a peak of 44 m at 80 Ma after opening began and then falls by 84 m to the present. Thus the model is capable of explaining approximately half of the total magnitude of the post-mid-Cretaceous eustatic fall without invoking short-term changes in global spreading rates. In addition, the model predicts the observed time lag between supercontinent break up and sea-level high stand for both Mesozoic as well as early Paleozoic time.« less

  1. Blue straggler stars: lessons from open clusters.

    NASA Astrophysics Data System (ADS)

    Geller, Aaron M.

    Open clusters enable a deep dive into blue straggler characteristics. Recent work shows that the binary properties (frequency, orbital elements and companion masses and evolutionary states) of the blue stragglers are the most important diagnostic for determining their origins. To date the multi-epoch radial-velocity observations necessary for characterizing these blue straggler binaries have only been carried out in open clusters. In this paper, I highlight recent results in the open clusters NGC 188, NGC 2682 (M67) and NGC 6819. The characteristics of many of the blue stragglers in these open clusters point directly to origins through mass transfer from an evolved donor star. Additionally, a handful of blue stragglers show clear signatures of past dynamical encounters. These comprehensive, diverse and detailed observations also reveal important challenges for blue straggler formation models (and particularly the mass-transfer channel), which we must overcome to fully understand the origins of blue straggler stars and other mass-transfer products.

  2. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.

  3. Microscale Obstacle Resolving Air Quality Model Evaluation with the Michelstadt Case

    PubMed Central

    Rakai, Anikó; Kristóf, Gergely

    2013-01-01

    Modelling pollutant dispersion in cities is challenging for air quality models as the urban obstacles have an important effect on the flow field and thus the dispersion. Computational Fluid Dynamics (CFD) models with an additional scalar dispersion transport equation are a possible way to resolve the flowfield in the urban canopy and model dispersion taking into consideration the effect of the buildings explicitly. These models need detailed evaluation with the method of verification and validation to gain confidence in their reliability and use them as a regulatory purpose tool in complex urban geometries. This paper shows the performance of an open source general purpose CFD code, OpenFOAM for a complex urban geometry, Michelstadt, which has both flow field and dispersion measurement data. Continuous release dispersion results are discussed to show the strengths and weaknesses of the modelling approach, focusing on the value of the turbulent Schmidt number, which was found to give best statistical metric results with a value of 0.7. PMID:24027450

  4. Microscale obstacle resolving air quality model evaluation with the Michelstadt case.

    PubMed

    Rakai, Anikó; Kristóf, Gergely

    2013-01-01

    Modelling pollutant dispersion in cities is challenging for air quality models as the urban obstacles have an important effect on the flow field and thus the dispersion. Computational Fluid Dynamics (CFD) models with an additional scalar dispersion transport equation are a possible way to resolve the flowfield in the urban canopy and model dispersion taking into consideration the effect of the buildings explicitly. These models need detailed evaluation with the method of verification and validation to gain confidence in their reliability and use them as a regulatory purpose tool in complex urban geometries. This paper shows the performance of an open source general purpose CFD code, OpenFOAM for a complex urban geometry, Michelstadt, which has both flow field and dispersion measurement data. Continuous release dispersion results are discussed to show the strengths and weaknesses of the modelling approach, focusing on the value of the turbulent Schmidt number, which was found to give best statistical metric results with a value of 0.7.

  5. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  6. Individualized Special Education with Cognitive Skill Assessment.

    ERIC Educational Resources Information Center

    Kurhila, Jaakko; Laine, Tei

    2000-01-01

    Describes AHMED (Adaptive and Assistive Hypermedia in Education), a computer learning environment which supports the evaluation of disabled children's cognitive skills in addition to supporting openness in learning materials and adaptivity in learning events. Discusses cognitive modeling and compares it to previous intelligent tutoring systems.…

  7. 75 FR 30863 - Nixon Presidential Historical Materials: Opening of Materials

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-02

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Nixon Presidential Historical Materials: Opening of Materials AGENCY: National Archives and Records Administration. ACTION: Notice of opening of additional materials. SUMMARY: This notice announces the opening of additional Nixon Presidential Historical Materials...

  8. OpenQuake, a platform for collaborative seismic hazard and risk assessment

    NASA Astrophysics Data System (ADS)

    Henshaw, Paul; Burton, Christopher; Butler, Lars; Crowley, Helen; Danciu, Laurentiu; Nastasi, Matteo; Monelli, Damiano; Pagani, Marco; Panzeri, Luigi; Simionato, Michele; Silva, Vitor; Vallarelli, Giuseppe; Weatherill, Graeme; Wyss, Ben

    2013-04-01

    Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, tools and models for global seismic hazard and risk assessment, within the context of the Global Earthquake Model (GEM). Guided by the needs and experiences of governments, companies and international organisations, all contributions are being integrated into OpenQuake: a web-based platform that - together with other resources - will become accessible in 2014. With OpenQuake, stakeholders worldwide will be able to calculate, visualize and investigate earthquake hazard and risk, capture new data and share findings for joint learning. The platform is envisaged as a collaborative hub for earthquake risk assessment, used at global and local scales, around which an active network of users has formed. OpenQuake will comprise both online and offline tools, many of which can also be used independently. One of the first steps in OpenQuake development was the creation of open-source software for advanced seismic hazard and risk calculations at any scale, the OpenQuake Engine. Although in continuous development, a command-line version of the software is already being test-driven and used by hundreds worldwide; from non-profits in Central Asia, seismologists in sub-Saharan Africa and companies in South Asia to the European seismic hazard harmonization programme (SHARE). In addition, several technical trainings were organized with scientists from different regions of the world (sub-Saharan Africa, Central Asia, Asia-Pacific) to introduce the engine and other OpenQuake tools to the community, something that will continue to happen over the coming years. Other tools that are being developed of direct interest to the hazard community are: • OpenQuake Modeller; fundamental instruments for the creation of seismogenic input models for seismic hazard assessment, a critical input to the OpenQuake Engine. OpenQuake Modeller will consist of a suite of tools (Hazard Modellers Toolkit) for characterizing the seismogenic sources of earthquakes and their models of earthquakes recurrence. An earthquake catalogue homogenization tool, for integration, statistical comparison and user-defined harmonization of multiple catalogues of earthquakes is also included in the OpenQuake modeling tools. • A data capture tool for active faults; a tool that allows geologists to draw (new) fault discoveries on a map in an intuitive GIS-environment and add details on the fault through the tool. This data, once quality checked, can then be integrated with the global active faults database, which will increase in value with every new fault insertion. Building on many ongoing efforts and the knowledge of scientists worldwide, GEM will for the first time integrate state-of-the-art data, models, results and open-source tools into a single platform. The platform will continue to increase in value, in particular for use in local contexts, through contributions from and collaborations with scientists and organisations worldwide. This presentation will showcase the OpenQuake Platform, focusing on the IT solutions that have been adopted as well as the added value that the platform will bring to scientists worldwide.

  9. Application of the Transtheoretical Model to Exercise Behavior and Physical Activity in Patients after Open Heart Surgery.

    PubMed

    Huang, Hsin-Yi; Lin, Yu-Shan; Chuang, Yi-Cheng; Lin, Wei-Hsuan; Kuo, Li Ying; Chen, Jui Chun; Hsu, Ching Ling; Chen, Bo Yan; Tsai, Hui Yu; Cheng, Fei Hsin; Tsai, Mei-Wun

    2015-05-01

    To assess exercise behavior and physical activity levels after open heart surgery. This prospective cohort study included 130 patients (70.8% male, aged 61.0 ± 12.2 years, 53.8% coronary bypass grafting) who underwent open heart surgery. The exercise behavior and physical activity of these patients were assessed at the 3- and 6-month follow-up appointments. Additional interviews were also conducted to further assess exercise behavior. Physical activity duration and metabolic equivalents were calculated from self-reported questionnaire responses. Moreover, possible related demographic factors, clinical features, participation in cardiac rehabilitation programs, and physical activity levels were additionally evaluated. Six months after hospital discharge, most patients were in the action (39.2%) and maintenance (37.7%) stages. Other subjects were in the precontemplation (11.5%), contemplation (5.4%), and preparation (6.2%) stages. The average physical activity level was 332.6 ± 377.1 min/week and 1198.1 ± 1396.9 KJ/week. Subjects in the action and maintenance stages exercised an average of 399.4 ± 397.6 min/week, significantly longer than those in other stages (116.2 ± 176.2 min/week, p = 0.02). Subjects that participated in outpatient cardiac rehabilitation programs after discharge may have the better exercise habit. Gender had no significant effect on exercise behavior 6 months after hospital discharge. Most subjects following open heart surgery may maintain regular exercise behavior at 6 months after hospital discharge. Physical activity levels sufficient for cardiac health were achieved by subjects in the active and maintenance stages. Outpatient cardiac rehabilitation programs are valuable for encouraging exercise behavior after heart surgery. Exercise behavior; Open heart surgery; Physical activity; Transtheoretical model.

  10. `Dhara': An Open Framework for Critical Zone Modeling

    NASA Astrophysics Data System (ADS)

    Le, P. V.; Kumar, P.

    2016-12-01

    Processes in the Critical Zone, which sustain terrestrial life, are tightly coupled across hydrological, physical, biological, chemical, pedological, geomorphological and ecological domains over both short and long timescales. Observations and quantification of the Earth's surface across these domains using emerging high resolution measurement technologies such as light detection and ranging (lidar) and hyperspectral remote sensing are enabling us to characterize fine scale landscape attributes over large spatial areas. This presents a unique opportunity to develop novel approaches to model the Critical Zone that can capture fine scale intricate dependencies across the different processes in 3D. The development of interdisciplinary tools that transcend individual disciplines and capture new levels of complexity and emergent properties is at the core of Critical Zone science. Here we introduce an open framework for high-performance computing model (`Dhara') for modeling complex processes in the Critical Zone. The framework is designed to be modular in structure with the aim to create uniform and efficient tools to facilitate and leverage process modeling. It also provides flexibility to maintain, collaborate, and co-develop additional components by the scientific community. We show the essential framework that simulates ecohydrologic dynamics, and surface - sub-surface coupling in 3D using hybrid parallel CPU-GPU. We demonstrate that the open framework in Dhara is feasible for detailed, multi-processes, and large-scale modeling of the Critical Zone, which opens up exciting possibilities. We will also present outcomes from a Modeling Summer Institute led by Intensively Managed Critical Zone Observatory (IMLCZO) with representation from several CZOs and international representatives.

  11. The Structure and Dynamics of the Corona - Heliosphere Connection

    NASA Technical Reports Server (NTRS)

    Antiochos, Spiro K.; Linker, Jon A.; Lionello, Roberto; Mikic, Zoran; Titov, Viacheslav; Zurbuchen, Thomas H.

    2011-01-01

    Determining the source at the Sun of the slow solar wind is one of the major unsolved problems in solar and heliospheric physics. First, we review the existing theories for the slow wind and argue that they have difficulty accounting for both the observed composition of the wind and its large angular extent. A new theory in which the slow wind originates from the continuous opening and closing of narrow open field corridors, the S-Web model, is described. Support for the S-Web model is derived from MHD solutions for the quasisteady corona and wind during the time of the August 1, 2008 eclipse. Additionally, we perform fully dynamic numerical simulations of the corona and heliosphere in order to test the S-Web model as well as the interchange model proposed by Fisk and co-workers. We discuss the implications of our simulations for the competing theories and for understanding the corona - heliosphere connection, in general.

  12. The Structure and Dynamics of the Corona - Heliosphere Connection

    NASA Technical Reports Server (NTRS)

    Antiochos, Spiro K.; Linker, Jon A.; Lionello, Roberto; Mikic, Zoran; Titov, Viacheslav; Zurbuchen, Thomas H.

    2010-01-01

    Determining the source at the Sun of the slow solar wind is one of the major unsolved problems in solar and heliospheric physics. First, we review the existing theories for the slow wind and argue that they have difficulty accounting for both the observed composition of the wind and its large angular extent. A new theory in which the slow wind originates from the continuous opening and closing of narrow open field corridors, the S-Web model, is described. Support for the S-Web model is derived from MHD solutions for the quasisteady corona and wind during the time of the August 1, 2008 eclipse. Additionally, we perform fully dynamic numerical simulations of the corona and heliosphere in order to test the S-Web model as well as the interchange model proposed by Fisk and co-workers. We discuss the implications of our simulations for the competing theories and for understanding the corona - heliosphere connection, in general.

  13. Gyrokinetic continuum simulation of turbulence in a straight open-field-line plasma

    DOE PAGES

    Shi, E. L.; Hammett, G. W.; Stoltzfus-Dueck, T.; ...

    2017-05-29

    Here, five-dimensional gyrokinetic continuum simulations of electrostatic plasma turbulence in a straight, open-field-line geometry have been performed using a full- discontinuous-Galerkin approach implemented in the Gkeyll code. While various simplifications have been used for now, such as long-wavelength approximations in the gyrokinetic Poisson equation and the Hamiltonian, these simulations include the basic elements of a fusion-device scrape-off layer: localised sources to model plasma outflow from the core, cross-field turbulent transport, parallel flow along magnetic field lines, and parallel losses at the limiter or divertor with sheath-model boundary conditions. The set of sheath-model boundary conditions used in the model allows currentsmore » to flow through the walls. In addition to details of the numerical approach, results from numerical simulations of turbulence in the Large Plasma Device, a linear device featuring straight magnetic field lines, are presented.« less

  14. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism.

    PubMed

    Birkel, Garrett W; Ghosh, Amit; Kumar, Vinay S; Weaver, Daniel; Ando, David; Backman, Tyler W H; Arkin, Adam P; Keasling, Jay D; Martín, Héctor García

    2017-04-05

    Modeling of microbial metabolism is a topic of growing importance in biotechnology. Mathematical modeling helps provide a mechanistic understanding for the studied process, separating the main drivers from the circumstantial ones, bounding the outcomes of experiments and guiding engineering approaches. Among different modeling schemes, the quantification of intracellular metabolic fluxes (i.e. the rate of each reaction in cellular metabolism) is of particular interest for metabolic engineering because it describes how carbon and energy flow throughout the cell. In addition to flux analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics, proteomics and metabolomics) are urgently needed. The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes and leveraging other -omics data for the scientific study of cellular metabolism and bioengineering purposes. Firstly, it presents a complete toolbox for simultaneously performing two different types of flux analysis that are typically disjoint: Flux Balance Analysis and 13 C Metabolic Flux Analysis. Moreover, it introduces the capability to use 13 C labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale 13 C Metabolic Flux Analysis (2S- 13 C MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs. jQMM will facilitate the design and metabolic engineering of organisms for biofuels and other chemicals, as well as investigations of cellular metabolism and leveraging -omics data. As an open source software project, we hope it will attract additions from the community and grow with the rapidly changing field of metabolic engineering.

  15. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism

    DOE PAGES

    Birkel, Garrett W.; Ghosh, Amit; Kumar, Vinay S.; ...

    2017-04-05

    Modeling of microbial metabolism is a topic of growing importance in biotechnology. Mathematical modeling helps provide a mechanistic understanding for the studied process, separating the main drivers from the circumstantial ones, bounding the outcomes of experiments and guiding engineering approaches. Among different modeling schemes, the quantification of intracellular metabolic fluxes (i.e. the rate of each reaction in cellular metabolism) is of particular interest for metabolic engineering because it describes how carbon and energy flow throughout the cell. In addition to flux analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics,more » proteomics and metabolomics) are urgently needed. The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes and leveraging other -omics data for the scientific study of cellular metabolism and bioengineering purposes. Firstly, it presents a complete toolbox for simultaneously performing two different types of flux analysis that are typically disjoint: Flux Balance Analysis and 13C Metabolic Flux Analysis. Moreover, it introduces the capability to use 13C labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale 13C Metabolic Flux Analysis (2S- 13C MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs. jQMM will facilitate the design and metabolic engineering of organisms for biofuels and other chemicals, as well as investigations of cellular metabolism and leveraging -omics data. As an open source software project, we hope it will attract additions from the community and grow with the rapidly changing field of metabolic engineering.« less

  16. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkel, Garrett W.; Ghosh, Amit; Kumar, Vinay S.

    Modeling of microbial metabolism is a topic of growing importance in biotechnology. Mathematical modeling helps provide a mechanistic understanding for the studied process, separating the main drivers from the circumstantial ones, bounding the outcomes of experiments and guiding engineering approaches. Among different modeling schemes, the quantification of intracellular metabolic fluxes (i.e. the rate of each reaction in cellular metabolism) is of particular interest for metabolic engineering because it describes how carbon and energy flow throughout the cell. In addition to flux analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics,more » proteomics and metabolomics) are urgently needed. The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes and leveraging other -omics data for the scientific study of cellular metabolism and bioengineering purposes. Firstly, it presents a complete toolbox for simultaneously performing two different types of flux analysis that are typically disjoint: Flux Balance Analysis and 13C Metabolic Flux Analysis. Moreover, it introduces the capability to use 13C labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale 13C Metabolic Flux Analysis (2S- 13C MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs. jQMM will facilitate the design and metabolic engineering of organisms for biofuels and other chemicals, as well as investigations of cellular metabolism and leveraging -omics data. As an open source software project, we hope it will attract additions from the community and grow with the rapidly changing field of metabolic engineering.« less

  17. Application of a hybrid MPI/OpenMP approach for parallel groundwater model calibration using multi-core computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Guoping; D'Azevedo, Ed F; Zhang, Fan

    2010-01-01

    Calibration of groundwater models involves hundreds to thousands of forward solutions, each of which may solve many transient coupled nonlinear partial differential equations, resulting in a computationally intensive problem. We describe a hybrid MPI/OpenMP approach to exploit two levels of parallelisms in software and hardware to reduce calibration time on multi-core computers. HydroGeoChem 5.0 (HGC5) is parallelized using OpenMP for direct solutions for a reactive transport model application, and a field-scale coupled flow and transport model application. In the reactive transport model, a single parallelizable loop is identified to account for over 97% of the total computational time using GPROF.more » Addition of a few lines of OpenMP compiler directives to the loop yields a speedup of about 10 on a 16-core compute node. For the field-scale model, parallelizable loops in 14 of 174 HGC5 subroutines that require 99% of the execution time are identified. As these loops are parallelized incrementally, the scalability is found to be limited by a loop where Cray PAT detects over 90% cache missing rates. With this loop rewritten, similar speedup as the first application is achieved. The OpenMP-parallelized code can be run efficiently on multiple workstations in a network or multiple compute nodes on a cluster as slaves using parallel PEST to speedup model calibration. To run calibration on clusters as a single task, the Levenberg Marquardt algorithm is added to HGC5 with the Jacobian calculation and lambda search parallelized using MPI. With this hybrid approach, 100 200 compute cores are used to reduce the calibration time from weeks to a few hours for these two applications. This approach is applicable to most of the existing groundwater model codes for many applications.« less

  18. Modeling of additive manufacturing processes for metals: Challenges and opportunities

    DOE PAGES

    Francois, Marianne M.; Sun, Amy; King, Wayne E.; ...

    2017-01-09

    Here, with the technology being developed to manufacture metallic parts using increasingly advanced additive manufacturing processes, a new era has opened up for designing novel structural materials, from designing shapes and complex geometries to controlling the microstructure (alloy composition and morphology). The material properties used within specific structural components are also designable in order to meet specific performance requirements that are not imaginable with traditional metal forming and machining (subtractive) techniques.

  19. Multi-Disciplinary Analysis and Optimization Frameworks

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia Gutierrez

    2009-01-01

    Since July 2008, the Multidisciplinary Analysis & Optimization Working Group (MDAO WG) of the Systems Analysis Design & Optimization (SAD&O) discipline in the Fundamental Aeronautics Program s Subsonic Fixed Wing (SFW) project completed one major milestone, Define Architecture & Interfaces for Next Generation Open Source MDAO Framework Milestone (9/30/08), and is completing the Generation 1 Framework validation milestone, which is due December 2008. Included in the presentation are: details of progress on developing the Open MDAO framework, modeling and testing the Generation 1 Framework, progress toward establishing partnerships with external parties, and discussion of additional potential collaborations

  20. 3D geospatial visualizations: Animation and motion effects on spatial objects

    NASA Astrophysics Data System (ADS)

    Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos

    2018-02-01

    Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

  1. Personality trait development at the end of life: Antecedents and correlates of mean-level trajectories.

    PubMed

    Wagner, Jenny; Ram, Nilam; Smith, Jacqui; Gerstorf, Denis

    2016-09-01

    Empirical evidence over the past 20 years has documented that key aspects of personality traits change during adulthood. However, it is essentially an open question whether and how traits change at the very end of life and what role health, cognitive performance, perceived control, and social factors play in those changes. To examine these questions, we applied growth models to 13-year longitudinal data obtained from now-deceased participants in the Berlin Aging Study (N = 463; age at baseline M = 85.9 years, SD = 8.4; 51% men). Results revealed that neuroticism, on average, increases (about 0.3 SD in the last 10 years) and that this increase becomes even steeper at the end of life. In contrast, extraversion and openness decline rather steadily at the end of life (about -0.5 SD in the last 10 years). Additionally, poor health manifested as a risk factor for declines in extraversion and openness late in life but not neuroticism. Similar to earlier phases of life, better cognitive performance related to more openness. More loneliness was associated with higher neuroticism, whereas more social activity was associated with higher levels of extraversion and openness. Intriguing additional insights indicated that more personal control was associated with higher levels of extraversion and openness, whereas the feeling that one's life is controlled by others was associated with higher neuroticism but also with higher openness closer to death. We discuss potential pathways by which health, cognitive performance, control, and social inclusion resources and risk factors affect personality development late in life. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. A survey on hysteresis modeling, identification and control

    NASA Astrophysics Data System (ADS)

    Hassani, Vahid; Tjahjowidodo, Tegoeh; Do, Thanh Nho

    2014-12-01

    The various mathematical models for hysteresis such as Preisach, Krasnosel'skii-Pokrovskii (KP), Prandtl-Ishlinskii (PI), Maxwell-Slip, Bouc-Wen and Duhem are surveyed in terms of their applications in modeling, control and identification of dynamical systems. In the first step, the classical formalisms of the models are presented to the reader, and more broadly, the utilization of the classical models is considered for development of more comprehensive models and appropriate controllers for corresponding systems. In addition, the authors attempt to encourage the reader to follow the existing mathematical models of hysteresis to resolve the open problems.

  3. Applying an Archetype-Based Approach to Electroencephalography/Event-Related Potential Experiments in the EEGBase Resource.

    PubMed

    Papež, Václav; Mouček, Roman

    2017-01-01

    The purpose of this study is to investigate the feasibility of applying openEHR (an archetype-based approach for electronic health records representation) to modeling data stored in EEGBase, a portal for experimental electroencephalography/event-related potential (EEG/ERP) data management. The study evaluates re-usage of existing openEHR archetypes and proposes a set of new archetypes together with the openEHR templates covering the domain. The main goals of the study are to (i) link existing EEGBase data/metadata and openEHR archetype structures and (ii) propose a new openEHR archetype set describing the EEG/ERP domain since this set of archetypes currently does not exist in public repositories. The main methodology is based on the determination of the concepts obtained from EEGBase experimental data and metadata that are expressible structurally by the openEHR reference model and semantically by openEHR archetypes. In addition, templates as the third openEHR resource allow us to define constraints over archetypes. Clinical Knowledge Manager (CKM), a public openEHR archetype repository, was searched for the archetypes matching the determined concepts. According to the search results, the archetypes already existing in CKM were applied and the archetypes not existing in the CKM were newly developed. openEHR archetypes support linkage to external terminologies. To increase semantic interoperability of the new archetypes, binding with the existing odML electrophysiological terminology was assured. Further, to increase structural interoperability, also other current solutions besides EEGBase were considered during the development phase. Finally, a set of templates using the selected archetypes was created to meet EEGBase requirements. A set of eleven archetypes that encompassed the domain of experimental EEG/ERP measurements were identified. Of these, six were reused without changes, one was extended, and four were newly created. All archetypes were arranged in the templates reflecting the EEGBase metadata structure. A mechanism of odML terminology referencing was proposed to assure semantic interoperability of the archetypes. The openEHR approach was found to be useful not only for clinical purposes but also for experimental data modeling.

  4. Operational aspects of asynchronous filtering for improved flood forecasting

    NASA Astrophysics Data System (ADS)

    Rakovec, Oldrich; Weerts, Albrecht; Sumihar, Julius; Uijlenhoet, Remko

    2014-05-01

    Hydrological forecasts can be made more reliable and less uncertain by recursively improving initial conditions. A common way of improving the initial conditions is to make use of data assimilation (DA), a feedback mechanism or update methodology which merges model estimates with available real world observations. The traditional implementation of the Ensemble Kalman Filter (EnKF; e.g. Evensen, 2009) is synchronous, commonly named a three dimensional (3-D) assimilation, which means that all assimilated observations correspond to the time of update. Asynchronous DA, also called four dimensional (4-D) assimilation, refers to an updating methodology, in which observations being assimilated into the model originate from times different to the time of update (Evensen, 2009; Sakov 2010). This study investigates how the capabilities of the DA procedure can be improved by applying alternative Kalman-type methods, e.g., the Asynchronous Ensemble Kalman Filter (AEnKF). The AEnKF assimilates observations with smaller computational costs than the original EnKF, which is beneficial for operational purposes. The results of discharge assimilation into a grid-based hydrological model for the Upper Ourthe catchment in Belgian Ardennes show that including past predictions and observations in the AEnKF improves the model forecasts as compared to the traditional EnKF. Additionally we show that elimination of the strongly non-linear relation between the soil moisture storage and assimilated discharge observations from the model update becomes beneficial for an improved operational forecasting, which is evaluated using several validation measures. In the current study we employed the HBV-96 model built within a recently developed open source modelling environment OpenStreams (2013). The advantage of using OpenStreams (2013) is that it enables direct communication with OpenDA (2013), an open source data assimilation toolbox. OpenDA provides a number of algorithms for model calibration and assimilation and is suitable to be connected to any kind of environmental model. This setup is embedded in the Delft Flood Early Warning System (Delft-FEWS, Werner et al., 2013) for making all simulations and forecast runs and handling of all hydrological and meteorological data. References: Evensen, G. (2009), Data Assimilation: The Ensemble Kalman Filter, Springer, doi:10.1007/978-3-642-03711-5. OpenDA (2013), The OpenDA data-assimilation toolbox, www.openda.org, (last access: 1 November 2013). OpenStreams (2013), OpenStreams, www.openstreams.nl, (last access: 1 November 2013). Sakov, P., G. Evensen, and L. Bertino (2010), Asynchronous data assimilation with the EnKF, Tellus, Series A: Dynamic Meteorology and Oceanography, 62(1), 24-29, doi:10.1111/j.1600-0870.2009.00417.x. Werner, M., J. Schellekens, P. Gijsbers, M. van Dijk, O. van den Akker, and K. Heynert (2013), The Delft-FEWS flow forecasting system, Environ. Mod. & Soft., 40(0), 65-77, doi: http://dx.doi.org/10.1016/j.envsoft.2012.07.010.

  5. New insights into the electroreduction of ethylene sulfite as an electrolyte additive for facilitating solid electrolyte interphase formation in lithium ion batteries.

    PubMed

    Sun, Youmin; Wang, Yixuan

    2017-03-01

    To help understand the solid electrolyte interphase (SEI) formation facilitated by electrolyte additives of lithium-ion batteries (LIBs) the supermolecular clusters [(ES)Li + (PC) m ](PC) n (m = 1-2; n = 0, 6 and 9) were used to investigate the electroreductive decompositions of the electrolyte additive ethylene sulfite (ES) as well as the solvent propylene carbonate (PC) with density functional theory. The results show that ES can be reduced prior to PC, resulting in a reduction precursor that will then undergo a ring opening decomposition to yield a radical anion. A new concerted pathway (path B) was located for the ring opening of the reduced ES, which has a much lower energy barrier than the previously reported stepwise pathway (path A). The transition state for the ring opening of PC induced by the reduced ES (path C, indirect path) is closer to that of path A than path B in energy. The direct ring opening of the reduced PC (path D) has a lower energy barrier than paths A, B and C, yet it is less favorable than the latter paths in terms of thermodynamics (vertical electron affinity or reduction potential and dissociation energy). The overall rate constant including the initial reduction and the subsequent ring opening for path B is the largest among the four paths, followed by paths A > C > D, which further signifies the importance of the concerted new path in facilitating the SEI formation. The hybrid models, the supermolecular clusters augmented by a polarized continuum model, PCM-[(ES)Li + (PC) 2 ](PC) n (n = 0, 6 and 9), were used to further estimate the reduction potential by taking into account both explicit and implicit solvent effects. The second solvation shell of Li + in [(ES)Li + (PC) 2 ](PC) n (n = 6 and 9) partially compensates the overestimation of solvent effects arising from the PCM for the naked (ES)Li + (PC) 2 , and the theoretical reduction potential of PCM-[(ES)Li + (PC) 2 ](PC) 6 (1.90-1.93 V) agrees very well with the experimental one (1.8-2.0 V).

  6. An open and extensible framework for spatially explicit land use change modelling: the lulcc R package

    NASA Astrophysics Data System (ADS)

    Moulds, S.; Buytaert, W.; Mijic, A.

    2015-10-01

    We present the lulcc software package, an object-oriented framework for land use change modelling written in the R programming language. The contribution of the work is to resolve the following limitations associated with the current land use change modelling paradigm: (1) the source code for model implementations is frequently unavailable, severely compromising the reproducibility of scientific results and making it impossible for members of the community to improve or adapt models for their own purposes; (2) ensemble experiments to capture model structural uncertainty are difficult because of fundamental differences between implementations of alternative models; and (3) additional software is required because existing applications frequently perform only the spatial allocation of change. The package includes a stochastic ordered allocation procedure as well as an implementation of the CLUE-S algorithm. We demonstrate its functionality by simulating land use change at the Plum Island Ecosystems site, using a data set included with the package. It is envisaged that lulcc will enable future model development and comparison within an open environment.

  7. Applying 1D Sediment Models to Reservoir Flushing Studies: Measuring, Monitoring, and Modeling the Spencer Dam Sediment Flush with HEC-RAS

    DTIC Science & Technology

    2016-07-01

    approximately 5 hours (hr) after opening main gates. Multiple channels eroded (Figure 5), moving sediment through the dam throughout the first day...additional sediment evacuation was observed over the next 4 weeks. ERDC/CHL CHETN-XIV-52 July 2016 5 Figure 5. Multiple channels eroded...2015. A physically-based channel - modeling framework integrating HEC-RAS sediment transport capabilities and the USDA-ARS Bank-Stability and Toe-Erosion

  8. mdFoam+: Advanced molecular dynamics in OpenFOAM

    NASA Astrophysics Data System (ADS)

    Longshaw, S. M.; Borg, M. K.; Ramisetti, S. B.; Zhang, J.; Lockerby, D. A.; Emerson, D. R.; Reese, J. M.

    2018-03-01

    This paper introduces mdFoam+, which is an MPI parallelised molecular dynamics (MD) solver implemented entirely within the OpenFOAM software framework. It is open-source and released under the same GNU General Public License (GPL) as OpenFOAM. The source code is released as a publicly open software repository that includes detailed documentation and tutorial cases. Since mdFoam+ is designed entirely within the OpenFOAM C++ object-oriented framework, it inherits a number of key features. The code is designed for extensibility and flexibility, so it is aimed first and foremost as an MD research tool, in which new models and test cases can be developed and tested rapidly. Implementing mdFoam+ in OpenFOAM also enables easier development of hybrid methods that couple MD with continuum-based solvers. Setting up MD cases follows the standard OpenFOAM format, as mdFoam+ also relies upon the OpenFOAM dictionary-based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of an MD simulation is not typical of most OpenFOAM applications. Results show that mdFoam+ compares well to another well-known MD code (e.g. LAMMPS) in terms of benchmark problems, although it also has additional functionality that does not exist in other open-source MD codes.

  9. A ROSE-based OpenMP 3.0 Research Compiler Supporting Multiple Runtime Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, C; Quinlan, D; Panas, T

    2010-01-25

    OpenMP is a popular and evolving programming model for shared-memory platforms. It relies on compilers for optimal performance and to target modern hardware architectures. A variety of extensible and robust research compilers are key to OpenMP's sustainable success in the future. In this paper, we present our efforts to build an OpenMP 3.0 research compiler for C, C++, and Fortran; using the ROSE source-to-source compiler framework. Our goal is to support OpenMP research for ourselves and others. We have extended ROSE's internal representation to handle all of the OpenMP 3.0 constructs and facilitate their manipulation. Since OpenMP research is oftenmore » complicated by the tight coupling of the compiler translations and the runtime system, we present a set of rules to define a common OpenMP runtime library (XOMP) on top of multiple runtime libraries. These rules additionally define how to build a set of translations targeting XOMP. Our work demonstrates how to reuse OpenMP translations across different runtime libraries. This work simplifies OpenMP research by decoupling the problematic dependence between the compiler translations and the runtime libraries. We present an evaluation of our work by demonstrating an analysis tool for OpenMP correctness. We also show how XOMP can be defined using both GOMP and Omni and present comparative performance results against other OpenMP compilers.« less

  10. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  11. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.

  12. Model Predictive Flight Control System with Full State Observer using H∞ Method

    NASA Astrophysics Data System (ADS)

    Sanwale, Jitu; Singh, Dhan Jeet

    2018-03-01

    This paper presents the application of the model predictive approach to design a flight control system (FCS) for longitudinal dynamics of a fixed wing aircraft. Longitudinal dynamics is derived for a conventional aircraft. Open loop aircraft response analysis is carried out. Simulation studies are illustrated to prove the efficacy of the proposed model predictive controller using H ∞ state observer. The estimation criterion used in the {H}_{∞} observer design is to minimize the worst possible effects of the modelling errors and additive noise on the parameter estimation.

  13. How Significant is the Slope of the Sea-side Boundary for Modelling Seawater Intrusion in Coastal Aquifers?

    NASA Astrophysics Data System (ADS)

    Walther, Marc; Graf, Thomas; Kolditz, Olaf; Lield, Rudolf; Post, Vincent

    2017-04-01

    A large number of people live in coastal areas using the available water resources, which in (semi-)arid regions are often taken from groundwater resources as the only sufficient source. Compared to surface water, these usually provide a safe water supply due to the remediation and retention capabilities of the subsurface, their high yield, and potentially longer term stability. With a water withdrawal from a coastal aquifer, coastal water management, however, has to ensure that seawater intrusion is retained in order to keep the water salinity at an acceptable level for all water users (e.g. agriculture, industry, households). Besides monitoring of water levels and saline intrusion, it has become a common practice to use numerical modeling for evaluating the coastal water resources and projecting future scenarios. When applying a model, it is necessary for the simplifications implied during the conceptualization of the setup to include the relevant processes (here variable-density flow and mass transport) and sensitive parameters (for a steady state commonly hydraulic conductivity, density ratio, dispersivity). Additionally, the model's boundary conditions are essential to the simulation results. In order to reduce the number of elements, and thus, the computational burden, one simplification that is made in most regional scale saltwater intrusion applications, is to represent the sea-side boundary with a vertical geometry, contrary to the natural conditions, that usually show a very shallow decent of the interface between the aquifer and the open seawater. We use the scientific open-source modeling toolbox OpenGeoSys [1] to quantify the influence of this simplification on the saline intrusion, submarine groundwater discharge, and groundwater residence times. Using an ensemble of different shelf shapes for a steady state setup, we identified a significant dependency of saline intrusion length on the geometric parameters of the sea-side boundary. Results show that the additional effort to implement a sloped sea-side boundary may have a significant impact for assessing coastal water resources, and its influence may be of a similar magnitude as that of other common uncertainties in numerical modelling. Literature [1] Kolditz, O., Bauer, S., Bilke, L., Böttcher, N., Delfs, J. O., Fischer, T., Görke, U. J., et al. (2012). OpenGeoSys: an open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THM/C) processes in porous media. Environmental Earth Sciences, 67(2), 589-599. doi:10.1007/s12665-012-1546-x

  14. The serotonin transporter promoter variant (5-HTTLPR) and childhood adversity are associated with the personality trait openness to experience.

    PubMed

    Rahman, Md Shafiqur; Guban, Peter; Wang, Mei; Melas, Philippe A; Forsell, Yvonne; Lavebratt, Catharina

    2017-11-01

    There is evidence supporting an association between the serotonin-transporter-linked polymorphic region (5-HTTLPR) and the Five Factor Model (FFM) of human personality. 5-HTTLPR has also been found to interact with stressful life events to increase risk of psychopathology. In the present study, by taking into account stressful life events in the form of childhood adversity, we examined the association between 5-HTTLPR and FFM traits using an adult Swedish cohort (N = 3112). We found that 5-HTTLPR was significantly associated with openness (to experience). Specifically, homozygote carriers of the short allele had lower levels of openness compared to carriers of the long allele. In addition, childhood adversity was found to influence openness. These findings support a previously reported association of 5-HTTLPR with openness in a younger cohort and may provide insights into the neurobiological basis of human personality. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. An Investigation of Jogging Biomechanics using the Full-Body Lumbar Spine Model: Model Development and Validation

    PubMed Central

    Raabe, Margaret E.; Chaudhari, Ajit M.W.

    2016-01-01

    The ability of a biomechanical simulation to produce results that can translate to real-life situations is largely dependent on the physiological accuracy of the musculoskeletal model. There are a limited number of freely-available, full-body models that exist in OpenSim, and those that do exist are very limited in terms of trunk musculature and degrees of freedom in the spine. Properly modeling the motion and musculature of the trunk is necessary to most accurately estimate lower extremity and spinal loading. The objective of this study was to develop and validate a more physiologically accurate OpenSim full-body model. By building upon three previously developed OpenSim models, the Full-Body Lumbar Spine (FBLS) model, comprised of 21 segments, 30 degrees-of-freedom, and 324 musculotendon actuators, was developed. The five lumbar vertebrae were modeled as individual bodies, and coupled constraints were implemented to describe the net motion of the spine. The eight major muscle groups of the lumbar spine were modeled (rectus abdominis, external and internal obliques, erector spinae, multifidus, quadratus lumborum, psoas major, and latissimus dorsi), and many of these muscle groups were modeled as multiple fascicles allowing the large muscles to act in multiple directions. The resulting FBLS model's trunk muscle geometry, maximal isometric joint moments, and simulated muscle activations compare well to experimental data. The FBLS model will be made freely available (https://simtk.org/home/fullbodylumbar) for others to perform additional analyses and develop simulations investigating full-body dynamics and contributions of the trunk muscles to dynamic tasks. PMID:26947033

  16. Conversion-to-open in laparoscopic appendectomy: A cohort analysis of risk factors and outcomes.

    PubMed

    Finnerty, Brendan M; Wu, Xian; Giambrone, Gregory P; Gaber-Baylis, Licia K; Zabih, Ramin; Bhat, Akshay; Zarnegar, Rasa; Pomp, Alfons; Fleischut, Peter; Afaneh, Cheguevara

    2017-04-01

    Identifying risk factors for conversion from laparoscopic to open appendectomy could select patients who may benefit from primary open appendectomy. We aimed to develop a predictive scoring model for conversion from laparoscopic to open based on pre-operative patient characteristics. A retrospective review of the State Inpatient Database (2007-2011) was performed using derivation (N = 71,617) and validation (N = 143,235) cohorts of adults ≥ 18 years with acute appendicitis treated by laparoscopic-only (LA), conversion from laparoscopic to open (CA), or primary open (OA) appendectomy. Pre-operative variables independently associated with CA were identified and reported as odds ratios (OR) with 95% confidence intervals (CI). A weighted integer-based scoring model to predict CA was designed based on pre-operative variable ORs, and complications between operative subgroups were compared. Independent predictors of CA in the derivation cohort were age ≥40 (OR 1.67; CI 1.55-1.80), male sex (OR 1.25; CI 1.17-1.34), black race (OR 1.46; CI 1.28-1.66), diabetes (OR 1.47; CI 1.31-1.65), obesity (OR 1.56; CI 1.40-1.74), and acute appendicitis with abscess or peritonitis (OR 7.00; CI 6.51-7.53). In the validation cohort, the CA predictive scoring model had an optimal cutoff score of 4 (range 0-9). The risk of conversion-to-open was ≤5% for a score <4, compared to 10-25% for a score ≥4. On composite outcomes analysis controlling for all pre-operative variables, CA had a higher likelihood of infectious/inflammatory (OR 1.44; CI 1.31-1.58), hematologic (OR 1.31; CI 1.17-1.46), and renal (OR 1.22; CI 1.06-1.39) complications compared to OA. Additionally, CA had a higher likelihood of infectious/inflammatory, respiratory, cardiovascular, hematologic, and renal complications compared to LA. CA patients have an unfavorable complication profile compared to OA. The predictors identified in this scoring model could help select for patients who may benefit from primary open appendectomy. Copyright © 2017 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez, Jesse E.; Baptista, António M.

    A sediment model coupled to the hydrodynamic model SELFE is validated against a benchmark combining a set of idealized tests and an application to a field-data rich energetic estuary. After sensitivity studies, model results for the idealized tests largely agree with previously reported results from other models in addition to analytical, semi-analytical, or laboratory results. Results of suspended sediment in an open channel test with fixed bottom are sensitive to turbulence closure and treatment for hydrodynamic bottom boundary. Results for the migration of a trench are very sensitive to critical stress and erosion rate, but largely insensitive to turbulence closure.more » The model is able to qualitatively represent sediment dynamics associated with estuarine turbidity maxima in an idealized estuary. Applied to the Columbia River estuary, the model qualitatively captures sediment dynamics observed by fixed stations and shipborne profiles. Representation of the vertical structure of suspended sediment degrades when stratification is underpredicted. Across all tests, skill metrics of suspended sediments lag those of hydrodynamics even when qualitatively representing dynamics. The benchmark is fully documented in an openly available repository to encourage unambiguous comparisons against other models.« less

  18. Numerical modelling of orthogonal cutting: application to woodworking with a bench plane.

    PubMed

    Nairn, John A

    2016-06-06

    A numerical model for orthogonal cutting using the material point method was applied to woodcutting using a bench plane. The cutting process was modelled by accounting for surface energy associated with wood fracture toughness for crack growth parallel to the grain. By using damping to deal with dynamic crack propagation and modelling all contact between wood and the plane, simulations could initiate chip formation and proceed into steady-state chip propagation including chip curling. Once steady-state conditions were achieved, the cutting forces became constant and could be determined as a function of various simulation variables. The modelling details included a cutting tool, the tool's rake and grinding angles, a chip breaker, a base plate and a mouth opening between the base plate and the tool. The wood was modelled as an anisotropic elastic-plastic material. The simulations were verified by comparison to an analytical model and then used to conduct virtual experiments on wood planing. The virtual experiments showed interactions between depth of cut, chip breaker location and mouth opening. Additional simulations investigated the role of tool grinding angle, tool sharpness and friction.

  19. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    PubMed

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  20. Recent evaluations of crack-opening-area in circumferentially cracked pipes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, S.; Brust, F.; Ghadiali, N.

    1997-04-01

    Leak-before-break (LBB) analyses for circumferentially cracked pipes are currently being conducted in the nuclear industry to justify elimination of pipe whip restraints and jet shields which are present because of the expected dynamic effects from pipe rupture. The application of the LBB methodology frequently requires calculation of leak rates. The leak rates depend on the crack-opening area of the through-wall crack in the pipe. In addition to LBB analyses which assume a hypothetical flaw size, there is also interest in the integrity of actual leaking cracks corresponding to current leakage detection requirements in NRC Regulatory Guide 1.45, or for assessingmore » temporary repair of Class 2 and 3 pipes that have leaks as are being evaluated in ASME Section XI. The objectives of this study were to review, evaluate, and refine current predictive models for performing crack-opening-area analyses of circumferentially cracked pipes. The results from twenty-five full-scale pipe fracture experiments, conducted in the Degraded Piping Program, the International Piping Integrity Research Group Program, and the Short Cracks in Piping and Piping Welds Program, were used to verify the analytical models. Standard statistical analyses were performed to assess used to verify the analytical models. Standard statistical analyses were performed to assess quantitatively the accuracy of the predictive models. The evaluation also involved finite element analyses for determining the crack-opening profile often needed to perform leak-rate calculations.« less

  1. A musculoskeletal model for the lumbar spine.

    PubMed

    Christophy, Miguel; Faruk Senan, Nur Adila; Lotz, Jeffrey C; O'Reilly, Oliver M

    2012-01-01

    A new musculoskeletal model for the lumbar spine is described in this paper. This model features a rigid pelvis and sacrum, the five lumbar vertebrae, and a rigid torso consisting of a lumped thoracic spine and ribcage. The motion of the individual lumbar vertebrae was defined as a fraction of the net lumbar movement about the three rotational degrees of freedom: flexion-extension lateral bending, and axial rotation. Additionally, the eight main muscle groups of the lumbar spine were incorporated using 238 muscle fascicles with prescriptions for the parameters in the Hill-type muscle models obtained with the help of an extensive literature survey. The features of the model include the abilities to predict joint reactions, muscle forces, and muscle activation patterns. To illustrate the capabilities of the model and validate its physiological similarity, the model's predictions for the moment arms of the muscles are shown for a range of flexion-extension motions of the lower back. The model uses the OpenSim platform and is freely available on https://www.simtk.org/home/lumbarspine to other spinal researchers interested in analyzing the kinematics of the spine. The model can also be integrated with existing OpenSim models to build more comprehensive models of the human body.

  2. MOSFiT: Modular Open Source Fitter for Transients

    NASA Astrophysics Data System (ADS)

    Guillochon, James; Nicholl, Matt; Villar, V. Ashley; Mockler, Brenna; Narayan, Gautham; Mandel, Kaisey S.; Berger, Edo; Williams, Peter K. G.

    2018-05-01

    Much of the progress made in time-domain astronomy is accomplished by relating observational multiwavelength time-series data to models derived from our understanding of physical laws. This goal is typically accomplished by dividing the task in two: collecting data (observing), and constructing models to represent that data (theorizing). Owing to the natural tendency for specialization, a disconnect can develop between the best available theories and the best available data, potentially delaying advances in our understanding new classes of transients. We introduce MOSFiT: the Modular Open Source Fitter for Transients, a Python-based package that downloads transient data sets from open online catalogs (e.g., the Open Supernova Catalog), generates Monte Carlo ensembles of semi-analytical light-curve fits to those data sets and their associated Bayesian parameter posteriors, and optionally delivers the fitting results back to those same catalogs to make them available to the rest of the community. MOSFiT is designed to help bridge the gap between observations and theory in time-domain astronomy; in addition to making the application of existing models and creation of new models as simple as possible, MOSFiT yields statistically robust predictions for transient characteristics, with a standard output format that includes all the setup information necessary to reproduce a given result. As large-scale surveys such as that conducted with the Large Synoptic Survey Telescope (LSST), discover entirely new classes of transients, tools such as MOSFiT will be critical for enabling rapid comparison of models against data in statistically consistent, reproducible, and scientifically beneficial ways.

  3. Benthic Light Availability Improves Predictions of Riverine Primary Production

    NASA Astrophysics Data System (ADS)

    Kirk, L.; Cohen, M. J.

    2017-12-01

    Light is a fundamental control on photosynthesis, and often the only control strongly correlated with gross primary production (GPP) in streams and rivers; yet it has received far less attention than nutrients. Because benthic light is difficult to measure in situ, surrogates such as open sky irradiance are often used. Several studies have now refined methods to quantify canopy and water column attenuation of open sky light in order to estimate the amount of light that actually reaches the benthos. Given the additional effort that measuring benthic light requires, we should ask if benthic light always improves our predictions of GPP compared to just open sky irradiance. We use long-term, high-resolution dissolved oxygen, turbidity, dissolved organic matter (fDOM), and irradiance data from streams and rivers in north-central Florida, US across gradients of size and color to build statistical models of benthic light that predict GPP. Preliminary results on a large, clear river show only modest model improvements over open sky irradiance, even in heavily canopied reaches with pulses of tannic water. However, in another spring-fed river with greater connectivity to adjacent wetlands - and hence larger, more frequent pulses of tannic water - the model improved dramatically with the inclusion of fDOM (model R2 improved from 0.28 to 0.68). River shade modeling efforts also suggest that knowing benthic light will greatly enhance our ability to predict GPP in narrower, forested streams flowing in particular directions. Our objective is to outline conditions where an assessment of benthic light conditions would be necessary for riverine metabolism studies or management strategies.

  4. Teaching Earth Science Using Hot Air Balloons

    ERIC Educational Resources Information Center

    Kuhl, James; Shaffer, Karen

    2008-01-01

    Constructing model hot air balloons is an activity that captures the imaginations of students, enabling teachers to present required content to minds that are open to receive it. Additionally, there are few activities that lend themselves to integrating so much content across subject areas. In this article, the authors describe how they have…

  5. The Effect of Magnetic Topology on the Escape of Flare Particles

    NASA Technical Reports Server (NTRS)

    Antiochos, S. K.; Masson, S.; DeVore, C. R.

    2012-01-01

    Magnetic reconnection in the solar atmosphere is believed to be the driver of most solar explosive phenomena. Therefore, the topology of the coronal magnetic field is central to understanding the solar drivers of space weather. Of particular importance to space weather are the impulsive Solar Energetic particles that are associated with some CME/eruptive flare events. Observationally, the magnetic configuration of active regions where solar eruptions originate appears to agree with the standard eruptive flare model. According to this model, particles accelerated at the flare reconnection site should remain trapped in the corona and the ejected plasmoid. However, flare-accelerated particles frequently reach the Earth long before the CME does. We present a model that may account for the injection of energetic particles onto open magnetic flux tubes connecting to the Earth. Our model is based on the well-known 2.5D breakout topology, which has a coronal null point (null line) and a four-flux system. A key new addition, however, is that we include an isothermal solar wind with open-flux regions. Depending on the location of the open flux with respect to the null point, we find that the flare reconnection can consist of two distinct phases. At first, the flare reconnection involves only closed field, but if the eruption occurs close to the open field, we find a second phase involving interchange reconnection between open and closed. We argue that this second reconnection episode is responsible for the injection of flare-accelerated particles into the interplanetary medium. We will report on our recent work toward understanding how flare particles escape to the heliosphere. This work uses high-resolution 2.5D MHD numerical simulations performed with the Adaptively Refined MHD Solver (ARMS).

  6. Accurate structure, thermodynamics and spectroscopy of medium-sized radicals by hybrid Coupled Cluster/Density Functional Theory approaches: the case of phenyl radical

    PubMed Central

    Barone, Vincenzo; Biczysko, Malgorzata; Bloino, Julien; Egidi, Franco; Puzzarini, Cristina

    2015-01-01

    The CCSD(T) model coupled with extrapolation to the complete basis-set limit and additive approaches represents the “golden standard” for the structural and spectroscopic characterization of building blocks of biomolecules and nanosystems. However, when open-shell systems are considered, additional problems related to both specific computational difficulties and the need of obtaining spin-dependent properties appear. In this contribution, we present a comprehensive study of the molecular structure and spectroscopic (IR, Raman, EPR) properties of the phenyl radical with the aim of validating an accurate computational protocol able to deal with conjugated open-shell species. We succeeded in obtaining reliable and accurate results, thus confirming and, partly, extending the available experimental data. The main issue to be pointed out is the need of going beyond the CCSD(T) level by including a full treatment of triple excitations in order to fulfil the accuracy requirements. On the other hand, the reliability of density functional theory in properly treating open-shell systems has been further confirmed. PMID:23802956

  7. Dynamic analysis of gas-core reactor system

    NASA Technical Reports Server (NTRS)

    Turner, K. H., Jr.

    1973-01-01

    A heat transfer analysis was incorporated into a previously developed model CODYN to obtain a model of open-cycle gaseous core reactor dynamics which can predict the heat flux at the cavity wall. The resulting model was used to study the sensitivity of the model to the value of the reactivity coefficients and to determine the system response for twenty specified perturbations. In addition, the model was used to study the effectiveness of several control systems in controlling the reactor. It was concluded that control drums located in the moderator region capable of inserting reactivity quickly provided the best control.

  8. Behavioral assessments of BTBR T+Itpr3tf/J mice by tests of object attention and elevated open platform: Implications for an animal model of psychiatric comorbidity in autism.

    PubMed

    Chao, Owen Y; Yunger, Richelle; Yang, Yi-Mei

    2018-07-16

    Autism spectrum disorders (ASD) are diagnosed based on the behavioral criteria of impaired social interaction, defective communication and repetitive behaviors. Psychiatric comorbidities, such as anxiety and intellectual disability, are commonly present in ASD. The BTBR T+ Itpr3tf/J (BTBR) mice display a range of autistic phenotypes, yet whether this mouse model is appropriate to study psychiatric comorbidity in ASD remains unclear. We addressed this issue by subjecting the BTBR animals to three-chambered apparatus, open field, object attention test and elevated open platform. Compared to C57BL/6J control mice, the BTBR mice displayed hyperactivity in most of the tests. In the three-chamber assessment, they exhibited deficits in sociability. In the open field, more grooming and thigmotaxis and less rearing behaviors were observed. They also showed impaired object-based attention. On the elevated open platform, the BTBR animals stayed more to the edges than in the center of the platform. To further examine the properties of this test, naïve C57BL/6J mice were randomly administrated with saline or an anxiogenic substance, caffeine. The caffeine group demonstrated a similar behavioral pattern as the BTBR mice. When the saline group was re-exposed to the same platform, the time they stayed in the center substantially increased, likely due to reduced anxiety by habituation. These results indicate that the BTBR were more anxious than control mice on the open platform. Taken together, the BTBR strain exhibit emotional and cognitive impairments in addition to autistic behaviors, suggesting that they can be a valid model for ASD with psychiatric comorbidity. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  10. Radiation combined injury models to study the effects of interventions and wound biomechanics.

    PubMed

    Zawaski, Janice A; Yates, Charles R; Miller, Duane D; Kaffes, Caterina C; Sabek, Omaima M; Afshar, Solmaz F; Young, Daniel A; Yang, Yunzhi; Gaber, M Waleed

    2014-12-01

    In the event of a nuclear detonation, a considerable number of projected casualties will suffer from combined radiation exposure and burn and/or wound injury. Countermeasure assessment in the setting of radiation exposure combined with dermal injury is hampered by a lack of animal models in which the effects of interventions have been characterized. To address this need, we used two separate models to characterize wound closure. The first was an open wound model in mice to study the effect of wound size in combination with whole-body 6 Gy irradiation on the rate of wound closure, animal weight and survival (morbidity). In this model the addition of interventions, wound closure, subcutaneous vehicle injection, topical antiseptic and topical antibiotics were studied to measure their effect on healing and survival. The second was a rat closed wound model to study the biomechanical properties of a healed wound at 10 days postirradiation (irradiated with 6 or 7.5 Gy). In addition, complete blood counts were performed and wound pathology by staining with hematoxylin and eosin, trichrome, CD68 and Ki67. In the mouse open wound model, we found that wound size and morbidity were positively correlated, while wound size and survival were negatively correlated. Regardless of the wound size, the addition of radiation exposure delayed the healing of the wound by approximately 5-6 days. The addition of interventions caused, at a minimum, a 30% increase in survival and improved mean survival by ∼9 days. In the rat closed wound model we found that radiation exposure significantly decreased all wound biomechanical measurements as well as white blood cell, platelet and red blood cell counts at 10 days post wounding. Also, pathological changes showed a loss of dermal structure, thickening of dermis, loss of collagen/epithelial hyperplasia and an increased density of macrophages. In conclusion, we have characterized the effect of a changing wound size in combination with radiation exposure. We also demonstrated that the most effective interventions mitigated insensible fluid loss, which could help to define the most appropriate requirements of a successful countermeasure.

  11. Making Transporter Models for Drug-Drug Interaction Prediction Mobile.

    PubMed

    Ekins, Sean; Clark, Alex M; Wright, Stephen H

    2015-10-01

    The past decade has seen increased numbers of studies publishing ligand-based computational models for drug transporters. Although they generally use small experimental data sets, these models can provide insights into structure-activity relationships for the transporter. In addition, such models have helped to identify new compounds as substrates or inhibitors of transporters of interest. We recently proposed that many transporters are promiscuous and may require profiling of new chemical entities against multiple substrates for a specific transporter. Furthermore, it should be noted that virtually all of the published ligand-based transporter models are only accessible to those involved in creating them and, consequently, are rarely shared effectively. One way to surmount this is to make models shareable or more accessible. The development of mobile apps that can access such models is highlighted here. These apps can be used to predict ligand interactions with transporters using Bayesian algorithms. We used recently published transporter data sets (MATE1, MATE2K, OCT2, OCTN2, ASBT, and NTCP) to build preliminary models in a commercial tool and in open software that can deliver the model in a mobile app. In addition, several transporter data sets extracted from the ChEMBL database were used to illustrate how such public data and models can be shared. Predicting drug-drug interactions for various transporters using computational models is potentially within reach of anyone with an iPhone or iPad. Such tools could help prioritize which substrates should be used for in vivo drug-drug interaction testing and enable open sharing of models. Copyright © 2015 by The American Society for Pharmacology and Experimental Therapeutics.

  12. Interannual sedimentary effluxes of alkalinity in the southern North Sea: model results compared with summer observations

    NASA Astrophysics Data System (ADS)

    Pätsch, Johannes; Kühn, Wilfried; Dorothea Six, Katharina

    2018-06-01

    For the sediments of the central and southern North Sea different sources of alkalinity generation are quantified by a regional modelling system for the period 2000-2014. For this purpose a formerly global ocean sediment model coupled with a pelagic ecosystem model is adapted to shelf sea dynamics, where much larger turnover rates than in the open and deep ocean occur. To track alkalinity changes due to different nitrogen-related processes, the open ocean sediment model was extended by the state variables particulate organic nitrogen (PON) and ammonium. Directly measured alkalinity fluxes and those derived from Ra isotope flux observation from the sediment into the pelagic are reproduced by the model system, but calcite building and calcite dissolution are underestimated. Both fluxes cancel out in terms of alkalinity generation and consumption. Other simulated processes altering alkalinity in the sediment, like net sulfate reduction, denitrification, nitrification, and aerobic degradation, are quantified and compare well with corresponding fluxes derived from observations. Most of these fluxes exhibit a strong positive gradient from the open North Sea to the coast, where large rivers drain nutrients and organic matter. Atmospheric nitrogen deposition also shows a positive gradient from the open sea towards land and supports alkalinity generation in the sediments. An additional source of spatial variability is introduced by the use of a 3-D heterogenous porosity field. Due to realistic porosity variations (0.3-0.5) the alkalinity fluxes vary by about 4 %. The strongest impact on interannual variations of alkalinity fluxes is exhibited by the temporal varying nitrogen inputs from large rivers directly governing the nitrate concentrations in the coastal bottom water, thus providing nitrate necessary for benthic denitrification. Over the time investigated the alkalinity effluxes decrease due to the decrease in the nitrogen supply by the rivers.

  13. Investigation of advanced fault insertion and simulator methods

    NASA Technical Reports Server (NTRS)

    Dunn, W. R.; Cottrell, D.

    1986-01-01

    The cooperative agreement partly supported research leading to the open-literature publication cited. Additional efforts under the agreement included research into fault modeling of semiconductor devices. Results of this research are presented in this report which is summarized in the following paragraphs. As a result of the cited research, it appears that semiconductor failure mechanism data is abundant but of little use in developing pin-level device models. Failure mode data on the other hand does exist but is too sparse to be of any statistical use in developing fault models. What is significant in the failure mode data is that, unlike classical logic, MSI and LSI devices do exhibit more than 'stuck-at' and open/short failure modes. Specifically they are dominated by parametric failures and functional anomalies that can include intermittent faults and multiple-pin failures. The report discusses methods of developing composite pin-level models based on extrapolation of semiconductor device failure mechanisms, failure modes, results of temperature stress testing and functional modeling. Limitations of this model particularly with regard to determination of fault detection coverage and latency time measurement are discussed. Indicated research directions are presented.

  14. Efficient Band-to-Trap Tunneling Model Including Heterojunction Band Offset

    DOE PAGES

    Gao, Xujiao; Huang, Andy; Kerr, Bert

    2017-10-25

    In this paper, we present an efficient band-to-trap tunneling model based on the Schenk approach, in which an analytic density-of-states (DOS) model is developed based on the open boundary scattering method. The new model explicitly includes the effect of heterojunction band offset, in addition to the well-known field effect. Its analytic form enables straightforward implementation into TCAD device simulators. It is applicable to all one-dimensional potentials, which can be approximated to a good degree such that the approximated potentials lead to piecewise analytic wave functions with open boundary conditions. The model allows for simulating both the electric-field-enhanced and band-offset-enhanced carriermore » recombination due to the band-to-trap tunneling near the heterojunction in a heterojunction bipolar transistor (HBT). Simulation results of an InGaP/GaAs/GaAs NPN HBT show that the proposed model predicts significantly increased base currents, due to the hole-to-trap tunneling enhanced by the emitter-base junction band offset. Finally, the results compare favorably with experimental observation.« less

  15. Efficient Band-to-Trap Tunneling Model Including Heterojunction Band Offset

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Xujiao; Huang, Andy; Kerr, Bert

    In this paper, we present an efficient band-to-trap tunneling model based on the Schenk approach, in which an analytic density-of-states (DOS) model is developed based on the open boundary scattering method. The new model explicitly includes the effect of heterojunction band offset, in addition to the well-known field effect. Its analytic form enables straightforward implementation into TCAD device simulators. It is applicable to all one-dimensional potentials, which can be approximated to a good degree such that the approximated potentials lead to piecewise analytic wave functions with open boundary conditions. The model allows for simulating both the electric-field-enhanced and band-offset-enhanced carriermore » recombination due to the band-to-trap tunneling near the heterojunction in a heterojunction bipolar transistor (HBT). Simulation results of an InGaP/GaAs/GaAs NPN HBT show that the proposed model predicts significantly increased base currents, due to the hole-to-trap tunneling enhanced by the emitter-base junction band offset. Finally, the results compare favorably with experimental observation.« less

  16. A modelling framework to simulate foliar fungal epidemics using functional–structural plant models

    PubMed Central

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-01-01

    Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems using FSPMs. This structure can accommodate both previously developed models for individual aspects of pathosystems and new ones. Complex models are deconstructed into separate ‘knowledge sources’ originating from different specialist areas of expertise and these can be shared and reassembled into multidisciplinary models. The framework thus provides a beneficial tool for a potential diverse and dynamic research community. PMID:24925323

  17. Toric-boson model: Toward a topological quantum memory at finite temperature

    NASA Astrophysics Data System (ADS)

    Hamma, Alioscia; Castelnovo, Claudio; Chamon, Claudio

    2009-06-01

    We discuss the existence of stable topological quantum memory at finite temperature. At stake here is the fundamental question of whether it is, in principle, possible to store quantum information for macroscopic times without the intervention from the external world, that is, without error correction. We study the toric code in two dimensions with an additional bosonic field that couples to the defects, in the presence of a generic environment at finite temperature: the toric-boson model. Although the coupling constants for the bare model are not finite in the thermodynamic limit, the model has a finite spectrum. We show that in the topological phase, there is a finite temperature below which open strings are confined and therefore the lifetime of the memory can be made arbitrarily (polynomially) long in system size. The interaction with the bosonic field yields a long-range attractive force between the end points of open strings but leaves closed strings and topological order intact.

  18. Multiscale Analysis of a Collapsible Respiratory Airway

    NASA Astrophysics Data System (ADS)

    Ghadiali, Samir; Bell, E. David; Swarts, J. Douglas

    2006-11-01

    The Eustachian tube (ET) is a collapsible respiratory airway that connects the nasopharynx with the middle ear (ME). The ET normally exists in a collapsed state and must be periodically opened to maintain a healthy and sterile ME. Although the inability to open the ET (i.e. ET dysfunction) is the primary etiology responsible for several common ME diseases (i.e. Otitis Media), the mechanisms responsible for ET dysfunction are not well established. To investigate these mechanisms, we developed a multi-scale model of airflow in the ET and correlated model results with experimental data obtained in healthy and diseased subjects. The computational models utilized finite-element methods to simulate fluid-structure interactions and molecular dynamics techniques to quantify the adhesive properties of mucus glycoproteins. Results indicate that airflow in the ET is highly sensitive to both the dynamics of muscle contraction and molecular adhesion forces within the ET lumen. In addition, correlation of model results with experimental data obtained in diseased subjects was used to identify the biomechanical mechanisms responsible for ET dysfunction.

  19. PUFoam : A novel open-source CFD solver for the simulation of polyurethane foams

    NASA Astrophysics Data System (ADS)

    Karimi, M.; Droghetti, H.; Marchisio, D. L.

    2017-08-01

    In this work a transient three-dimensional mathematical model is formulated and validated for the simulation of polyurethane (PU) foams. The model is based on computational fluid dynamics (CFD) and is coupled with a population balance equation (PBE) to describe the evolution of the gas bubbles/cells within the PU foam. The front face of the expanding foam is monitored on the basis of the volume-of-fluid (VOF) method using a compressible solver available in OpenFOAM version 3.0.1. The solver is additionally supplemented to include the PBE, solved with the quadrature method of moments (QMOM), the polymerization kinetics, an adequate rheological model and a simple model for the foam thermal conductivity. The new solver is labelled as PUFoam and is, for the first time in this work, validated for 12 different mixing-cup experiments. Comparison of the time evolution of the predicted and experimentally measured density and temperature of the PU foam shows the potentials and limitations of the approach.

  20. Open circuit voltage durability study and model of catalyst coated membranes at different humidification levels

    NASA Astrophysics Data System (ADS)

    Kundu, Sumit; Fowler, Michael W.; Simon, Leonardo C.; Abouatallah, Rami; Beydokhti, Natasha

    Fuel cell material durability is an area of extensive research today. Chemical degradation of the ionomer membrane is one important degradation mechanism leading to overall failure of fuel cells. This study examined the effects of relative humidity on the chemical degradation of the membrane during open circuit voltage testing. Five Gore™ PRIMEA ® series 5510 catalyst coated membranes were degraded at 100%, 75%, 50%, and 20% RH. Open circuit potential and cumulative fluoride release were monitored over time. Additionally scanning electron microscopy images were taken at end of the test. The results showed that with decreasing RH fluoride release rate increased as did performance degradation. This was attributed to an increase in gas crossover with a decrease in RH. Further, it is also shown that interruptions in testing may heavily influence cumulative fluoride release measurements where frequent stoppages in testing will cause fluoride release to be underestimated. SEM analysis shows that degradation occurred in the ionomer layer close to the cathode catalyst. A chemical degradation model of the ionomer membrane was used to model the results. The model was able to predict fluoride release trends, including the effects of interruptions, showing that changes in gas crossover with RH could explain the experimental results.

  1. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  2. Progress on the Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha

    2015-12-01

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.

  3. An investigation of jogging biomechanics using the full-body lumbar spine model: Model development and validation.

    PubMed

    Raabe, Margaret E; Chaudhari, Ajit M W

    2016-05-03

    The ability of a biomechanical simulation to produce results that can translate to real-life situations is largely dependent on the physiological accuracy of the musculoskeletal model. There are a limited number of freely-available, full-body models that exist in OpenSim, and those that do exist are very limited in terms of trunk musculature and degrees of freedom in the spine. Properly modeling the motion and musculature of the trunk is necessary to most accurately estimate lower extremity and spinal loading. The objective of this study was to develop and validate a more physiologically accurate OpenSim full-body model. By building upon three previously developed OpenSim models, the full-body lumbar spine (FBLS) model, comprised of 21 segments, 30 degrees-of-freedom, and 324 musculotendon actuators, was developed. The five lumbar vertebrae were modeled as individual bodies, and coupled constraints were implemented to describe the net motion of the spine. The eight major muscle groups of the lumbar spine were modeled (rectus abdominis, external and internal obliques, erector spinae, multifidus, quadratus lumborum, psoas major, and latissimus dorsi), and many of these muscle groups were modeled as multiple fascicles allowing the large muscles to act in multiple directions. The resulting FBLS model׳s trunk muscle geometry, maximal isometric joint moments, and simulated muscle activations compare well to experimental data. The FBLS model will be made freely available (https://simtk.org/home/fullbodylumbar) for others to perform additional analyses and develop simulations investigating full-body dynamics and contributions of the trunk muscles to dynamic tasks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. QSAR DataBank - an approach for the digital organization and archiving of QSAR model information

    PubMed Central

    2014-01-01

    Background Research efforts in the field of descriptive and predictive Quantitative Structure-Activity Relationships or Quantitative Structure–Property Relationships produce around one thousand scientific publications annually. All the materials and results are mainly communicated using printed media. The printed media in its present form have obvious limitations when they come to effectively representing mathematical models, including complex and non-linear, and large bodies of associated numerical chemical data. It is not supportive of secondary information extraction or reuse efforts while in silico studies poses additional requirements for accessibility, transparency and reproducibility of the research. This gap can and should be bridged by introducing domain-specific digital data exchange standards and tools. The current publication presents a formal specification of the quantitative structure-activity relationship data organization and archival format called the QSAR DataBank (QsarDB for shorter, or QDB for shortest). Results The article describes QsarDB data schema, which formalizes QSAR concepts (objects and relationships between them) and QsarDB data format, which formalizes their presentation for computer systems. The utility and benefits of QsarDB have been thoroughly tested by solving everyday QSAR and predictive modeling problems, with examples in the field of predictive toxicology, and can be applied for a wide variety of other endpoints. The work is accompanied with open source reference implementation and tools. Conclusions The proposed open data, open source, and open standards design is open to public and proprietary extensions on many levels. Selected use cases exemplify the benefits of the proposed QsarDB data format. General ideas for future development are discussed. PMID:24910716

  5. An analysis of the North Rainier Elk Herd area, Washington: Change detection and habitat modeling with remote sensing and GIS

    NASA Astrophysics Data System (ADS)

    Benton, Joshua J.

    The North Rainier Elk Herd (NREH) is one of ten designated herds in Washington State, all managed by the Washington Department of Fish and Wildlife (WDFW). To aid in the management of the herd, the WDFW has decided to implement a spatial ecosystem analysis. This thesis partially undertakes this analysis through the use of a suite of software tools, the Westside Elk Nutrition and Habitat Use Models (WENHUM). This model analyzes four covariates that have a strong correlation to elk habitat selection: dietary digestible energy (DDE); distance to roads open to the public; mean slope; and distance to cover-forage edge and returns areas of likely elk habitation or use. This thesis includes an update of the base vegetation layer from 2006 data to 2011, a series of clear cuts were identified as areas of change and fed into the WENHUM models. The addition of these clear cuts created improvements in the higher quality DDE levels and when the updated data is compared to the original, predictions of elk use are higher. The presence of open or closed roads was simulated by creating an area of possible closures, selecting candidate roads within that area and then modeling them as either "all open" or "all closed". The simulation of the road closures produced increases in the higher levels of predicted use.

  6. Representing Matrix Cracks Through Decomposition of the Deformation Gradient Tensor in Continuum Damage Mechanics Methods

    NASA Technical Reports Server (NTRS)

    Leone, Frank A., Jr.

    2015-01-01

    A method is presented to represent the large-deformation kinematics of intraply matrix cracks and delaminations in continuum damage mechanics (CDM) constitutive material models. The method involves the additive decomposition of the deformation gradient tensor into 'crack' and 'bulk material' components. The response of the intact bulk material is represented by a reduced deformation gradient tensor, and the opening of an embedded cohesive interface is represented by a normalized cohesive displacement-jump vector. The rotation of the embedded interface is tracked as the material deforms and as the crack opens. The distribution of the total local deformation between the bulk material and the cohesive interface components is determined by minimizing the difference between the cohesive stress and the bulk material stress projected onto the cohesive interface. The improvements to the accuracy of CDM models that incorporate the presented method over existing approaches are demonstrated for a single element subjected to simple shear deformation and for a finite element model of a unidirectional open-hole tension specimen. The material model is implemented as a VUMAT user subroutine for the Abaqus/Explicit finite element software. The presented deformation gradient decomposition method reduces the artificial load transfer across matrix cracks subjected to large shearing deformations, and avoids the spurious secondary failure modes that often occur in analyses based on conventional progressive damage models.

  7. LERU roadmap towards Open Access.

    PubMed

    Ayris, Paul; Björnshauge, Lars; Collier, Mel; Ferwerda, Eelco; Jacobs, Neil; Sinikara, Kaisa; Swan, Alma; de Bries, Saskia; van Wesenbeeck, Astrid

    2015-09-01

    Money which is not directly spent on research and education, even though it is largely taxpayers´ money. As Harvard University already denounced in 2012, many large journal publishers have rendered the situation "fiscally unsustainable and academically restrictive", with some journals costing as much as $40,000 per year (and publishers drawing profits of 35% or more). If one of the wealthiest universities in the world can no longer afford it, who can? It is easy to picture the struggle of European universities with tighter budgets. In addition to subscription costs, academic research funding is also largely affected by "Article Processing Charges" (APC), which come at an additional cost of €2000/article, on average, when making individual articles Gold Open Access. Some publishers are in this way even being paid twice for the same content ("double dipping"). In the era of Open Science, Open Access to publications is one of the cornerstones of the new research paradigm and business models must support this transition. It should be one of the principal objectives of Commissioner Carlos Moedas and the Dutch EU Presidency (January-June 2016) to ensure that this transition happens. Further developing the EU´s leadership in research and innovation largely depends on it. With this statement "Moving Forwards on Open Access", LERU calls upon all universities, research institutes, research funders and researchers to sign this statement and give a clear signal towards the European Commission and the Dutch EU Presidency. Copyright© by the Spanish Society for Microbiology and Institute for Catalan Studies.

  8. Development and Evaluation of the Brief Sexual Openness Scale—A Construal Level Theory Based Approach

    PubMed Central

    Chen, Xinguang; Wang, Yan; Li, Fang; Gong, Jie; Yan, Yaqiong

    2015-01-01

    Obtaining reliable and valid data on sensitive questions represents a longstanding challenge for public health, particularly HIV research. To overcome the challenge, we assessed a construal level theory (CLT)-based novel method. The method was previously established and pilot-tested using the Brief Sexual Openness Scale (BSOS). This scale consists of five items assessing attitudes toward premarital sex, multiple sexual partners, homosexuality, extramarital sex, and commercial sex, all rated on a standard 5-point Likert scale. In addition to self-assessment, the participants were asked to assess rural residents, urban residents, and foreigners. The self-assessment plus the assessment of the three other groups were all used as subconstructs of one latent construct: sexual openness. The method was validated with data from 1,132 rural-to-urban migrants (mean age = 32.5, SD = 7.9; 49.6% female) recruited in China. Consistent with CLT, the Cronbach alpha of the BSOS as a conventional tool increased with social distance, from .81 for self-assessment to .97 for assessing foreigners. In addition to a satisfactory fit of the data to a one-factor model (CFI = .94, TLI = .93, RMSEA = .08), a common factor was separated from the four perspective factors (i.e., migrants’ self-perspective and their perspectives of rural residents, urban residents and foreigners) through a trifactor modeling analysis (CFI = .95, TLI = .94, RMSEA = .08). Relative to its conventional form, CTL-based BSOS was more reliable (alpha: .96 vs .81) and valid in predicting sexual desire, frequency of dating, age of first sex, multiple sexual partners and STD history. This novel technique can be used to assess sexual openness, and possibly other sensitive questions among Chinese domestic migrants. PMID:26308336

  9. The CNO Bi-cycle in the Open Cluster NGC 752

    NASA Astrophysics Data System (ADS)

    Hawkins, Keith; Schuler, S.; King, J.; The, L.

    2011-01-01

    The CNO bi-cycle is the primary energy source for main sequence stars more massive than the sun. To test our understanding of stellar evolution models using the CNO bi-cycle, we have undertaken light-element (CNO) abundance analysis of three main sequence dwarf stars and three red giant stars in the open cluster NGC 752 utilizing high resolution (R 50,000) spectroscopy from the Keck Observatory. Preliminary results indicate, as expected, there is a depletion of carbon in the giants relative to the dwarfs. Additional analysis is needed to determine if the amount of depletion is in line with model predictions, as seen in the Hyades open cluster. Oxygen abundances are derived from the high-excitation O I triplet, and there is a 0.19 dex offset in the [O/H] abundances between the giants and dwarfs which may be explained by non-local thermodynamic equilibrium (NLTE), although further analysis is needed to verify this. The standard procedure for spectroscopically determining stellar parameters used here allows for a measurement of the cluster metallicity, [Fe/H] = 0.04 ± 0.02. In addition to the Fe abundances we have determined Na, Mg, and Al abundances to determine the status of other nucleosynthesis processes. The Na, Mg and Al abundances of the giants are enhanced relative to the dwarfs, which is consistent with similar findings in giants of other open clusters. Support for K. Hawkins was provided by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experiences for Undergraduates Program and the Department of Defense ASSURE program through Scientific Program Order No. 13 (AST-0754223) of the Cooperative Agreement No. AST-0132798 between the Association of Universities for Research in Astronomy (AURA) and the NSF.

  10. Open Vessel Data Management (OpenVDM), Open-source Software to Assist Vessel Operators with the Task of Ship-wide Data Management.

    NASA Astrophysics Data System (ADS)

    Pinner, J. W., IV

    2016-02-01

    Data from shipboard oceanographic sensors are collected in various ASCii, binary, open and proprietary formats. Acquiring all of these formats using single, monolithic data acquisition system (DAS) can be cumbersome, complex and difficult to adapt for the ever changing suite of emerging oceanographic sensors. Another approach to the at-sea data acquisition challenge is to utilize multiple DAS software packages and corral the resulting data files with a ship-wide data management system. The Open Vessel Data Management project (OpenVDM) implements this second approach to ship-wide data management and over the last three years has successfully demonstrated it's ability to deliver a consistent cruise data package to scientists while reducing the workload placed on marine technicians. In addition to meeting the at-sea and post-cruise needs of scientists OpenVDM is helping vessel operators better adhere to the recommendations and best practices set forth by 3rd party data management and data quality groups such as R2R and SAMOS. OpenVDM also includes tools for supporting telepresence-enabled ocean research/exploration such as bandwidth-efficient ship-to-shore data transfers, shore-side data access, data visualization and near-real-time data quality tests and data statistics. OpenVDM is currently operating aboard three vessels. The R/V Endeavor, operated by the University of Rhode Island, is a regional-class UNOLS research vessel operating under the traditional NFS, P.I. driven model. The E/V Nautilus, operated by the Ocean Exploration Trust specializes in ROV-based, telepresence-enabled oceanographic research. The R/V Falkor operated by the Schmidt Ocean Institute is an ocean research platform focusing on cutting-edge technology development. These three vessels all have different missions, sensor suites and operating models yet all are able to leverage OpenVDM for managing their unique datasets and delivering a more consistent cruise data package to scientists and data archives.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Argo, P.E.; DeLapp, D.; Sutherland, C.D.

    TRACKER is an extension of a three-dimensional Hamiltonian raytrace code developed some thirty years ago by R. Michael Jones. Subsequent modifications to this code, which is commonly called the {open_quotes}Jones Code,{close_quotes} were documented by Jones and Stephensen (1975). TRACKER incorporates an interactive user`s interface, modern differential equation integrators, graphical outputs, homing algorithms, and the Ionospheric Conductivity and Electron Density (ICED) ionosphere. TRACKER predicts the three-dimensional paths of radio waves through model ionospheres by numerically integrating Hamilton`s equations, which are a differential expression of Fermat`s principle of least time. By using continuous models, the Hamiltonian method avoids false caustics and discontinuousmore » raypath properties often encountered in other raytracing methods. In addition to computing the raypath, TRACKER also calculates the group path (or pulse travel time), the phase path, the geometrical (or {open_quotes}real{close_quotes}) pathlength, and the Doppler shift (if the time variation of the ionosphere is explicitly included). Computational speed can be traded for accuracy by specifying the maximum allowable integration error per step in the integration. Only geometrical optics are included in the main raytrace code; no partial reflections or diffraction effects are taken into account. In addition, TRACKER does not lend itself to statistical descriptions of propagation -- it requires a deterministic model of the ionosphere.« less

  12. Benchmarking an unstructured grid sediment model in an energetic estuary

    DOE PAGES

    Lopez, Jesse E.; Baptista, António M.

    2016-12-14

    A sediment model coupled to the hydrodynamic model SELFE is validated against a benchmark combining a set of idealized tests and an application to a field-data rich energetic estuary. After sensitivity studies, model results for the idealized tests largely agree with previously reported results from other models in addition to analytical, semi-analytical, or laboratory results. Results of suspended sediment in an open channel test with fixed bottom are sensitive to turbulence closure and treatment for hydrodynamic bottom boundary. Results for the migration of a trench are very sensitive to critical stress and erosion rate, but largely insensitive to turbulence closure.more » The model is able to qualitatively represent sediment dynamics associated with estuarine turbidity maxima in an idealized estuary. Applied to the Columbia River estuary, the model qualitatively captures sediment dynamics observed by fixed stations and shipborne profiles. Representation of the vertical structure of suspended sediment degrades when stratification is underpredicted. Across all tests, skill metrics of suspended sediments lag those of hydrodynamics even when qualitatively representing dynamics. The benchmark is fully documented in an openly available repository to encourage unambiguous comparisons against other models.« less

  13. Learning in Later Adulthood: Transitions and Engagement in Formal Study

    ERIC Educational Resources Information Center

    Jamieson, Anne

    2012-01-01

    This paper addresses the question of benefits of education from a life course perspective. Using data from a study of 1600 students (response rate 48%) on an open access program at a London University college, it explores educational activity within the framework of a transitions model. In addition to the quantitative evidence, the article uses…

  14. A Learned Society's Perspective on Publishing.

    PubMed

    Suzuki, Kunihiko; Edelson, Alan; Iversen, Leslie L; Hausmann, Laura; Schulz, Jörg B; Turner, Anthony J

    2016-10-01

    Scientific journals that are owned by a learned society, like the Journal of Neurochemistry (JNC), which is owned by the International Society for Neurochemistry (ISN), benefit the scientific community in that a large proportion of the income is returned to support the scientific mission of the Society. The income generated by the JNC enables the ISN to organize conferences as a platform for members and non-members alike to share their research, supporting researchers particularly in developing countries by travel grants and other funds, and promoting education in student schools. These direct benefits and initiatives for ISN members and non-members distinguish a society journal from pure commerce. However, the world of scholarly publishing is changing rapidly. Open access models have challenged the business model of traditional journal subscription and hence provided free access to publicly funded scientific research. In these models, the manuscript authors pay a publication cost after peer review and acceptance of the manuscript. Over the last decade, numerous new open access journals have been launched and traditional subscription journals have started to offer open access (hybrid journals). However, open access journals follow the general scheme that, of all participating parties, the publisher receives the highest financial benefit. The income is generated by researchers whose positions and research are mostly financed by taxpayers' or funders' money, and by reviewers and editors, who frequently are not reimbursed. Last but not least, the authors pay for the publication of their work after a rigorous and sometimes painful review process. JNC itself has an open access option, at a significantly reduced cost for Society members as an additional benefit. This article provides first-hand insights from two former Editors-in-Chief, Kunihiko Suzuki and Leslie Iversen, about the history of JNC's ownership and about the difficulties and battles fought along the way to its current success and reputation. Scientific journals that are owned by a learned society, like the Journal of Neurochemistry (JNC) which is owned by the International Society for Neurochemistry (ISN), benefit the scientific community in that a large proportion of the income is returned to support the scientific mission of the Society. The income generated by the JNC enables the ISN to organize conferences as a platform for members and non-members alike to share their research, supporting researchers particularly in developing countries by travel grants and other funds, and to promote education in student schools. These direct benefits and initiatives for ISN members and non-members distinguish a society journal from pure commerce. However, the world of scholarly publishing is changing rapidly. Open access models have challenged the business model of traditional journal subscription and hence provide free access to publicly funded scientific research. In these models, the manuscript authors pay a publication cost after peer review and acceptance of the manuscript. Over the last decade, numerous new open access journals have been launched and traditional subscription journals have started to offer open access (hybrid journals). However, open access journals pertain to the general scheme that, of all participating parties, the publisher receives the highest financial benefit. The income is generated by researchers whose positions and research are mostly financed by tax payers' or funders' money, reviewers and editors, who frequently are not reimbursed. Last but not least, the authors pay for the publication of their work after a rigorous and sometimes painful review process. JNC itself has an open access option, at a significantly reduced cost for Society members as an additional benefit. This article provides first-hand insights from a long-standing Editor-in-Chief, Kunihiko Suzuki, about the history of JNC's ownership and about difficulties and battles fought on the way to its current success and reputation today. This article is part of the 60th Anniversary special issue. © 2016 International Society for Neurochemistry.

  15. Exploring the Role of Value Networks for Software Innovation

    NASA Astrophysics Data System (ADS)

    Morgan, Lorraine; Conboy, Kieran

    This paper describes a research-in-progress that aims to explore the applicability and implications of open innovation practices in two firms - one that employs agile development methods and another that utilizes open source software. The open innovation paradigm has a lot in common with open source and agile development methodologies. A particular strength of agile approaches is that they move away from 'introverted' development, involving only the development personnel, and intimately involves the customer in all areas of software creation, supposedly leading to the development of a more innovative and hence more valuable information system. Open source software (OSS) development also shares two key elements of the open innovation model, namely the collaborative development of the technology and shared rights to the use of the technology. However, one shortfall with agile development in particular is the narrow focus on a single customer representative. In response to this, we argue that current thinking regarding innovation needs to be extended to include multiple stakeholders both across and outside the organization. Additionally, for firms utilizing open source, it has been found that their position in a network of potential complementors determines the amount of superior value they create for their customers. Thus, this paper aims to get a better understanding of the applicability and implications of open innovation practices in firms that employ open source and agile development methodologies. In particular, a conceptual framework is derived for further testing.

  16. Influences of deep learning, need for cognition and preparation time on open- and closed-book test performance.

    PubMed

    Heijne-Penninga, Marjolein; Kuks, Jan B M; Hofman, W H Adriaan; Cohen-Schotanus, Janke

    2010-09-01

    The ability to master discipline-specific knowledge is one of the competencies medical students must acquire. In this context, 'mastering' means being able to recall and apply knowledge. A way to assess this competency is to use both open- and closed-book tests. Student performance on both tests can be influenced by the way the student processes information. Deep information processing is expected to influence performance positively. The personal preferences of students in relation to how they process information in general (i.e. their level of need for cognition) may also be of importance. In this study, we examined the inter-relatedness of deep learning, need for cognition and preparation time, and scores on open- and closed-book tests. This study was conducted at the University Medical Centre Groningen. Participants were Year 2 students (n = 423). They were asked to complete a questionnaire on deep information processing, a scale for need for cognition on a questionnaire on intellectualism and, additionally, to write down the time they spent on test preparation. We related these measures to the students' scores on two tests, both consisting of open- and closed-book components and used structural equation modelling to analyse the data. Both questionnaires were completed by 239 students (57%). The results showed that need for cognition positively influenced both open- and closed-book test scores (beta-coefficients 0.05 and 0.11, respectively). Furthermore, study outcomes measured by open-book tests predicted closed-book test results better than the other way around (beta-coefficients 0.72 and 0.11, respectively). Students with a high need for cognition performed better on open- as well as closed-book tests. Deep learning did not influence their performance. Adding open-book tests to the regularly used closed-book tests seems to improve the recall of knowledge that has to be known by heart. Need for cognition may provide a valuable addition to existing theories on learning.

  17. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent processes from a different domain or have different spatial and temporal resolutions. An open source framework that bridges OpenMI and OpenDA is presented. The framework provides a generic and easy means for any OpenMI compliant model to assimilate observation measurements. An example test case will be presented using MikeSHE, and OpenMI compliant fully coupled integrated hydrological model that can accurately simulate the feedback dynamics of overland flow, unsaturated zone and saturated zone.

  18. Open science versus commercialization: a modern research conflict?

    PubMed

    Caulfield, Timothy; Harmon, Shawn He; Joly, Yann

    2012-02-27

    Efforts to improve research outcomes have resulted in genomic researchers being confronted with complex and seemingly contradictory instructions about how to perform their tasks. Over the past decade, there has been increasing pressure on university researchers to commercialize their work. Concurrently, they are encouraged to collaborate, share data and disseminate new knowledge quickly (that is, to adopt an open science model) in order to foster scientific progress, meet humanitarian goals, and to maximize the impact of their research. We present selected guidelines from three countries (Canada, United States, and United Kingdom) situated at the forefront of genomics to illustrate this potential policy conflict. Examining the innovation ecosystem and the messages conveyed by the different policies surveyed, we further investigate the inconsistencies between open science and commercialization policies. Commercialization and open science are not necessarily irreconcilable and could instead be envisioned as complementary elements of a more holistic innovation framework. Given the exploratory nature of our study, we wish to point out the need to gather additional evidence on the coexistence of open science and commercialization policies and on its impact, both positive and negative, on genomics academic research.

  19. Technical Note: FreeCT_ICD: An Open Source Implementation of a Model-Based Iterative Reconstruction Method using Coordinate Descent Optimization for CT Imaging Investigations.

    PubMed

    Hoffman, John M; Noo, Frédéric; Young, Stefano; Hsieh, Scott S; McNitt-Gray, Michael

    2018-06-01

    To facilitate investigations into the impacts of acquisition and reconstruction parameters on quantitative imaging, radiomics and CAD using CT imaging, we previously released an open source implementation of a conventional weighted filtered backprojection reconstruction called FreeCT_wFBP. Our purpose was to extend that work by providing an open-source implementation of a model-based iterative reconstruction method using coordinate descent optimization, called FreeCT_ICD. Model-based iterative reconstruction offers the potential for substantial radiation dose reduction, but can impose substantial computational processing and storage requirements. FreeCT_ICD is an open source implementation of a model-based iterative reconstruction method that provides a reasonable tradeoff between these requirements. This was accomplished by adapting a previously proposed method that allows the system matrix to be stored with a reasonable memory requirement. The method amounts to describing the attenuation coefficient using rotating slices that follow the helical geometry. In the initially-proposed version, the rotating slices are themselves described using blobs. We have replaced this description by a unique model that relies on tri-linear interpolation together with the principles of Joseph's method. This model offers an improvement in memory requirement while still allowing highly accurate reconstruction for conventional CT geometries. The system matrix is stored column-wise and combined with an iterative coordinate descent (ICD) optimization. The result is FreeCT_ICD, which is a reconstruction program developed on the Linux platform using C++ libraries and the open source GNU GPL v2.0 license. The software is capable of reconstructing raw projection data of helical CT scans. In this work, the software has been described and evaluated by reconstructing datasets exported from a clinical scanner which consisted of an ACR accreditation phantom dataset and a clinical pediatric thoracic scan. For the ACR phantom, image quality was comparable to clinical reconstructions as well as reconstructions using open-source FreeCT_wFBP software. The pediatric thoracic scan also yielded acceptable results. In addition, we did not observe any deleterious impact in image quality associated with the utilization of rotating slices. These evaluations also demonstrated reasonable tradeoffs in storage requirements and computational demands. FreeCT_ICD is an open-source implementation of a model-based iterative reconstruction method that extends the capabilities of previously released open source reconstruction software and provides the ability to perform vendor-independent reconstructions of clinically acquired raw projection data. This implementation represents a reasonable tradeoff between storage and computational requirements and has demonstrated acceptable image quality in both simulated and clinical image datasets. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  20. Integrating HCI Specialists into Open Source Software Development Projects

    NASA Astrophysics Data System (ADS)

    Hedberg, Henrik; Iivari, Netta

    Typical open source software (OSS) development projects are organized around technically talented developers, whose communication is based on technical aspects and source code. Decision-making power is gained through proven competence and activity in the project, and non-technical end-user opinions are too many times neglected. In addition, also human-computer interaction (HCI) specialists have encountered difficulties in trying to participate in OSS projects, because there seems to be no clear authority and responsibility for them. In this paper, based on HCI and OSS literature, we introduce an extended OSS development project organization model that adds a new level of communication and roles for attending human aspects of software. The proposed model makes the existence of HCI specialists visible in the projects, and promotes interaction between developers and the HCI specialists in the course of a project.

  1. Development of an integrated aeroservoelastic analysis program and correlation with test data

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.; Brenner, M. J.; Voelker, L. S.

    1991-01-01

    The details and results are presented of the general-purpose finite element STructural Analysis RoutineS (STARS) to perform a complete linear aeroelastic and aeroservoelastic analysis. The earlier version of the STARS computer program enabled effective finite element modeling as well as static, vibration, buckling, and dynamic response of damped and undamped systems, including those with pre-stressed and spinning structures. Additions to the STARS program include aeroelastic modeling for flutter and divergence solutions, and hybrid control system augmentation for aeroservoelastic analysis. Numerical results of the X-29A aircraft pertaining to vibration, flutter-divergence, and open- and closed-loop aeroservoelastic controls analysis are compared to ground vibration, wind-tunnel, and flight-test results. The open- and closed-loop aeroservoelastic control analyses are based on a hybrid formulation representing the interaction of structural, aerodynamic, and flight-control dynamics.

  2. An information theory model for dissipation in open quantum systems

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    2017-08-01

    This work presents a general model for open quantum systems using an information game along the lines of Jaynes’ original work. It is shown how an energy based reweighting of propagators provides a novel moment generating function at each time point in the process. Derivatives of the generating function give moments of the time derivatives of observables. Aside from the mathematically helpful properties, the ansatz reproduces key physics of stochastic quantum processes. At high temperature, the average density matrix follows the Caldeira-Leggett equation. Its associated Langevin equation clearly demonstrates the emergence of dissipation and decoherence time scales, as well as an additional diffusion due to quantum confinement. A consistent interpretation of these results is that decoherence and wavefunction collapse during measurement are directly related to the degree of environmental noise, and thus occur because of subjective uncertainty of an observer.

  3. Robust, open-source removal of systematics in Kepler data

    NASA Astrophysics Data System (ADS)

    Aigrain, S.; Parviainen, H.; Roberts, S.; Reece, S.; Evans, T.

    2017-10-01

    We present ARC2 (Astrophysically Robust Correction 2), an open-source python-based systematics-correction pipeline, to correct for the Kepler prime mission long-cadence light curves. The ARC2 pipeline identifies and corrects any isolated discontinuities in the light curves and then removes trends common to many light curves. These trends are modelled using the publicly available co-trending basis vectors, within an (approximate) Bayesian framework with 'shrinkage' priors to minimize the risk of overfitting and the injection of any additional noise into the corrected light curves, while keeping any astrophysical signals intact. We show that the ARC2 pipeline's performance matches that of the standard Kepler PDC-MAP data products using standard noise metrics, and demonstrate its ability to preserve astrophysical signals using injection tests with simulated stellar rotation and planetary transit signals. Although it is not identical, the ARC2 pipeline can thus be used as an open-source alternative to PDC-MAP, whenever the ability to model the impact of the systematics removal process on other kinds of signal is important.

  4. Adaption to Extreme Rainfall with Open Urban Drainage System: An Integrated Hydrological Cost-Benefit Analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Qianqian; Panduro, Toke Emil; Thorsen, Bo Jellesmark; Arnbjerg-Nielsen, Karsten

    2013-03-01

    This paper presents a cross-disciplinary framework for assessment of climate change adaptation to increased precipitation extremes considering pluvial flood risk as well as additional environmental services provided by some of the adaptation options. The ability of adaptation alternatives to cope with extreme rainfalls is evaluated using a quantitative flood risk approach based on urban inundation modeling and socio-economic analysis of corresponding costs and benefits. A hedonic valuation model is applied to capture the local economic gains or losses from more water bodies in green areas. The framework was applied to the northern part of the city of Aarhus, Denmark. We investigated four adaptation strategies that encompassed laissez-faire, larger sewer pipes, local infiltration units, and open drainage system in the urban green structure. We found that when taking into account environmental amenity effects, an integration of open drainage basins in urban recreational areas is likely the best adaptation strategy, followed by pipe enlargement and local infiltration strategies. All three were improvements compared to the fourth strategy of no measures taken.

  5. Adaption to extreme rainfall with open urban drainage system: an integrated hydrological cost-benefit analysis.

    PubMed

    Zhou, Qianqian; Panduro, Toke Emil; Thorsen, Bo Jellesmark; Arnbjerg-Nielsen, Karsten

    2013-03-01

    This paper presents a cross-disciplinary framework for assessment of climate change adaptation to increased precipitation extremes considering pluvial flood risk as well as additional environmental services provided by some of the adaptation options. The ability of adaptation alternatives to cope with extreme rainfalls is evaluated using a quantitative flood risk approach based on urban inundation modeling and socio-economic analysis of corresponding costs and benefits. A hedonic valuation model is applied to capture the local economic gains or losses from more water bodies in green areas. The framework was applied to the northern part of the city of Aarhus, Denmark. We investigated four adaptation strategies that encompassed laissez-faire, larger sewer pipes, local infiltration units, and open drainage system in the urban green structure. We found that when taking into account environmental amenity effects, an integration of open drainage basins in urban recreational areas is likely the best adaptation strategy, followed by pipe enlargement and local infiltration strategies. All three were improvements compared to the fourth strategy of no measures taken.

  6. The effects of central administration of physostigmine in two models of anxiety.

    PubMed

    Sienkiewicz-Jarosz, H; Maciejak, Piotr; Krzaścik, Paweł; Członkowska, Agnieszka I; Szyndler, Janusz; Bidziński, Andrzej; Kostowski, Wojciech; Płaźnik, Adam

    2003-05-01

    The effects of intracerebroventricular and intraseptal (the medial septum) administration of a prototypical acetylcholinesterase inhibitor (AChE-I), physostigmine, and a classic benzodiazepine midazolam on rat behavior in the open field test of neophobia and in the conditioned fear test (freezing reaction) were examined in rats. In the open field test of neophobia midazolam and physostigmine increased at a limited dose range, rat exploratory activity, after intracerebroventricular injection. Physostigmine produced in addition the hyperlocomotory effect. Following intraseptal injections, only physostigmine selectively prolonged the time spent by animals in the central sector of the open field. In the model of a conditioned fear, both midazolam and physostigmine inhibited rat freezing reaction to the aversively conditioned context after intracerebroventricular, but not after intraseptal, pretrial drug administration. The presented data support the notion about the selective anxiolytic-like effects of some AChE-Is. It appears, therefore, that the calming and sedative effects of AChE-Is observed in patients with Alzheimer's disease may be directly related to their anxiolytic action, independent of an improvement in cognitive functions, which in turn may decrease disorientation-induced distress and anxiety.

  7. Nd-isotopes in selected mantle-derived rocks and minerals and their implications for mantle evolution

    USGS Publications Warehouse

    Basu, A.R.; Tatsumoto, M.

    1980-01-01

    The Sm-Nd systematics in a variety of mantle-derived samples including kimberlites, alnoite, carbonatite, pyroxene and amphibole inclusions in alkali basalts and xenolithic eclogites, granulites and a pyroxene megacryst in kimberlites are reported. The additional data on kimberlites strengthen our earlier conclusion that kimberlites are derived from a relatively undifferentiated chondritic mantle source. This conclusion is based on the observation that the e{open}Nd values of most of the kimberlites are near zero. In contrast with the kimberlites, their garnet lherzolite inclusions show both time-averaged Nd enrichment and depletion with respect to Sm. Separated clinopyroxenes in eclogite xenoliths from the Roberts Victor kimberlite pipe show both positive and negative e{open}Nd values suggesting different genetic history. A whole rock lower crustal scapolite granulite xenolith from the Matsoku kimberlite pipe shows a negative e{open}Nd value of -4.2, possibly representative of the base of the crust in Lesotho. It appears that all inclusions, mafic and ultramafic, in kimberlites are unrelated to their kimberlite host. The above data and additional Sm-Nd data on xenoliths in alkali basalts, alpine peridotite and alnoite-carbonatites are used to construct a model for the upper 200 km of the earth's mantle - both oceanic and continental. The essential feature of this model is the increasing degree of fertility of the mantle with depth. The kimberlite's source at depths below 200 km in the subcontinental mantle is the most primitive in this model, and this primitive layer is also extended to the suboceanic mantle. However, it is clear from the Nd-isotopic data in the xenoliths of the continental kimberlites that above 200 km the continental mantle is distinctly different from their suboceanic counterpart. ?? 1980 Springer-Verlag.

  8. Neonatal infrared thermography imaging: Analysis of heat flux during different clinical scenarios

    NASA Astrophysics Data System (ADS)

    Abbas, Abbas K.; Heimann, Konrad; Blazek, Vladimir; Orlikowsky, Thorsten; Leonhardt, Steffen

    2012-11-01

    IntroductionAn accurate skin temperature measurement of Neonatal Infrared Thermography (NIRT) imaging requires an appropriate calibration process for compensation of external effects (e.g. variation of environmental temperature, variable air velocity or humidity). Although modern infrared cameras can perform such calibration, an additional compensation is required for highly accurate thermography. This compensation which corrects any temperature drift should occur during the NIRT imaging process. We introduce a compensation technique which is based on modeling the physical interactions within the measurement scene and derived the detected temperature signal of the object. Materials and methodsIn this work such compensation was performed for different NIRT imaging application in neonatology (e.g. convective incubators, kangaroo mother care (KMC), and an open radiant warmer). The spatially distributed temperatures of 12 preterm infants (average gestation age 31 weeks) were measured under these different infant care arrangements (i.e. closed care system like a convective incubator, and open care system like kangaroo mother care, and open radiant warmer). ResultsAs errors in measurement of temperature were anticipated, a novel compensation method derived from infrared thermography of the neonate's skin was developed. Moreover, the differences in temperature recording for the 12 preterm infants varied from subject to subject. This variation could be arising from individual experimental setting applied to the same region of interest over the neonate's body. The experimental results for the model-based corrections is verified over the selected patient group. ConclusionThe proposed technique relies on applying model-based correction to the measured temperature and reducing extraneous errors during NIRT. This application specific method is based on different heat flux compartments present in neonatal thermography scene. Furthermore, these results are considered to be groundwork for further investigation, especially when using NIRT imaging arrangement with additional compensation settings together with reference temperature measurements.

  9. EASEE: an open architecture approach for modeling battlespace signal and sensor phenomenology

    NASA Astrophysics Data System (ADS)

    Waldrop, Lauren E.; Wilson, D. Keith; Ekegren, Michael T.; Borden, Christian T.

    2017-04-01

    Open architecture in the context of defense applications encourages collaboration across government agencies and academia. This paper describes a success story in the implementation of an open architecture framework that fosters transparency and modularity in the context of Environmental Awareness for Sensor and Emitter Employment (EASEE), a complex physics-based software package for modeling the effects of terrain and atmospheric conditions on signal propagation and sensor performance. Among the highlighted features in this paper are: (1) a code refactorization to separate sensitive parts of EASEE, thus allowing collaborators the opportunity to view and interact with non-sensitive parts of the EASEE framework with the end goal of supporting collaborative innovation, (2) a data exchange and validation effort to enable the dynamic addition of signatures within EASEE thus supporting a modular notion that components can be easily added or removed to the software without requiring recompilation by developers, and (3) a flexible and extensible XML interface, which aids in decoupling graphical user interfaces from EASEE's calculation engine, and thus encourages adaptability to many different defense applications. In addition to the outlined points above, this paper also addresses EASEE's ability to interface with both proprietary systems such as ArcGIS. A specific use case regarding the implementation of an ArcGIS toolbar that leverages EASEE's XML interface and enables users to set up an EASEE-compliant configuration for probability of detection or optimal sensor placement calculations in various modalities is discussed as well.

  10. Optimal stomatal behavior with competition for water and risk of hydraulic impairment.

    PubMed

    Wolf, Adam; Anderegg, William R L; Pacala, Stephen W

    2016-11-15

    For over 40 y the dominant theory of stomatal behavior has been that plants should open stomates until the carbon gained by an infinitesimal additional opening balances the additional water lost times a water price that is constant at least over short periods. This theory has persisted because of its remarkable success in explaining strongly supported simple empirical models of stomatal conductance, even though we have also known for over 40 y that the theory is not consistent with competition among plants for water. We develop an alternative theory in which plants maximize carbon gain without pricing water loss and also add two features to both this and the classical theory, which are strongly supported by empirical evidence: (i) water flow through xylem that is progressively impaired as xylem water potential drops and (ii) fitness or carbon costs associated with low water potentials caused by a variety of mechanisms, including xylem damage repair. We show that our alternative carbon-maximization optimization is consistent with plant competition because it yields an evolutionary stable strategy (ESS)-species with the ESS stomatal behavior that will outcompete all others. We further show that, like the classical theory, the alternative theory also explains the functional forms of empirical stomatal models. We derive ways to test between the alternative optimization criteria by introducing a metric-the marginal xylem tension efficiency, which quantifies the amount of photosynthesis a plant will forego from opening stomatal an infinitesimal amount more to avoid a drop in water potential.

  11. EuroForMix: An open source software based on a continuous model to evaluate STR DNA profiles from a mixture of contributors with artefacts.

    PubMed

    Bleka, Øyvind; Storvik, Geir; Gill, Peter

    2016-03-01

    We have released a software named EuroForMix to analyze STR DNA profiles in a user-friendly graphical user interface. The software implements a model to explain the allelic peak height on a continuous scale in order to carry out weight-of-evidence calculations for profiles which could be from a mixture of contributors. Through a properly parameterized model we are able to do inference on mixture proportions, the peak height properties, stutter proportion and degradation. In addition, EuroForMix includes models for allele drop-out, allele drop-in and sub-population structure. EuroForMix supports two inference approaches for likelihood ratio calculations. The first approach uses maximum likelihood estimation of the unknown parameters. The second approach is Bayesian based which requires prior distributions to be specified for the parameters involved. The user may specify any number of known and unknown contributors in the model, however we find that there is a practical computing time limit which restricts the model to a maximum of four unknown contributors. EuroForMix is the first freely open source, continuous model (accommodating peak height, stutter, drop-in, drop-out, population substructure and degradation), to be reported in the literature. It therefore serves an important purpose to act as an unrestricted platform to compare different solutions that are available. The implementation of the continuous model used in the software showed close to identical results to the R-package DNAmixtures, which requires a HUGIN Expert license to be used. An additional feature in EuroForMix is the ability for the user to adapt the Bayesian inference framework by incorporating their own prior information. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Government Open Systems Interconnection Profile (GOSIP) transition strategy

    NASA Astrophysics Data System (ADS)

    Laxen, Mark R.

    1993-09-01

    This thesis analyzes the Government Open Systems Interconnection Profile (GOSIP) and the requirements of the Federal Information Processing Standard (FIPS) Publication 146-1. It begins by examining the International Organization for Standardization (ISO) Open Systems Interconnection (OSI) architecture and protocol suites and the distinctions between GOSIP version one and two. Additionally, it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed their recommendations. Implementation considerations from both government and vendor perspectives illustrate the barriers and requirements faced by information systems managers, as well as basic transition strategies. The result of this thesis is to show a transition strategy through an extended and coordinated period of coexistence due to extensive legacy systems and GOSIP product unavailability. Recommendations for GOSIP protocol standards to include capabilities outside the OSI model are also presented.

  13. Integrated Multidisciplinary Optimization Objects

    NASA Technical Reports Server (NTRS)

    Alston, Katherine

    2014-01-01

    OpenMDAO is an open-source MDAO framework. It is used to develop an integrated analysis and design environment for engineering challenges. This Phase II project integrated additional modules and design tools into OpenMDAO to perform discipline-specific analysis across multiple flight regimes at varying levels of fidelity. It also showcased a refined system architecture that allows the system to be less customized to a specific configuration (i.e., system and configuration separation). By delivering a capable and validated MDAO system along with a set of example applications to be used as a template for future users, this work greatly expands NASA's high-fidelity, physics-based MDAO capabilities and enables the design of revolutionary vehicles in a cost-effective manner. This proposed work complements M4 Engineering's expertise in developing modeling and simulation toolsets that solve relevant subsonic, supersonic, and hypersonic demonstration applications.

  14. Open innovation and external sources of innovation. An opportunity to fuel the R&D pipeline and enhance decision making?

    PubMed

    Schuhmacher, Alexander; Gassmann, Oliver; McCracken, Nigel; Hinder, Markus

    2018-05-08

    Historically, research and development (R&D) in the pharmaceutical sector has predominantly been an in-house activity. To enable investments for game changing late-stage assets and to enable better and less costly go/no-go decisions, most companies have employed a fail early paradigm through the implementation of clinical proof-of-concept organizations. To fuel their pipelines, some pioneers started to complement their internal R&D efforts through collaborations as early as the 1990s. In recent years, multiple extrinsic and intrinsic factors induced an opening for external sources of innovation and resulted in new models for open innovation, such as open sourcing, crowdsourcing, public-private partnerships, innovations centres, and the virtualization of R&D. Three factors seem to determine the breadth and depth regarding how companies approach external innovation: (1) the company's legacy, (2) the company's willingness and ability to take risks and (3) the company's need to control IP and competitors. In addition, these factors often constitute the major hurdles to effectively leveraging external opportunities and assets. Conscious and differential choices of the R&D and business models for different companies and different divisions in the same company seem to best allow a company to fully exploit the potential of both internal and external innovations.

  15. Dynamical gauge effects in an open quantum network

    NASA Astrophysics Data System (ADS)

    Zhao, Jianshi; Price, Craig; Liu, Qi; Gemelke, Nathan

    2016-05-01

    We describe new experimental techniques for simulation of high-energy field theories based on an analogy between open thermodynamic systems and effective dynamical gauge-fields following SU(2) × U(1) Yang-Mills models. By coupling near-resonant laser-modes to atoms moving in a disordered optical environment, we create an open system which exhibits a non-equilibrium phase transition between two steady-state behaviors, exhibiting scale-invariant behavior near the transition. By measuring transport of atoms through the disordered network, we observe two distinct scaling behaviors, corresponding to the classical and quantum limits for the dynamical gauge field. This behavior is loosely analogous to dynamical gauge effects in quantum chromodynamics, and can mapped onto generalized open problems in theoretical understanding of quantized non-Abelian gauge theories. Additional, the scaling behavior can be understood from the geometric structure of the gauge potential and linked to the measure of information in the local disordered potential, reflecting an underlying holographic principle. We acknowledge support from NSF Award No.1068570, and the Charles E. Kaufman Foundation.

  16. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  17. In Vivo Healing after Capsular Plication in an Ovine Shoulder Model

    PubMed Central

    Kelly, BT; Turner, AS; Bansal, M; Terry, M; Wolf, BR; Warren, RF; Altchek, DW; Allen, AA

    2005-01-01

    Traditionally, arthroscopic management of shoulder instability has been reserved for patients with isolated Bankart lesions without any capsular laxity or injury. To date, there are no animal studies evaluating the healing potential of capsular plication and/or capsulo-labral repair. The purpose of this in vivo animal study was to determine if the histological capsular healing of an open capsular plication simulating an arthroscopic plication is equivalent to the more traditional open capsular shift involving cutting and advancing the capsule. Twenty-six skeletally mature sheep were randomized to either an open capsular plication simulating arthroscopic plication (n=13), or an open traditional capsular shift (n=13). A sham operation (n=4) was also performed involving exposure to visualize the capsule. Normal non-operated control shoulders were also analyzed. A pathologist blinded to the treatment evaluated both hematoxylin and eosin (H&E) sections and polarized light microscopy. Qualitative scoring evaluated fibrosis, mucinous degeneration, fat necrosis, granuloma formation, vascularity, inflammatory infiltrate and hemosiderin (0 to 3 points). Both the capsular plication and open shift groups demonstrated healing by fibrosis at the site of surgical manipulation. There were no statistical differences in the capsular healing responses between the two groups with regard to fibrosis, granuloma formation and vascularity. The open shift group demonstrated significantly more mucinous degeneration (p=0.038). Fat necrosis was present in 4/13 specimens in the open shift group and none in the capsular plication specimens. Both groups demonstrated disorganized collagen formation under polarized light microscopy. There were no differences between non-operated control specimens and sham surgery specimens. Our findings support the hypothesis that histologic capsular healing is equivalent between the plication group and the open shift group. In addition, the open shift group demonstrated significantly more changes indicative of tissue injury. This basic science model confirms capsular healing after simulated arthroscopic plication, providing support for arthroscopic capsular plication in practice. PMID:16089080

  18. Open Source Tools for Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Powers, P.

    2010-12-01

    The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.

  19. Quantifying Economic and Environmental Impacts of Transportation Network Disruptions with Dynamic Traffic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shekar, Venkateswaran; Fiondella, Lance; Chatterjee, Samrat

    Several transportation network vulnerability models have been proposed. However, most only consider disruptions as a static snapshot in time and the impact on total travel time. These approaches cannot consider the time-varying nature of travel demand nor other undesirable outcomes that follow from transportation network disruptions. This paper proposes an algorithmic approach to assess the vulnerability of a transportation network that considers the time-varying demand with an open source dynamic transportation simulation tool. The open source nature of the tool allows us to systematically consider many disruption scenarios and quantitatively compare their relative criticality. This is far more efficient thanmore » traditional approaches which would require days or weeks of a transportation engineers time to manually set up, run, and assess these simulations. In addition to travel time, we also collect statistics on additional fuel consumed and the corresponding carbon dioxide emissions. Our approach, thus provides a more systematic approach that is both time-varying and can consider additional negative consequences of disruptions for decision makers to evaluate.« less

  20. An Online Prediction Platform to Support the Environmental ...

    EPA Pesticide Factsheets

    Historical QSAR models are currently utilized across a broad range of applications within the U.S. Environmental Protection Agency (EPA). These models predict basic physicochemical properties (e.g., logP, aqueous solubility, vapor pressure), which are then incorporated into exposure, fate and transport models. Whereas the classical manner of publishing results in peer-reviewed journals remains appropriate, there are substantial benefits to be gained by providing enhanced, open access to the training data sets and resulting models. Benefits include improved transparency, more flexibility to expand training sets and improve model algorithms, and greater ability to independently characterize model performance both globally and in local areas of chemistry. We have developed a web-based prediction platform that uses open-source descriptors and modeling algorithms, employs modern cheminformatics technologies, and is tailored for ease of use by the toxicology and environmental regulatory community. This tool also provides web-services to meet both EPA’s projects and the modeling community at-large. The platform hosts models developed within EPA’s National Center for Computational Toxicology, as well as those developed by other EPA scientists and the outside scientific community. Recognizing that there are other on-line QSAR model platforms currently available which have additional capabilities, we connect to such services, where possible, to produce an integrated

  1. Teaching New Keynesian Open Economy Macroeconomics at the Intermediate Level

    ERIC Educational Resources Information Center

    Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo

    2009-01-01

    For the open economy, the workhorse model in intermediate textbooks still is the Mundell-Fleming model, which basically extends the investment and savings, liquidity preference and money supply (IS-LM) model to open economy problems. The authors present a simple New Keynesian model of the open economy that introduces open economy considerations…

  2. Single-channel kinetics of BK (Slo1) channels

    PubMed Central

    Geng, Yanyan; Magleby, Karl L.

    2014-01-01

    Single-channel kinetics has proven a powerful tool to reveal information about the gating mechanisms that control the opening and closing of ion channels. This introductory review focuses on the gating of large conductance Ca2+- and voltage-activated K+ (BK or Slo1) channels at the single-channel level. It starts with single-channel current records and progresses to presentation and analysis of single-channel data and the development of gating mechanisms in terms of discrete state Markov (DSM) models. The DSM models are formulated in terms of the tetrameric modular structure of BK channels, consisting of a central transmembrane pore-gate domain (PGD) attached to four surrounding transmembrane voltage sensing domains (VSD) and a large intracellular cytosolic domain (CTD), also referred to as the gating ring. The modular structure and data analysis shows that the Ca2+ and voltage dependent gating considered separately can each be approximated by 10-state two-tiered models with five closed states on the upper tier and five open states on the lower tier. The modular structure and joint Ca2+ and voltage dependent gating are consistent with a 50 state two-tiered model with 25 closed states on the upper tier and 25 open states on the lower tier. Adding an additional tier of brief closed (flicker states) to the 10-state or 50-state models improved the description of the gating. For fixed experimental conditions a channel would gate in only a subset of the potential number of states. The detected number of states and the correlations between adjacent interval durations are consistent with the tiered models. The examined models can account for the single-channel kinetics and the bursting behavior of gating. Ca2+ and voltage activate BK channels by predominantly increasing the effective opening rate of the channel with a smaller decrease in the effective closing rate. Ca2+ and depolarization thus activate by mainly destabilizing the closed states. PMID:25653620

  3. Results of tests performed on the Acoustic Quiet Flow Facility Three-Dimensional Model Tunnel: Report on the Modified D.S.M.A. Design

    NASA Technical Reports Server (NTRS)

    Barna, P. S.

    1996-01-01

    Numerous tests were performed on the original Acoustic Quiet Flow Facility Three-Dimensional Model Tunnel, scaled down from the full-scale plans. Results of tests performed on the original scale model tunnel were reported in April 1995, which clearly showed that this model was lacking in performance. Subsequently this scale model was modified to attempt to possibly improve the tunnel performance. The modifications included: (a) redesigned diffuser; (b) addition of a collector; (c) addition of a Nozzle-Diffuser; (d) changes in location of vent-air. Tests performed on the modified tunnel showed a marked improvement in performance amounting to a nominal increase of pressure recovery in the diffuser from 34 percent to 54 percent. Results obtained in the tests have wider application. They may also be applied to other tunnels operating with an open test section not necessarily having similar geometry as the model under consideration.

  4. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org

  5. Spectral-Element Seismic Wave Propagation Codes for both Forward Modeling in Complex Media and Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Peter, D. B.; Tromp, J.; Komatitsch, D.; Lefebvre, M. P.

    2015-12-01

    We present both SPECFEM3D_Cartesian and SPECFEM3D_GLOBE open-source codes, representing high-performance numerical wave solvers simulating seismic wave propagation for local-, regional-, and global-scale application. These codes are suitable for both forward propagation in complex media and tomographic imaging. Both solvers compute highly accurate seismic wave fields using the continuous Galerkin spectral-element method on unstructured meshes. Lateral variations in compressional- and shear-wave speeds, density, as well as 3D attenuation Q models, topography and fluid-solid coupling are all readily included in both codes. For global simulations, effects due to rotation, ellipticity, the oceans, 3D crustal models, and self-gravitation are additionally included. Both packages provide forward and adjoint functionality suitable for adjoint tomography on high-performance computing architectures. We highlight the most recent release of the global version which includes improved performance, simultaneous MPI runs, OpenCL and CUDA support via an automatic source-to-source transformation library (BOAST), parallel I/O readers and writers for databases using ADIOS and seismograms using the recently developed Adaptable Seismic Data Format (ASDF) with built-in provenance. This makes our spectral-element solvers current state-of-the-art, open-source community codes for high-performance seismic wave propagation on arbitrarily complex 3D models. Together with these solvers, we provide full-waveform inversion tools to image the Earth's interior at unprecedented resolution.

  6. Calcium Carbonate Dissolution Above the Lysocline: Implications of Copepod Grazing on Coccolithophores

    NASA Astrophysics Data System (ADS)

    White, M. M.; Waller, J. D.; Lubelczyk, L.; Drapeau, D.; Bowler, B.; Wyeth, A.; Fields, D.; Balch, W. M.

    2016-02-01

    Copepod-coccolithophore predator-prey interactions are of great importance because they facilitate the export of particulate inorganic and organic carbon (PIC and POC) from the surface ocean. Coccolith dissolution in acidic copepod guts has been proposed as a possible explanation for the paradox of PIC dissolution above the lysocline, but warrants further investigation. Using a new application of the 14C-microdiffusion technique, we investigated the dissolution of coccoliths in copepod guts. We considered both an estuarine predator-prey model (Acartia tonsa and Pleurochrysis carterae) and an open ocean predator-prey model (Calanus finmarchicus and Emiliania huxleyi). Additionally, we considered the impacts of pCO2 on this process to advance our understanding of the effects of ocean acidification on trophic interactions. In the estuarine predator-prey model, fecal pellets produced immediately after previously-starved copepods grazed on P. carterae had PIC/POC ratios 27-40 % lower than that of the algae, indicating PIC dissolution within the copepod gut, with no impact of pCO2 on this dissolution. Subsequent fecal pellets showed increasing PIC/POC, suggesting that calcite dissolution decreases as the gut fills. The open ocean predator-prey model showed equivocal results, indicating high variability among individual grazing behavior, and therefore no consistent impact of copepod grazing on coccolith dissolution above the lysocline in the open ocean. We will further discuss the effects of fecal pellet PIC/POC ratios on sinking rate.

  7. An Open Computing Infrastructure that Facilitates Integrated Product and Process Development from a Decision-Based Perspective

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.

    1996-01-01

    Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust Engineering Analysis Models and Specifications) has been developed which supports the activities of both meta-design and actual design execution. This is accomplished through a systematic process which is comprised of the stages of Formulation, Translation, and Evaluation. During this process, elements from a Design Specification are integrated into Design Processes. In addition, a software infrastructure was developed and is called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment). This represents a virtual apparatus in the Design Experiment conducted in this research. IMAGE is an innovative architecture because it explicitly supports design-related activities. This is accomplished through a GUI driven and Agent-based implementation of DREAMS. A HSCT design has been adopted from the Framework for Interdisciplinary Design Optimization (FIDO) and is implemented in IMAGE. This problem shows how Design Processes and Specification interact in a design system. In addition, the problem utilizes two different solution models concurrently: optimal and satisfying. The satisfying model allows for more design flexibility and allows a designer to maintain design freedom. As a result of following this experimental procedure, this infrastructure is an open system that it is robust to changes in both corporate needs and computer technologies. The development of this infrastructure leads to a number of significant intellectual contributions: 1) A new approach to implementing IPPD with the aid of a computer; 2) A formal Design Experiment; 3) A combined Process and Specification architecture that is language-based; 4) An infrastructure for exploring design; 5) An integration strategy for implementing computer resources; and 6) A seamless modeling language. The need for these contributions is emphasized by the demand by industry and government agencies for the development of these technologies.

  8. Efficient design of CMOS TSC checkers

    NASA Technical Reports Server (NTRS)

    Biddappa, Anita; Shamanna, Manjunath K.; Maki, Gary; Whitaker, Sterling

    1990-01-01

    This paper considers the design of an efficient, robustly testable, CMOS Totally Self-Checking (TSC) Checker for k-out-of-2k codes. Most existing implementations use primitive gates and assume the single stuck-at fault model. The self-testing property has been found to fail for CMOS TSC checkers under the stuck-open fault model due to timing skews and arbitrary delays in the circuit. A new four level design using CMOS primitive gates (NAND, NOR, INVERTERS) is presented. This design retains its properties under the stuck-open fault model. Additionally, this method offers an impressive reduction (greater than 70 percent) in gate count, gate inputs, and test set size when compared to the existing method. This implementation is easily realizable and is based on Anderson's technique. A thorough comparative study has been made on the proposed implementation and Kundu's implementation and the results indicate that the proposed one is better than Kundu's in all respects for k-out-of-2k codes.

  9. Comparison between two photovoltaic module models based on transistors

    NASA Astrophysics Data System (ADS)

    Saint-Eve, Frédéric; Sawicki, Jean-Paul; Petit, Pierre; Maufay, Fabrice; Aillerie, Michel

    2018-05-01

    The main objective of this paper is to verify the possibility to reduce to a simple electronic circuit with very few components the behavior simulation of an un-shaded photovoltaic (PV) module. Particularly, two models based on well-tried elementary structures, i.e., the Darlington structure in first model and the voltage regulation with programmable Zener diode in the second are analyzed. Specifications extracted from the behavior of a real I-V characteristic of a panel are considered and the principal electrical variables are deduced. The two models are expected to match with open circuit voltage, maximum power point (MPP) and short circuit current, without forgetting realistic current slopes on the both sides of MPP. The robustness is mentioned when irradiance varies and is considered as an additional fundamental property. For both models, two simulations are done to identify influence of some parameters. In the first model, a parameter allowing to adjust current slope on left side of MPP proves to be also important for the calculation of open circuit voltage. Besides this model does not authorize an entirely adjustment of I-V characteristic and MPP moves significantly away from real value when irradiance increases. On the contrary, the second model seems to have only qualities: open circuit voltage is easy to calculate, current slopes are realistic and there is perhaps a good robustness when irradiance variations are simulated by adjusting short circuit current of PV module. We have shown that these two simplified models are expected to make reliable and easier simulations of complex PV architecture integrating many different devices like PV modules or other renewable energy sources and storage capacities coupled in parallel association.

  10. Stiffness degradation-based damage model for RC members and structures using fiber-beam elements

    NASA Astrophysics Data System (ADS)

    Guo, Zongming; Zhang, Yaoting; Lu, Jiezhi; Fan, Jian

    2016-12-01

    To meet the demand for an accurate and highly efficient damage model with a distinct physical meaning for performance-based earthquake engineering applications, a stiffness degradation-based damage model for reinforced concrete (RC) members and structures was developed using fiber beam-column elements. In this model, damage indices for concrete and steel fibers were defined by the degradation of the initial reloading modulus and the low-cycle fatigue law. Then, section, member, story and structure damage was evaluated by the degradation of the sectional bending stiffness, rod-end bending stiffness, story lateral stiffness and structure lateral stiffness, respectively. The damage model was realized in Matlab by reading in the outputs of OpenSees. The application of the damage model to RC columns and a RC frame indicates that the damage model is capable of accurately predicting the magnitude, position, and evolutionary process of damage, and estimating story damage more precisely than inter-story drift. Additionally, the damage model establishes a close connection between damage indices at various levels without introducing weighting coefficients or force-displacement relationships. The development of the model has perfected the damage assessment function of OpenSees, laying a solid foundation for damage estimation at various levels of a large-scale structure subjected to seismic loading.

  11. Measuring and modelling the impact of the bark beetle forest disturbance on snow accumulation and ablation at a plot scale

    NASA Astrophysics Data System (ADS)

    Jenicek, Michal; Matejka, Ondrej; Hotovy, Ondrej

    2017-04-01

    The knowledge of water volume stored in the snowpack and its spatial distribution is important to predict the snowmelt runoff. The objective of this study was to quantify the role of different forest structures on the snowpack distribution at a plot scale during snow accumulation and snow ablation periods. Special interest was put in the role of the forest affected by the bark beetle (Ips typographus). We performed repeated detailed manual field survey at selected mountain plots with different canopy structure located at the same elevation and without influence of topography and wind on the snow distribution. The forest canopy structure was described using parameters calculated from hemispherical photographs, such as canopy closure, leaf area index (LAI) and potential irradiance. Additionally, we used shortwave radiation measured using CNR4 Net radiometers placed in plots with different canopy structure. Two snow accumulation and ablation models were set-up to simulate the snow water equivalent (SWE) in plots with different vegetation cover. First model was physically-based using the energy balance approach, second model was conceptual and it was based on the degree-day approach. Both models accounted for snow interception in different forest types using LAI as a parameter. The measured SWE in the plot with healthy forest was on average by 41% lower than in open area during snow accumulation period. The disturbed forest caused the SWE reduction by 22% compared to open area indicating increasing snow storage after forest defoliation. The snow ablation in healthy forest was by 32% slower compared to open area. On the contrary, the snow ablation in disturbed forest (due to the bark beetle) was on average only by 7% slower than in open area. The relative decrease in incoming solar radiation in the forest compared to open area was much bigger compared to the relative decrease in snowmelt rates. This indicated that the decrease in snowmelt rates cannot be explained only by the decrease in incoming solar radiation. Both models simulated sufficiently compared to observations with slightly accurate simulations in open area compared to healthy forest. This was expected, since both models were forced to fit with observations. However, the energy balance approach simulated snowmelt in the forest environment accurately since it accounts also for longwave radiation which might largely influence snowmelt in the forested plots. Both models showed faster snowmelt after forest defoliation which also resulted in earlier snow melt-out in the disturbed forest compared to the healthy coniferous forest.

  12. The Effect of K and Acidity of NiW-Loaded HY Zeolite Catalyst for Selective Ring Opening of 1-Methylnaphthalene.

    PubMed

    Lee, You-Jin; Kim, Eun-Sang; Kim, Jeong-Rang; Kim, Joo-Wan; Kim, Tae-Wan; Chae, Ho-Jeong; Kim, Chul-Ung; Lee, Chang-Ha; Jeong, Soon-Yong

    2016-05-01

    Bi-functional catalysts were prepared using HY zeolites with various SiO2/Al2O3 ratios for acidic function, NiW for metallic function, and K for acidity control. 1-Methylnaphthalene was selected as a model compound for multi-ring aromatics in heavy oil, and its selective ring opening reaction was investigated using the prepared bi-functional catalysts with different levels of acidity in a fixed bed reactor system. In NiW/HY catalysts without K addition, the acidity decreased with the SiO2/Al2O3 mole ratio of the HY zeolite. Ni1.1W1.1/HY(12) catalyst showed the highest acidity but slightly lower yields for the selective ring opening than Ni1.1W1.1/HY(30) catalyst. The acidity of the catalyst seemed to play an important role as the active site for the selective ring opening of 1-methylnaphthalene but there should be some optimum catalyst acidity for the reaction. Catalyst acidity could be controlled between Ni1.1W1.1/HY(12) and Ni1.1W1.1/HY(30) by adding a moderate amount of K to Ni1.1W1.1/HY(12) catalyst. K0.3Ni1.1W1.1/HY(12) catalyst should have the optimum acidity for the selective ring opening. The addition of a moderate amount of K to the NiW/HY catalyst must improve the catalytic performance due to the optimization of catalyst acidity.

  13. Progress on the FabrIc for Frontier Experiments project at Fermilab

    DOE PAGES

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; ...

    2015-12-23

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercialmore » cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. Hence, the progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide« less

  14. 76 FR 2931 - Self-Regulatory Organizations; Notice of Filing and Immediate Effectiveness of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-18

    ... open additional series of stock options or ETF options under certain circumstances. The proposed change... series for each class of stock options or ETF options open for trading on the Exchange; and that the Exchange may open additional series of stock options or ETF options under certain circumstances. The...

  15. Apache Open Climate Workbench: Building Open Source Climate Science Tools and Community at the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Joyce, M.; Ramirez, P.; Boustani, M.; Mattmann, C. A.; Khudikyan, S.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Apache Open Climate Workbench (OCW; https://climate.apache.org/) is a Top-Level Project at the Apache Software Foundation that aims to provide a suite of tools for performing climate science evaluations using model outputs from a multitude of different sources (ESGF, CORDEX, U.S. NCA, NARCCAP) with remote sensing data from NASA, NOAA, and other agencies. Apache OCW is the second NASA project to become a Top-Level Project at the Apache Software Foundation. It grew out of the Jet Propulsion Laboratory's (JPL) Regional Climate Model Evaluation System (RCMES) project, a collaboration between JPL and the University of California, Los Angeles' Joint Institute for Regional Earth System Science and Engineering (JIFRESSE). Apache OCW provides scientists and developers with tools for data manipulation, metrics for dataset comparisons, and a visualization suite. In addition to a powerful low-level API, Apache OCW also supports a web application for quick, browser-controlled evaluations, a command line application for local evaluations, and a virtual machine for isolated experimentation with minimal setup. This talk will look at the difficulties and successes of moving a closed community research project out into the wild world of open source. We'll explore the growing pains Apache OCW went through to become a Top-Level Project at the Apache Software Foundation as well as the benefits gained by opening up development to the broader climate and computer science communities.

  16. Computational Study of Oxidation of Guanine by Singlet Oxygen (1 Δg ) and Formation of Guanine:Lysine Cross-Links.

    PubMed

    Thapa, Bishnu; Munk, Barbara H; Burrows, Cynthia J; Schlegel, H Bernhard

    2017-04-27

    Oxidation of guanine in the presence of lysine can lead to guanine-lysine cross-links. The ratio of the C4, C5 and C8 crosslinks depends on the manner of oxidation. Type II photosensitizers such as Rose Bengal and methylene blue can generate singlet oxygen, which leads to a different ratio of products than oxidation by type I photosensitizers or by one electron oxidants. Modeling reactions of singlet oxygen can be quite challenging. Reactions have been explored using CASSCF, NEVPT2, DFT, CCSD(T), and BD(T) calculations with SMD implicit solvation. The spin contamination in open-shell calculations were corrected by Yamaguchi's approximate spin projection method. The addition of singlet oxygen to guanine to form guanine endo- peroxide proceeds step-wise via a zwitterionic peroxyl intermediate. The subsequent barrier for ring closure is smaller than the initial barrier for singlet oxygen addition. Ring opening of the endoperoxide by protonation at C4-O is followed by loss of a proton from C8 and dehydration to produce 8-oxoG ox . The addition of lysine (modelled by methylamine) or water across the C5=N7 double bond of 8-oxoG ox is followed by acyl migration to form the final spiro products. The barrier for methylamine addition is significantly lower than for water addition and should be the dominant reaction channel. These results are in good agreement with the experimental results for the formation of guanine-lysine cross-links by oxidation by type II photosensitizers. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. OpenDMAP: An open source, ontology-driven concept analysis engine, with applications to capturing knowledge regarding protein transport, protein interactions and cell-type-specific gene expression

    PubMed Central

    Hunter, Lawrence; Lu, Zhiyong; Firby, James; Baumgartner, William A; Johnson, Helen L; Ogren, Philip V; Cohen, K Bretonnel

    2008-01-01

    Background Information extraction (IE) efforts are widely acknowledged to be important in harnessing the rapid advance of biomedical knowledge, particularly in areas where important factual information is published in a diverse literature. Here we report on the design, implementation and several evaluations of OpenDMAP, an ontology-driven, integrated concept analysis system. It significantly advances the state of the art in information extraction by leveraging knowledge in ontological resources, integrating diverse text processing applications, and using an expanded pattern language that allows the mixing of syntactic and semantic elements and variable ordering. Results OpenDMAP information extraction systems were produced for extracting protein transport assertions (transport), protein-protein interaction assertions (interaction) and assertions that a gene is expressed in a cell type (expression). Evaluations were performed on each system, resulting in F-scores ranging from .26 – .72 (precision .39 – .85, recall .16 – .85). Additionally, each of these systems was run over all abstracts in MEDLINE, producing a total of 72,460 transport instances, 265,795 interaction instances and 176,153 expression instances. Conclusion OpenDMAP advances the performance standards for extracting protein-protein interaction predications from the full texts of biomedical research articles. Furthermore, this level of performance appears to generalize to other information extraction tasks, including extracting information about predicates of more than two arguments. The output of the information extraction system is always constructed from elements of an ontology, ensuring that the knowledge representation is grounded with respect to a carefully constructed model of reality. The results of these efforts can be used to increase the efficiency of manual curation efforts and to provide additional features in systems that integrate multiple sources for information extraction. The open source OpenDMAP code library is freely available at PMID:18237434

  18. Marine infectious disease ecology

    USGS Publications Warehouse

    Lafferty, Kevin D.

    2017-01-01

    To put marine disease impacts in context requires a broad perspective on the roles infectious agents have in the ocean. Parasites infect most marine vertebrate and invertebrate species, and parasites and predators can have comparable biomass density, suggesting they play comparable parts as consumers in marine food webs. Although some parasites might increase with disturbance, most probably decline as food webs unravel. There are several ways to adapt epidemiological theory to the marine environment. In particular, because the ocean represents a three-dimensional moving habitat for hosts and parasites, models should open up the spatial scales at which infective stages and host larvae travel. In addition to open recruitment and dimensionality, marine parasites are subject to fishing, filter feeders, dosedependent infection, environmental forcing, and death-based transmission. Adding such considerations to marine disease models will make it easier to predict which infectious diseases will increase or decrease in a changing ocean.

  19. Flow visualization methods for field test verification of CFD analysis of an open gloveport

    DOE PAGES

    Strons, Philip; Bailey, James L.

    2017-01-01

    Anemometer readings alone cannot provide a complete picture of air flow patterns at an open gloveport. Having a means to visualize air flow for field tests in general provides greater insight by indicating direction in addition to the magnitude of the air flow velocities in the region of interest. Furthermore, flow visualization is essential for Computational Fluid Dynamics (CFD) verification, where important modeling assumptions play a significant role in analyzing the chaotic nature of low-velocity air flow. A good example is shown Figure 1, where an unexpected vortex pattern occurred during a field test that could not have been measuredmore » relying only on anemometer readings. Here by, observing and measuring the patterns of the smoke flowing into the gloveport allowed the CFD model to be appropriately updated to match the actual flow velocities in both magnitude and direction.« less

  20. Development of Virtual Blade Model for Modelling Helicopter Rotor Downwash in OpenFOAM

    DTIC Science & Technology

    2013-12-01

    UNCLASSIFIED Development of Virtual Blade Model for Modelling Helicopter Rotor Downwash in OpenFOAM Stefano Wahono Aerospace...Georgia Institute of Technology. The OpenFOAM predicted result was also shown to compare favourably with ANSYS Fluent predictions. RELEASE...UNCLASSIFIED Development of Virtual Blade Model for Modelling Helicopter Rotor Downwash in OpenFOAM Executive Summary The Infrared

  1. Open innovation: Towards sharing of data, models and workflows.

    PubMed

    Conrado, Daniela J; Karlsson, Mats O; Romero, Klaus; Sarr, Céline; Wilkins, Justin J

    2017-11-15

    Sharing of resources across organisations to support open innovation is an old idea, but which is being taken up by the scientific community at increasing speed, concerning public sharing in particular. The ability to address new questions or provide more precise answers to old questions through merged information is among the attractive features of sharing. Increased efficiency through reuse, and increased reliability of scientific findings through enhanced transparency, are expected outcomes from sharing. In the field of pharmacometrics, efforts to publicly share data, models and workflow have recently started. Sharing of individual-level longitudinal data for modelling requires solving legal, ethical and proprietary issues similar to many other fields, but there are also pharmacometric-specific aspects regarding data formats, exchange standards, and database properties. Several organisations (CDISC, C-Path, IMI, ISoP) are working to solve these issues and propose standards. There are also a number of initiatives aimed at collecting disease-specific databases - Alzheimer's Disease (ADNI, CAMD), malaria (WWARN), oncology (PDS), Parkinson's Disease (PPMI), tuberculosis (CPTR, TB-PACTS, ReSeqTB) - suitable for drug-disease modelling. Organized sharing of pharmacometric executable model code and associated information has in the past been sparse, but a model repository (DDMoRe Model Repository) intended for the purpose has recently been launched. In addition several other services can facilitate model sharing more generally. Pharmacometric workflows have matured over the last decades and initiatives to more fully capture those applied to analyses are ongoing. In order to maximize both the impact of pharmacometrics and the knowledge extracted from clinical data, the scientific community needs to take ownership of and create opportunities for open innovation. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Moving beyond Watson-Crick models of coarse grained DNA dynamics.

    PubMed

    Linak, Margaret C; Tourdot, Richard; Dorfman, Kevin D

    2011-11-28

    DNA produces a wide range of structures in addition to the canonical B-form of double-stranded DNA. Some of these structures are stabilized by Hoogsteen bonds. We developed an experimentally parameterized, coarse-grained model that incorporates such bonds. The model reproduces many of the microscopic features of double-stranded DNA and captures the experimental melting curves for a number of short DNA hairpins, even when the open state forms complicated secondary structures. We demonstrate the utility of the model by simulating the folding of a thrombin aptamer, which contains G-quartets, and strand invasion during triplex formation. Our results highlight the importance of including Hoogsteen bonding in coarse-grained models of DNA.

  3. Modeling Self-Healing of Concrete Using Hybrid Genetic Algorithm–Artificial Neural Network

    PubMed Central

    Ramadan Suleiman, Ahmed; Nehdi, Moncef L.

    2017-01-01

    This paper presents an approach to predicting the intrinsic self-healing in concrete using a hybrid genetic algorithm–artificial neural network (GA–ANN). A genetic algorithm was implemented in the network as a stochastic optimizing tool for the initial optimal weights and biases. This approach can assist the network in achieving a global optimum and avoid the possibility of the network getting trapped at local optima. The proposed model was trained and validated using an especially built database using various experimental studies retrieved from the open literature. The model inputs include the cement content, water-to-cement ratio (w/c), type and dosage of supplementary cementitious materials, bio-healing materials, and both expansive and crystalline additives. Self-healing indicated by means of crack width is the model output. The results showed that the proposed GA–ANN model is capable of capturing the complex effects of various self-healing agents (e.g., biochemical material, silica-based additive, expansive and crystalline components) on the self-healing performance in cement-based materials. PMID:28772495

  4. Modeling Self-Healing of Concrete Using Hybrid Genetic Algorithm-Artificial Neural Network.

    PubMed

    Ramadan Suleiman, Ahmed; Nehdi, Moncef L

    2017-02-07

    This paper presents an approach to predicting the intrinsic self-healing in concrete using a hybrid genetic algorithm-artificial neural network (GA-ANN). A genetic algorithm was implemented in the network as a stochastic optimizing tool for the initial optimal weights and biases. This approach can assist the network in achieving a global optimum and avoid the possibility of the network getting trapped at local optima. The proposed model was trained and validated using an especially built database using various experimental studies retrieved from the open literature. The model inputs include the cement content, water-to-cement ratio (w/c), type and dosage of supplementary cementitious materials, bio-healing materials, and both expansive and crystalline additives. Self-healing indicated by means of crack width is the model output. The results showed that the proposed GA-ANN model is capable of capturing the complex effects of various self-healing agents (e.g., biochemical material, silica-based additive, expansive and crystalline components) on the self-healing performance in cement-based materials.

  5. Locating and characterizing a crack in concrete with diffuse ultrasound: A four-point bending test.

    PubMed

    Larose, Eric; Obermann, Anne; Digulescu, Angela; Planès, Thomas; Chaix, Jean-Francois; Mazerolle, Frédéric; Moreau, Gautier

    2015-07-01

    This paper describes an original imaging technique, named Locadiff, that benefits from the diffuse effect of ultrasound waves in concrete to detect and locate mechanical changes associated with the opening of pre-existing cracks, and/or to the development of diffuse damage at the tip of the crack. After giving a brief overview of the theoretical model to describe the decorrelation of diffuse waveforms induced by a local change, the article introduces the inversion procedure that produces the three dimensional maps of density of changes. These maps are interpreted in terms of mechanical changes, fracture opening, and damage development. In addition, each fracture is characterized by its effective scattering cross section.

  6. Flexible Environmental Modeling with Python and Open - GIS

    NASA Astrophysics Data System (ADS)

    Pryet, Alexandre; Atteia, Olivier; Delottier, Hugo; Cousquer, Yohann

    2015-04-01

    Numerical modeling now represents a prominent task of environmental studies. During the last decades, numerous commercial programs have been made available to environmental modelers. These software applications offer user-friendly graphical user interfaces that allow an efficient management of many case studies. However, they suffer from a lack of flexibility and closed-source policies impede source code reviewing and enhancement for original studies. Advanced modeling studies require flexible tools capable of managing thousands of model runs for parameter optimization, uncertainty and sensitivity analysis. In addition, there is a growing need for the coupling of various numerical models associating, for instance, groundwater flow modeling to multi-species geochemical reactions. Researchers have produced hundreds of open-source powerful command line programs. However, there is a need for a flexible graphical user interface allowing an efficient processing of geospatial data that comes along any environmental study. Here, we present the advantages of using the free and open-source Qgis platform and the Python scripting language for conducting environmental modeling studies. The interactive graphical user interface is first used for the visualization and pre-processing of input geospatial datasets. Python scripting language is then employed for further input data processing, call to one or several models, and post-processing of model outputs. Model results are eventually sent back to the GIS program, processed and visualized. This approach combines the advantages of interactive graphical interfaces and the flexibility of Python scripting language for data processing and model calls. The numerous python modules available facilitate geospatial data processing and numerical analysis of model outputs. Once input data has been prepared with the graphical user interface, models may be run thousands of times from the command line with sequential or parallel calls. We illustrate this approach with several case studies in groundwater hydrology and geochemistry and provide links to several python libraries that facilitate pre- and post-processing operations.

  7. Evolution of the Campanian Ignimbrite Magmatic System II: Trace Element and Th Isotopic Evidence for Open-System Processes

    NASA Astrophysics Data System (ADS)

    Bohrson, W. A.; Spera, F. J.; Fowler, S.; Belkin, H.; de Vivo, B.

    2005-12-01

    The Campanian Ignimbrite, a large volume (~200 km3 DRE) trachytic to phonolitic ignimbrite was deposited at ~39.3 ka and represents the largest of a number of highly explosive volcanic events in the region near Naples, Italy. Thermodynamic modeling of the major element evolution using the MELTS algorithm (see companion contribution by Fowler et al.) provides detailed information about the identity of and changes in proportions of solids along the liquid line of descent during isobaric fractional crystallization. We have derived trace element mass balance equations that explicitly accommodate changing mineral-melt bulk distribution coefficients during crystallization and also simultaneously satisfy energy and major element mass conservation. Although major element patterns are reasonably modeled assuming closed system fractional crystallization, modeling of trace elements that represent a range of behaviors (e.g. Zr, Nb, Th, U, Rb, Sm, Sr) yields trends for closed system fractionation that are distinct from those observed. These results suggest open-system processes were also important in the evolution of the Campanian magmatic system. Th isotope data yield an apparent isochron that is ~20 kyr younger than the age of the deposit, and age-corrected Th isotope data indicate that the magma body was an open-system at the time of eruption. Because open-system processes can profoundly change isotopic characteristics of a magma body, these results illustrate that it is critical to understand the contribution that open-system processes make to silicic magma bodies prior to assigning relevance to age or timescale information derived from isotope systematics. Fluid-magma interaction has been proposed as a mechanism to change isotopic and elemental characteristics of magma bodies, but an evaluation of the mass and thermal constraints on such a process suggest large-scale fluid-melt interaction at liquidus temperatures is unlikely. In the case of the magma body associated with the Campanian Ignimbrite, the most likely source of open-system signatures is assimilation of partial melts of compositionally heterogeneous basement composed of older cumulates and intrusive equivalents of volcanic activity within the Campanian region. Additional trace element modeling, explicitly evaluating the mass and energy balance effects that fluid, solids, and melt have on trace element evolution, will further elucidate the contributions of open vs. closed system processes within the Campanian magma body.

  8. Influence of fiber packing structure on permeability

    NASA Technical Reports Server (NTRS)

    Cai, Zhong; Berdichevsky, Alexander L.

    1993-01-01

    The study on the permeability of an aligned fiber bundle is the key building block in modeling the permeability of advanced woven and braided preforms. Available results on the permeability of fiber bundles in the literature show that a substantial difference exists between numerical and analytical calculations on idealized fiber packing structures, such as square and hexagonal packing, and experimental measurements on practical fiber bundles. The present study focuses on the variation of the permeability of a fiber bundle under practical process conditions. Fiber bundles are considered as containing openings and fiber clusters within the bundle. Numerical simulations on the influence of various openings on the permeability were conducted. Idealized packing structures are used, but with introduced openings distributed in different patterns. Both longitudinal and transverse flow are considered. The results show that openings within the fiber bundle have substantial effect on the permeability. In the longitudinal flow case, the openings become the dominant flow path. In the transverse flow case, the fiber clusters reduce the gap sizes among fibers. Therefore the permeability is greatly influenced by these openings and clusters, respectively. In addition to the porosity or fiber volume fraction, which is commonly used in the permeability expression, another fiber bundle status parameter, the ultimate fiber volume fraction, is introduced to capture the disturbance within a fiber bundle.

  9. An Inverse Modeling Plugin for HydroDesktop using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio, C.; Over, M. W.; Rubin, Y.

    2011-12-01

    The CUAHSI Hydrologic Information System (HIS) software stack is based on an open and extensible architecture that facilitates the addition of new functions and capabilities at both the server side (using HydroServer) and the client side (using HydroDesktop). The HydroDesktop client plugin architecture is used here to expose a new scripting based plugin that makes use of the R statistics software as a means for conducting inverse modeling using the Method of Anchored Distributions (MAD). MAD is a Bayesian inversion technique for conditioning computational model parameters on relevant field observations yielding probabilistic distributions of the model parameters, related to the spatial random variable of interest, by assimilating multi-type and multi-scale data. The implementation of a desktop software tool for using the MAD technique is expected to significantly lower the barrier to use of inverse modeling in education, research, and resource management. The HydroDesktop MAD plugin is being developed following a community-based, open-source approach that will help both its adoption and long term sustainability as a user tool. This presentation will briefly introduce MAD, HydroDesktop, and the MAD plugin and software development effort.

  10. Open quantum systems, effective Hamiltonians, and device characterization

    NASA Astrophysics Data System (ADS)

    Duffus, S. N. A.; Dwyer, V. M.; Everitt, M. J.

    2017-10-01

    High fidelity models, which are able to both support accurate device characterization and correctly account for environmental effects, are crucial to the engineering of scalable quantum technologies. As it ensures positivity of the density matrix, one preferred model of open systems describes the dynamics with a master equation in Lindblad form. In practice, Linblad operators are rarely derived from first principles, and often a particular form of annihilator is assumed. This results in dynamical models that miss those additional terms which must generally be added for the master equation to assume the Lindblad form, together with the other concomitant terms that must be assimilated into an effective Hamiltonian to produce the correct free evolution. In first principles derivations, such additional terms are often canceled (or countered), frequently in a somewhat ad hoc manner, leading to a number of competing models. Whilst the implications of this paper are quite general, to illustrate the point we focus here on an example anharmonic system; specifically that of a superconducting quantum interference device (SQUID) coupled to an Ohmic bath. The resulting master equation implies that the environment has a significant impact on the system's energy; we discuss the prospect of keeping or canceling this impact and note that, for the SQUID, monitoring the magnetic susceptibility under control of the capacitive coupling strength and the externally applied flux results in experimentally measurable differences between a number of these models. In particular, one should be able to determine whether a squeezing term of the form X ̂P ̂+P ̂X ̂ should be present in the effective Hamiltonian or not. If model generation is not performed correctly, device characterization will be prone to systemic errors.

  11. InSAR analysis for detecting the route of hydrothermal fluid to the surface during the 2015 phreatic eruption of Hakone Volcano, Japan

    NASA Astrophysics Data System (ADS)

    Doke, Ryosuke; Harada, Masatake; Mannen, Kazutaka; Itadera, Kazuhiro; Takenaka, Jun

    2018-04-01

    Although the 2015 Hakone Volcano eruption was a small-scale phreatic eruption with a discharged mass of only about 100 tons, interferometric synthetic aperture radar successfully detected surface deformations related to the eruption. Inversion model of the underground hydrothermal system based on measured ground displacements by ALOS-2/PALSAR-2 images showed that a crack opened at an elevation of about 530-830 m, probably at the time of the eruption. A geomorphological analysis detected several old NW-SE trending fissures, and the open crack was located just beneath one of the fissures. Thus, the crack that opened during the 2015 eruption could have been a preexisting crack that formed during a more voluminous hydrothermal eruption. In addition, the inversion model implies that a sill deflation occurred at an elevation of about 225 m, probably at the time of the eruption. The deflation of sill-like body represents a preexisting hydrothermal reservoir at an elevation of 100-400 m, which intruded fluid in the open crack prior to eruption. The volume changes of the open crack and the sill were calculated to be 1.14 × 105 m3 (inflation) and 0.49 × 105 m3 (deflation), respectively. A very local swelling (about 200 m in diameter) was also detected at the eruption center 2 months before the eruption. The local swelling, whose rate in satellite line-of-sight was 0.7-0.9 cm/day during May 2015 and declined in June, had been monitored until the time of the eruption, when its uplift halted. This was modeled as a point pressure source at an elevation of about 900 m (at a depth of about 80-90 m from the ground surface) and is considered to be a minor hydrothermal reservoir just beneath the fumarolic field. Our analysis shows that the northernmost tip of the open crack reached within 200 m of the surface. Thus, it is reasonable to assume that the hydrothermal fluid in the open crack found a way to the surface and formed the eruption.[Figure not available: see fulltext.

  12. The HYPE Open Source Community

    NASA Astrophysics Data System (ADS)

    Strömbäck, L.; Pers, C.; Isberg, K.; Nyström, K.; Arheimer, B.

    2013-12-01

    The Hydrological Predictions for the Environment (HYPE) model is a dynamic, semi-distributed, process-based, integrated catchment model. It uses well-known hydrological and nutrient transport concepts and can be applied for both small and large scale assessments of water resources and status. In the model, the landscape is divided into classes according to soil type, vegetation and altitude. The soil representation is stratified and can be divided in up to three layers. Water and substances are routed through the same flow paths and storages (snow, soil, groundwater, streams, rivers, lakes) considering turn-over and transformation on the way towards the sea. HYPE has been successfully used in many hydrological applications at SMHI. For Europe, we currently have three different models; The S-HYPE model for Sweden; The BALT-HYPE model for the Baltic Sea; and the E-HYPE model for the whole Europe. These models simulate hydrological conditions and nutrients for their respective areas and are used for characterization, forecasts, and scenario analyses. Model data can be downloaded from hypeweb.smhi.se. In addition, we provide models for the Arctic region, the Arab (Middle East and Northern Africa) region, India, the Niger River basin, the La Plata Basin. This demonstrates the applicability of the HYPE model for large scale modeling in different regions of the world. An important goal with our work is to make our data and tools available as open data and services. For this aim we created the HYPE Open Source Community (OSC) that makes the source code of HYPE available for anyone interested in further development of HYPE. The HYPE OSC (hype.sourceforge.net) is an open source initiative under the Lesser GNU Public License taken by SMHI to strengthen international collaboration in hydrological modeling and hydrological data production. The hypothesis is that more brains and more testing will result in better models and better code. The code is transparent and can be changed and learnt from. New versions of the main code are delivered frequently. HYPE OSC is open to everyone interested in hydrology, hydrological modeling and code development - e.g. scientists, authorities, and consultancies. By joining the HYPE OSC you get access a state-of-the-art operational hydrological model. The HYPE source code is designed to efficiently handle large scale modeling for forecast, hindcast and climate applications. The code is under constant development to improve the hydrological processes, efficiency and readability. In the beginning of 2013 we released a version with new and better modularization based on hydrological processes. This will make the code easier to understand and further develop for a new user. An important challenge in this process is to produce code that is easy for anyone to understand and work with, but still maintain the properties that make the code efficient enough for large scale applications. Input from the HYPE Open Source Community is an important source for future improvements of the HYPE model. Therefore, by joining the community you become an active part of the development, get access to the latest features and can influence future versions of the model.

  13. Open source electronic health records and chronic disease management.

    PubMed

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-02-01

    To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC.

  14. The Ship Tethered Aerostat Remote Sensing System (STARRS): Observations of Small-Scale Surface Lateral Transport During the LAgrangian Submesoscale ExpeRiment (LASER)

    NASA Astrophysics Data System (ADS)

    Carlson, D. F.; Novelli, G.; Guigand, C.; Özgökmen, T.; Fox-Kemper, B.; Molemaker, M. J.

    2016-02-01

    The Consortium for Advanced Research on the Transport of Hydrocarbon in the Environment (CARTHE) will carry out the LAgrangian Submesoscale ExpeRiment (LASER) to study the role of small-scale processes in the transport and dispersion of oil and passive tracers. The Ship-Tethered Aerostat Remote Sensing System (STARRS) will observe small-scale surface dispersion in the open ocean. STARRS is built around a high-lift-capacity (30 kg) helium-filled aerostat. STARRS is equipped with a high resolution digital camera. An integrated GNSS receiver and inertial navigation system permit direct geo-rectification of the imagery. Consortium for Advanced Research on the Transport of Hydrocarbon in the Environment (CARTHE) will carry out the LAgrangian Submesoscale ExpeRiment (LASER) to study the role of small-scale processes in the transport and dispersion of oil and passive tracers. The Ship-Tethered Aerostat Remote Sensing System (STARRS) was developed to produce observational estimates of small-scale surface dispersion in the open ocean. STARRS is built around a high-lift-capacity (30 kg) helium-filled aerostat. STARRS is equipped with a high resolution digital camera. An integrated GNSS receiver and inertial navigation system permit direct geo-rectification of the imagery. Thousands of drift cards deployed in the field of view of STARRS and tracked over time provide the first observational estimates of small-scale (1-500 m) surface dispersion in the open ocean. The STARRS imagery will be combined with GPS-tracked surface drifter trajectories, shipboard observations, and aerial surveys of sea surface temperature in the DeSoto Canyon. In addition to obvious applications to oil spill modelling, the STARRS observations will provide essential benchmarks for high resolution numerical modelsDrift cards deployed in the field of view of STARRS and tracked over time provide the first observational estimates of small-scale (1-100 m) surface dispersion in the open ocean. The STARRS imagery will be combined with GPS-tracked surface drifter trajectories, shipboard observations, and aerial surveys of sea surface temperature in the DeSoto Canyon. In addition to obvious applications to oil spill modelling, the STARRS observations will provide essential benchmarks for high resolution numerical models

  15. Hazard Models From Periodic Dike Intrusions at Kı¯lauea Volcano, Hawai`i

    NASA Astrophysics Data System (ADS)

    Montgomery-Brown, E. K.; Miklius, A.

    2016-12-01

    The persistence and regular recurrence intervals of dike intrusions in the East Rift Zone (ERZ) of Kı¯lauea Volcano lead to the possibility of constructing a time-dependent intrusion hazard model. Dike intrusions are commonly observed in Kı¯lauea Volcano's ERZ and can occur repeatedly in regions that correlate with seismic segments (sections of rift seismicity with persistent definitive lateral boundaries) proposed by Wright and Klein (USGS PP1806, 2014). Five such ERZ intrusions have occurred since 1983 with inferred locations downrift of the bend in Kı¯lauea's ERZ, with the first (1983) being the start of the ongoing ERZ eruption. The ERZ intrusions occur on one of two segments that are spatially coincident with seismic segments: Makaopuhi (1993 and 2007) and Nāpau (1983, 1997, and 2011). During each intrusion, the amount of inferred dike opening was between 2 and 3 meters. The times between ERZ intrusions for same-segment pairs are all close to 14 years: 14.07 (1983-1997), 14.09 (1997-2011), and 13.95 (1993-2007) years, with the Nāpau segment becoming active about 3.5 years after the Makaopuhi segment in each case. Four additional upper ERZ intrusions are also considered here. Dikes in the upper ERZ have much smaller opening ( 10 cm), and have shorter recurrence intervals of 8 years with more variability. The amount of modeled dike opening during each of these events roughly corresponds to the amount of seaward south flank motion and deep rift opening accumulated in the time between events. Additionally, the recurrence interval of 14 years appears to be unaffected by the magma surge of 2003-2007, suggesting that flank motion, rather than magma supply, could be a controlling factor in the timing and periodicity of intrusions. Flank control over the timing of magma intrusions runs counter to the historical research suggesting that dike intrusions at Kı¯lauea are driven by magma overpressure. This relatively free sliding may have resulted from decreased friction following the 1975 Kalapana earthquake. A hazard model can be constructed from the historical intrusion record (i.e., how long has it been since an intrusion on that segment), and augmented by monitoring the accumulation of strain across the rift and local seismicity rates.

  16. Development and testing of a numerical simulation method for thermally nonequilibrium dissociating flows in ANSYS Fluent

    NASA Astrophysics Data System (ADS)

    Shoev, G. V.; Bondar, Ye. A.; Oblapenko, G. P.; Kustova, E. V.

    2016-03-01

    Various issues of numerical simulation of supersonic gas flows with allowance for thermochemical nonequilibrium on the basis of fluid dynamic equations in the two-temperature approximation are discussed. The computational tool for modeling flows with thermochemical nonequilibrium is the commercial software package ANSYS Fluent with an additional userdefined open-code module. A comparative analysis of results obtained by various models of vibration-dissociation coupling in binary gas mixtures of nitrogen and oxygen is performed. Results of numerical simulations are compared with available experimental data.

  17. Simulation of optically pumped intersubband laser in magnetic field

    NASA Astrophysics Data System (ADS)

    Erić, Marko; Milanović, Vitomir; Ikonić, Zoran; Indjin, Dragan

    2007-06-01

    Simulations of an optically pumped intersubband laser in magnetic field up to 60 T are performed within the steady-state rate equations model. The electron-polar optical phonon scattering is calculated using the confined and interface phonon model. A strong oscillatory optical gain vs. magnetic field dependence is found, with two dominant gain peaks occurring at 20 and 40 T, the fields which bring appropriate states into resonance with optical phonons and thus open additional relaxation paths. The peak at 20 T exceeds the value of gain achieved at zero field.

  18. The discounting model selector: Statistical software for delay discounting applications.

    PubMed

    Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A

    2017-05-01

    Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.

  19. iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations

    NASA Astrophysics Data System (ADS)

    Vanfretti, L.; Rabuzin, T.; Baudette, M.; Murad, M.

    The iTesla Power Systems Library (iPSL) is a Modelica package providing a set of power system components for phasor time-domain modeling and simulation. The Modelica language provides a systematic approach to develop models using a formal mathematical description, that uniquely specifies the physical behavior of a component or the entire system. Furthermore, the standardized specification of the Modelica language (Modelica Association [1]) enables unambiguous model exchange by allowing any Modelica-compliant tool to utilize the models for simulation and their analyses without the need of a specific model transformation tool. As the Modelica language is being developed with open specifications, any tool that implements these requirements can be utilized. This gives users the freedom of choosing an Integrated Development Environment (IDE) of their choice. Furthermore, any integration solver can be implemented within a Modelica tool to simulate Modelica models. Additionally, Modelica is an object-oriented language, enabling code factorization and model re-use to improve the readability of a library by structuring it with object-oriented hierarchy. The developed library is released under an open source license to enable a wider distribution and let the user customize it to their specific needs. This paper describes the iPSL and provides illustrative application examples.

  20. Comparing the floquet stability of open and breathing fatigue cracks in an overhung rotordynamic system

    NASA Astrophysics Data System (ADS)

    Varney, Philip; Green, Itzhak

    2017-11-01

    Rotor cracks represent an uncommon but serious threat to rotating machines and must be detected early to avoid catastrophic machine failure. An important aspect of analyzing rotor cracks is understanding their influence on the rotor stability. It is well-known that the extent of rotor instability versus shaft speed is exacerbated by deeper cracks. Consequently, crack propagation can eventually result in an unstable response even if the shaft speed remains constant. Most previous investigations of crack-induced rotor instability concern simple Jeffcott rotors. This work advances the state-of-the-art by (a) providing a novel inertial-frame model of an overhung rotor, and (b) assessing the stability of the cracked overhung rotor using Floquet stability analysis. The rotor Floquet stability analysis is performed for both an open crack and a breathing crack, and conclusions are drawn regarding the importance of appropriately selecting the crack model. The rotor stability is analyzed versus crack depth, external viscous damping ratio, and rotor inertia. In general, this work concludes that the onset of instability occurs at lower shaft speeds for thick rotors, lower viscous damping ratios, and deeper cracks. In addition, when comparing commensurate cracks, the breathing crack is shown to induce more regions of instability than the open crack, though the open crack generally predicts an unstable response for shallower cracks than the breathing crack. Keywords: rotordynamics, stability, rotor cracks.

  1. Superior diastolic function with KATP channel opener diazoxide in a novel mouse Langendorff model.

    PubMed

    Makepeace, Carol M; Suarez-Pierre, Alejandro; Kanter, Evelyn M; Schuessler, Richard B; Nichols, Colin G; Lawton, Jennifer S

    2018-07-01

    Adenosine triphosphate-sensitive potassium (K ATP ) channel openers have been found to be cardioprotective in multiple animal models via an unknown mechanism. Mouse models allow genetic manipulation of K ATP channel components for the investigation of this mechanism. Mouse Langendorff models using 30 min of global ischemia are known to induce measurable myocardial infarction and injury. Prolongation of global ischemia in a mouse Langendorff model could allow the determination of the mechanisms involved in K ATP channel opener cardioprotection. Mouse hearts (C57BL/6) underwent baseline perfusion with Krebs-Henseleit buffer (30 min), assessment of function using a left ventricular balloon, delivery of test solution, and prolonged global ischemia (90 min). Hearts underwent reperfusion (30 min) and functional assessment. Coronary flow was measured using an inline probe. Test solutions included were as follows: hyperkalemic cardioplegia alone (CPG, n = 11) or with diazoxide (CPG + DZX, n = 12). Although the CPG + DZX group had greater percent recovery of developed pressure and coronary flow, this was not statistically significant. Following a mean of 74 min (CPG) and 77 min (CPG + DZX), an additional increase in end-diastolic pressure was noted (plateau), which was significantly higher in the CPG group. Similarly, the end-diastolic pressure (at reperfusion and at the end of experiment) was significantly higher in the CPG group. Prolongation of global ischemia demonstrated added benefit when DZX was added to traditional hyperkalemic CPG. This model will allow the investigation of DZX mechanism of cardioprotection following manipulation of targeted K ATP channel components. This model will also allow translation to prolonged ischemic episodes associated with cardiac surgery. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. A Web-based Visualization System for Three Dimensional Geological Model using Open GIS

    NASA Astrophysics Data System (ADS)

    Nemoto, T.; Masumoto, S.; Nonogaki, S.

    2017-12-01

    A three dimensional geological model is an important information in various fields such as environmental assessment, urban planning, resource development, waste management and disaster mitigation. In this study, we have developed a web-based visualization system for 3D geological model using free and open source software. The system has been successfully implemented by integrating web mapping engine MapServer and geographic information system GRASS. MapServer plays a role of mapping horizontal cross sections of 3D geological model and a topographic map. GRASS provides the core components for management, analysis and image processing of the geological model. Online access to GRASS functions has been enabled using PyWPS that is an implementation of WPS (Web Processing Service) Open Geospatial Consortium (OGC) standard. The system has two main functions. Two dimensional visualization function allows users to generate horizontal and vertical cross sections of 3D geological model. These images are delivered via WMS (Web Map Service) and WPS OGC standards. Horizontal cross sections are overlaid on the topographic map. A vertical cross section is generated by clicking a start point and an end point on the map. Three dimensional visualization function allows users to visualize geological boundary surfaces and a panel diagram. The user can visualize them from various angles by mouse operation. WebGL is utilized for 3D visualization. WebGL is a web technology that brings hardware-accelerated 3D graphics to the browser without installing additional software. The geological boundary surfaces can be downloaded to incorporate the geologic structure in a design on CAD and model for various simulations. This study was supported by JSPS KAKENHI Grant Number JP16K00158.

  3. From Particles and Point Clouds to Voxel Models: High Resolution Modeling of Dynamic Landscapes in Open Source GIS

    NASA Astrophysics Data System (ADS)

    Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.

    2012-12-01

    Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.

  4. OpenMx: An Open Source Extended Structural Equation Modeling Framework

    ERIC Educational Resources Information Center

    Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John

    2011-01-01

    OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the "R" statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are…

  5. Ares I-X First Stage Internal Aft Skirt Re-Entry Heating Data and Modeling

    NASA Technical Reports Server (NTRS)

    Schmitz, Craig P.; Tashakkor, Scott B.

    2011-01-01

    The CLVSTATE engineering code is being used to predict Ares-I launch vehicle first stage reentry aerodynamic heating. An engineering analysis is developed which yields reasonable predictions for the timing of the first stage aft skirt thermal curtain failure and the resulting internal gas temperatures. The analysis is based on correlations of the Ares I-X internal aft skirt gas temperatures and has been implemented into CLVSTATE. Validation of the thermal curtain opening models has been accomplished using additional Ares I-X thermocouple, calorimeter and pressure flight data. In addition, a technique which accounts for radiation losses at high altitudes has been developed which improves the gas temperature measurements obtained by the gas temperature probes (GTP). Updates to the CLVSTATE models are shown to improve the accuracy of the internal aft skirt heating predictions which will result in increased confidence in future vehicle designs

  6. A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data

    DOE PAGES

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...

    2016-01-01

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less

  7. Using CellML with OpenCMISS to Simulate Multi-Scale Physiology

    PubMed Central

    Nickerson, David P.; Ladd, David; Hussan, Jagir R.; Safaei, Soroush; Suresh, Vinod; Hunter, Peter J.; Bradley, Christopher P.

    2014-01-01

    OpenCMISS is an open-source modeling environment aimed, in particular, at the solution of bioengineering problems. OpenCMISS consists of two main parts: a computational library (OpenCMISS-Iron) and a field manipulation and visualization library (OpenCMISS-Zinc). OpenCMISS is designed for the solution of coupled multi-scale, multi-physics problems in a general-purpose parallel environment. CellML is an XML format designed to encode biophysically based systems of ordinary differential equations and both linear and non-linear algebraic equations. A primary design goal of CellML is to allow mathematical models to be encoded in a modular and reusable format to aid reproducibility and interoperability of modeling studies. In OpenCMISS, we make use of CellML models to enable users to configure various aspects of their multi-scale physiological models. This avoids the need for users to be familiar with the OpenCMISS internal code in order to perform customized computational experiments. Examples of this are: cellular electrophysiology models embedded in tissue electrical propagation models; material constitutive relationships for mechanical growth and deformation simulations; time-varying boundary conditions for various problem domains; and fluid constitutive relationships and lumped-parameter models. In this paper, we provide implementation details describing how CellML models are integrated into multi-scale physiological models in OpenCMISS. The external interface OpenCMISS presents to users is also described, including specific examples exemplifying the extensibility and usability these tools provide the physiological modeling and simulation community. We conclude with some thoughts on future extension of OpenCMISS to make use of other community developed information standards, such as FieldML, SED-ML, and BioSignalML. Plans for the integration of accelerator code (graphical processing unit and field programmable gate array) generated from CellML models is also discussed. PMID:25601911

  8. Numerical Analysis of Combined Well and Open-Closed Loops Geothermal (CWG) Systems

    NASA Astrophysics Data System (ADS)

    Park, Yu-Chul

    2016-04-01

    Open-loop geothermal heat pump (GHP) system and closed-loop heat pump systems have been used in Korea to reduce emission of greenhouse gases such as carbon dioxide (CO2). The GHP systems have the pros and cons, for example, the open-loop GHP system is good energy-efficient and the closed-loop GHP system requires minimum maintenance costs. The open-loop GHP system can be used practically only with large amount of groundwater supply. The closed-loop GHP system can be used with high costs of initial installation. The performance and efficiency of the GHP system depend on the characteristics of the GHP system itself in addition to the geologic conditions. To overcome the cons of open-loop or closed-loop GHP system, the combined well and open-closed loops geothermal (CWG) system was designed. The open-loop GHP system is surrounded with closed-loop GHP systems in the CWG system. The geothermal energy in closed-loop GHP systems is supplied by the groundwater pumped by the open-loop GHP system. In this study, 2 different types of the CWG systems (small aperture hybrid CWG system and large aperture CWG system) are estimated using numerical simulation models in the aspect of energy efficiency. This work was supported by the New & Renewable Energy Core Technology Program of the Korea Institute of Energy Technology Evaluation and Planning (KETEP), granted financial resource from the Ministry of Trade, Industry & Energy, Republic of Korea. (No.20153030111120).

  9. Open high-level data formats and software for gamma-ray astronomy

    NASA Astrophysics Data System (ADS)

    Deil, Christoph; Boisson, Catherine; Kosack, Karl; Perkins, Jeremy; King, Johannes; Eger, Peter; Mayer, Michael; Wood, Matthew; Zabalza, Victor; Knödlseder, Jürgen; Hassan, Tarek; Mohrmann, Lars; Ziegler, Alexander; Khelifi, Bruno; Dorner, Daniela; Maier, Gernot; Pedaletti, Giovanna; Rosado, Jaime; Contreras, José Luis; Lefaucheur, Julien; Brügge, Kai; Servillat, Mathieu; Terrier, Régis; Walter, Roland; Lombardi, Saverio

    2017-01-01

    In gamma-ray astronomy, a variety of data formats and proprietary software have been traditionally used, often developed for one specific mission or experiment. Especially for ground-based imaging atmospheric Cherenkov telescopes (IACTs), data and software are mostly private to the collaborations operating the telescopes. However, there is a general movement in science towards the use of open data and software. In addition, the next-generation IACT instrument, the Cherenkov Telescope Array (CTA), will be operated as an open observatory. We have created a Github organisation at https://github.com/open-gamma-ray-astro where we are developing high-level data format specifications. A public mailing list was set up at https://lists.nasa.gov/mailman/listinfo/open-gamma-ray-astro and a first face-to-face meeting on the IACT high-level data model and formats took place in April 2016 in Meudon (France). This open multi-mission effort will help to accelerate the development of open data formats and open-source software for gamma-ray astronomy, leading to synergies in the development of analysis codes and eventually better scientific results (reproducible, multi-mission). This write-up presents this effort for the first time, explaining the motivation and context, the available resources and process we use, as well as the status and planned next steps for the data format specifications. We hope that it will stimulate feedback and future contributions from the gamma-ray astronomy community.

  10. Embracing Open Source for NASA's Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Baynes, Katie; Pilone, Dan; Boller, Ryan; Meyer, David; Murphy, Kevin

    2017-01-01

    The overarching purpose of NASAs Earth Science program is to develop a scientific understanding of Earth as a system. Scientific knowledge is most robust and actionable when resulting from transparent, traceable, and reproducible methods. Reproducibility includes open access to the data as well as the software used to arrive at results. Additionally, software that is custom-developed for NASA should be open to the greatest degree possible, to enable re-use across Federal agencies, reduce overall costs to the government, remove barriers to innovation, and promote consistency through the use of uniform standards. Finally, Open Source Software (OSS) practices facilitate collaboration between agencies and the private sector. To best meet these ends, NASAs Earth Science Division promotes the full and open sharing of not only all data, metadata, products, information, documentation, models, images, and research results but also the source code used to generate, manipulate and analyze them. This talk focuses on the challenges to open sourcing NASA developed software within ESD and the growing pains associated with establishing policies running the gamut of tracking issues, properly documenting build processes, engaging the open source community, maintaining internal compliance, and accepting contributions from external sources. This talk also covers the adoption of existing open source technologies and standards to enhance our custom solutions and our contributions back to the community. Finally, we will be introducing the most recent OSS contributions from NASA Earth Science program and promoting these projects for wider community review and adoption.

  11. Molecular interactions involved in proton-dependent gating in KcsA potassium channels

    PubMed Central

    Posson, David J.; Thompson, Ameer N.; McCoy, Jason G.

    2013-01-01

    The bacterial potassium channel KcsA is gated open by the binding of protons to amino acids on the intracellular side of the channel. We have identified, via channel mutagenesis and x-ray crystallography, two pH-sensing amino acids and a set of nearby residues involved in molecular interactions that influence gating. We found that the minimal mutation of one histidine (H25) and one glutamate (E118) near the cytoplasmic gate completely abolished pH-dependent gating. Mutation of nearby residues either alone or in pairs altered the channel’s response to pH. In addition, mutations of certain pairs of residues dramatically increased the energy barriers between the closed and open states. We proposed a Monod–Wyman–Changeux model for proton binding and pH-dependent gating in KcsA, where H25 is a “strong” sensor displaying a large shift in pKa between closed and open states, and E118 is a “weak” pH sensor. Modifying model parameters that are involved in either the intrinsic gating equilibrium or the pKa values of the pH-sensing residues was sufficient to capture the effects of all mutations. PMID:24218397

  12. Crack Opening Displacement Behavior in Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Sevener, Kathy; Tracy, Jared; Chen, Zhe; Daly, Sam; Kiser, Doug

    2017-01-01

    Ceramic Matrix Composites (CMC) modeling and life prediction strongly depend on oxidation, and therefore require a thorough understanding of when matrix cracks occur, the extent of cracking for given conditions (time-temperature-environment-stress), and the interactions of matrix cracks with fibers and interfaces. In this work, the evolution of matrix cracks in a melt-infiltrated Silicon Carbide/Silicon Carbide (SiC/SiC) CMC under uniaxial tension was examined using scanning electron microscopy (SEM) combined with digital image correlation (DIC) and manual crack opening displacement (COD) measurements. Strain relaxation due to matrix cracking, the relationship between COD's and applied stress, and damage evolution at stresses below the proportional limit were assessed. Direct experimental observation of strain relaxation adjacent to regions of matrix cracking is presented and discussed. Additionally, crack openings were found to increase linearly with increasing applied stress, and no crack was found to pass fully through the gage cross-section. This observation is discussed in the context of the assumption of through-cracks for all loading conditions and fiber architectures in oxidation modeling. Finally, the combination of SEM with DIC is demonstrated throughout to be a powerful means for damage identification and quantification in CMC's at stresses well below the proportional limit.

  13. Additivity and Interactions in Ecotoxicity of Pollutant Mixtures: Some Patterns, Conclusions, and Open Questions

    PubMed Central

    Rodea-Palomares, Ismael; González-Pleiter, Miguel; Martín-Betancor, Keila; Rosal, Roberto; Fernández-Piñas, Francisca

    2015-01-01

    Understanding the effects of exposure to chemical mixtures is a common goal of pharmacology and ecotoxicology. In risk assessment-oriented ecotoxicology, defining the scope of application of additivity models has received utmost attention in the last 20 years, since they potentially allow one to predict the effect of any chemical mixture relying on individual chemical information only. The gold standard for additivity in ecotoxicology has demonstrated to be Loewe additivity which originated the so-called Concentration Addition (CA) additivity model. In pharmacology, the search for interactions or deviations from additivity (synergism and antagonism) has similarly captured the attention of researchers over the last 20 years and has resulted in the definition and application of the Combination Index (CI) Theorem. CI is based on Loewe additivity, but focused on the identification and quantification of synergism and antagonism. Despite additive models demonstrating a surprisingly good predictive power in chemical mixture risk assessment, concerns still exist due to the occurrence of unpredictable synergism or antagonism in certain experimental situations. In the present work, we summarize the parallel history of development of CA, IA, and CI models. We also summarize the applicability of these concepts in ecotoxicology and how their information may be integrated, as well as the possibility of prediction of synergism. Inside the box, the main question remaining is whether it is worthy to consider departures from additivity in mixture risk assessment and how to predict interactions among certain mixture components. Outside the box, the main question is whether the results observed under the experimental constraints imposed by fractional approaches are a de fide reflection of what it would be expected from chemical mixtures in real world circumstances. PMID:29051468

  14. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    PubMed

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and responsive nature of the applications described has the potential to allow complex environmental social and political considerations to be incorporated and visualised. Through supporting evidence-based planning the innovative modelling practices described have the potential to help local health and emergency response planning in the developing world.

  15. An Open Source Framework for Coupled Hydro-Hydrogeo-Chemical Systems in Catchment Research

    NASA Astrophysics Data System (ADS)

    Delfs, J.; Sachse, A.; Gayler, S.; Grathwohl, P.; He, W.; Jang, E.; Kalbacher, T.; Klein, C.; Kolditz, O.; Maier, U.; Priesack, E.; Rink, K.; Selle, B.; Shao, H.; Singh, A. K.; Streck, T.; Sun, Y.; Wang, W.; Walther, M.

    2013-12-01

    This poster presents an open-source framework designed to assist water scientists in the study of catchment hydraulic functions with associated chemical processes, e.g. contaminant degradation, plant nutrient turnover. The model successfully calculates the feedbacks between surface water, subsurface water and air in standard benchmarks. In specific model applications to heterogeneous catchments, subsurface water is driven by density variations and runs through double porous media. Software codes of water science are tightly coupled by iteration, namely the Storm Water Management Model (SWMM) for urban runoff, Expert-N for simulating water fluxes and nutrient turnover in agricultural and forested soils, and OpenGeoSys (OGS) for groundwater. The coupled model calculates flow of hydrostatic shallow water over the land surface with finite volume and difference methods. The flow equations for water in the porous subsurface are discretized in space with finite elements. Chemical components are transferred through 1D, 2D or 3D watershed representations with advection-dispersion solvers or, as an alternative, random walk particle tracking. A transport solver can be in sequence with a chemical solver, e.g. PHREEQ-C, BRNS, additionally. Besides coupled partial differential equations, the concept of hydrological response units is employed in simulations at regional scale with scarce data availability. In this case, a conceptual hydrological model, specifically the Jena Adaptable Modeling System (JAMS), passes groundwater recharge through a software interface into OGS, which solves the partial differential equations of groundwater flow. Most components of the modeling framework are open source and can be modified for individual purposes. Applications range from temperate climate regions in Germany (Ammer catchment and Hessian Ried) to arid regions in the Middle East (Oman and Dead See). Some of the presented examples originate from intensively monitored research sites of the WESS research centre and the monitoring initiative TERENO. Other examples originate from the IWAS project on integrated water resources management. The model applications are primarily concerned with groundwater resources, which are endangered by overexploitation, intrusion of saltwater, and nitrate loads.

  16. Personalized mitral valve closure computation and uncertainty analysis from 3D echocardiography.

    PubMed

    Grbic, Sasa; Easley, Thomas F; Mansi, Tommaso; Bloodworth, Charles H; Pierce, Eric L; Voigt, Ingmar; Neumann, Dominik; Krebs, Julian; Yuh, David D; Jensen, Morten O; Comaniciu, Dorin; Yoganathan, Ajit P

    2017-01-01

    Intervention planning is essential for successful Mitral Valve (MV) repair procedures. Finite-element models (FEM) of the MV could be used to achieve this goal, but the translation to the clinical domain is challenging. Many input parameters for the FEM models, such as tissue properties, are not known. In addition, only simplified MV geometry models can be extracted from non-invasive modalities such as echocardiography imaging, lacking major anatomical details such as the complex chordae topology. A traditional approach for FEM computation is to use a simplified model (also known as parachute model) of the chordae topology, which connects the papillary muscle tips to the free-edges and select basal points. Building on the existing parachute model a new and comprehensive MV model was developed that utilizes a novel chordae representation capable of approximating regional connectivity. In addition, a fully automated personalization approach was developed for the chordae rest length, removing the need for tedious manual parameter selection. Based on the MV model extracted during mid-diastole (open MV) the MV geometric configuration at peak systole (closed MV) was computed according to the FEM model. In this work the focus was placed on validating MV closure computation. The method is evaluated on ten in vitro ovine cases, where in addition to echocardiography imaging, high-resolution μCT imaging is available for accurate validation. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. 78 FR 79498 - Notice Pursuant to the National Cooperative Research and Production Act of 1993-OpenDaylight...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-30

    ... Production Act of 1993--OpenDaylight Project, Inc. Notice is hereby given that, on November 13, 2013.... 4301 et seq. (``the Act''), OpenDaylight Project, Inc. (``OpenDaylight'') has filed written.... Membership in this group research project remains open, and OpenDaylight intends to file additional written...

  18. 47 CFR 76.1513 - Open video dispute resolution.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Open video dispute resolution. 76.1513 Section... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1513 Open video dispute resolution. (a... with the following additions or changes. (b) Alternate dispute resolution. An open video system...

  19. 47 CFR 76.1513 - Open video dispute resolution.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Open video dispute resolution. 76.1513 Section... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1513 Open video dispute resolution. (a... with the following additions or changes. (b) Alternate dispute resolution. An open video system...

  20. 47 CFR 76.1513 - Open video dispute resolution.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Open video dispute resolution. 76.1513 Section... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1513 Open video dispute resolution. (a... with the following additions or changes. (b) Alternate dispute resolution. An open video system...

  1. 47 CFR 76.1513 - Open video dispute resolution.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Open video dispute resolution. 76.1513 Section... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1513 Open video dispute resolution. (a... with the following additions or changes. (b) Alternate dispute resolution. An open video system...

  2. 47 CFR 76.1513 - Open video dispute resolution.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Open video dispute resolution. 76.1513 Section... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1513 Open video dispute resolution. (a... with the following additions or changes. (b) Alternate dispute resolution. An open video system...

  3. Local and landscape scale factors influencing edge effects on woodland salamanders.

    PubMed

    Moseley, Kurtis R; Ford, W Mark; Edwards, John W

    2009-04-01

    We examined local and landscape-scale variable influence on the depth and magnitude of edge effects on woodland salamanders in mature mixed mesophytic and northern hardwood forest adjacent to natural gas well sites maintained as wildlife openings. We surveyed woodland salamander occurrence from June-August 2006 at 33 gas well sites in the Monongahela National Forest, West Virginia. We used an information-theoretic approach to test nine a priori models explaining landscape-scale effects on woodland salamander capture proportion within 20 m of field edge. Salamander capture proportion was greater within 0-60 m than 61-100 m of field edges. Similarly, available coarse woody debris proportion was greater within 0-60 m than 61-100 m of field edge. Our ASPECT model, that incorporated the single variable aspect, received the strongest support for explaining landscape-scale effects on salamander capture proportion within 20 m of opening edge. The ASPECT model indicated that fewer salamanders occurred within 20 m of opening edges on drier, hotter southwestern aspects than in moister, cooler northeastern aspects. Our results suggest that forest habitat adjacent to maintained edges and with sufficient cover still can provide suitable habitat for woodland salamander species in central Appalachian mixed mesophytic and northern hardwood forests. Additionally, our modeling results support the contention that edge effects are more severe on southwesterly aspects. These results underscore the importance of distinguishing among different edge types as well as placing survey locations within a landscape context when investigating edge impacts on woodland salamanders.

  4. Model Experiment on the Temporary Closure of a Breached Bank

    NASA Astrophysics Data System (ADS)

    Shimada, T.; Maeda, S.; Nakashima, Y.

    2016-12-01

    In recent years, the possibility of river bank failures has been rising due to increased occurrences of floods from localized torrential downpours and typhoons. To mitigate bank failure damage, we made an experiment to simulate the flood discharge reduction effect of a temporary closure at an opening in a breached bank. A scale river model was used. A bank was made and then breached. Then, model blocks were placed to close the breach, to observe the flood discharge reduction afforded by the closure. We assumed that the blocks would be placed by a crane or from a helicopter, so we placed the model blocks accordingly. Regardless of the placement method, the flood discharge reduction was about 20% when about 50% of the breach was closed by the placement of blocks starting from the upstream-most portion of the breach. That result was because the water flow hit the tip of the placed closure, scoured the bed near the tip, and lowered the bed at the remaining part of the breach opening, after which the area where water flows out did not decrease at the same rate as the rate of longitudinal closure for the breach. In addition, with each successive length of breach closure, the required number of blocks increased and the closure progress decreased, because of the bed degradation. The results show that it is possible to reduce the flood flow from a bank breach effectively while closing the opening by taking measures to reduce bed scouring near the breach.

  5. SedInConnect: a stand-alone, free and open source tool for the assessment of sediment connectivity

    NASA Astrophysics Data System (ADS)

    Crema, Stefano; Cavalli, Marco

    2018-02-01

    There is a growing call, within the scientific community, for solid theoretic frameworks and usable indices/models to assess sediment connectivity. Connectivity plays a significant role in characterizing structural properties of the landscape and, when considered in combination with forcing processes (e.g., rainfall-runoff modelling), can represent a valuable analysis for an improved landscape management. In this work, the authors present the development and application of SedInConnect: a free, open source and stand-alone application for the computation of the Index of Connectivity (IC), as expressed in Cavalli et al. (2013) with the addition of specific innovative features. The tool is intended to have a wide variety of users, both from the scientific community and from the authorities involved in the environmental planning. Thanks to its open source nature, the tool can be adapted and/or integrated according to the users' requirements. Furthermore, presenting an easy-to-use interface and being a stand-alone application, the tool can help management experts in the quantitative assessment of sediment connectivity in the context of hazard and risk assessment. An application to a sample dataset and an overview on up-to-date applications of the approach and of the tool shows the development potential of such analyses. The modelled connectivity, in fact, appears suitable not only to characterize sediment dynamics at the catchment scale but also to integrate prediction models and as a tool for helping geomorphological interpretation.

  6. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Combining the GW formalism with the polarizable continuum model: A state-specific non-equilibrium approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duchemin, Ivan, E-mail: ivan.duchemin@cea.fr; Jacquemin, Denis; Institut Universitaire de France, 1 rue Descartes, 75005 Paris Cedex 5

    We have implemented the polarizable continuum model within the framework of the many-body Green’s function GW formalism for the calculation of electron addition and removal energies in solution. The present formalism includes both ground-state and non-equilibrium polarization effects. In addition, the polarization energies are state-specific, allowing to obtain the bath-induced renormalisation energy of all occupied and virtual energy levels. Our implementation is validated by comparisons with ΔSCF calculations performed at both the density functional theory and coupled-cluster single and double levels for solvated nucleobases. The present study opens the way to GW and Bethe-Salpeter calculations in disordered condensed phases ofmore » interest in organic optoelectronics, wet chemistry, and biology.« less

  8. The impact of hydrophobic hernia mesh coating by omega fatty acid on atraumatic fibrin sealant fixation.

    PubMed

    Gruber-Blum, S; Brand, J; Keibl, C; Redl, H; Fortelny, R H; May, C; Petter-Puchner, A H

    2015-08-01

    Fibrin sealant (FS) is a safe and efficient fixation method in open intraperitoneal hernia repair. While favourable results have been achieved with hydrophilic meshes, hydrophobic (such as Omega fatty acid coated) meshes (OFM) have not been specifically assessed so far. Atrium C-qur lite(®) mesh was tested in rats in models of open onlay and intraperitoneal hernia repair. 44 meshes (2 × 2 cm) were implanted in 30 male Sprague-Dawley rats in open (n = 2 meshes per animal) and intraperitoneal technique (IPOM; n = 1 mesh per animal). Animals were randomised to four groups: onlay and IPOM sutured vs. sealed. Follow-up was 6 weeks, sutured groups serving as controls. Evaluation criteria were mesh dislocation, adhesions and foreign body reaction. FS provided a reliable fixation in onlay technique, whereas OFM meshes dislocated in the IPOM position when sealed only. FS mesh fixation was safe with OFM meshes in open onlay repair. Intraperitoneal placement of hydrophobic meshes requires additional fixation and cannot be achieved with FS alone.

  9. Molecular Dynamics of Mouse Acetylcholinesterase Complexed with Huperzine A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tara, Sylvia; Helms, Volkhard H.; Straatsma, TP

    1999-03-16

    Two molecular dynamics simulations were performed for a modeled complex of mouse acetylcholinesterase liganded with huperzine A (HupA). Analysis of these simulations shows that HupA shifts in the active site toward Tyr 337 and Phe 338, and that several residues in the active site area reach out to make hydrogen bonds with the inhibitor. Rapid fluctuations of the gorge width are observed, ranging from widths that allow substrate access to the active site, to pinched structures that do not allow access of molecules as small as water. Additional openings or channels to the active site are found. One opening ismore » formed in the side wall of the active site gorge by residues Val 73, Asp 74, Thr 83, Glu 84, and Asn 87. Another opening is formed at the base of the gorge by residues Trp 86, Val 132, Glu 202, Gly 448, and Ile 451. Both of these openings have been observed separately in the Torpedo californica form of the enzyme. These channels could allow transport of waters and ions to and from the bulk solution.« less

  10. Automatic threshold selection for multi-class open set recognition

    NASA Astrophysics Data System (ADS)

    Scherreik, Matthew; Rigling, Brian

    2017-05-01

    Multi-class open set recognition is the problem of supervised classification with additional unknown classes encountered after a model has been trained. An open set classifer often has two core components. The first component is a base classifier which estimates the most likely class of a given example. The second component consists of open set logic which estimates if the example is truly a member of the candidate class. Such a system is operated in a feed-forward fashion. That is, a candidate label is first estimated by the base classifier, and the true membership of the example to the candidate class is estimated afterward. Previous works have developed an iterative threshold selection algorithm for rejecting examples from classes which were not present at training time. In those studies, a Platt-calibrated SVM was used as the base classifier, and the thresholds were applied to class posterior probabilities for rejection. In this work, we investigate the effectiveness of other base classifiers when paired with the threshold selection algorithm and compare their performance with the original SVM solution.

  11. Cloud regimes as phase transitions

    NASA Astrophysics Data System (ADS)

    Stechmann, Samuel; Hottovy, Scott

    2017-11-01

    Clouds are repeatedly identified as a leading source of uncertainty in future climate predictions. Of particular importance are stratocumulus clouds, which can appear as either (i) closed cells that reflect solar radiation back to space or (ii) open cells that allow solar radiation to reach the Earth's surface. Here we show that these clouds regimes - open versus closed cells - fit the paradigm of a phase transition. In addition, this paradigm characterizes pockets of open cells (POCs) as the interface between the open- and closed-cell regimes, and it identifies shallow cumulus clouds as a regime of higher variability. This behavior can be understood using an idealized model for the dynamics of atmospheric water as a stochastic diffusion process. Similar viewpoints of deep convection and self-organized criticality will also be discussed. With these new conceptual viewpoints, ideas from statistical mechanics could potentially be used for understanding uncertainties related to clouds in the climate system and climate predictions. The research of S.N.S. is partially supported by a Sloan Research Fellowship, ONR Young Investigator Award N00014-12-1-0744, and ONR MURI Grant N00014-12-1-0912.

  12. Additively manufactured porous tantalum implants.

    PubMed

    Wauthle, Ruben; van der Stok, Johan; Amin Yavari, Saber; Van Humbeeck, Jan; Kruth, Jean-Pierre; Zadpoor, Amir Abbas; Weinans, Harrie; Mulier, Michiel; Schrooten, Jan

    2015-03-01

    The medical device industry's interest in open porous, metallic biomaterials has increased in response to additive manufacturing techniques enabling the production of complex shapes that cannot be produced with conventional techniques. Tantalum is an important metal for medical devices because of its good biocompatibility. In this study selective laser melting technology was used for the first time to manufacture highly porous pure tantalum implants with fully interconnected open pores. The architecture of the porous structure in combination with the material properties of tantalum result in mechanical properties close to those of human bone and allow for bone ingrowth. The bone regeneration performance of the porous tantalum was evaluated in vivo using an orthotopic load-bearing bone defect model in the rat femur. After 12 weeks, substantial bone ingrowth, good quality of the regenerated bone and a strong, functional implant-bone interface connection were observed. Compared to identical porous Ti-6Al-4V structures, laser-melted tantalum shows excellent osteoconductive properties, has a higher normalized fatigue strength and allows for more plastic deformation due to its high ductility. It is therefore concluded that this is a first step towards a new generation of open porous tantalum implants manufactured using selective laser melting. Copyright © 2014 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  13. A Framework for Daylighting Optimization in Whole Buildings with OpenStudio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2016-08-12

    We present a toolkit and workflow for leveraging the OpenStudio (Guglielmetti et al. 2010) platform to perform daylighting analysis and optimization in a whole building energy modeling (BEM) context. We have re-implemented OpenStudio's integrated Radiance and EnergyPlus functionality as an OpenStudio Measure. The OpenStudio Radiance Measure works within the OpenStudio Application and Parametric Analysis Tool, as well as the OpenStudio Server large scale analysis framework, allowing a rigorous daylighting simulation to be performed on a single building model or potentially an entire population of programmatically generated models. The Radiance simulation results can automatically inform the broader building energy model, andmore » provide dynamic daylight metrics as a basis for decision. Through introduction and example, this paper illustrates the utility of the OpenStudio building energy modeling platform to leverage existing simulation tools for integrated building energy performance simulation, daylighting analysis, and reportage.« less

  14. Analysis of the process of representing clinical statements for decision-support applications: a comparison of openEHR archetypes and HL7 virtual medical record.

    PubMed

    González-Ferrer, A; Peleg, M; Marcos, M; Maldonado, J A

    2016-07-01

    Delivering patient-specific decision-support based on computer-interpretable guidelines (CIGs) requires mapping CIG clinical statements (data items, clinical recommendations) into patients' data. This is most effectively done via intermediate data schemas, which enable querying the data according to the semantics of a shared standard intermediate schema. This study aims to evaluate the use of HL7 virtual medical record (vMR) and openEHR archetypes as intermediate schemas for capturing clinical statements from CIGs that are mappable to electronic health records (EHRs) containing patient data and patient-specific recommendations. Using qualitative research methods, we analyzed the encoding of ten representative clinical statements taken from two CIGs used in real decision-support systems into two health information models (openEHR archetypes and HL7 vMR instances) by four experienced informaticians. Discussion among the modelers about each case study example greatly increased our understanding of the capabilities of these standards, which we share in this educational paper. Differing in content and structure, the openEHR archetypes were found to contain a greater level of representational detail and structure while the vMR representations took fewer steps to complete. The use of openEHR in the encoding of CIG clinical statements could potentially facilitate applications other than decision-support, including intelligent data analysis and integration of additional properties of data items from existing EHRs. On the other hand, due to their smaller size and fewer details, the use of vMR potentially supports quicker mapping of EHR data into clinical statements.

  15. Open access of evidence-based publications: the case of the orthopedic and musculoskeletal literature.

    PubMed

    Yammine, Kaissar

    2015-11-01

    The open access model, where researchers can publish their work and make it freely available to the whole medical community, is gaining ground over the traditional type of publication. However, fees are to be paid by either the authors or their institutions. The purpose of this paper is to assess the proportion and type of open access evidence-based articles in the form of systematic reviews and meta-analyses in the field of musculoskeletal disorders and orthopedic surgery. PubMed database was searched and the results showed a maximal number of hits for low back pain and total hip arthroplasty. We demonstrated that despite a 10-fold increase in the number of evidence-based publications in the past 10 years, the rate of free systematic reviews in the general biomedical literature did not change for the last two decades. In addition, the average percentage of free open access systematic reviews and meta-analyses for the commonest painful musculoskeletal conditions and orthopedic procedures was 20% and 18%, respectively. Those results were significantly lower than those of the systematic reviews and meta-analyses in the remaining biomedical research. Such findings could indicate a divergence between the efforts engaged at promoting evidence-based principles and those at disseminating evidence-based findings in the field of musculoskeletal disease and trauma. The high processing fee is thought to be a major limitation when considering open access model for publication. © 2015 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.

  16. Further Development of Verification Check-Cases for Six- Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Madden, Michael M.; Shelton, Robert; Jackson, A. A.; Castro, Manuel P.; Noble, Deleena M.; Zimmerman, Curtis J.; Shidner, Jeremy D.; White, Joseph P.; Dutta, Doumyo; hide

    2015-01-01

    This follow-on paper describes the principal methods of implementing, and documents the results of exercising, a set of six-degree-of-freedom rigid-body equations of motion and planetary geodetic, gravitation and atmospheric models for simple vehicles in a variety of endo- and exo-atmospheric conditions with various NASA, and one popular open-source, engineering simulation tools. This effort is intended to provide an additional means of verification of flight simulations. The models used in this comparison, as well as the resulting time-history trajectory data, are available electronically for persons and organizations wishing to compare their flight simulation implementations of the same models.

  17. Impossibility of Classically Simulating One-Clean-Qubit Model with Multiplicative Error

    NASA Astrophysics Data System (ADS)

    Fujii, Keisuke; Kobayashi, Hirotada; Morimae, Tomoyuki; Nishimura, Harumichi; Tamate, Shuhei; Tani, Seiichiro

    2018-05-01

    The one-clean-qubit model (or the deterministic quantum computation with one quantum bit model) is a restricted model of quantum computing where all but a single input qubits are maximally mixed. It is known that the probability distribution of measurement results on three output qubits of the one-clean-qubit model cannot be classically efficiently sampled within a constant multiplicative error unless the polynomial-time hierarchy collapses to the third level [T. Morimae, K. Fujii, and J. F. Fitzsimons, Phys. Rev. Lett. 112, 130502 (2014), 10.1103/PhysRevLett.112.130502]. It was open whether we can keep the no-go result while reducing the number of output qubits from three to one. Here, we solve the open problem affirmatively. We also show that the third-level collapse of the polynomial-time hierarchy can be strengthened to the second-level one. The strengthening of the collapse level from the third to the second also holds for other subuniversal models such as the instantaneous quantum polynomial model [M. Bremner, R. Jozsa, and D. J. Shepherd, Proc. R. Soc. A 467, 459 (2011), 10.1098/rspa.2010.0301] and the boson sampling model [S. Aaronson and A. Arkhipov, STOC 2011, p. 333]. We additionally study the classical simulatability of the one-clean-qubit model with further restrictions on the circuit depth or the gate types.

  18. Numerical Modeling of Trinity River Shoaling below Wallisville, Texas

    DTIC Science & Technology

    2015-02-01

    levees , the hydraulic deltaic process of finding the most efficient pathway to open water controls the flow direction and speed. Additionally, changes...events to allow flow to pass through the structures. During the dry season the structures are normally closed to control salt water intrusion. The... levees and natural ridges, which have low spots and channels that have incised from previous floods. Second, once the flood waters are outside the

  19. Additions and improvements to the high energy density physics capabilities in the FLASH code

    NASA Astrophysics Data System (ADS)

    Lamb, D.; Bogale, A.; Feister, S.; Flocke, N.; Graziani, C.; Khiar, B.; Laune, J.; Tzeferacos, P.; Walker, C.; Weide, K.

    2017-10-01

    FLASH is an open-source, finite-volume Eulerian, spatially-adaptive radiation magnetohydrodynamics code that has the capabilities to treat a broad range of physical processes. FLASH performs well on a wide range of computer architectures, and has a broad user base. Extensive high energy density physics (HEDP) capabilities exist in FLASH, which make it a powerful open toolset for the academic HEDP community. We summarize these capabilities, emphasizing recent additions and improvements. We describe several non-ideal MHD capabilities that are being added to FLASH, including the Hall and Nernst effects, implicit resistivity, and a circuit model, which will allow modeling of Z-pinch experiments. We showcase the ability of FLASH to simulate Thomson scattering polarimetry, which measures Faraday due to the presence of magnetic fields, as well as proton radiography, proton self-emission, and Thomson scattering diagnostics. Finally, we describe several collaborations with the academic HEDP community in which FLASH simulations were used to design and interpret HEDP experiments. This work was supported in part at U. Chicago by DOE NNSA ASC through the Argonne Institute for Computing in Science under FWP 57789; DOE NNSA under NLUF Grant DE-NA0002724; DOE SC OFES Grant DE-SC0016566; and NSF Grant PHY-1619573.

  20. JASPAR 2010: the greatly expanded open-access database of transcription factor binding profiles

    PubMed Central

    Portales-Casamar, Elodie; Thongjuea, Supat; Kwon, Andrew T.; Arenillas, David; Zhao, Xiaobei; Valen, Eivind; Yusuf, Dimas; Lenhard, Boris; Wasserman, Wyeth W.; Sandelin, Albin

    2010-01-01

    JASPAR (http://jaspar.genereg.net) is the leading open-access database of matrix profiles describing the DNA-binding patterns of transcription factors (TFs) and other proteins interacting with DNA in a sequence-specific manner. Its fourth major release is the largest expansion of the core database to date: the database now holds 457 non-redundant, curated profiles. The new entries include the first batch of profiles derived from ChIP-seq and ChIP-chip whole-genome binding experiments, and 177 yeast TF binding profiles. The introduction of a yeast division brings the convenience of JASPAR to an active research community. As binding models are refined by newer data, the JASPAR database now uses versioning of matrices: in this release, 12% of the older models were updated to improved versions. Classification of TF families has been improved by adopting a new DNA-binding domain nomenclature. A curated catalog of mammalian TFs is provided, extending the use of the JASPAR profiles to additional TFs belonging to the same structural family. The changes in the database set the system ready for more rapid acquisition of new high-throughput data sources. Additionally, three new special collections provide matrix profile data produced by recent alternative high-throughput approaches. PMID:19906716

  1. A one-dimensional heat-transport model for conduit flow in karst aquifers

    USGS Publications Warehouse

    Long, Andrew J.; Gilcrease, P.C.

    2009-01-01

    A one-dimensional heat-transport model for conduit flow in karst aquifers is presented as an alternative to two or three-dimensional distributed-parameter models, which are data intensive and require knowledge of conduit locations. This model can be applied for cases where water temperature in a well or spring receives all or part of its water from a phreatic conduit. Heat transport in the conduit is simulated by using a physically-based heat-transport equation that accounts for inflow of diffuse flow from smaller openings and fissures in the surrounding aquifer during periods of low recharge. Additional diffuse flow that is within the zone of influence of the well or spring but has not interacted with the conduit is accounted for with a binary mixing equation to proportion these different water sources. The estimation of this proportion through inverse modeling is useful for the assessment of contaminant vulnerability and well-head or spring protection. The model was applied to 7 months of continuous temperature data for a sinking stream that recharges a conduit and a pumped well open to the Madison aquifer in western South Dakota. The simulated conduit-flow fraction to the well ranged from 2% to 31% of total flow, and simulated conduit velocity ranged from 44 to 353 m/d.

  2. LakeMetabolizer: An R package for estimating lake metabolism from free-water oxygen using diverse statistical models

    USGS Publications Warehouse

    Winslow, Luke; Zwart, Jacob A.; Batt, Ryan D.; Dugan, Hilary; Woolway, R. Iestyn; Corman, Jessica; Hanson, Paul C.; Read, Jordan S.

    2016-01-01

    Metabolism is a fundamental process in ecosystems that crosses multiple scales of organization from individual organisms to whole ecosystems. To improve sharing and reuse of published metabolism models, we developed LakeMetabolizer, an R package for estimating lake metabolism from in situ time series of dissolved oxygen, water temperature, and, optionally, additional environmental variables. LakeMetabolizer implements 5 different metabolism models with diverse statistical underpinnings: bookkeeping, ordinary least squares, maximum likelihood, Kalman filter, and Bayesian. Each of these 5 metabolism models can be combined with 1 of 7 models for computing the coefficient of gas exchange across the air–water interface (k). LakeMetabolizer also features a variety of supporting functions that compute conversions and implement calculations commonly applied to raw data prior to estimating metabolism (e.g., oxygen saturation and optical conversion models). These tools have been organized into an R package that contains example data, example use-cases, and function documentation. The release package version is available on the Comprehensive R Archive Network (CRAN), and the full open-source GPL-licensed code is freely available for examination and extension online. With this unified, open-source, and freely available package, we hope to improve access and facilitate the application of metabolism in studies and management of lentic ecosystems.

  3. Simulating wind and marine hydrokinetic turbines with actuator lines in RANS and LES

    NASA Astrophysics Data System (ADS)

    Bachant, Peter; Wosnik, Martin

    2015-11-01

    As wind and marine hydrokinetic (MHK) turbine designs mature, focus is shifting towards improving turbine array layouts for maximizing overall power output, i.e., minimizing wake interference for axial-flow or horizontal-axis turbines, or taking advantage of constructive wake interaction for cross-flow or vertical-axis turbines. Towards this goal, an actuator line model (ALM) was developed to provide a computationally feasible method for simulating full turbine arrays inside Navier-Stokes models. The ALM predicts turbine loading with the blade element method combined with sub-models for dynamic stall and flow curvature. The open-source software is written as an extension library for the OpenFOAM CFD package, which allows the ALM body force to be applied to their standard RANS and LES solvers. Turbine forcing is also applied to volume of fluid (VOF) models, e.g., for predicting free surface effects on submerged MHK devices. An additional sub-model is considered for injecting turbulence model scalar quantities based on actuator line element loading. Results are presented for the simulation of performance and wake dynamics of axial- and cross-flow turbines and compared with moderate Reynolds number experiments and body-fitted mesh, blade-resolving CFD. Work supported by NSF-CBET grant 1150797.

  4. Theoretical modeling of the MILES hit profiles in military weapon low-data rate simulators

    NASA Astrophysics Data System (ADS)

    Andrews, L. C.; Phillips, R. L.; Smith, C. A.; Belichki, S. B.; Crabbs, R.; Cofarro, J. T.; Fountain, W.; Tucker, F. M.; Parrish, B. J.

    2016-09-01

    Math modeling of a low-data-rate optical communication system is presented and compared with recent testing results over ranges up to 100 m in an indoor tunnel at Kennedy Space Center. Additional modeling of outdoor testing results at longer ranges in the open atmosphere is also presented. The system modeled is the Army's Multiple Integrated Laser Engagement System (MILES) that has been used as a tactical training system since the early 1980s. The objective of the current modeling and testing is to obtain target hit zone profiles for the M16A2/M4 rifles and establish a data baseline for MILES that will aid in its upgrade using more recently developed lasers and detectors.

  5. Temporal Decompostion of a Distribution System Quasi-Static Time-Series Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, Barry A; Hunsberger, Randolph J

    This paper documents the first phase of an investigation into reducing runtimes of complex OpenDSS models through parallelization. As the method seems promising, future work will quantify - and further mitigate - errors arising from this process. In this initial report, we demonstrate how, through the use of temporal decomposition, the run times of a complex distribution-system-level quasi-static time series simulation can be reduced roughly proportional to the level of parallelization. Using this method, the monolithic model runtime of 51 hours was reduced to a minimum of about 90 minutes. As expected, this comes at the expense of control- andmore » voltage-errors at the time-slice boundaries. All evaluations were performed using a real distribution circuit model with the addition of 50 PV systems - representing a mock complex PV impact study. We are able to reduce induced transition errors through the addition of controls initialization, though small errors persist. The time savings with parallelization are so significant that we feel additional investigation to reduce control errors is warranted.« less

  6. Cations SkQ1 and MitoQ accumulated in mitochondria delay opening of ascorbate/FeSO4-induced nonspecific pore in the inner mitochondrial membrane.

    PubMed

    Khailova, L S; Dedukhova, V I; Mokhova, E N

    2008-10-01

    It is known that an addition of FeSO4 in the presence of ascorbic acid to cells or mitochondria can injure energy coupling and some other functions in mitochondria. The present study demonstrates that decrease in ascorbate concentration from 4 to 0.2 mM in the presence of the same low concentrations of FeSO4 accelerates the nonspecific pore opening, while cyclosporin A prevents and under some conditions reverses the pore opening. Hydrophobic cations SkQ1 and MitoQ (structural analogs of plastoquinone and coenzyme Q(10), respectively) delay pore opening, SkQ1 being more efficient. It is known that an increase in matrix ADP concentration delays pore opening, while an addition of carboxyatractylate to mitochondria accelerates the beginning of pore opening. Preliminary addition of SkQ1 into a mitochondrial suspension increased the effect of ADP and decreased the effect of carboxyatractylate. These results suggest that under the conditions used SkQ1 protects mitochondria from oxidative damage as an antioxidant when added at extremely low concentrations.

  7. 50 CFR 660.311 - Open access fishery-definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Open access fishery-definitions. 660.311... Groundfish-Open Access Fisheries § 660.311 Open access fishery—definitions. General definitions for the... specific to the open access fishery covered in this subpart and are in addition to those specified at § 660...

  8. 50 CFR 660.311 - Open access fishery-definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Open access fishery-definitions. 660.311... Groundfish-Open Access Fisheries § 660.311 Open access fishery—definitions. General definitions for the... specific to the open access fishery covered in this subpart and are in addition to those specified at § 660...

  9. 50 CFR 660.311 - Open access fishery-definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Open access fishery-definitions. 660.311... Groundfish-Open Access Fisheries § 660.311 Open access fishery—definitions. General definitions for the... specific to the open access fishery covered in this subpart and are in addition to those specified at § 660...

  10. 50 CFR 660.311 - Open access fishery-definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Open access fishery-definitions. 660.311... Groundfish-Open Access Fisheries § 660.311 Open access fishery—definitions. General definitions for the... specific to the open access fishery covered in this subpart and are in addition to those specified at § 660...

  11. 50 CFR 660.311 - Open access fishery-definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 50 Wildlife and Fisheries 13 2014-10-01 2014-10-01 false Open access fishery-definitions. 660.311... Groundfish-Open Access Fisheries § 660.311 Open access fishery—definitions. General definitions for the... specific to the open access fishery covered in this subpart and are in addition to those specified at § 660...

  12. Modelling the Composition of Outgassing Bubbles at Basaltic Open Vent Volcanoes

    NASA Astrophysics Data System (ADS)

    Edmonds, M.; Clements, N.; Houghton, B. F.; Oppenheimer, C.; Jones, R. L.; Burton, M. R.

    2015-12-01

    Basaltic open vent volcanoes exhibit a wide range in eruption styles, from passive outgassing to Strombolian and Hawaiian explosive activity. Transitions between these styles are linked to contrasting two-phase (melt and gas) flow regimes in the conduit system. A wealth of data now exists characterising the fluxes and compositions of gases emitted from these volcanoes, alongside detailed observations of patterns of outgassing at the magma free surfaces. Complex variations in gas composition are apparent from high temporal resolution measurement techniques such as open path spectroscopy. This variability with time is likely a function of individual bubbles' histories of growth during ascent, with variable degrees of kinetic inhibition. Our previous studies at Kilauea and Stromboli have, for example, linked CO2-rich gases with the bursting of bubbles that last equilibrated at some depth beneath the surface. However, very few studies have attempted to reconcile such observations with quantitative models of diffusion-limited bubble growth in magmas prior to eruption. We present here an analytical model that simulates the growth of populations of bubbles by addition of volatile mass during decompression, with growth limited by diffusion. The model simulates a range of behaviors between the end members of separated two-phase flow and homogeneous bubbly flow in the conduit, tied to thermodynamic models of solubility and partitioning of volatile species (carbon, water, sulfur). We explore the effects of the form of bubble populations at depth, melt viscosity, total volatile content, magma decompression rate and other intrinsic parameters on expected gas compositions at the surface and consider implications for transitions between eruption styles. We compare the the model to data suites from Stromboli and Kilauea.

  13. Aeroacoustic Study of a High-Fidelity Aircraft Model: Part 1- Steady Aerodynamic Measurements

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Hannon, Judith A.; Neuhart, Danny H.; Markowski, Gregory A.; VandeVen, Thomas

    2012-01-01

    In this paper, we present steady aerodynamic measurements for an 18% scale model of a Gulfstream air-craft. The high fidelity and highly-instrumented semi-span model was developed to perform detailed aeroacoustic studies of airframe noise associated with main landing gear/flap components and gear-flap interaction noise, as well as to evaluate novel noise reduction concepts. The aeroacoustic tests, being conducted in the NASA Langley Research Center 14- by 22-Foot Subsonic Tunnel, are split into two entries. The first entry, completed November 2010, was entirely devoted to the detailed mapping of the aerodynamic characteristics of the fabricated model. Flap deflections of 39?, 20?, and 0? with the main landing gear on and off were tested at Mach numbers of 0.16, 0.20, and 0.24. Additionally, for each flap deflection, the model was tested with the tunnel both in the closed-wall and open-wall (jet) modes. During this first entry, global forces (lift and drag) and extensive steady and unsteady surface pressure measurements were obtained. Preliminary analysis of the measured forces indicates that lift, drag, and stall characteristics compare favorably with Gulfstream?s high Reynolds number flight data. The favorable comparison between wind-tunnel and flight data allows the semi-span model to be used as a test bed for developing/evaluating airframe noise reduction concepts under a relevant environment. Moreover, initial comparison of the aerodynamic measurements obtained with the tunnel in the closed- and open-wall configurations shows similar aerodynamic behavior. This permits the acoustic and off-surface flow measurements, planned for the second entry, to be conducted with the tunnel in the open-jet mode.

  14. Forecasting Effusive Dynamics and Decompression Rates by Magmastatic Model at Open-vent Volcanoes.

    PubMed

    Ripepe, Maurizio; Pistolesi, Marco; Coppola, Diego; Delle Donne, Dario; Genco, Riccardo; Lacanna, Giorgio; Laiolo, Marco; Marchetti, Emanuele; Ulivieri, Giacomo; Valade, Sébastien

    2017-06-20

    Effusive eruptions at open-conduit volcanoes are interpreted as reactions to a disequilibrium induced by the increase in magma supply. By comparing four of the most recent effusive eruptions at Stromboli volcano (Italy), we show how the volumes of lava discharged during each eruption are linearly correlated to the topographic positions of the effusive vents. This correlation cannot be explained by an excess of pressure within a deep magma chamber and raises questions about the actual contributions of deep magma dynamics. We derive a general model based on the discharge of a shallow reservoir and the magmastatic crustal load above the vent, to explain the linear link. In addition, we show how the drastic transition from effusive to violent explosions can be related to different decompression rates. We suggest that a gravity-driven model can shed light on similar cases of lateral effusive eruptions in other volcanic systems and can provide evidence of the roles of slow decompression rates in triggering violent paroxysmal explosive eruptions, which occasionally punctuate the effusive phases at basaltic volcanoes.

  15. Online Statistical Modeling (Regression Analysis) for Independent Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  16. An open annotation ontology for science on web 3.0

    PubMed Central

    2011-01-01

    Background There is currently a gap between the rich and expressive collection of published biomedical ontologies, and the natural language expression of biomedical papers consumed on a daily basis by scientific researchers. The purpose of this paper is to provide an open, shareable structure for dynamic integration of biomedical domain ontologies with the scientific document, in the form of an Annotation Ontology (AO), thus closing this gap and enabling application of formal biomedical ontologies directly to the literature as it emerges. Methods Initial requirements for AO were elicited by analysis of integration needs between biomedical web communities, and of needs for representing and integrating results of biomedical text mining. Analysis of strengths and weaknesses of previous efforts in this area was also performed. A series of increasingly refined annotation tools were then developed along with a metadata model in OWL, and deployed for feedback and additional requirements the ontology to users at a major pharmaceutical company and a major academic center. Further requirements and critiques of the model were also elicited through discussions with many colleagues and incorporated into the work. Results This paper presents Annotation Ontology (AO), an open ontology in OWL-DL for annotating scientific documents on the web. AO supports both human and algorithmic content annotation. It enables “stand-off” or independent metadata anchored to specific positions in a web document by any one of several methods. In AO, the document may be annotated but is not required to be under update control of the annotator. AO contains a provenance model to support versioning, and a set model for specifying groups and containers of annotation. AO is freely available under open source license at http://purl.org/ao/, and extensive documentation including screencasts is available on AO’s Google Code page: http://code.google.com/p/annotation-ontology/ . Conclusions The Annotation Ontology meets critical requirements for an open, freely shareable model in OWL, of annotation metadata created against scientific documents on the Web. We believe AO can become a very useful common model for annotation metadata on Web documents, and will enable biomedical domain ontologies to be used quite widely to annotate the scientific literature. Potential collaborators and those with new relevant use cases are invited to contact the authors. PMID:21624159

  17. An open annotation ontology for science on web 3.0.

    PubMed

    Ciccarese, Paolo; Ocana, Marco; Garcia Castro, Leyla Jael; Das, Sudeshna; Clark, Tim

    2011-05-17

    There is currently a gap between the rich and expressive collection of published biomedical ontologies, and the natural language expression of biomedical papers consumed on a daily basis by scientific researchers. The purpose of this paper is to provide an open, shareable structure for dynamic integration of biomedical domain ontologies with the scientific document, in the form of an Annotation Ontology (AO), thus closing this gap and enabling application of formal biomedical ontologies directly to the literature as it emerges. Initial requirements for AO were elicited by analysis of integration needs between biomedical web communities, and of needs for representing and integrating results of biomedical text mining. Analysis of strengths and weaknesses of previous efforts in this area was also performed. A series of increasingly refined annotation tools were then developed along with a metadata model in OWL, and deployed for feedback and additional requirements the ontology to users at a major pharmaceutical company and a major academic center. Further requirements and critiques of the model were also elicited through discussions with many colleagues and incorporated into the work. This paper presents Annotation Ontology (AO), an open ontology in OWL-DL for annotating scientific documents on the web. AO supports both human and algorithmic content annotation. It enables "stand-off" or independent metadata anchored to specific positions in a web document by any one of several methods. In AO, the document may be annotated but is not required to be under update control of the annotator. AO contains a provenance model to support versioning, and a set model for specifying groups and containers of annotation. AO is freely available under open source license at http://purl.org/ao/, and extensive documentation including screencasts is available on AO's Google Code page: http://code.google.com/p/annotation-ontology/ . The Annotation Ontology meets critical requirements for an open, freely shareable model in OWL, of annotation metadata created against scientific documents on the Web. We believe AO can become a very useful common model for annotation metadata on Web documents, and will enable biomedical domain ontologies to be used quite widely to annotate the scientific literature. Potential collaborators and those with new relevant use cases are invited to contact the authors.

  18. A Monte Carlo investigation of contaminant electrons due to a novel in vivo transmission detector.

    PubMed

    Asuni, G; Jensen, J M; McCurdy, B M C

    2011-02-21

    A novel transmission detector (IBA Dosimetry, Germany) developed as an IMRT quality assurance tool, intended for in vivo patient dose measurements, is studied here. The goal of this investigation is to use Monte Carlo techniques to characterize treatment beam parameters in the presence of the detector and to compare to those of a plastic block tray (a frequently used clinical device). Particular attention is paid to the impact of the detector on electron contamination model parameters of two commercial dose calculation algorithms. The linac head together with the COMPASS transmission detector (TRD) was modeled using BEAMnrc code. To understand the effect of the TRD on treatment beams, the contaminant electron fluence, energy spectra, and angular distributions at different SSDs were analyzed for open and non-open (i.e. TRD and block tray) fields. Contaminant electrons in the BEAMnrc simulations were separated according to where they were created. Calculation of surface dose and the evaluation of contributions from contaminant electrons were performed using the DOSXYZnrc user code. The effect of the TRD on contaminant electrons model parameters in Eclipse AAA and Pinnacle(3) dose calculation algorithms was investigated. Comparisons of the fluence of contaminant electrons produced in the non-open fields versus open field show that electrons created in the non-open fields increase at shorter SSD, but most of the electrons at shorter SSD are of low energy with large angular spread. These electrons are out-scattered or absorbed in air and contribute less to surface dose at larger SSD. Calculated surface doses with the block tray are higher than those with the TRD. Contribution of contaminant electrons to dose in the buildup region increases with increasing field size. The additional contribution of electrons to surface dose increases with field size for TRD and block tray. The introduction of the TRD results in a 12% and 15% increase in the Gaussian widths used in the contaminant electron source model of the Eclipse AAA dose algorithm. The off-axis coefficient in the Pinnacle(3) dose calculation algorithm decreases in the presence of TRD compared to without the device. The electron model parameters were modified to reflect the increase in electron contamination with the TRD, a necessary step for accurate beam modeling when using the device.

  19. SiGN-SSM: open source parallel software for estimating gene networks with state space models.

    PubMed

    Tamada, Yoshinori; Yamaguchi, Rui; Imoto, Seiya; Hirose, Osamu; Yoshida, Ryo; Nagasaki, Masao; Miyano, Satoru

    2011-04-15

    SiGN-SSM is an open-source gene network estimation software able to run in parallel on PCs and massively parallel supercomputers. The software estimates a state space model (SSM), that is a statistical dynamic model suitable for analyzing short time and/or replicated time series gene expression profiles. SiGN-SSM implements a novel parameter constraint effective to stabilize the estimated models. Also, by using a supercomputer, it is able to determine the gene network structure by a statistical permutation test in a practical time. SiGN-SSM is applicable not only to analyzing temporal regulatory dependencies between genes, but also to extracting the differentially regulated genes from time series expression profiles. SiGN-SSM is distributed under GNU Affero General Public Licence (GNU AGPL) version 3 and can be downloaded at http://sign.hgc.jp/signssm/. The pre-compiled binaries for some architectures are available in addition to the source code. The pre-installed binaries are also available on the Human Genome Center supercomputer system. The online manual and the supplementary information of SiGN-SSM is available on our web site. tamada@ims.u-tokyo.ac.jp.

  20. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  1. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  2. The OpenCourseWare Model: High-Impact Open Educational Content

    ERIC Educational Resources Information Center

    Carson, Stephen

    2007-01-01

    OpenCourseWare (OCW) is one among several models for offering open educational resources (OER). This article explains the OCW model and its position within the broader OER context. OCW primarily represents publication of existing course materials already in use for teaching purposes. OCW projects are most often institutional, carrying the…

  3. A systemic approach for modeling biological evolution using Parallel DEVS.

    PubMed

    Heredia, Daniel; Sanz, Victorino; Urquia, Alfonso; Sandín, Máximo

    2015-08-01

    A new model for studying the evolution of living organisms is proposed in this manuscript. The proposed model is based on a non-neodarwinian systemic approach. The model is focused on considering several controversies and open discussions about modern evolutionary biology. Additionally, a simplification of the proposed model, named EvoDEVS, has been mathematically described using the Parallel DEVS formalism and implemented as a computer program using the DEVSLib Modelica library. EvoDEVS serves as an experimental platform to study different conditions and scenarios by means of computer simulations. Two preliminary case studies are presented to illustrate the behavior of the model and validate its results. EvoDEVS is freely available at http://www.euclides.dia.uned.es. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Simulation model of fatigue crack opening/closing phenomena for predicting RPG load under arbitrary stress distribution field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toyosada, M.; Niwa, T.

    1995-12-31

    In this paper, Newman`s calculation model is modified to solve his neglected effect of the change of stress distribution ahead of a crack, and to leave elastic plastic materials along the crack surface because of the compatibility of Dugdale model. In addition to above treatment, the authors introduce plastic shrinkage at an immediate generation of new crack surfaces due to emancipation of internal force with the magnitude of yield stress level during unloading process in the model. Moreover, the model is expanded to arbitrary stress distribution field. By using the model, RPG load is simulated for a center notched specimenmore » under constant amplitude loading with various stress ratios and decreased maximum load while keeping minimum load.« less

  5. The Open Source Snowpack modelling ecosystem

    NASA Astrophysics Data System (ADS)

    Bavay, Mathias; Fierz, Charles; Egger, Thomas; Lehning, Michael

    2016-04-01

    As a large number of numerical snow models are available, a few stand out as quite mature and widespread. One such model is SNOWPACK, the Open Source model that is developed at the WSL Institute for Snow and Avalanche Research SLF. Over the years, various tools have been developed around SNOWPACK in order to expand its use or to integrate additional features. Today, the model is part of a whole ecosystem that has evolved to both offer seamless integration and high modularity so each tool can easily be used outside the ecosystem. Many of these Open Source tools experience their own, autonomous development and are successfully used in their own right in other models and applications. There is Alpine3D, the spatially distributed version of SNOWPACK, that forces it with terrain-corrected radiation fields and optionally with blowing and drifting snow. This model can be used on parallel systems (either with OpenMP or MPI) and has been used for applications ranging from climate change to reindeer herding. There is the MeteoIO pre-processing library that offers fully integrated data access, data filtering, data correction, data resampling and spatial interpolations. This library is now used by several other models and applications. There is the SnopViz snow profile visualization library and application that supports both measured and simulated snow profiles (relying on the CAAML standard) as well as time series. This JavaScript application can be used standalone without any internet connection or served on the web together with simulation results. There is the OSPER data platform effort with a data management service (build on the Global Sensor Network (GSN) platform) as well as a data documenting system (metadata management as a wiki). There are several distributed hydrological models for mountainous areas in ongoing development that require very little information about the soil structure based on the assumption that in step terrain, the most relevant information is contained in the Digital Elevation Model (DEM). There is finally a set of tools making up the operational chain to automatically run, monitor and publish SNOWPACK simulations for operational avalanche warning purposes. This tool chain has been developed with the aim of offering very low maintenance operation and very fast deployment and to easily adapt to other avalanche services.

  6. Analyses of Fatigue Crack Growth and Closure Near Threshold Conditions for Large-Crack Behavior

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.

    1999-01-01

    A plasticity-induced crack-closure model was used to study fatigue crack growth and closure in thin 2024-T3 aluminum alloy under constant-R and constant-K(sub max) threshold testing procedures. Two methods of calculating crack-opening stresses were compared. One method was based on a contact-K analyses and the other on crack-opening-displacement (COD) analyses. These methods gave nearly identical results under constant-amplitude loading, but under threshold simulations the contact-K analyses gave lower opening stresses than the contact COD method. Crack-growth predictions tend to support the use of contact-K analyses. Crack-growth simulations showed that remote closure can cause a rapid rise in opening stresses in the near threshold regime for low-constraint and high applied stress levels. Under low applied stress levels and high constraint, a rise in opening stresses was not observed near threshold conditions. But crack-tip-opening displacement (CTOD) were of the order of measured oxide thicknesses in the 2024 alloy under constant-R simulations. In contrast, under constant-K(sub max) testing the CTOD near threshold conditions were an order-of-magnitude larger than measured oxide thicknesses. Residual-plastic deformations under both constant-R and constant-K(sub max) threshold simulations were several times larger than the expected oxide thicknesses. Thus, residual-plastic deformations, in addition to oxide and roughness, play an integral part in threshold development.

  7. Initial Assessment of Open Rotor Propulsion Applied to an Advanced Single-Aisle Aircraft

    NASA Technical Reports Server (NTRS)

    Guynn, Mark D.; Berton, Jeffrey J.; Hendricks, Eric S.; Tong, Michael T.; Haller, William J.; Thurman, Douglas R.

    2011-01-01

    Application of high speed, advanced turboprops, or propfans, to subsonic transport aircraft received significant attention and research in the 1970s and 1980s when fuel efficiency was the driving focus of aeronautical research. Recent volatility in fuel prices and concern for aviation s environmental impact have renewed interest in unducted, open rotor propulsion, and revived research by NASA and a number of engine manufacturers. Unfortunately, in the two decades that have passed since open rotor concepts were thoroughly investigated, NASA has lost experience and expertise in this technology area. This paper describes initial efforts to re-establish NASA s capability to assess aircraft designs with open rotor propulsion. Specifically, methodologies for aircraft-level sizing, performance analysis, and system-level noise analysis are described. Propulsion modeling techniques have been described in a previous paper. Initial results from application of these methods to an advanced single-aisle aircraft using open rotor engines based on historical blade designs are presented. These results indicate open rotor engines have the potential to provide large reductions in fuel consumption and emissions. Initial noise analysis indicates that current noise regulations can be met with old blade designs and modern, noiseoptimized blade designs are expected to result in even lower noise levels. Although an initial capability has been established and initial results obtained, additional development work is necessary to make NASA s open rotor system analysis capability on par with existing turbofan analysis capabilities.

  8. OpenNFT: An open-source Python/Matlab framework for real-time fMRI neurofeedback training based on activity, connectivity and multivariate pattern analysis.

    PubMed

    Koush, Yury; Ashburner, John; Prilepin, Evgeny; Sladky, Ronald; Zeidman, Peter; Bibikov, Sergei; Scharnowski, Frank; Nikonorov, Artem; De Ville, Dimitri Van

    2017-08-01

    Neurofeedback based on real-time functional magnetic resonance imaging (rt-fMRI) is a novel and rapidly developing research field. It allows for training of voluntary control over localized brain activity and connectivity and has demonstrated promising clinical applications. Because of the rapid technical developments of MRI techniques and the availability of high-performance computing, new methodological advances in rt-fMRI neurofeedback become possible. Here we outline the core components of a novel open-source neurofeedback framework, termed Open NeuroFeedback Training (OpenNFT), which efficiently integrates these new developments. This framework is implemented using Python and Matlab source code to allow for diverse functionality, high modularity, and rapid extendibility of the software depending on the user's needs. In addition, it provides an easy interface to the functionality of Statistical Parametric Mapping (SPM) that is also open-source and one of the most widely used fMRI data analysis software. We demonstrate the functionality of our new framework by describing case studies that include neurofeedback protocols based on brain activity levels, effective connectivity models, and pattern classification approaches. This open-source initiative provides a suitable framework to actively engage in the development of novel neurofeedback approaches, so that local methodological developments can be easily made accessible to a wider range of users. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Open source electronic health records and chronic disease management

    PubMed Central

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    Objective To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). Methods and Materials The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Results Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. Discussion The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. Conclusions The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC. PMID:23813566

  10. Open space and imagination

    Treesearch

    G. Scott Place; Bruce Hronek

    2001-01-01

    Open space is a necessary tool in our park system for fostering creativity and allowing for relaxation. In addition, open space areas allow people to exercise, find self-worth, and to use their imagination. This manuscript addresses the issue of what is happening in open space provided in several park settings. Do residents use open space as a place where they can play...

  11. Explanatory models and openness about dementia in migrant communities: A qualitative study among female family carers.

    PubMed

    van Wezel, Nienke; Francke, Anneke L; Kayan Acun, Emine; Devillé, Walter Ljm; van Grondelle, Nies J; Blom, Marco M

    2016-06-15

    The prevalence of dementia is increasing among people with a Turkish, Moroccan and Surinamese-Creole background. Because informal care is very important in these communities, it is pertinent to see what explanations female family carers have for dementia and whether they can discuss dementia openly within the community and the family. Forty-one individual interviews and six focus group interviews (n = 28) were held with female Turkish, Moroccan and Surinamese Creole family carers who are looking after a close relative with dementia, and who live in The Netherlands. Qualitative analysis has been carried out, supported by the software MaxQda. The dominant explanations of dementia given by the female family carers interviewed are in line with what Downs et al. describe as the explanatory models 'dementia as a normal ageing process' and 'dementia as a spiritual experience'. In addition, some female family carers gave explanations that were about an interplay between various factors. Turkish and Moroccan informal caregivers ascribe the causes of dementia relatively often to life events or personality traits, whereas Surinamese Creole caregivers frequently mention physical aspects, such as past dehydration. However, the explanatory model 'dementia as a neuropsychiatric condition', which is dominant in Western cultures, was rarely expressed by the informal caregivers. The female family carers generally talked openly about the dementia with their close family, whereas particularly in the Turkish and Moroccan communities open communication within the broader communities was often hampered, e.g. by feelings of shame. Female family carers of Turkish, Moroccan or Surinamese Creole backgrounds often consider dementia as a natural consequence of ageing, as a spiritual experience, and/or as an interplay between various factors. They feel they can talk openly about dementia within their close family, while outside the close family this is often more difficult. © The Author(s) 2016.

  12. Joint kinematic calculation based on clinical direct kinematic versus inverse kinematic gait models.

    PubMed

    Kainz, H; Modenese, L; Lloyd, D G; Maine, S; Walsh, H P J; Carty, C P

    2016-06-14

    Most clinical gait laboratories use the conventional gait analysis model. This model uses a computational method called Direct Kinematics (DK) to calculate joint kinematics. In contrast, musculoskeletal modelling approaches use Inverse Kinematics (IK) to obtain joint angles. IK allows additional analysis (e.g. muscle-tendon length estimates), which may provide valuable information for clinical decision-making in people with movement disorders. The twofold aims of the current study were: (1) to compare joint kinematics obtained by a clinical DK model (Vicon Plug-in-Gait) with those produced by a widely used IK model (available with the OpenSim distribution), and (2) to evaluate the difference in joint kinematics that can be solely attributed to the different computational methods (DK versus IK), anatomical models and marker sets by using MRI based models. Eight children with cerebral palsy were recruited and presented for gait and MRI data collection sessions. Differences in joint kinematics up to 13° were found between the Plug-in-Gait and the gait 2392 OpenSim model. The majority of these differences (94.4%) were attributed to differences in the anatomical models, which included different anatomical segment frames and joint constraints. Different computational methods (DK versus IK) were responsible for only 2.7% of the differences. We recommend using the same anatomical model for kinematic and musculoskeletal analysis to ensure consistency between the obtained joint angles and musculoskeletal estimates. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Dipy, a library for the analysis of diffusion MRI data.

    PubMed

    Garyfallidis, Eleftherios; Brett, Matthew; Amirbekian, Bagrat; Rokem, Ariel; van der Walt, Stefan; Descoteaux, Maxime; Nimmo-Smith, Ian

    2014-01-01

    Diffusion Imaging in Python (Dipy) is a free and open source software project for the analysis of data from diffusion magnetic resonance imaging (dMRI) experiments. dMRI is an application of MRI that can be used to measure structural features of brain white matter. Many methods have been developed to use dMRI data to model the local configuration of white matter nerve fiber bundles and infer the trajectory of bundles connecting different parts of the brain. Dipy gathers implementations of many different methods in dMRI, including: diffusion signal pre-processing; reconstruction of diffusion distributions in individual voxels; fiber tractography and fiber track post-processing, analysis and visualization. Dipy aims to provide transparent implementations for all the different steps of dMRI analysis with a uniform programming interface. We have implemented classical signal reconstruction techniques, such as the diffusion tensor model and deterministic fiber tractography. In addition, cutting edge novel reconstruction techniques are implemented, such as constrained spherical deconvolution and diffusion spectrum imaging (DSI) with deconvolution, as well as methods for probabilistic tracking and original methods for tractography clustering. Many additional utility functions are provided to calculate various statistics, informative visualizations, as well as file-handling routines to assist in the development and use of novel techniques. In contrast to many other scientific software projects, Dipy is not being developed by a single research group. Rather, it is an open project that encourages contributions from any scientist/developer through GitHub and open discussions on the project mailing list. Consequently, Dipy today has an international team of contributors, spanning seven different academic institutions in five countries and three continents, which is still growing.

  14. DataMed - an open source discovery index for finding biomedical datasets.

    PubMed

    Chen, Xiaoling; Gururaj, Anupama E; Ozyurt, Burak; Liu, Ruiling; Soysal, Ergin; Cohen, Trevor; Tiryaki, Firat; Li, Yueling; Zong, Nansu; Jiang, Min; Rogith, Deevakar; Salimi, Mandana; Kim, Hyeon-Eui; Rocca-Serra, Philippe; Gonzalez-Beltran, Alejandra; Farcas, Claudiu; Johnson, Todd; Margolis, Ron; Alter, George; Sansone, Susanna-Assunta; Fore, Ian M; Ohno-Machado, Lucila; Grethe, Jeffrey S; Xu, Hua

    2018-01-13

    Finding relevant datasets is important for promoting data reuse in the biomedical domain, but it is challenging given the volume and complexity of biomedical data. Here we describe the development of an open source biomedical data discovery system called DataMed, with the goal of promoting the building of additional data indexes in the biomedical domain. DataMed, which can efficiently index and search diverse types of biomedical datasets across repositories, is developed through the National Institutes of Health-funded biomedical and healthCAre Data Discovery Index Ecosystem (bioCADDIE) consortium. It consists of 2 main components: (1) a data ingestion pipeline that collects and transforms original metadata information to a unified metadata model, called DatA Tag Suite (DATS), and (2) a search engine that finds relevant datasets based on user-entered queries. In addition to describing its architecture and techniques, we evaluated individual components within DataMed, including the accuracy of the ingestion pipeline, the prevalence of the DATS model across repositories, and the overall performance of the dataset retrieval engine. Our manual review shows that the ingestion pipeline could achieve an accuracy of 90% and core elements of DATS had varied frequency across repositories. On a manually curated benchmark dataset, the DataMed search engine achieved an inferred average precision of 0.2033 and a precision at 10 (P@10, the number of relevant results in the top 10 search results) of 0.6022, by implementing advanced natural language processing and terminology services. Currently, we have made the DataMed system publically available as an open source package for the biomedical community. © The Author 2018. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. Scaffold Library for Tissue Engineering: A Geometric Evaluation

    PubMed Central

    Chantarapanich, Nattapon; Puttawibul, Puttisak; Sucharitpwatskul, Sedthawatt; Jeamwatthanachai, Pongnarin; Inglam, Samroeng; Sitthiseripratip, Kriskrai

    2012-01-01

    Tissue engineering scaffold is a biological substitute that aims to restore, to maintain, or to improve tissue functions. Currently available manufacturing technology, that is, additive manufacturing is essentially applied to fabricate the scaffold according to the predefined computer aided design (CAD) model. To develop scaffold CAD libraries, the polyhedrons could be used in the scaffold libraries development. In this present study, one hundred and nineteen polyhedron models were evaluated according to the established criteria. The proposed criteria included considerations on geometry, manufacturing feasibility, and mechanical strength of these polyhedrons. CAD and finite element (FE) method were employed as tools in evaluation. The result of evaluation revealed that the close-cellular scaffold included truncated octahedron, rhombicuboctahedron, and rhombitruncated cuboctahedron. In addition, the suitable polyhedrons for using as open-cellular scaffold libraries included hexahedron, truncated octahedron, truncated hexahedron, cuboctahedron, rhombicuboctahedron, and rhombitruncated cuboctahedron. However, not all pore size to beam thickness ratios (PO : BT) were good for making the open-cellular scaffold. The PO : BT ratio of each library, generating the enclosed pore inside the scaffold, was excluded to avoid the impossibility of material removal after the fabrication. The close-cellular libraries presented the constant porosity which is irrespective to the different pore sizes. The relationship between PO : BT ratio and porosity of open-cellular scaffold libraries was displayed in the form of Logistic Power function. The possibility of merging two different types of libraries to produce the composite structure was geometrically evaluated in terms of the intersection index and was mechanically evaluated by means of FE analysis to observe the stress level. The couples of polyhedrons presenting low intersection index and high stress level were excluded. Good couples for producing the reinforced scaffold were hexahedron-truncated hexahedron and cuboctahedron-rhombitruncated cuboctahedron. PMID:23056147

  16. Geothermal Exploration Case Studies on OpenEI (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, K.; Bennett, M.; Atkins, D.

    2014-03-01

    The U.S. Geological Survey (USGS) resource assessment (Williams et al., 2008) outlined a mean 30 GWe of undiscovered hydrothermal resource in the western United States. One goal of the U.S. Department of Energy's (DOE) Geothermal Technology Office (GTO) is to accelerate the development of this undiscovered resource. DOE has focused efforts on helping industry identify hidden geothermal resources to increase geothermal capacity in the near term. Increased exploration activity will produce more prospects, more discoveries, and more readily developable resources. Detailed exploration case studies akin to those found in oil and gas (e.g. Beaumont and Foster, 1990-1992) will give developersmore » central location for information gives models for identifying new geothermal areas, and guide efficient exploration and development of these areas. To support this effort, the National Renewable Energy Laboratory (NREL) has been working with GTO to develop a template for geothermal case studies on the Geothermal Gateway on OpenEI. In 2012, the template was developed and tested with two case studies: Raft River Geothermal Area (http://en.openei.org/wiki/Raft_River_Geothermal_Area) and Coso Geothermal Area (http://en.openei.org/wiki/Coso_Geothermal_Area). In 2013, ten additional case studies were completed, and Semantic MediaWiki features were developed to allow for more data and the direct citations of these data. These case studies are now in the process of external peer review. In 2014, NREL is working with universities and industry partners to populate additional case studies on OpenEI. The goal is to provide a large enough data set to start conducting analyses of exploration programs to identify correlations between successful exploration plans for areas with similar geologic occurrence models.« less

  17. Dipy, a library for the analysis of diffusion MRI data

    PubMed Central

    Garyfallidis, Eleftherios; Brett, Matthew; Amirbekian, Bagrat; Rokem, Ariel; van der Walt, Stefan; Descoteaux, Maxime; Nimmo-Smith, Ian

    2014-01-01

    Diffusion Imaging in Python (Dipy) is a free and open source software project for the analysis of data from diffusion magnetic resonance imaging (dMRI) experiments. dMRI is an application of MRI that can be used to measure structural features of brain white matter. Many methods have been developed to use dMRI data to model the local configuration of white matter nerve fiber bundles and infer the trajectory of bundles connecting different parts of the brain. Dipy gathers implementations of many different methods in dMRI, including: diffusion signal pre-processing; reconstruction of diffusion distributions in individual voxels; fiber tractography and fiber track post-processing, analysis and visualization. Dipy aims to provide transparent implementations for all the different steps of dMRI analysis with a uniform programming interface. We have implemented classical signal reconstruction techniques, such as the diffusion tensor model and deterministic fiber tractography. In addition, cutting edge novel reconstruction techniques are implemented, such as constrained spherical deconvolution and diffusion spectrum imaging (DSI) with deconvolution, as well as methods for probabilistic tracking and original methods for tractography clustering. Many additional utility functions are provided to calculate various statistics, informative visualizations, as well as file-handling routines to assist in the development and use of novel techniques. In contrast to many other scientific software projects, Dipy is not being developed by a single research group. Rather, it is an open project that encourages contributions from any scientist/developer through GitHub and open discussions on the project mailing list. Consequently, Dipy today has an international team of contributors, spanning seven different academic institutions in five countries and three continents, which is still growing. PMID:24600385

  18. Vitamin D deficiency intensifies deterioration of risk factors, such as male sex and absence of vision, leading to increased postural body sway.

    PubMed

    Krause, Matthias; Anschütz, Wilma; Vettorazzi, Eik; Breer, Stefan; Amling, Michael; Barvencik, Florian

    2014-01-01

    Due to inconsistent findings, the influence of vitamin D on postural body sway (PBS) is currently under debate. This study evaluated the impact of vitamin D on PBS with regards to different foot positions and eye opening states in community-dwelling older individuals. In a cross-sectional study, we assessed PBS in 342 older individuals (264 females [average age (± SD): 68.3 ± 9.0 years], 78 males [65.7 ± 9.6 years]). A detailed medical history and vitamin D level were obtained for each individual. Fall risk was evaluated using the New York-Presbyterian Fall Risk Assessment Tool (NY PFRA). PBS parameters (area, distance, velocity, frequency) were evaluated on a pressure plate with feet in closed stance (CS) or hip-width stance (HWS), open eyes and closed eyes. Statistical analysis included logarithmic mixed models for repeated measures with the MIXED model procedure to test the influence of vitamin D (categorized in <10 μg/l, 10-20 μg/l, 21-30 μg/l, >30 μg/l), foot position, eye opening state, age, sex and frequency of physical activity on PBS. Vitamin D was not an independent risk factor for falls experienced in the last 12 months. Nonetheless, PBS was higher in patients with vitamin D deficiency (<10 μg/l) in HWS (A/P p=0.028 and area p=0.037). Additionally, vitamin D deficiency intensified the deleterious effects of male sex (distance p=0.002) and absence of vision (area p<0.001) on PBS. Independent risk factors for increased PBS like male sex and absence of vision are additionally compromised by vitamin D deficiency. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Open Babel: An open chemical toolbox

    PubMed Central

    2011-01-01

    Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300

  20. Binary Code Extraction and Interface Identification for Security Applications

    DTIC Science & Technology

    2009-10-02

    the functions extracted during the end-to-end applications and at the bottom some additional functions extracted from the OpenSSL library. fact that as...mentioned in Section 5.1 through Section 5.3 and some additional functions that we extract from the OpenSSL library for evaluation purposes. The... OpenSSL functions, the false positives and negatives are measured by comparison with the original C source code. For the malware samples, no source is

  1. Detection of Failure in Asynchronous Motor Using Soft Computing Method

    NASA Astrophysics Data System (ADS)

    Vinoth Kumar, K.; Sony, Kevin; Achenkunju John, Alan; Kuriakose, Anto; John, Ano P.

    2018-04-01

    This paper investigates the stator short winding failure of asynchronous motor also their effects on motor current spectrums. A fuzzy logic approach i.e., model based technique possibly will help to detect the asynchronous motor failure. Actually, fuzzy logic similar to humanoid intelligent methods besides expected linguistic empowering inferences through vague statistics. The dynamic model is technologically advanced for asynchronous motor by means of fuzzy logic classifier towards investigate the stator inter turn failure in addition open phase failure. A hardware implementation was carried out with LabVIEW for the online-monitoring of faults.

  2. Web Platform for Sharing Modeling Software in the Field of Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Dubenskaya, Julia; Kryukov, Alexander; Demichev, Andrey

    2018-02-01

    We describe the prototype of a Web platform intended for sharing software programs for computer modeling in the rapidly developing field of the nonlinear optics phenomena. The suggested platform is built on the top of the HUBZero open-source middleware. In addition to the basic HUBZero installation we added to our platform the capability to run Docker containers via an external application server and to send calculation programs to those containers for execution. The presented web platform provides a wide range of features and might be of benefit to nonlinear optics researchers.

  3. Mathematical, numerical and experimental analysis of the swirling flow at a Kaplan runner outlet

    NASA Astrophysics Data System (ADS)

    Muntean, S.; Ciocan, T.; Susan-Resiga, R. F.; Cervantes, M.; Nilsson, H.

    2012-11-01

    The paper presents a novel mathematical model for a-priori computation of the swirling flow at Kaplan runners outlet. The model is an extension of the initial version developed by Susan-Resiga et al [1], to include the contributions of non-negligible radial velocity and of the variable rothalpy. Simple analytical expressions are derived for these additional data from three-dimensional numerical simulations of the Kaplan turbine. The final results, i.e. velocity components profiles, are validated against experimental data at two operating points, with the same Kaplan runner blades opening, but variable discharge.

  4. The Assembly and Emplacement of the Mushy Magma Model: A Historical Perspective

    NASA Astrophysics Data System (ADS)

    Bergantz, G. W.

    2012-12-01

    The "mush model" for magmatic systems has emerged as an alternative to the classic notion of a silicate liquid dominated reservoir, the so-called big tank model. The mush model is motivated by a concurrence of geochemical, geophysical and geological observations and new ideas on multiphase fluid dynamics. This presentation will review the historical development and remaining open questions about the mush model as it pertains to silicic systems. The observation that rhyolites have extreme depletions in Sr, Ba and Eu, as well as depletions in Zr and LREE, precluded an origin by direct crustal melting, instead requiring crystallization differentiation. This initially motivated the sidewall crystallization model, where less dense, evolved liquid originated and percolated upwards through a crystal-rich boundary zone, adjacent to a liquid dominated reservoir. The 'defrosting' or remobilization of this sidewall was proposed as a mechanism for producing complex temporal signatures in erupted suites, and this notion later found additional support in the recognition of so-called 'antecrysts.' Lab bench scale tank models of crystallizing salt solutions were offered as analogs for these boundary layer driven magmatic systems. However, it was recognized that features of this boundary layer model that did not agree with seismic, gravity and magnetotelluric, and geological observations. Seismic studies of silicic systems typically indicate P-wave velocity anomalies of 15% or greater for both shallow and deeper systems. But they do not show velocity anomalies that would indicate substantial regions of pure liquid in the core. Rather, the geophysical anomalies, are consistent with a with a spatially extensive crystal mush with an overlying thin melt lens. In addition the observation that erupted crystal poor liquids abruptly transition into crystal rich magmas with interstitial liquid compositions that are nearly identical to the crystal poor ones, provides evidence of a geometrical and source relationship between crystal poor and subjacent crystal rich mush. Lastly, it was appreciated that the boundary layer fluid dynamic models lacked geological verisimilitude, and invoked assumptions on heat transfer rates that were not in accord with geological conditions. Taken together this required a new conceptual model that could honor a broader range of constraints, and led to the 'full chamber' mush model as described by Hildreth (2001, 2004, 2007) and subsequently Bachmann and Bergantz (2004, 2008). However there are many open questions about this model, particularly how they are assembled, the physics of melt movement and mixing, and the way they respond to open system events. For example it is now recognized that crystal mushes can be remobilized rapidly and mineral isotopic and trace element zoning requires that the mush can go through some re-melting, consistent with the unzipping model of Burgisser and Bergantz (2011).

  5. METHODOLOGICAL ISSUES IN THE USE OF GENERALIZED ADDITIVE MODELS FOR THE ANALYSIS OF PARTICULATE MATTER; CONFERENCE PROCEEDINGS FOR 9TH INT'L. INHALATION SYMPOSIUM ON EFFECTS OF AIR CONTAMINANTS ON THE RESPIRATORY TRACT - INTERPRETATIONS FROM MOLECULES TO META ANALYSIS

    EPA Science Inventory

    Open cohort ("time-series") studies of the adverse health effects of short-term exposures to ambient particulate matter and gaseous co-pollutants have been essential in the standard setting process. Last year, a number of serious issues were raised concerning the fitting of Gener...

  6. Code-switched English Pronunciation Modeling for Swahili Spoken Term Detection (Pub Version, Open Access)

    DTIC Science & Technology

    2016-05-03

    using additional English resources. 2. Background The Babel program1 is an international collaborative effort sponsored by the US Intelligence Advanced...phenomenon is not as well studied for English / African language pairs, but some results are available8,9. 3. Experimental Setup The Swahili analysis...word pronunciations. From the analysis it was concluded that in most cases English words were pronounced using standard English letter-to-sound rules

  7. Early Experiences Writing Performance Portable OpenMP 4 Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joubert, Wayne; Hernandez, Oscar R

    In this paper, we evaluate the recently available directives in OpenMP 4 to parallelize a computational kernel using both the traditional shared memory approach and the newer accelerator targeting capabilities. In addition, we explore various transformations that attempt to increase application performance portability, and examine the expressiveness and performance implications of using these approaches. For example, we want to understand if the target map directives in OpenMP 4 improve data locality when mapped to a shared memory system, as opposed to the traditional first touch policy approach in traditional OpenMP. To that end, we use recent Cray and Intel compilersmore » to measure the performance variations of a simple application kernel when executed on the OLCF s Titan supercomputer with NVIDIA GPUs and the Beacon system with Intel Xeon Phi accelerators attached. To better understand these trade-offs, we compare our results from traditional OpenMP shared memory implementations to the newer accelerator programming model when it is used to target both the CPU and an attached heterogeneous device. We believe the results and lessons learned as presented in this paper will be useful to the larger user community by providing guidelines that can assist programmers in the development of performance portable code.« less

  8. Why Do Some Estuaries Close: A Model of Estuary Entrance Morphodynamics.

    NASA Astrophysics Data System (ADS)

    McSweeney, S. L.; Kennedy, D. M.; Rutherfurd, I.

    2014-12-01

    Intermittently Closed/Open Coastal Lakes/Lagoons (ICOLLs) are a form of wave-dominated, microtidal estuary that experience periodic closure in times of low river flow. ICOLL entrance morphodynamics are complex due to the interaction between wave, tidal and fluvial processes. Managers invest substantial funds to artificially open ICOLLs as they flood surrounding property and infrastructure, and have poor water quality. Existing studies examine broad scale processes but do not identify the main drivers of entrance condition. In this research, the changes in entrance geomorphology were surveyed before and after artificial entrance openings in three ICOLLs in Victoria, Australia. Changes in morphology were related to continuous measures of sediment volume, water level, tide and wave energy. A six-stage quantitative phase model of entrance geomorphology and hydrodynamics is presented to illustrate the spatio-temporal variability in ICOLL entrance morphodynamics. Phases include: breakout; channel expansion with rapid outflow; open with tidal exchange; initial berm rebuilding with tidal attenuation; partial berm recovery with rising water levels; closed with perched water levels. Entrance breakout initiates incision of a pilot channel to the ocean, whereby basin water levels then decline and channel expansion as the headcut migrates landwards. Peak outflow velocities of 5 m/s-3 were recorded and channel dimensions increased over 6 hrs to 3.5 m deep and 140 m wide. When tidal, a clear semi-diurnal signal is superimposed upon an otherwise stable water level. Deep-water wave energy was transferred 1.8 km upstream of the rivermouth with bores present in the basin. Berm rebuilding occurred by littoral drift and cross-shore transport once outflow ceased and microscale bedform features, particularly antidunes, contributed to sediment progradation. Phase duration is dependant on how high the estuary was perched above mean sea level, tidal prism extent, and onshore sediment supply. High offshore wave height and frequency, in addition to littoral drift magnitude, were main drivers of closure. This study presents a predictive model of entrance morphodynamics whereby managers can determine proximity to natural closure or opening, and as a result identify whether implementing an artificial opening is worthwhile.

  9. Bright x-rays reveal shifting deformation states and effects of the microstructure on the plastic deformation of crystalline materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaudoin, A. J.; Shade, P. A.; Schuren, J. C.

    The plastic deformation of crystalline materials is usually modeled as smoothly progressing in space and time, yet modern studies show intermittency in the deformation dynamics of single-crystals arising from avalanche behavior of dislocation ensembles under uniform applied loads. However, once the prism of the microstructure in polycrystalline materials disperses and redistributes the load on a grain-by-grain basis, additional length and time scales are involved. Thus, the question is open as to how deformation intermittency manifests for the nonuniform grain-scale internal driving forces interacting with the finer-scale dislocation ensemble behavior. In this work we track the evolution of elastic strain withinmore » individual grains of a creep-loaded titanium alloy, revealing widely varying internal strains that fluctuate over time. Here, the findings provide direct evidence of how flow intermittency proceeds for an aggregate of ~700 grains while showing the influences of multiscale ensemble interactions and opening new avenues for advancing plasticity modeling.« less

  10. At-sea detection of marine debris: overview of technologies, processes, issues, and options.

    PubMed

    Mace, Thomas H

    2012-01-01

    At-sea detection of marine debris presents a difficult problem, as the debris items are often relatively small and partially submerged. However, they may accumulate in water parcel boundaries or eddy lines. The application of models, satellite radar and multispectral data, and airborne remote sensing (particularly radar) to focus the search on eddies and convergence zones in the open ocean appear to be a productive avenue of investigation. A multistage modeling and remote sensing approach is proposed for the identification of areas of the open ocean where debris items are more likely to congregate. A path forward may best be achieved through the refinement of the Ghost Net procedures with the addition of a final search stage using airborne radar from an UAS simulator aircraft to detect zones of potential accumulation for direct search. Sampling strategies, direct versus indirect measurements, remote sensing resolution, sensor/platform considerations, and future state are addressed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Thermal control of low-pressure fractionation processes. [in basaltic magma solidification

    NASA Technical Reports Server (NTRS)

    Usselman, T. M.; Hodge, D. S.

    1978-01-01

    Thermal models detailing the solidification paths for shallow basaltic magma chambers (both open and closed systems) were calculated using finite-difference techniques. The total solidification time for closed chambers are comparable to previously published calculations; however, the temperature-time paths are not. These paths are dependent on the phase relations and the crystallinity of the system, because both affect the manner in which the latent heat of crystallization is distributed. In open systems, where a chamber would be periodically replenished with additional parental liquid, calculations indicate that the possibility is strong that a steady-state temperature interval is achieved near a major phase boundary. In these cases it is straightforward to analyze fractionation models of the basaltic liquid evolution and their corresponding cumulate sequences. This steady thermal fractionating state can be invoked to explain large amounts of erupted basalts of similar composition over long time periods from the same volcanic center and some rhythmically layered basic cumulate sequences.

  12. Bright x-rays reveal shifting deformation states and effects of the microstructure on the plastic deformation of crystalline materials

    DOE PAGES

    Beaudoin, A. J.; Shade, P. A.; Schuren, J. C.; ...

    2017-11-30

    The plastic deformation of crystalline materials is usually modeled as smoothly progressing in space and time, yet modern studies show intermittency in the deformation dynamics of single-crystals arising from avalanche behavior of dislocation ensembles under uniform applied loads. However, once the prism of the microstructure in polycrystalline materials disperses and redistributes the load on a grain-by-grain basis, additional length and time scales are involved. Thus, the question is open as to how deformation intermittency manifests for the nonuniform grain-scale internal driving forces interacting with the finer-scale dislocation ensemble behavior. In this work we track the evolution of elastic strain withinmore » individual grains of a creep-loaded titanium alloy, revealing widely varying internal strains that fluctuate over time. Here, the findings provide direct evidence of how flow intermittency proceeds for an aggregate of ~700 grains while showing the influences of multiscale ensemble interactions and opening new avenues for advancing plasticity modeling.« less

  13. Neutrons and Fundamental Symmetries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plaster, Bradley

    2016-01-11

    The research supported by this project addressed fundamental open physics questions via experiments with subatomic particles. In particular, neutrons constitute an especially ideal “laboratory” for fundamental physics tests, as their sensitivities to the four known forces of nature permit a broad range of tests of the so-called “Standard Model”, our current best physics model for the interactions of subatomic particles. Although the Standard Model has been a triumphant success for physics, it does not provide satisfactory answers to some of the most fundamental open questions in physics, such as: are there additional forces of nature beyond the gravitational, electromagnetic, weakmore » nuclear, and strong nuclear forces?, or why does our universe consist of more matter than anti-matter? This project also contributed significantly to the training of the next generation of scientists, of considerable value to the public. Young scientists, ranging from undergraduate students to graduate students to post-doctoral researchers, made significant contributions to the work carried out under this project.« less

  14. Wave-ice interaction, observed and modelled

    NASA Astrophysics Data System (ADS)

    Gemmrich, Johannes

    2017-04-01

    The need for wide-spread, up-to-date sea state predictions and observations in the emerging ice-free Arctic will further increase as the region will open up to marine operations. Wave models for arctic regions have to capture the additional wave physics associated with wave-ice interactions, and different prediction schemes have to be tested against observations. Here we present examples of spatial wave field parameters obtained from TerraSAR-X StripMap swaths in the southern Beaufort Sea taken as part of the "Arctic Sea State and Boundary Layer DRI". Fetch evolution of the significant wave height and length in open waters, and dominant wave lengths and the high frequency cut-off of the wave spectrum in ice are readily extracted from the SAR (synthetic aperture radar) data. A surprising result is that wave evolution in off-ice wind conditions is more rapidly than the fetch evolution in off-land cases, suggesting seeding of the wave field within the ice-covered region.

  15. Brain Metastasis: Unique Challenges and Open Opportunities

    PubMed Central

    Lowery, Frank J.; Yu, Dihua

    2016-01-01

    The metastasis of cancer to the central nervous system (CNS) remains a devastating clinical reality, carrying an estimated survival time of less than one year in spite of recent therapeutic breakthroughs for other disease contexts. Advances in brain metastasis research are hindered by a number of reasons, including its complicated nature and the difficulty of modeling metastatic cancer growth in the unique brain microenvironment. In this review, we will discuss the clinical challenge, and compare the values and limitations of the available models for brain metastasis research. Additionally, we will specifically address current knowledge on how brain metastases take advantage of the unique brain environment to benefit their own growth. Finally, we will explore the distinctive metabolic and nutrient characteristics of the brain; how these paradoxically represent barriers to establishment of brain metastasis, but also provide ample supplies for metastatic cells’ growth in the brain. We envision that multi-disciplinary innovative approaches will open opportunities for the field to make breakthroughs in tackling unique challenges of brain metastasis. PMID:27939792

  16. Characterization of a microbial fuel cell with reticulated carbon foam electrodes.

    PubMed

    Lepage, Guillaume; Albernaz, Fabio Ovenhausen; Perrier, Gérard; Merlin, Gérard

    2012-11-01

    A microbial fuel cell with open-pore reticulated vitreous carbon electrodes is studied to assess the suitability of this material in a batch mode, in the perspective of flow-through reactors for wastewater treatment with electricity generation. The cell shows good stability and fair robustness in regards to substrate cycles. A power density of 40 W/m(3) is reached. The cell efficiency is mainly limited by cathodic transfers, representing 85% of the global overpotential in open circuit. Through impedance spectrocopy, equivalent circuit modeling reveals the complex nature of the bioelectrochemical phenomena. The global electrical behavior of the cell seems to result in the addition of three anodic and two cathodic distinct phenomena. On the cathode side, the Warburg element in the model is related to the diffusion of oxygen. Warburg resistance and time are respectively 2.99 kΩ cm(2) and 16.4s, similar to those published elsewhere. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Post Processing Methods used to Improve Surface Finish of Products which are Manufactured by Additive Manufacturing Technologies: A Review

    NASA Astrophysics Data System (ADS)

    Kumbhar, N. N.; Mulay, A. V.

    2016-08-01

    The Additive Manufacturing (AM) processes open the possibility to go directly from Computer-Aided Design (CAD) to a physical prototype. These prototypes are used as test models before it is finalized as well as sometimes as a final product. Additive Manufacturing has many advantages over the traditional process used to develop a product such as allowing early customer involvement in product development, complex shape generation and also save time as well as money. Additive manufacturing also possess some special challenges that are usually worth overcoming such as Poor Surface quality, Physical Properties and use of specific raw material for manufacturing. To improve the surface quality several attempts had been made by controlling various process parameters of Additive manufacturing and also applying different post processing techniques on components manufactured by Additive manufacturing. The main objective of this work is to document an extensive literature review in the general area of post processing techniques which are used in Additive manufacturing.

  18. Molecular mechanism of pharmacological activation of BK channels

    PubMed Central

    Gessner, Guido; Cui, Yong-Mei; Otani, Yuko; Ohwada, Tomohiko; Soom, Malle; Hoshi, Toshinori; Heinemann, Stefan H.

    2012-01-01

    Large-conductance voltage- and Ca2+-activated K+ (Slo1 BK) channels serve numerous cellular functions, and their dysregulation is implicated in various diseases. Drugs activating BK channels therefore bear substantial therapeutic potential, but their deployment has been hindered in part because the mode of action remains obscure. Here we provide mechanistic insight into how the dehydroabietic acid derivative Cym04 activates BK channels. As a representative of NS1619-like BK openers, Cym04 reversibly left-shifts the half-activation voltage of Slo1 BK channels. Using an established allosteric BK gating model, the Cym04 effect can be simulated by a shift of the voltage sensor and the ion conduction gate equilibria toward the activated and open state, respectively. BK activation by Cym04 occurs in a splice variant-specific manner; it does not occur in such Slo1 BK channels using an alternative neuronal exon 9, which codes for the linker connecting the transmembrane segment S6 and the cytosolic RCK1 domain—the S6/RCK linker. In addition, Cym04 does not affect Slo1 BK channels with a two-residue deletion within this linker. Mutagenesis and model-based gating analysis revealed that BK openers, such as Cym04 and NS1619 but not mallotoxin, activate BK channels by functionally interacting with the S6/RCK linker, mimicking site-specific shortening of this purported passive spring, which transmits force from the cytosolic gating ring structure to open the channel's gate. PMID:22331907

  19. An Alternative Interpretation of the Relationship between the Inferred Open Solar Flux and the Interplanetary Magnetic Field

    NASA Technical Reports Server (NTRS)

    Riley, Pete

    2007-01-01

    Photospheric observations at the Wilcox Solar Observatory (WSO) represent an uninterrupted data set of 32 years and are therefore unique for modeling variations in the magnetic structure of the corona and inner heliosphere over three solar cycles. For many years, modelers have applied a latitudinal correction factor to these data, believing that it provided a better estimate of the line-of-sight magnetic field. Its application was defended by arguing that the computed open flux matched observations of the interplanetary magnetic field (IMF) significantly better than the original WSO correction factor. However, no physically based argument could be made for its use. In this Letter we explore the implications of using the constant correction factor on the value and variation of the computed open solar flux and its relationship to the measured IMF. We find that it does not match the measured IMF at 1 AU except at and surrounding solar minimum. However, we argue that interplanetary coronal mass ejections (ICMEs) may provide sufficient additional magnetic flux to the extent that a remarkably good match is found between the sum of the computed open flux and inferred ICME flux and the measured flux at 1 AU. If further substantiated, the implications of this interpretation may be significant, including a better understanding of the structure and strength of the coronal field and I N providing constraints for theories of field line transport in the corona, the modulation of galactic cosmic rays, and even possibly terrestrial climate effects.

  20. CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.

    2013-12-01

    As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.

  1. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    PubMed

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  2. 11 CFR 9407.3 - Open meetings.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 11 Federal Elections 1 2014-01-01 2014-01-01 false Open meetings. 9407.3 Section 9407.3 Federal Elections ELECTION ASSISTANCE COMMISSION IMPLEMENTATION OF THE GOVERNMENT IN THE SUNSHINE ACT § 9407.3 Open... every Commission meeting shall be open to public observation. (c) No additional right to participate in...

  3. 11 CFR 9407.3 - Open meetings.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 11 Federal Elections 1 2011-01-01 2011-01-01 false Open meetings. 9407.3 Section 9407.3 Federal Elections ELECTION ASSISTANCE COMMISSION IMPLEMENTATION OF THE GOVERNMENT IN THE SUNSHINE ACT § 9407.3 Open... every Commission meeting shall be open to public observation. (c) No additional right to participate in...

  4. 11 CFR 9407.3 - Open meetings.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 11 Federal Elections 1 2012-01-01 2012-01-01 false Open meetings. 9407.3 Section 9407.3 Federal Elections ELECTION ASSISTANCE COMMISSION IMPLEMENTATION OF THE GOVERNMENT IN THE SUNSHINE ACT § 9407.3 Open... every Commission meeting shall be open to public observation. (c) No additional right to participate in...

  5. 11 CFR 9407.3 - Open meetings.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 11 Federal Elections 1 2013-01-01 2012-01-01 true Open meetings. 9407.3 Section 9407.3 Federal Elections ELECTION ASSISTANCE COMMISSION IMPLEMENTATION OF THE GOVERNMENT IN THE SUNSHINE ACT § 9407.3 Open... every Commission meeting shall be open to public observation. (c) No additional right to participate in...

  6. 11 CFR 9407.3 - Open meetings.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 11 Federal Elections 1 2010-01-01 2010-01-01 false Open meetings. 9407.3 Section 9407.3 Federal Elections ELECTION ASSISTANCE COMMISSION IMPLEMENTATION OF THE GOVERNMENT IN THE SUNSHINE ACT § 9407.3 Open... every Commission meeting shall be open to public observation. (c) No additional right to participate in...

  7. Colloquium: Non-Markovian dynamics in open quantum systems

    NASA Astrophysics Data System (ADS)

    Breuer, Heinz-Peter; Laine, Elsi-Mari; Piilo, Jyrki; Vacchini, Bassano

    2016-04-01

    The dynamical behavior of open quantum systems plays a key role in many applications of quantum mechanics, examples ranging from fundamental problems, such as the environment-induced decay of quantum coherence and relaxation in many-body systems, to applications in condensed matter theory, quantum transport, quantum chemistry, and quantum information. In close analogy to a classical Markovian stochastic process, the interaction of an open quantum system with a noisy environment is often modeled phenomenologically by means of a dynamical semigroup with a corresponding time-independent generator in Lindblad form, which describes a memoryless dynamics of the open system typically leading to an irreversible loss of characteristic quantum features. However, in many applications open systems exhibit pronounced memory effects and a revival of genuine quantum properties such as quantum coherence, correlations, and entanglement. Here recent theoretical results on the rich non-Markovian quantum dynamics of open systems are discussed, paying particular attention to the rigorous mathematical definition, to the physical interpretation and classification, as well as to the quantification of quantum memory effects. The general theory is illustrated by a series of physical examples. The analysis reveals that memory effects of the open system dynamics reflect characteristic features of the environment which opens a new perspective for applications, namely, to exploit a small open system as a quantum probe signifying nontrivial features of the environment it is interacting with. This Colloquium further explores the various physical sources of non-Markovian quantum dynamics, such as structured environmental spectral densities, nonlocal correlations between environmental degrees of freedom, and correlations in the initial system-environment state, in addition to developing schemes for their local detection. Recent experiments addressing the detection, quantification, and control of non-Markovian quantum dynamics are also briefly discussed.

  8. MHD natural convection and entropy generation in an open cavity having different horizontal porous blocks saturated with a ferrofluid

    NASA Astrophysics Data System (ADS)

    Gibanov, Nikita S.; Sheremet, Mikhail A.; Oztop, Hakan F.; Al-Salem, Khaled

    2018-04-01

    In this study, natural convection combined with entropy generation of Fe3O4-water nanofluid within a square open cavity filled with two different porous blocks under the influence of uniform horizontal magnetic field is numerically studied. Porous blocks of different thermal properties, permeability and porosity are located on the bottom wall. The bottom wall of the cavity is kept at hot temperature Th, while upper open boundary is at constant cold temperature Tc and other walls of the cavity are supposed to be adiabatic. Governing equations with corresponding boundary conditions formulated in dimensionless stream function and vorticity using Brinkman-extended Darcy model for porous blocks have been solved numerically using finite difference method. Numerical analysis has been carried out for wide ranges of Hartmann number, nanoparticles volume fraction and length of the porous blocks. It has been found that an addition of spherical ferric oxide nanoparticles can order the flow structures inside the cavity.

  9. Scaling Agile Infrastructure to People

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.

    2015-12-01

    When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.

  10. Microwave background anisotropies in quasiopen inflation

    NASA Astrophysics Data System (ADS)

    García-Bellido, Juan; Garriga, Jaume; Montes, Xavier

    1999-10-01

    Quasiopenness seems to be generic to multifield models of single-bubble open inflation. Instead of producing infinite open universes, these models actually produce an ensemble of very large but finite inflating islands. In this paper we study the possible constraints from CMB anisotropies on existing models of open inflation. The effect of supercurvature anisotropies combined with the quasiopenness of the inflating regions make some models incompatible with observations, and severely reduces the parameter space of others. Supernatural open inflation and the uncoupled two-field model seem to be ruled out due to these constraints for values of Ω0<~0.98. Others, such as the open hybrid inflation model with suitable parameters for the slow roll potential can be made compatible with observations.

  11. Open Badges for Education: What Are the Implications at the Intersection of Open Systems and Badging?

    ERIC Educational Resources Information Center

    Ahn, June; Pellicone, Anthony; Butler, Brian S.

    2014-01-01

    Badges have garnered great interest among scholars of digital media and learning. In addition, widespread initiatives such as Mozilla's Open Badge Framework expand the potential of badging into the realm of open education. In this paper, we explicate the concept of open badges. We highlight some of the ways that researchers have examined…

  12. Introduction of a simulation model for choledocho- and pancreaticojejunostomy.

    PubMed

    Narumi, Shunji; Toyoki, Yoshikazu; Ishido, Keinosuke; Kudo, Daisuke; Umehara, Minoru; Kimura, Norihisa; Miura, Takuya; Muroya, Takahiro; Hakamada, Kenichi

    2012-10-01

    Pancreaticoduodenectomy includes choledochojejunostomy and pancreaticojejunostomy, which require hand-sewn anastomoses. Educational simulation models for choledochojejunostomy and pancreaticojejunostomy have not been designed. We introduce a simulation model for choledochojejunostomy and pancreaticojejunostomy created with a skin closure pad and a vascular model. A wound closure pad and a vein model (4 mm diameter) were used as a stump model of the pancreas. Pancreaticojejunostomy was simulated with a stump model of the pancreas and a double layer bowel model; these models were stabilized in an end-to-side fashion on a magnetic board using magnetic clips. In addition, vein (6 or 8 mm diameter) and bowel models were used to simulate choledochojejunostomy. Pancreatic and hepatobiliary surgery are relatively rare, particularly in a community hospital although surgical residents wish to practice these procedures. Our simulator enables surgeons and surgical residents to practice choledocho- and pancreaticojejunostomy through open or laparoscopic approaches.

  13. Effect of vergence adaptation on convergence-accommodation: model simulations.

    PubMed

    Sreenivasan, Vidhyapriya; Bobier, William R; Irving, Elizabeth L; Lakshminarayanan, Vasudevan

    2009-10-01

    Several theoretical control models depict the adaptation effects observed in the accommodation and vergence mechanisms of the human visual system. Two current quantitative models differ in their approach of defining adaptation and in identifying the effect of controller adaptation on their respective cross-links between the vergence and accommodative systems. Here, we compare the simulation results of these adaptation models with empirical data obtained from emmetropic adults when they performed sustained near task through + 2D lens addition. The results of our experimental study showed an initial increase in exophoria (a divergent open-loop vergence position) and convergence-accommodation (CA) when viewing through +2D lenses. Prolonged fixation through the near addition lenses initiated vergence adaptation, which reduced the lens-induced exophoria and resulted in a concurrent reduction of CA. Both models showed good agreement with empirical measures of vergence adaptation. However, only one model predicted the experimental time course of reduction in CA. The pattern of our empirical results seem to be best described by the adaptation model that indicates the total vergence response to be a sum of two controllers, phasic and tonic, with the output of phasic controller providing input to the cross-link interactions.

  14. Das Bremerhavener Grundwasser im Klimawandel - Eine FREEWAT-Fallstudie

    NASA Astrophysics Data System (ADS)

    Panteleit, Björn; Jensen, Sven; Seiter, Katherina; Siebert, Yvonne

    2018-01-01

    A 3D structural model was created for the state of Bremen based on an extensive borehole database. Parameters were assigned to the model by interpretation and interpolation of the borehole descriptions. This structural model was transferred into a flow model via the FREEWAT platform, an open-source plug-in of the free QGIS software, with connection to the MODFLOW code. This groundwater management tool is intended for long-term use. As a case study for the FREEWAT Project, possible effects of climate change on groundwater levels in the Bremerhaven area have been simulated. In addition to the calibration year 2010, scenarios with a sea-level rise and decreasing groundwater recharge were simulated for the years 2040, 2070 and 2100. In addition to seawater intrusion in the coastal area, declining groundwater levels are also a concern. Possibilities for future groundwater management already include active control of the water level of a lake and the harbor basin. With the help of a focused groundwater monitoring program based on the model results, the planned flow model can become an important forecasting tool for groundwater management within the framework of the planned continuous model management and for representing the effects of changing climatic conditions and mitigation measures.

  15. The Future of ECHO: Evaluating Open Source Possibilities

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Gilman, J.; Baynes, K.; Mitchell, A. E.

    2012-12-01

    NASA's Earth Observing System ClearingHOuse (ECHO) is a format agnostic metadata repository supporting over 3000 collections and 100M science granules. ECHO exposes FTP and RESTful Data Ingest APIs in addition to both SOAP and RESTful search and order capabilities. Built on top of ECHO is a human facing search and order web application named Reverb. ECHO processes hundreds of orders, tens of thousands of searches, and 1-2M ingest actions each week. As ECHO's holdings, metadata format support, and visibility have increased, the ECHO team has received requests by non-NASA entities for copies of ECHO that can be run locally against their data holdings. ESDIS and the ECHO Team have begun investigations into various deployment and Open Sourcing models that can balance the real constraints faced by the ECHO project with the benefits of providing ECHO capabilities to a broader set of users and providers. This talk will discuss several release and Open Source models being investigated by the ECHO team along with the impacts those models are expected to have on the project. We discuss: - Addressing complex deployment or setup issues for potential users - Models of vetting code contributions - Balancing external (public) user requests versus our primary partners - Preparing project code for public release, including navigating licensing issues related to leveraged libraries - Dealing with non-free project dependencies such as commercial databases - Dealing with sensitive aspects of project code such as database passwords, authentication approaches, security through obscurity, etc. - Ongoing support for the released code including increased testing demands, bug fixes, security fixes, and new features.

  16. Comparison of endovascular repair with branched stent graft and open repair for aortic arch aneurysm.

    PubMed

    Kawatou, Masahide; Minakata, Kenji; Sakamoto, Kazuhisa; Nakatsu, Taro; Tazaki, Junichi; Higami, Hirooki; Uehara, Kyokun; Yamazaki, Kazuhiro; Inoue, Kanji; Kimura, Takeshi; Sakata, Ryuzo

    2017-08-01

    Although conventional open repair is our preference for patients with aortic arch aneurysms, we have often chosen thoracic endovascular aneurysm repair (TEVAR) with a handmade branched stent graft (bTEVAR) in high-risk patients. The aim of this study was to compare the midterm clinical outcomes of our bTEVAR technique to those of the open repair. Between January 2007 and December 2014, we treated 129 patients with aortic arch aneurysm by means of either conventional open repair (OPEN, n = 61) or bTEVAR (n = 68) at our institution. The mean ages were 70.5 ± 12.7 years in the OPEN group and 72.7 ± 12.5 years in the bTEVAR group (P = 0.32). The aetiologies included true aneurysm in 101 patients (78.3%) and chronic dissection in 26 (20.1%). There were 2 (3.3%) in-hospital deaths in the OPEN group and 3 (4.4%) in the bTEVAR group. The mean follow-up duration was 3.0 ± 2.1 years (2.4 ± 1.9 years in the OPEN group and 3.6 ± 2.3 years in the bTEVAR group). There was no difference in 5-year aneurysm-related mortality between groups (10.7% in OPEN vs 12.8% in bTEVAR, P = 0.50). In terms of late additional procedures, however, none were required in the OPEN group, whereas 10 (15.4%) additional endovascular repairs and 4 (6.2%) open repairs were required in the bTEVAR group. Our bTEVAR could be performed with low early mortality, and it yielded similar midterm aneurysm-related mortality to that of conventional open repair. However, these patients undergoing this technique required more late additional procedures than those undergoing conventional open repair. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  17. Personality and HIV Disease Progression: Role of NEO-PI-R Openness, Extraversion, and Profiles of Engagement

    PubMed Central

    O'Cleirigh, Conall; Schneiderman, Neil; Weiss, Alexander; Costa, Paul T.

    2008-01-01

    Objective To examine the role of the big five personality domains (Neuroticism, Extraversion, Openness, Agreeableness, Conscientiousness) and their respective facets and profiles on change in CD4 and log HIV-RNA copies/ml (VL) over 4 years. The examination of psychosocial predictors of disease progression in human immunodeficiency virus (HIV) has focused primarily on depression, coping, and stress, with little attention paid to stable individual differences. Methods A diverse sample of HIV-seropositive patients (n = 104) completed personality assessment (NEO-PI-R), underwent comprehensive psychological assessment and blood samples every 6 months for 4 years. Linear rates of change for CD4 cells and VL were modeled using Hierarchical Linear Modeling controlling for antiretrovirals (time dependent covariate), initial disease status, age, gender, ethnicity, and education. Results Domains that were significantly associated with slower disease progression over 4 years included Openness (CD4, VL), Extraversion (CD4, VL), and Conscientiousness (VL). Facets of the above domains that were significantly related to slower disease progression were assertiveness, positive emotions, and gregariousness (Extraversion); ideas, esthetics (Openness); achievement striving and order (Conscientiousness). In addition, profile analyses suggested personality styles which seem to underscore the importance of remaining engaged (e.g., Creative Interactors (E+O+), Upbeat Optimists (N−E+), Welcomers (E+A+), Go Getters (C+E+), and Directed (N−C+)) had slower disease progression, whereas the “homebody” profile (Low Extraversion-Low Openness) was significantly associated with faster disease progression. Conclusions These results provide good initial evidence of the relationship between personality and disease progression in HIV and suggest protective aspects of profiles of engagement. These finding may help identify those individuals at risk for poorer disease course and specify targets for psychosocial interventions. PMID:18256349

  18. Evolution of dike opening during the March 2011 Kamoamoa fissure eruption, Kīlauea Volcano, Hawai`i

    USGS Publications Warehouse

    Lundgren, Paul; Poland, Michael; Miklius, Asta; Orr, Tim R.; Yun, Sang-Ho; Fielding, Eric; Liu, Zhen; Tanaka, Akiko; Szeliga, Walter; Hensley, Scott; Owen, Susan

    2013-01-01

    The 5–9 March 2011 Kamoamoa fissure eruption along the east rift zone of Kīlauea Volcano, Hawai`i, followed months of pronounced inflation at Kīlauea summit. We examine dike opening during and after the eruption using a comprehensive interferometric synthetic aperture radar (InSAR) data set in combination with continuous GPS data. We solve for distributed dike displacements using a whole Kīlauea model with dilating rift zones and possibly a deep décollement. Modeled surface dike opening increased from nearly 1.5 m to over 2.8 m from the first day to the end of the eruption, in agreement with field observations of surface fracturing. Surface dike opening ceased following the eruption, but subsurface opening in the dike continued into May 2011. Dike volumes increased from 15, to 16, to 21 million cubic meters (MCM) after the first day, eruption end, and 2 months following, respectively. Dike shape is distinctive, with a main limb plunging from the surface to 2–3 km depth in the up-rift direction toward Kīlauea's summit, and a lesser projection extending in the down-rift direction toward Pu`u `Ō`ō at 2 km depth. Volume losses beneath Kīlauea summit (1.7 MCM) and Pu`u `Ō`ō (5.6 MCM) crater, relative to dike plus erupted volume (18.3 MCM), yield a dike to source volume ratio of 2.5 that is in the range expected for compressible magma without requiring additional sources. Inflation of Kīlauea's summit in the months before the March 2011 eruption suggests that the Kamoamoa eruption resulted from overpressure of the volcano's magmatic system.

  19. Evolution of dike opening during the March 2011 Kamoamoa fissure eruption, Kīlauea Volcano, Hawai`i

    NASA Astrophysics Data System (ADS)

    Lundgren, Paul; Poland, Michael; Miklius, Asta; Orr, Tim; Yun, Sang-Ho; Fielding, Eric; Liu, Zhen; Tanaka, Akiko; Szeliga, Walter; Hensley, Scott; Owen, Susan

    2013-03-01

    5-9 March 2011 Kamoamoa fissure eruption along the east rift zone of Kīlauea Volcano, Hawai`i, followed months of pronounced inflation at Kīlauea summit. We examine dike opening during and after the eruption using a comprehensive interferometric synthetic aperture radar (InSAR) data set in combination with continuous GPS data. We solve for distributed dike displacements using a whole Kīlauea model with dilating rift zones and possibly a deep décollement. Modeled surface dike opening increased from nearly 1.5 m to over 2.8 m from the first day to the end of the eruption, in agreement with field observations of surface fracturing. Surface dike opening ceased following the eruption, but subsurface opening in the dike continued into May 2011. Dike volumes increased from 15, to 16, to 21 million cubic meters (MCM) after the first day, eruption end, and 2 months following, respectively. Dike shape is distinctive, with a main limb plunging from the surface to 2-3 km depth in the up-rift direction toward Kīlauea's summit, and a lesser projection extending in the down-rift direction toward Pu`u `Ō`ō at 2 km depth. Volume losses beneath Kīlauea summit (1.7 MCM) and Pu`u `Ō`ō (5.6 MCM) crater, relative to dike plus erupted volume (18.3 MCM), yield a dike to source volume ratio of 2.5 that is in the range expected for compressible magma without requiring additional sources. Inflation of Kīlauea's summit in the months before the March 2011 eruption suggests that the Kamoamoa eruption resulted from overpressure of the volcano's magmatic system.

  20. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C.; Somerville, P.; Jordan, T. H.

    2013-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving SCEC researchers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Broadband Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms of a historical earthquake for which observed strong ground motion data is available. Also in validation mode, the Broadband Platform calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. During the past year, we have modified the software to enable the addition of a large number of historical events, and we are now adding validation simulation inputs and observational data for 23 historical events covering the Eastern and Western United States, Japan, Taiwan, Turkey, and Italy. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. By establishing an interface between scientific modules with a common set of input and output files, the Broadband Platform facilitates the addition of new scientific methods, which are written by earth scientists in a number of languages such as C, C++, Fortran, and Python. The Broadband Platform's modular design also supports the reuse of existing software modules as building blocks to create new scientific methods. Additionally, the Platform implements a wrapper around each scientific module, converting input and output files to and from the specific formats required (or produced) by individual scientific codes. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes the addition of 3 new simulation methods and several new data products, such as map and distance-based goodness of fit plots. Finally, as the number and complexity of scenarios simulated using the Broadband Platform increase, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  1. SPIN EVOLUTION OF ACCRETING YOUNG STARS. I. EFFECT OF MAGNETIC STAR-DISK COUPLING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matt, Sean P.; Greene, Thomas P.; Pinzon, Giovanni

    2010-05-10

    We present a model for the rotational evolution of a young, solar mass star interacting with an accretion disk. The model incorporates a description of the angular momentum transfer between the star and the disk due to a magnetic connection, and includes changes in the star's mass and radius and a decreasing accretion rate. The model also includes, for the first time in a spin evolution model, the opening of the stellar magnetic field lines, as expected to arise from twisting via star-disk differential rotation. In order to isolate the effect that this has on the star-disk interaction torques, wemore » neglect the influence of torques that may arise from open field regions connected to the star or disk. For a range of magnetic field strengths, accretion rates, and initial spin rates, we compute the stellar spin rates of pre-main-sequence stars as they evolve on the Hayashi track to an age of 3 Myr. How much the field opening affects the spin depends on the strength of the coupling of the magnetic field to the disk. For the relatively strong coupling (i.e., high magnetic Reynolds number) expected in real systems, all models predict spin periods of less than {approx}3 days, in the age range of 1-3 Myr. Furthermore, these systems typically do not reach an equilibrium spin rate within 3 Myr, so that the spin at any given time depends upon the choice of initial spin rate. This corroborates earlier suggestions that, in order to explain the full range of observed rotation periods of approximately 1-10 days, additional processes, such as the angular momentum loss from powerful stellar winds, are necessary.« less

  2. Intelligent Multi-Media Presentation Using Rhetorical Structure Theory

    DTIC Science & Technology

    2015-01-01

    information repeatedly, on demand, and without imposing an additional manning burden. Virtual Advisers can be delivered in several ways: as a...up text which identifies what content is to be said in addition to how that content is to be emotionally expressed. </say> <say> Using real-time...development of new rendering engines. These toolkits provide additional common underlying functionality such as: pluggable audio (via OpenAL4/JOAL5

  3. Pyvolve: A Flexible Python Module for Simulating Sequences along Phylogenies.

    PubMed

    Spielman, Stephanie J; Wilke, Claus O

    2015-01-01

    We introduce Pyvolve, a flexible Python module for simulating genetic data along a phylogeny using continuous-time Markov models of sequence evolution. Easily incorporated into Python bioinformatics pipelines, Pyvolve can simulate sequences according to most standard models of nucleotide, amino-acid, and codon sequence evolution. All model parameters are fully customizable. Users can additionally specify custom evolutionary models, with custom rate matrices and/or states to evolve. This flexibility makes Pyvolve a convenient framework not only for simulating sequences under a wide variety of conditions, but also for developing and testing new evolutionary models. Pyvolve is an open-source project under a FreeBSD license, and it is available for download, along with a detailed user-manual and example scripts, from http://github.com/sjspielman/pyvolve.

  4. Mouth opening in patients irradiated for head and neck cancer: a prospective repeated measures study.

    PubMed

    Kamstra, J I; Dijkstra, P U; van Leeuwen, M; Roodenburg, J L N; Langendijk, J A

    2015-05-01

    Aims of this prospective cohort study were (1) to analyze the course of mouth opening up to 48months post-radiotherapy (RT), (2) to assess risk factors predicting decrease in mouth opening, and (3) to develop a multivariable prediction model for change in mouth opening in a large sample of patients irradiated for head and neck cancer. Mouth opening was measured prior to RT (baseline) and at 6, 12, 18, 24, 36, and 48months post-RT. The primary outcome variable was mouth opening. Potential risk factors were entered into a linear mixed model analysis (manual backward-stepwise elimination) to create a multivariable prediction model. The interaction terms between time and risk factors that were significantly related to mouth opening were explored. The study population consisted of 641 patients: 70.4% male, mean age at baseline 62.3years (sd 12.5). Primary tumors were predominantly located in the oro- and nasopharynx (25.3%) and oral cavity (20.6%). Mean mouth opening at baseline was 38.7mm (sd 10.8). Six months post-RT, mean mouth opening was smallest, 36.7mm (sd 10.0). In the linear mixed model analysis, mouth opening was statistically predicted by the location of the tumor, natural logarithm of time post-RT in months (Ln (months)), gender, baseline mouth opening, and baseline age. All main effects interacted with Ln (months). The mean mouth opening decreased slightly over time. Mouth opening was predicted by tumor location, time, gender, baseline mouth opening, and age. The model can be used to predict mouth opening. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Structure of the dimeric RC–LH1–PufX complex from Rhodobaca bogoriensis investigated by electron microscopy

    PubMed Central

    Semchonok, Dmitry A.; Chauvin, Jean-Paul; Frese, Raoul N.; Jungas, Colette; Boekema, Egbert J.

    2012-01-01

    Electron microscopy and single-particle averaging were performed on isolated reaction centre (RC)—antenna complexes (RC–LH1–PufX complexes) of Rhodobaca bogoriensis strain LBB1, with the aim of establishing the LH1 antenna conformation, and, in particular, the structural role of the PufX protein. Projection maps of dimeric complexes were obtained at 13 Å resolution and show the positions of the 2 × 14 LH1 α- and β-subunits. This new dimeric complex displays two open, C-shaped LH1 aggregates of 13 αβ polypeptides partially surrounding the RCs plus two LH1 units forming the dimer interface in the centre. Between the interface and the two half rings are two openings on each side. Next to the openings, there are four additional densities present per dimer, considered to be occupied by four copies of PufX. The position of the RC in our model was verified by comparison with RC–LH1–PufX complexes in membranes. Our model differs from previously proposed configurations for Rhodobacter species in which the LH1 ribbon is continuous in the shape of an S, and the stoichiometry is of one PufX per RC. PMID:23148268

  6. Additional disturbances as a beneficial tool for restoration of post-mining sites: a multi-taxa approach.

    PubMed

    Řehounková, Klára; Čížek, Lukáš; Řehounek, Jiří; Šebelíková, Lenka; Tropek, Robert; Lencová, Kamila; Bogusch, Petr; Marhoul, Pavel; Máca, Jan

    2016-07-01

    Open interior sands represent a highly threatened habitat in Europe. In recent times, their associated organisms have often found secondary refuges outside their natural habitats, mainly in sand pits. We investigated the effects of different restoration approaches, i.e. spontaneous succession without additional disturbances, spontaneous succession with additional disturbances caused by recreational activities, and forestry reclamation, on the diversity and conservation values of spiders, beetles, flies, bees and wasps, orthopterans and vascular plants in a large sand pit in the Czech Republic, Central Europe. Out of 406 species recorded in total, 112 were classified as open sand specialists and 71 as threatened. The sites restored through spontaneous succession with additional disturbances hosted the largest proportion of open sand specialists and threatened species. The forestry reclamations, in contrast, hosted few such species. The sites with spontaneous succession without disturbances represent a transition between these two approaches. While restoration through spontaneous succession favours biodiversity in contrast to forestry reclamation, additional disturbances are necessary to maintain early successional habitats essential for threatened species and open sand specialists. Therefore, recreational activities seem to be an economically efficient restoration tool that will also benefit biodiversity in sand pits.

  7. OpenSesame: an open-source, graphical experiment builder for the social sciences.

    PubMed

    Mathôt, Sebastiaan; Schreij, Daniel; Theeuwes, Jan

    2012-06-01

    In the present article, we introduce OpenSesame, a graphical experiment builder for the social sciences. OpenSesame is free, open-source, and cross-platform. It features a comprehensive and intuitive graphical user interface and supports Python scripting for complex tasks. Additional functionality, such as support for eyetrackers, input devices, and video playback, is available through plug-ins. OpenSesame can be used in combination with existing software for creating experiments.

  8. Impact of mechanism vibration characteristics by joint clearance and optimization design of its multi-objective robustness

    NASA Astrophysics Data System (ADS)

    Zeng, Baoping; Wang, Chao; Zhang, Yu; Gong, Yajun; Hu, Sanbao

    2017-12-01

    Joint clearances and friction characteristics significantly influence the mechanism vibration characteristics; for example: as for joint clearances, the shaft and bearing of its clearance joint collide to bring about the dynamic normal contact force and tangential coulomb friction force while the mechanism works; thus, the whole system may vibrate; moreover, the mechanism is under contact-impact with impact force constraint from free movement under action of the above dynamic forces; in addition, the mechanism topology structure also changes. The constraint relationship between joints may be established by a repeated complex nonlinear dynamic process (idle stroke - contact-impact - elastic compression - rebound - impact relief - idle stroke movement - contact-impact). Analysis of vibration characteristics of joint parts is still a challenging open task by far. The dynamic equations for any mechanism with clearance is often a set of strong coupling, high-dimensional and complex time-varying nonlinear differential equations which are solved very difficultly. Moreover, complicated chaotic motions very sensitive to initial values in impact and vibration due to clearance let high-precision simulation and prediction of their dynamic behaviors be more difficult; on the other hand, their subsequent wearing necessarily leads to some certain fluctuation of structure clearance parameters, which acts as one primary factor for vibration of the mechanical system. A dynamic model was established to the device for opening the deepwater robot cabin door with joint clearance by utilizing the finite element method and analysis was carried out to its vibration characteristics in this study. Moreover, its response model was carried out by utilizing the DOE method and then the robust optimization design was performed to sizes of the joint clearance and the friction coefficient change range so that the optimization design results may be regarded as reference data for selecting bearings and controlling manufacturing process parameters for the opening mechanism. Several optimization objectives such as x/y/z accelerations for various measuring points and dynamic reaction forces of mounting brackets, and a few constraints including manufacturing process were taken into account in the optimization models, which were solved by utilizing the multi-objective genetic algorithm (NSGA-II). The vibration characteristics of the optimized opening mechanism are superior to those of the original design. In addition, the numerical forecast results are in good agreement with the test results of the prototype.

  9. A feasibility study on porting the community land model onto accelerators using OpenACC

    DOE PAGES

    Wang, Dali; Wu, Wei; Winkler, Frank; ...

    2014-01-01

    As environmental models (such as Accelerated Climate Model for Energy (ACME), Parallel Reactive Flow and Transport Model (PFLOTRAN), Arctic Terrestrial Simulator (ATS), etc.) became more and more complicated, we are facing enormous challenges regarding to porting those applications onto hybrid computing architecture. OpenACC appears as a very promising technology, therefore, we have conducted a feasibility analysis on porting the Community Land Model (CLM), a terrestrial ecosystem model within the Community Earth System Models (CESM)). Specifically, we used automatic function testing platform to extract a small computing kernel out of CLM, then we apply this kernel into the actually CLM dataflowmore » procedure, and investigate the strategy of data parallelization and the benefit of data movement provided by current implementation of OpenACC. Even it is a non-intensive kernel, on a single 16-core computing node, the performance (based on the actual computation time using one GPU) of OpenACC implementation is 2.3 time faster than that of OpenMP implementation using single OpenMP thread, but it is 2.8 times slower than the performance of OpenMP implementation using 16 threads. On multiple nodes, MPI_OpenACC implementation demonstrated very good scalability on up to 128 GPUs on 128 computing nodes. This study also provides useful information for us to look into the potential benefits of “deep copy” capability and “routine” feature of OpenACC standards. In conclusion, we believe that our experience on the environmental model, CLM, can be beneficial to many other scientific research programs who are interested to porting their large scale scientific code using OpenACC onto high-end computers, empowered by hybrid computing architecture.« less

  10. Experimental and modelling of Arthrospira platensis cultivation in open raceway ponds.

    PubMed

    Ranganathan, Panneerselvam; Amal, J C; Savithri, S; Haridas, Ajith

    2017-10-01

    In this study, the growth of Arthrospira platensis was studied in an open raceway pond. Furthermore, dynamic model for algae growth and CFD modelling of hydrodynamics in open raceway pond were developed. The dynamic behaviour of the algal system was developed by solving mass balance equations of various components, considering light intensity and gas-liquid mass transfer. A CFD modelling of the hydrodynamics of open raceway pond was developed by solving mass and momentum balance equations of the liquid medium. The prediction of algae concentration from the dynamic model was compared with the experimental data. The hydrodynamic behaviour of the open raceway pond was compared with the literature data for model validation. The model predictions match the experimental findings. Furthermore, the hydrodynamic behaviour and residence time distribution in our small raceway pond were predicted. These models can serve as a tool to assess the pond performance criteria. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Archetype-based conversion of EHR content models: pilot experience with a regional EHR system.

    PubMed

    Chen, Rong; Klein, Gunnar O; Sundvall, Erik; Karlsson, Daniel; Ahlfeldt, Hans

    2009-07-01

    Exchange of Electronic Health Record (EHR) data between systems from different suppliers is a major challenge. EHR communication based on archetype methodology has been developed by openEHR and CEN/ISO. The experience of using archetypes in deployed EHR systems is quite limited today. Currently deployed EHR systems with large user bases have their own proprietary way of representing clinical content using various models. This study was designed to investigate the feasibility of representing EHR content models from a regional EHR system as openEHR archetypes and inversely to convert archetypes to the proprietary format. The openEHR EHR Reference Model (RM) and Archetype Model (AM) specifications were used. The template model of the Cambio COSMIC, a regional EHR product from Sweden, was analyzed and compared to the openEHR RM and AM. This study was focused on the convertibility of the EHR semantic models. A semantic mapping between the openEHR RM/AM and the COSMIC template model was produced and used as the basis for developing prototype software that performs automated bi-directional conversion between openEHR archetypes and COSMIC templates. Automated bi-directional conversion between openEHR archetype format and COSMIC template format has been achieved. Several archetypes from the openEHR Clinical Knowledge Repository have been imported into COSMIC, preserving most of the structural and terminology related constraints. COSMIC templates from a large regional installation were successfully converted into the openEHR archetype format. The conversion from the COSMIC templates into archetype format preserves nearly all structural and semantic definitions of the original content models. A strategy of gradually adding archetype support to legacy EHR systems was formulated in order to allow sharing of clinical content models defined using different formats. The openEHR RM and AM are expressive enough to represent the existing clinical content models from the template based EHR system tested and legacy content models can automatically be converted to archetype format for sharing of knowledge. With some limitations, internationally available archetypes could be converted to the legacy EHR models. Archetype support can be added to legacy EHR systems in an incremental way allowing a migration path to interoperability based on standards.

  12. Characterization of Air Emissions from Open Burning and ...

    EPA Pesticide Factsheets

    Emissions from open burning (OB) and open detonation (OD) of military ordnance and static fires (SF) of rocket motors were sampled in fall, 2013 at the Dundurn Depot (Saskatchewan, Canada). Emission sampling was conducted with an aerostat-lofted instrument package termed the “Flyer” that was maneuvered into the downwind plumes. Forty-nine OB events, 94 OD events, and 16 SF on four propellants types (Triple base, 105 M1, 155 M4A2 white bag, and 155 M6 red bag), two smokes (HC grenade and red phosphorus), five explosive types (Trigran, C4, ANFO, ANFO+HC grenade, and ANFO+Flare), and two rocket motors types (CVR-7 and MK 58) resulted in emission factors for particulate matter (PM), carbon dioxide (CO2), carbon monoxide (CO), methane (CH4), volatile organic compounds (VOCs), chlorine species (HCl, chloride, chlorate, perchlorate), polychlorinated dibenzodioxins and polychlorinated dibenzofurans (PCDDs/PCDFs) and PM-based metals. These data provide Canada and the United States with additional air emissions data to support health risk assessments and permitting for safe treatment of military ordnance by OB/OD/SF. In addition, the data will be used to conduct air dispersion modelling assessing the impact of treatment of various ordnance on the air quality, to support mandatory reporting requirements of the Canadian Environmental Protection Act (CEPA), the National Pollutant Release Inventory (NPRI), and to update the Canadian Ammunition Chemical Database.Result

  13. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  14. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  15. Screening the Medicines for Malaria Venture Pathogen Box across Multiple Pathogens Reclassifies Starting Points for Open-Source Drug Discovery

    PubMed Central

    Sykes, Melissa L.; Jones, Amy J.; Shelper, Todd B.; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E.

    2017-01-01

    ABSTRACT Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro. Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. PMID:28674055

  16. Implementation of a near-real time cross-border web-mapping platform on airborne particulate matter (PM) concentration with open-source software

    NASA Astrophysics Data System (ADS)

    Knörchen, Achim; Ketzler, Gunnar; Schneider, Christoph

    2015-01-01

    Although Europe has been growing together for the past decades, cross-border information platforms on environmental issues are still scarce. With regard to the establishment of a web-mapping tool on airborne particulate matter (PM) concentration for the Euregio Meuse-Rhine located in the border region of Belgium, Germany and the Netherlands, this article describes the research on methodical and technical backgrounds implementing such a platform. An open-source solution was selected for presenting the data in a Web GIS (OpenLayers/GeoExt; both JavaScript-based), applying other free tools for data handling (Python), data management (PostgreSQL), geo-statistical modelling (Octave), geoprocessing (GRASS GIS/GDAL) and web mapping (MapServer). The multilingual, made-to-order online platform provides access to near-real time data on PM concentration as well as additional background information. In an open data section, commented configuration files for the Web GIS client are being made available for download. Furthermore, all geodata generated by the project is being published under public domain and can be retrieved in various formats or integrated into Desktop GIS as Web Map Services (WMS).

  17. [Effects of psychological stress on performances in open-field test of rats and tyrosine's modulation].

    PubMed

    Chen, Wei-Qiang; Cheng, Yi-Yong; Li, Shu-Tian; Hong, Yan; Wang, Dong-Lan; Hou, Yue

    2009-02-01

    To explore the effects of different doses of tyrosine modulation on behavioral performances in open field test of psychological stress rats. The animal model of psychological stress was developed by restraint stress for 21 days. Wistar rats were randomly assigned to five groups (n = 10) as follows: control group (CT), stress control group (SCT), low, medium and high-doses of tyrosine modulation stress groups (SLT, SMT and SIT). The changes of behavioral performances were examined by open-field test. Serum levels of cortisol, norepinephrine and dopamine were also detected. The levels of serum cortisol were all increased obviously in the four stress groups, and their bodyweight gainings were diminished. The behavioral performances of SCT rats in open-field test were changed significantly in contrast to that of CT rats. However, The behavioral performances of SMT and SHT rats were not different from that of CT rats. In addition, the serum levels of norepinephrine and dopamine were downregulated obviously in SCT and SLT groups, and no differences were observed in other groups. Psychological stress can impair body behavioral performances, and moderate tyrosine modulation may improve these abnormal changes. The related mechanisms may be involved with the changes of norepinephrine and dopamine.

  18. Understanding spatio-temporal strategies of adult zebrafish exploration in the open field test.

    PubMed

    Stewart, Adam Michael; Gaikwad, Siddharth; Kyzar, Evan; Kalueff, Allan V

    2012-04-27

    Zebrafish (Danio rerio) are emerging as a useful model organism for neuroscience research. Mounting evidence suggests that various traditional rodent paradigms may be adapted for testing zebrafish behavior. The open field test is a popular rodent test of novelty exploration, recently applied to zebrafish research. To better understand fish novelty behavior, we exposed adult zebrafish to two different open field arenas for 30 min, assessing the amount and temporal patterning of their exploration. While (similar to rodents) zebrafish scale their locomotory activity depending on the size of the tank, the temporal patterning of their activity was independent of arena size. These observations strikingly parallel similar rodent behaviors, suggesting that spatio-temporal strategies of animal exploration may be evolutionarily conserved across vertebrate species. In addition, we found interesting oscillations in zebrafish exploration, with the per-minute distribution of their horizontal activity demonstrating sinusoidal-like patterns. While such patterning is not reported for rodents and other higher vertebrates, a nonlinear regression analysis confirmed the oscillation patterning of all assessed zebrafish behavioral endpoints in both open field arenas, revealing a potentially important aspect of novelty exploration in lower vertebrates. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Screening the Medicines for Malaria Venture Pathogen Box across Multiple Pathogens Reclassifies Starting Points for Open-Source Drug Discovery.

    PubMed

    Duffy, Sandra; Sykes, Melissa L; Jones, Amy J; Shelper, Todd B; Simpson, Moana; Lang, Rebecca; Poulsen, Sally-Ann; Sleebs, Brad E; Avery, Vicky M

    2017-09-01

    Open-access drug discovery provides a substantial resource for diseases primarily affecting the poor and disadvantaged. The open-access Pathogen Box collection is comprised of compounds with demonstrated biological activity against specific pathogenic organisms. The supply of this resource by the Medicines for Malaria Venture has the potential to provide new chemical starting points for a number of tropical and neglected diseases, through repurposing of these compounds for use in drug discovery campaigns for these additional pathogens. We tested the Pathogen Box against kinetoplastid parasites and malaria life cycle stages in vitro Consequently, chemical starting points for malaria, human African trypanosomiasis, Chagas disease, and leishmaniasis drug discovery efforts have been identified. Inclusive of this in vitro biological evaluation, outcomes from extensive literature reviews and database searches are provided. This information encompasses commercial availability, literature reference citations, other aliases and ChEMBL number with associated biological activity, where available. The release of this new data for the Pathogen Box collection into the public domain will aid the open-source model of drug discovery. Importantly, this will provide novel chemical starting points for drug discovery and target identification in tropical disease research. Copyright © 2017 Duffy et al.

  20. Molecular dynamics simulations give insight into the conformational change, complex formation, and electron transfer pathway for cytochrome P450 reductase

    PubMed Central

    Sündermann, Axel; Oostenbrink, Chris

    2013-01-01

    Cytochrome P450 reductase (CYPOR) undergoes a large conformational change to allow for an electron transfer to a redox partner to take place. After an internal electron transfer over its cofactors, it opens up to facilitate the interaction and electron transfer with a cytochrome P450. The open conformation appears difficult to crystallize. Therefore, a model of a human CYPOR in the open conformation was constructed to be able to investigate the stability and conformational change of this protein by means of molecular dynamics simulations. Since the role of the protein is to provide electrons to a redox partner, the interactions with cytochrome P450 2D6 (2D6) were investigated and a possible complex structure is suggested. Additionally, electron pathway calculations with a newly written program were performed to investigate which amino acids relay the electrons from the FMN cofactor of CYPOR to the HEME of 2D6. Several possible interacting amino acids in the complex, as well as a possible electron transfer pathway were identified and open the way for further investigation by site directed mutagenesis studies. PMID:23832577

  1. Strand swapping regulates the iron-sulfur cluster in the diabetes drug target mitoNEET

    PubMed Central

    Baxter, Elizabeth Leigh; Jennings, Patricia A.; Onuchic, José N.

    2012-01-01

    MitoNEET is a recently identified diabetes drug target that coordinates a transferable 2Fe-2S cluster, and additionally contains an unusual strand swap. In this manuscript, we use a dual basin structure-based model to predict and characterize the folding and functionality of strand swapping in mitoNEET. We demonstrate that a strand unswapped conformation is kinetically accessible and that multiple levels of control are employed to regulate the conformational dynamics of the system. Environmental factors such as temperature can shift route preference toward the unswapped pathway. Additionally we see that a region recently identified as contributing to frustration in folding acts as a regulatory hinge loop that modulates conformational balance. Interestingly, strand unswapping transfers strain specifically to cluster-coordinating residues, opening the cluster-coordinating pocket. Strengthening contacts within the cluster-coordinating pocket opens a new pathway between the swapped and unswapped conformation that utilizes cracking to bypass the unfolded basin. These results suggest that local control within distinct regions affect motions important in regulating mitoNEET’s 2Fe-2S clusters. PMID:22308404

  2. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and institutional logic from higher level processes (engine) suit JWP's requirements. The use of Hydra Platform and Pynsim helps make complex customised models such as the JWP model easier to run and manage with international groups of researchers.

  3. Numerical investigation of unsteady cavitation around a NACA 66 hydrofoil using OpenFOAM

    NASA Astrophysics Data System (ADS)

    Hidalgo, V. H.; Luo, X. W.; Escaler, X.; Ji, J.; Aguinaga, A.

    2014-03-01

    The prediction and control of cavitation damage in pumps, propellers, hydro turbines and fluid machinery in general is necessary during the design stage. The present paper deals with a numerical investigation of unsteady cloud cavitation around a NACA 66 hydrofoil. The current study is focused on understanding the dynamic pressures generated during the cavity collapses as a fundamental characteristic in cavitation erosion. A 2D and 3D unsteady flow simulation has been carried out using OpenFOAM. Then, Paraview and Python programming language have been used to characterize dynamic pressure field. Adapted Large Eddy Simulation (LES) and Zwart cavitation model have been implemented to improve the analysis of cloud motion and to visualize the bubble expansions. Additional results also confirm the correlation between cavity formation and generated pressures.

  4. Investigation of Blade Angle of an Open Cross-Flow Runner

    NASA Astrophysics Data System (ADS)

    Katayama, Yusuke; Iio, Shouichiro; Veerapun, Salisa; Uchiyama, Tomomi

    2015-04-01

    The aim of this study was to develop a nano-hydraulic turbine utilizing drop structure in irrigation channels or industrial waterways. This study was focused on an open-type cross-flow turbine without any attached equipment for cost reduction and easy maintenance. In this study, the authors used an artificial indoor waterfall as lab model. Test runner which is a simple structure of 20 circular arc-shaped blades sandwiched by two circular plates was used The optimum inlet blade angle and the relationship between the power performance and the flow rate approaching theoretically and experimentally were investigated. As a result, the optimum inlet blade angle due to the flow rate was changed. Additionally, allocation rate of power output in 1st stage and 2nd stage is changed by the blade inlet angle.

  5. A Generic Software Architecture For Prognostics

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Daigle, Matthew J.; Sankararaman, Shankar; Goebel, Kai; Watkins, Jason

    2017-01-01

    Prognostics is a systems engineering discipline focused on predicting end-of-life of components and systems. As a relatively new and emerging technology, there are few fielded implementations of prognostics, due in part to practitioners perceiving a large hurdle in developing the models, algorithms, architecture, and integration pieces. As a result, no open software frameworks for applying prognostics currently exist. This paper introduces the Generic Software Architecture for Prognostics (GSAP), an open-source, cross-platform, object-oriented software framework and support library for creating prognostics applications. GSAP was designed to make prognostics more accessible and enable faster adoption and implementation by industry, by reducing the effort and investment required to develop, test, and deploy prognostics. This paper describes the requirements, design, and testing of GSAP. Additionally, a detailed case study involving battery prognostics demonstrates its use.

  6. Power Watch - A global, open database of power plants that supports research on climate, water and air pollution impact of the global power sector.

    NASA Astrophysics Data System (ADS)

    Friedrich, J.; Kressig, A.; Van Groenou, S.; McCormick, C.

    2017-12-01

    Challenge The lack of transparent, accessible, and centralized power sector data inhibits the ability to research the impact of the global power sector. information gaps for citizens, analysts, and decision makers worldwide create barriers to sustainable development efforts. The need for transparent, accessible, and centralized information is especially important to enhance the commitments outlined in the recently adopted Paris Agreement and Sustainable Development Goals. Offer Power Watch will address this challenge by creating a comprehensive, open-source platform on the world's power systems. The platform hosts data on 85% of global installed electrical capacity and for each power plant will include data points on installed capacity, fuel type, annual generation, commissioning year, with more characteristics like emissions, particulate matter, annual water demand and more added over time. Most of the data is reported from national level sources, but annual generation and other operational characteristiscs are estimated via Machine Learning modeling and remotely sensed data when not officially reported. In addition, Power Watch plans to provide a suite of tools that address specific decision maker needs, such as water risk assessments and air pollution modeling. Impact Through open data, the platform and its tools will allow reserachers to do more analysis of power sector impacts and perform energy modeling. It will help catalyze accountability for policy makers, businesses, and investors and will inform and drive the transition to a clean energy future while reaching development targets.

  7. CO2 enrichment and N addition increase nutrient loss from decomposing leaf litter in subtropical model forest ecosystems

    PubMed Central

    Liu, Juxiu; Fang, Xiong; Deng, Qi; Han, Tianfeng; Huang, Wenjuan; Li, Yiyong

    2015-01-01

    As atmospheric CO2 concentration increases, many experiments have been carried out to study effects of CO2 enrichment on litter decomposition and nutrient release. However, the result is still uncertain. Meanwhile, the impact of CO2 enrichment on nutrients other than N and P are far less studied. Using open-top chambers, we examined effects of elevated CO2 and N addition on leaf litter decomposition and nutrient release in subtropical model forest ecosystems. We found that both elevated CO2 and N addition increased nutrient (C, N, P, K, Ca, Mg and Zn) loss from the decomposing litter. The N, P, Ca and Zn loss was more than tripled in the chambers exposed to both elevated CO2 and N addition than those in the control chambers after 21 months of treatment. The stimulation of nutrient loss under elevated CO2 was associated with the increased soil moisture, the higher leaf litter quality and the greater soil acidity. Accelerated nutrient release under N addition was related to the higher leaf litter quality, the increased soil microbial biomass and the greater soil acidity. Our results imply that elevated CO2 and N addition will increase nutrient cycling in subtropical China under the future global change. PMID:25608664

  8. DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...

  9. Response of Solar Irradiance to Sunspot-area Variations

    NASA Astrophysics Data System (ADS)

    Dudok de Wit, T.; Kopp, G.; Shapiro, A.; Witzke, V.; Kretzschmar, M.

    2018-02-01

    One of the important open questions in solar irradiance studies is whether long-term variability (i.e., on timescales of years and beyond) can be reconstructed by means of models that describe short-term variability (i.e., days) using solar proxies as inputs. Preminger & Walton showed that the relationship between spectral solar irradiance and proxies of magnetic-flux emergence, such as the daily sunspot area, can be described in the framework of linear system theory by means of the impulse response. We significantly refine that empirical model by removing spurious solar-rotational effects and by including an additional term that captures long-term variations. Our results show that long-term variability cannot be reconstructed from the short-term response of the spectral irradiance, which questions the extension of solar proxy models to these timescales. In addition, we find that the solar response is nonlinear in a way that cannot be corrected simply by applying a rescaling to a sunspot area.

  10. An open ecosystem engagement strategy through the lens of global food safety

    PubMed Central

    Stacey, Paul; Fons, Garin; Bernardo, Theresa M

    2015-01-01

    The Global Food Safety Partnership (GFSP) is a public/private partnership established through the World Bank to improve food safety systems through a globally coordinated and locally-driven approach. This concept paper aims to establish a framework to help GFSP fully leverage the potential of open models. In preparing this paper the authors spoke to many different GFSP stakeholders who asked questions about open models such as: what is it?what’s in it for me?why use an open rather than a proprietary model?how will open models generate equivalent or greater sustainable revenue streams compared to the current “traditional” approaches?  This last question came up many times with assertions that traditional service providers need to see opportunity for equivalent or greater revenue dollars before they will buy-in. This paper identifies open value propositions for GFSP stakeholders and proposes a framework for creating and structuring that value. Open Educational Resources (OER) were the primary open practice GFSP partners spoke to us about, as they provide a logical entry point for collaboration. Going forward, funders should consider requiring that educational resources and concomitant data resulting from their sponsorship should be open, as a public good. There are, however, many other forms of open practice that bring value to the GFSP. Nine different open strategies and tactics (Appendix A) are described, including: open content (including OER and open courseware), open data, open access (research), open government, open source software, open standards, open policy, open licensing and open hardware. It is recommended that all stakeholders proactively pursue "openness" as an operating principle. This paper presents an overall GFSP Open Ecosystem Engagement Strategy within which specific local case examples can be situated. Two different case examples, China and Colombia, are presented to show both project-based and crowd-sourced, direct-to-public paths through this ecosystem. PMID:26213614

  11. Assessment of an approach to printed polymer lenses

    NASA Astrophysics Data System (ADS)

    Marasco, Peter L.; Foote, Bob

    2017-05-01

    Additive manufacturing is proving its relevancy across a wide spectrum of development, prototyping and manufacturing in the US. However, there is a desire to move the capability beyond modeling and structural components. The use of additive manufacturing techniques to fabricate low-cost optics and optical systems is highly desirable in a number of markets. But processes and techniques for successfully printing an optic are currently very new. This paper discusses early advances in printing optics suitable for commercial and military applications. Data from and analysis of early prototype lenses fabricated using one possible technique will be included and discussed. The potential for additive manufacturing of optics to open the design space for complex optics and reduce development time, lowering cost and speeding up time to market, will also be discussed.

  12. Prediction models for CO2 emission in Malaysia using best subsets regression and multi-linear regression

    NASA Astrophysics Data System (ADS)

    Tan, C. H.; Matjafri, M. Z.; Lim, H. S.

    2015-10-01

    This paper presents the prediction models which analyze and compute the CO2 emission in Malaysia. Each prediction model for CO2 emission will be analyzed based on three main groups which is transportation, electricity and heat production as well as residential buildings and commercial and public services. The prediction models were generated using data obtained from World Bank Open Data. Best subset method will be used to remove irrelevant data and followed by multi linear regression to produce the prediction models. From the results, high R-square (prediction) value was obtained and this implies that the models are reliable to predict the CO2 emission by using specific data. In addition, the CO2 emissions from these three groups are forecasted using trend analysis plots for observation purpose.

  13. Evaluation of CHO Benchmarks on the Arria 10 FPGA using Intel FPGA SDK for OpenCL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Zheming; Yoshii, Kazutomo; Finkel, Hal

    The OpenCL standard is an open programming model for accelerating algorithms on heterogeneous computing system. OpenCL extends the C-based programming language for developing portable codes on different platforms such as CPU, Graphics processing units (GPUs), Digital Signal Processors (DSPs) and Field Programmable Gate Arrays (FPGAs). The Intel FPGA SDK for OpenCL is a suite of tools that allows developers to abstract away the complex FPGA-based development flow for a high-level software development flow. Users can focus on the design of hardware-accelerated kernel functions in OpenCL and then direct the tools to generate the low-level FPGA implementations. The approach makes themore » FPGA-based development more accessible to software users as the needs for hybrid computing using CPUs and FPGAs are increasing. It can also significantly reduce the hardware development time as users can evaluate different ideas with high-level language without deep FPGA domain knowledge. Benchmarking of OpenCL-based framework is an effective way for analyzing the performance of system by studying the execution of the benchmark applications. CHO is a suite of benchmark applications that provides support for OpenCL [1]. The authors presented CHO as an OpenCL port of the CHStone benchmark. Using Altera OpenCL (AOCL) compiler to synthesize the benchmark applications, they listed the resource usage and performance of each kernel that can be successfully synthesized by the compiler. In this report, we evaluate the resource usage and performance of the CHO benchmark applications using the Intel FPGA SDK for OpenCL and Nallatech 385A FPGA board that features an Arria 10 FPGA device. The focus of the report is to have a better understanding of the resource usage and performance of the kernel implementations using Arria-10 FPGA devices compared to Stratix-5 FPGA devices. In addition, we also gain knowledge about the limitations of the current compiler when it fails to synthesize a benchmark application.« less

  14. HELI-DEM portal for geo-processing services

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Antonovic, Milan; Molinari, Monia

    2014-05-01

    HELI-DEM (Helvetia-Italy Digital Elevation Model) is a project developed in the framework of Italy/Switzerland Operational Programme for Trans-frontier Cooperation 2007-2013 whose major aim is to create a unified digital terrain model that includes the alpine and sub-alpine areas between Italy and Switzerland. The partners of the project are: Lombardy Region, Piedmont Region, Polytechnic of Milan, Polytechnic of Turin and Fondazione Politecnico from Italy; Institute of Earth Sciences (SUPSI) from Switzerland. The digital terrain model has been produced by integrating and validating the different elevation data available for the areas of interest, characterized by different reference frame, resolutions and accuracies: DHM at 25 m resolution from Swisstopo, DTM at 20 m resolution from Lombardy Region, DTM at 5 m resolution from Piedmont Region and DTM LiDAR PST-A at about 1 m resolution, that covers the main river bed areas and is produced by the Italian Ministry of the Environment. Further results of the project are: the generation of a unique Italian Swiss geoid with an accuracy of few centimeters (Gilardoni et al. 2012); the establishment of a GNSS permanent network, prototype of a transnational positioning service; the development of a geo-portal, entirely based on open source technologies and open standards, which provides the cross-border DTM and offers some capabilities of analysis and processing through the Internet. With this talk, the authors want to present the main steps of the project with a focus on the HELI-DEM geo-portal development carried out by the Institute of Earth Sciences, which is the access point to the DTM outputted from the project. The portal, accessible at http://geoservice.ist.supsi.ch/helidem, is a demonstration of open source technologies combined for providing access to geospatial functionalities to wide non GIS expert public. In fact, the system is entirely developed using only Open Standards and Free and Open Source Software (FOSS) both on the server side (services) and on the client side (interface). In addition to self developed code the system relies mainly on teh software GRASS 7 [1], ZOO-project [2], Geoserver [3] and OpenLayers [4] and the standards WMS [5], WCS [6] and WPS [7]. At the time of writing, the portal offers features like profiling, contour extraction, watershed delineation and analysis, derivatives calculation, data extraction, coordinate conversion but it is evolving and it is planned to extend to a series of environmental modeling that the IST developed in the past like dam break simulation, landslide run-out estimation and floods due to landslide impact in artificial basins. [1] Neteler M., Mitasova H., Open Source GIS: A GRASS GIS Approach. 3rd Ed. 406 pp, Springer, New York, 2008. [2] Fenoy G., Bozon N., Raghavan V., ZOO Project: The Open Wps Platform. Proceeding of 1st International Workshop on Pervasive Web Mapping, Geoprocessing and Services (WebMGS). Como, http://www.isprs.org/proceedings/XXXVIII/4-W13/ID_32.pdf, 26-27 agosto 2010. [3] Giannecchini S., Aime A., GeoServer, il server open source per la gestione interoperabile dei dati geospaziali. Atti 15a Conferenza Nazionale ASITA. Reggia di Colorno, 15-18 novembre 2011. [4] Perez A.S., OpenLayers Cookbook. Packt Publishing, 2012. ISBN 1849517843. [5] OGC, OpenGIS Web Map Server Implementation Specification, http://www.opengeospatial.org/standards/wms, 2006. [6] OGC, OGC WCS 2.0 Interface Standard - Core, http://portal.opengeospatial.org/files/?artifact_id=41437, 2010b. [7] OGC, OpenGIS Web Processing Service, http://portal.opengeospatial.org/files/?artifact_id=24151, 2007.

  15. Laparoscopic versus open-component separation: a comparative analysis in a porcine model.

    PubMed

    Rosen, Michael J; Williams, Christina; Jin, Judy; McGee, Michael F; Schomisch, Steve; Marks, Jeffrey; Ponsky, Jeffrey

    2007-09-01

    The ideal surgical treatment for complicated ventral hernias remains elusive. Traditional component separation provides local advancement of native tissue for tension-free closure without prosthetic materials. This technique requires an extensive subcutaneous dissection with division of perforating vessels predisposing to skin-flap necrosis and complicated wound infections. A minimally invasive component separation may decrease wound complication rates; however, the adequacy of the myofascial advancement has not been studied. Five 25-kg pigs underwent bilateral laparoscopic component separation. A 10-mm incision was made lateral to the rectus abdominus muscle. The external oblique fascia was incised, and a dissecting balloon was inflated between the internal and external oblique muscles. Two additional ports were placed in the intermuscular space. The external oblique was incised from the costal margin to the inguinal ligament. The maximal abdominal wall advancement was recorded. A formal open-component separation was performed and maximal advancement 5 cm superior and 5 cm inferior to the umbilicus was recorded for comparison. Groups were compared using standard statistical analysis. The laparoscopic component separation was completed successfully in all animals, with a mean of 22 min/side. Laparoscopic component separation yielded 3.9 cm (SD 1.1) of fascial advancement above the umbilicus, whereas 4.4 cm (1.2) was obtained after open release (P = .24). Below the umbilicus, laparoscopic release achieved 5.0 cm (1.0) of advancement, whereas 5.8 cm (1.2) was gained after open release (P = .13). The minimally invasive component separation achieved an average of 86% of the myofascial advancement compared with a formal open release. The laparoscopic approach does not require extensive subcutaneous dissection and might theoretically result in a decreased incidence or decreased complexity of postoperative wound infections or skin-flap necrosis. Based on our preliminary data in this porcine model, further comparative studies of laparoscopic versus open component separation in complex ventral hernia repair is warranted to evaluate postoperative morbidity and long-term hernia recurrence rates.

  16. Vertical Interaction in Open Software Engineering Communities

    DTIC Science & Technology

    2009-03-01

    Program in CASOS (NSF,DGE-9972762), the Office of Naval Research under Dynamic Network Analysis program (N00014-02-1-0973, the Air Force Office of...W91WAW07C0063) for research in the area of dynamic network analysis. Additional support was provided by CASOS - the center for Computational Analysis of Social...methods across the domain. For a given project, de - velopers can choose from dozens of models, tools, platforms, and languages for specification, design

  17. Basis set and electron correlation effects on the polarizability and second hyperpolarizability of model open-shell π-conjugated systems

    NASA Astrophysics Data System (ADS)

    Champagne, Benoı̂t; Botek, Edith; Nakano, Masayoshi; Nitta, Tomoshige; Yamaguchi, Kizashi

    2005-03-01

    The basis set and electron correlation effects on the static polarizability (α) and second hyperpolarizability (γ) are investigated ab initio for two model open-shell π-conjugated systems, the C5H7 radical and the C6H8 radical cation in their doublet state. Basis set investigations evidence that the linear and nonlinear responses of the radical cation necessitate the use of a less extended basis set than its neutral analog. Indeed, double-zeta-type basis sets supplemented by a set of d polarization functions but no diffuse functions already provide accurate (hyper)polarizabilities for C6H8 whereas diffuse functions are compulsory for C5H7, in particular, p diffuse functions. In addition to the 6-31G*+pd basis set, basis sets resulting from removing not necessary diffuse functions from the augmented correlation consistent polarized valence double zeta basis set have been shown to provide (hyper)polarizability values of similar quality as more extended basis sets such as augmented correlation consistent polarized valence triple zeta and doubly augmented correlation consistent polarized valence double zeta. Using the selected atomic basis sets, the (hyper)polarizabilities of these two model compounds are calculated at different levels of approximation in order to assess the impact of including electron correlation. As a function of the method of calculation antiparallel and parallel variations have been demonstrated for α and γ of the two model compounds, respectively. For the polarizability, the unrestricted Hartree-Fock and unrestricted second-order Møller-Plesset methods bracket the reference value obtained at the unrestricted coupled cluster singles and doubles with a perturbative inclusion of the triples level whereas the projected unrestricted second-order Møller-Plesset results are in much closer agreement with the unrestricted coupled cluster singles and doubles with a perturbative inclusion of the triples values than the projected unrestricted Hartree-Fock results. Moreover, the differences between the restricted open-shell Hartree-Fock and restricted open-shell second-order Møller-Plesset methods are small. In what concerns the second hyperpolarizability, the unrestricted Hartree-Fock and unrestricted second-order Møller-Plesset values remain of similar quality while using spin-projected schemes fails for the charged system but performs nicely for the neutral one. The restricted open-shell schemes, and especially the restricted open-shell second-order Møller-Plesset method, provide for both compounds γ values close to the results obtained at the unrestricted coupled cluster level including singles and doubles with a perturbative inclusion of the triples. Thus, to obtain well-converged α and γ values at low-order electron correlation levels, the removal of spin contamination is a necessary but not a sufficient condition. Density-functional theory calculations of α and γ have also been carried out using several exchange-correlation functionals. Those employing hybrid exchange-correlation functionals have been shown to reproduce fairly well the reference coupled cluster polarizability and second hyperpolarizability values. In addition, inclusion of Hartree-Fock exchange is of major importance for determining accurate polarizability whereas for the second hyperpolarizability the gradient corrections are large.

  18. Modeling Reconnection-Driven Solar Polar Jets with Gravity and Wind

    NASA Astrophysics Data System (ADS)

    Karpen, Judith T.; DeVore, C. R.; Antiochos, S. K.

    2013-07-01

    Solar polar jets are dynamic, narrow, radially extended structures observed in EUV emission. They have been found to originate within the open magnetic field of coronal holes in “anemone” regions, which are generally accepted to be intrusions of opposite polarity. The associated embedded-dipole topology consists of a spine line emanating from a null point atop a dome-shaped fan surface. Previous work (Pariat et al. 2009, 2010) has validated the idea that magnetic free energy stored on twisted closed field lines within the fan surface can be released explosively by the onset of fast reconnection between the highly stressed closed field inside the null and the unstressed open field outside (Antiochos 1996). The simulations showed that a dense jet comprising a nonlinear, torsional Alfven wave is ejected into the outer corona on the newly reconnected open field lines. While proving the principle of the basic model, those simulations neglected the important effects of gravity, the solar wind, and an expanding spherical geometry. We introduce those additional physical processes in new simulations of reconnection-driven jets, to determine whether the model remains robust in the resulting more realistic setting, and to begin establishing the signatures of the jets in the inner heliosphere for comparison with observations. Initial results demonstrate explosive energy release and a jet in the low corona very much like that in the earlier Cartesian, gravity-free, static-atmosphere runs. We report our analysis of the results, their comparison with previous work, and their implications for observations. This work was supported by NASA’s LWS TR&T program.Abstract (2,250 Maximum Characters): Solar polar jets are dynamic, narrow, radially extended structures observed in EUV emission. They have been found to originate within the open magnetic field of coronal holes in “anemone” regions, which are generally accepted to be intrusions of opposite polarity. The associated embedded-dipole topology consists of a spine line emanating from a null point atop a dome-shaped fan surface. Previous work (Pariat et al. 2009, 2010) has validated the idea that magnetic free energy stored on twisted closed field lines within the fan surface can be released explosively by the onset of fast reconnection between the highly stressed closed field inside the null and the unstressed open field outside (Antiochos 1996). The simulations showed that a dense jet comprising a nonlinear, torsional Alfven wave is ejected into the outer corona on the newly reconnected open field lines. While proving the principle of the basic model, those simulations neglected the important effects of gravity, the solar wind, and an expanding spherical geometry. We introduce those additional physical processes in new simulations of reconnection-driven jets, to determine whether the model remains robust in the resulting more realistic setting, and to begin establishing the signatures of the jets in the inner heliosphere for comparison with observations. Initial results demonstrate explosive energy release and a jet in the low corona very much like that in the earlier Cartesian, gravity-free, static-atmosphere runs. We report our analysis of the results, their comparison with previous work, and their implications for observations. This work was supported by NASA’s LWS TR&T program.

  19. Virtual Hubs for facilitating access to Open Data

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Latre, Miguel Á.; Ernst, Julia; Brumana, Raffaella; Brauman, Stefan; Nativi, Stefano

    2015-04-01

    In October 2014 the ENERGIC-OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". In ENERGIC-OD, Virtual Hubs are conceived as information systems supporting the full life cycle of Open Data: publishing, discovery and access. They facilitate the use of Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and data services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus, beyond a certain extent, heterogeneity is irreducible especially in interdisciplinary contexts. ENERGIC-OD Virtual Hubs address heterogeneity adopting a mediation and brokering approach: specific components (brokers) are dedicated to harmonize service interfaces, metadata and data models, enabling seamless discovery and access to heterogeneous infrastructures and datasets. As an innovation project, ENERGIC-OD will integrate several existing technologies to implement Virtual Hubs as single points of access to geospatial datasets provided by new or existing platforms and infrastructures, including INSPIRE-compliant systems and Copernicus services. ENERGIC OD will deploy a set of five Virtual Hubs (VHs) at national level in France, Germany, Italy, Poland, Spain and an additional one at the European level. VHs will be provided according to the cloud Software-as-a-Services model. The main expected impact of VHs is the creation of new business opportunities opening up access to Research Data and Public Sector Information. Therefore, ENERGIC-OD addresses not only end-users, who will have the opportunity to access the VH through a geo-portal, but also application developers who will be able to access VH functionalities through simple Application Programming Interfaces (API). ENERGIC-OD Consortium will develop ten different applications on top of the deployed VHs. They aim to demonstrate how VHs facilitate the development of new and multidisciplinary applications based on the full exploitation of (open) GI, hence stimulating innovation and business activities.

  20. Extending enterprise architecture modelling with business goals and requirements

    NASA Astrophysics Data System (ADS)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  1. 33 CFR 117.686 - Yazoo River.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) bridge shall open on signal if at least four hours notice is given. When a vessel has given notice and fails to arrive within the four hour period specified, the drawtender shall remain on duty for two additional hours and open the draw if the requesting vessel appears. After this time, an additional four hour...

  2. Towards community-driven paleogeographic reconstructions: integrating open-access paleogeographic and paleobiology data with plate tectonics

    NASA Astrophysics Data System (ADS)

    Wright, N.; Zahirovic, S.; Müller, R. D.; Seton, M.

    2013-03-01

    A variety of paleogeographic reconstructions have been published, with applications ranging from paleoclimate, ocean circulation and faunal radiation models to resource exploration; yet their uncertainties remain difficult to assess as they are generally presented as low-resolution static maps. We present a methodology for ground-truthing the digital Palaeogeographic Atlas of Australia by linking the GPlates plate reconstruction tool to the global Paleobiology Database and a Phanerozoic plate motion model. We develop a spatio-temporal data mining workflow to validate the Phanerozoic Palaeogeographic Atlas of Australia with paleoenvironments derived from fossil data. While there is general agreement between fossil data and the paleogeographic model, the methodology highlights key inconsistencies. The Early Devonian paleogeographic model of southeastern Australia insufficiently describes the Emsian inundation that may be refined using biofacies distributions. Additionally, the paleogeographic model and fossil data can be used to strengthen numerical models, such as the dynamic topography and the associated inundation of eastern Australia during the Cretaceous. Although paleobiology data provide constraints only for paleoenvironments with high preservation potential of organisms, our approach enables the use of additional proxy data to generate improved paleogeographic reconstructions.

  3. A 1D-2D coupled SPH-SWE model applied to open channel flow simulations in complicated geometries

    NASA Astrophysics Data System (ADS)

    Chang, Kao-Hua; Sheu, Tony Wen-Hann; Chang, Tsang-Jung

    2018-05-01

    In this study, a one- and two-dimensional (1D-2D) coupled model is developed to solve the shallow water equations (SWEs). The solutions are obtained using a Lagrangian meshless method called smoothed particle hydrodynamics (SPH) to simulate shallow water flows in converging, diverging and curved channels. A buffer zone is introduced to exchange information between the 1D and 2D SPH-SWE models. Interpolated water discharge values and water surface levels at the internal boundaries are prescribed as the inflow/outflow boundary conditions in the two SPH-SWE models. In addition, instead of using the SPH summation operator, we directly solve the continuity equation by introducing a diffusive term to suppress oscillations in the predicted water depth. The performance of the two approaches in calculating the water depth is comprehensively compared through a case study of a straight channel. Additionally, three benchmark cases involving converging, diverging and curved channels are adopted to demonstrate the ability of the proposed 1D and 2D coupled SPH-SWE model through comparisons with measured data and predicted mesh-based numerical results. The proposed model provides satisfactory accuracy and guaranteed convergence.

  4. Mathematical and computational approaches can complement experimental studies of host-pathogen interactions.

    PubMed

    Kirschner, Denise E; Linderman, Jennifer J

    2009-04-01

    In addition to traditional and novel experimental approaches to study host-pathogen interactions, mathematical and computer modelling have recently been applied to address open questions in this area. These modelling tools not only offer an additional avenue for exploring disease dynamics at multiple biological scales, but also complement and extend knowledge gained via experimental tools. In this review, we outline four examples where modelling has complemented current experimental techniques in a way that can or has already pushed our knowledge of host-pathogen dynamics forward. Two of the modelling approaches presented go hand in hand with articles in this issue exploring fluorescence resonance energy transfer and two-photon intravital microscopy. Two others explore virtual or 'in silico' deletion and depletion as well as a new method to understand and guide studies in genetic epidemiology. In each of these examples, the complementary nature of modelling and experiment is discussed. We further note that multi-scale modelling may allow us to integrate information across length (molecular, cellular, tissue, organism, population) and time (e.g. seconds to lifetimes). In sum, when combined, these compatible approaches offer new opportunities for understanding host-pathogen interactions.

  5. DigitalHuman (DH): An Integrative Mathematical Model ofHuman Physiology

    NASA Technical Reports Server (NTRS)

    Hester, Robert L.; Summers, Richard L.; lIescu, Radu; Esters, Joyee; Coleman, Thomas G.

    2010-01-01

    Mathematical models and simulation are important tools in discovering the key causal relationships governing physiological processes and improving medical intervention when physiological complexity is a central issue. We have developed a model of integrative human physiology called DigitalHuman (DH) consisting of -5000 variables modeling human physiology describing cardiovascular, renal, respiratory, endocrine, neural and metabolic physiology. Users can view time-dependent solutions and interactively introduce perturbations by altering numerical parameters to investigate new hypotheses. The variables, parameters and quantitative relationships as well as all other model details are described in XML text files. All aspects of the model, including the mathematical equations describing the physiological processes are written in XML open source, text-readable files. Model structure is based upon empirical data of physiological responses documented within the peer-reviewed literature. The model can be used to understand proposed physiological mechanisms and physiological interactions that may not be otherwise intUitively evident. Some of the current uses of this model include the analyses of renal control of blood pressure, the central role of the liver in creating and maintaining insulin resistance, and the mechanisms causing orthostatic hypotension in astronauts. Additionally the open source aspect of the modeling environment allows any investigator to add detailed descriptions of human physiology to test new concepts. The model accurately predicts both qualitative and more importantly quantitative changes in clinically and experimentally observed responses. DigitalHuman provides scientists a modeling environment to understand the complex interactions of integrative physiology. This research was supported by.NIH HL 51971, NSF EPSCoR, and NASA

  6. Quantum Control of Open Systems and Dense Atomic Ensembles

    NASA Astrophysics Data System (ADS)

    DiLoreto, Christopher

    Controlling the dynamics of open quantum systems; i.e. quantum systems that decohere because of interactions with the environment, is an active area of research with many applications in quantum optics and quantum computation. My thesis expands the scope of this inquiry by seeking to control open systems in proximity to an additional system. The latter could be a classical system such as metal nanoparticles, or a quantum system such as a cluster of similar atoms. By modelling the interactions between the systems, we are able to expand the accessible state space of the quantum system in question. For a single, three-level quantum system, I examine isolated systems that have only normal spontaneous emission. I then show that intensity-intensity correlation spectra, which depend directly on the density matrix of the system, can be used detect whether transitions share a common energy level. This detection is possible due to the presence of quantum interference effects between two transitions if they are connected. This effect allows one to asses energy level structure diagrams in complex atoms/molecules. By placing an open quantum system near a nanoparticle dimer, I show that the spontaneous emission rate of the system can be changed "on demand" by changing the polarization of an incident, driving field. In a three-level, Lambda system, this allows a qubit to both retain high qubit fidelity when it is operating, and to be rapidly initialized to a pure state once it is rendered unusable by decoherence. This type of behaviour is not possible in a single open quantum system; therefore adding a classical system nearby extends the overall control space of the quantum system. An open quantum system near identical neighbours in a dense ensemble is another example of how the accessible state space can be expanded. I show that a dense ensemble of atoms rapidly becomes disordered with states that are not directly excited by an incident field becoming significantly populated. This effect motivates the need for using multi-directional basis sets in theoretical analysis of dense quantum systems. My results demonstrate the shortcomings of short-pulse techniques used in many recent studies. Based on my numerical studies, I hypothesize that the dense ensemble can be modelled by an effective single quantum system that has a decoherence rate that changes over time. My effective single particle model provides a way in which computational time can be reduced, and also a model in which the underlying physical processes involved in the system's evolution are much easier to understand. I then use this model to provide an elegant theoretical explanation for an unusual experimental result called "transverse optical magnetism''. My effective single particle model's predictions match very well with experimental data.

  7. Perception of passage through openings depends on the size of the body in motion

    PubMed Central

    Franchak, John M.; Celano, Emma C.; Adolph, Karen E.

    2012-01-01

    Walkers need to modify their ongoing actions to meet the demands of everyday environments. Navigating through openings requires gait modifications if the size of the opening is too small relative to the body. Here we ask if the spatial requirements for navigating horizontal and vertical openings differ, and, if so, whether walkers are sensitive to those requirements. To test walkers’ sensitivity to demands for gait modification, we asked participants to judge whether they could walk through horizontal openings without shoulder rotation and through vertical openings without ducking. Afterward, participants walked through the openings so that we could determine which opening sizes elicited gait modifications. Participants turned their shoulders with more space available than the space they left themselves for ducking. Larger buffers for horizontal openings may reflect different spatial requirements created by lateral sway of the body during walking compared to vertical bounce. In addition, greater variability of turning from trial to trial compared with ducking may lead walkers to adopt a more conservative buffer to avoid errors. Verbal judgments accurately predicted whether openings required gait modifications. For horizontal openings, participants’ judgments were best predicted by the body’s dynamic abilities, not static shoulder width. The differences between horizontal and vertical openings illustrate that walkers account for the dynamic properties of walking in addition to scaling decisions to body dimensions. PMID:22990292

  8. Perception of passage through openings depends on the size of the body in motion.

    PubMed

    Franchak, John M; Celano, Emma C; Adolph, Karen E

    2012-11-01

    Walkers need to modify their ongoing actions to meet the demands of everyday environments. Navigating through openings requires gait modifications if the size of the opening is too small relative to the body. Here we ask whether the spatial requirements for navigating horizontal and vertical openings differ, and, if so, whether walkers are sensitive to those requirements. To test walkers' sensitivity to demands for gait modification, we asked participants to judge whether they could walk through horizontal openings without shoulder rotation and through vertical openings without ducking. Afterward, participants walked through the openings, so that we could determine which opening sizes elicited gait modifications. Participants turned their shoulders with more space available than the space they left themselves for ducking. Larger buffers for horizontal openings may reflect different spatial requirements created by lateral sway of the body during walking compared to vertical bounce. In addition, greater variability of turning from trial to trial compared with ducking may lead walkers to adopt a more conservative buffer to avoid errors. Verbal judgments accurately predicted whether openings required gait modifications. For horizontal openings, participants' judgments were best predicted by the body's dynamic abilities, not static shoulder width. The differences between horizontal and vertical openings illustrate that walkers account for the dynamic properties of walking in addition to scaling decisions to body dimensions.

  9. Open data models for smart health interconnected applications: the example of openEHR.

    PubMed

    Demski, Hans; Garde, Sebastian; Hildebrand, Claudia

    2016-10-22

    Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.

  10. 12 CFR Appendix G to Part 1026 - Open-End Model Forms and Clauses

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Open-End Model Forms and Clauses G Appendix G...) Pt. 1026, App. G Appendix G to Part 1026—Open-End Model Forms and Clauses G-1Balance Computation Methods Model Clauses (Home-equity Plans) (§§ 1026.6 and 1026.7) G-1(A)Balance Computation Methods Model...

  11. 12 CFR Appendix G to Part 1026 - Open-End Model Forms and Clauses

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Open-End Model Forms and Clauses G Appendix G...) Pt. 1026, App. G Appendix G to Part 1026—Open-End Model Forms and Clauses G-1Balance Computation Methods Model Clauses (Home-equity Plans) (§§ 1026.6 and 1026.7) G-1(A)Balance Computation Methods Model...

  12. 12 CFR Appendix G to Part 1026 - Open-End Model Forms and Clauses

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 9 2014-01-01 2014-01-01 false Open-End Model Forms and Clauses G Appendix G...) Pt. 1026, App. G Appendix G to Part 1026—Open-End Model Forms and Clauses G-1Balance Computation Methods Model Clauses (Home-equity Plans) (§§ 1026.6 and 1026.7) G-1(A)Balance Computation Methods Model...

  13. A Kind of Optimization Method of Loading Documents in OpenOffice.org

    NASA Astrophysics Data System (ADS)

    Lan, Yuqing; Li, Li; Zhou, Wenbin

    As a giant in open source community, OpenOffice.org has become the most popular office suite within Linux community. But OpenOffice.org is relatively slow while loading documents. Research shows that the most time consuming part is importing one page of whole document. If there are many pages in a document, the accumulation of time consumed can be astonishing. Therefore, this paper proposes a solution, which has improved the speed of loading documents through asynchronous importing mechanism: a document is not imported as a whole, but only part of the document is imported at first for display, then mechanism in the background is started to asynchronously import the remaining parts, and insert it into the drawing queue of OpenOffice.org for display. In this way, the problem can be solved and users don't have to wait for a long time. Application start-up time testing tool has been used to test the time consumed in loading different pages of documents before and after optimization of OpenOffice.org, then, we adopt the regression theory to analyse the correlation between the page number of documents and the loading time. In addition, visual modeling of the experimental data are acquired with the aid of matlab. An obvious increase in loading speed can be seen after a comparison of the time consumed to load a document before and after the solution is adopted. And then, using Microsoft Office compared with the optimized OpenOffice.org, their loading speeds are almost same. The results of the experiments show the effectiveness of this solution.

  14. Relative economic values of open space provided by National Forest and military lands to surrounding communities in Colorado

    Treesearch

    Charlotte Ham; John B. Loomis; Patricia A. Champ

    2015-01-01

    Open space lands are provided by a variety of entities from private individuals to the federal government and these entities make management decisions based on a very broad range of priorities. The net benefits of additional open space depend on the number, quality, and composition of existing open space in the vicinity. In areas where open space is abundant and there...

  15. Open-source software: not quite endsville.

    PubMed

    Stahl, Matthew T

    2005-02-01

    Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.

  16. A distributed data component for the open modeling interface

    USDA-ARS?s Scientific Manuscript database

    As the volume of collected data continues to increase in the environmental sciences, so does the need for effective means for accessing those data. We have developed an Open Modeling Interface (OpenMI) data component that retrieves input data for model components from environmental information syste...

  17. Multiple loop conformations of peptides predicted by molecular dynamics simulations are compatible with nuclear magnetic resonance.

    PubMed

    Carstens, Heiko; Renner, Christian; Milbradt, Alexander G; Moroder, Luis; Tavan, Paul

    2005-03-29

    The affinity and selectivity of protein-protein interactions can be fine-tuned by varying the size, flexibility, and amino acid composition of involved surface loops. As a model for such surface loops, we study the conformational landscape of an octapeptide, whose flexibility is chemically steered by a covalent ring closure integrating an azobenzene dye into and by a disulfide bridge additionally constraining the peptide backbone. Because the covalently integrated azobenzene dyes can be switched by light between a bent cis state and an elongated trans state, six cyclic peptide models of strongly different flexibilities are obtained. The conformational states of these peptide models are sampled by NMR and by unconstrained molecular dynamics (MD) simulations. Prototypical conformations and the free-energy landscapes in the high-dimensional space spanned by the phi/psi angles at the peptide backbone are obtained by clustering techniques from the MD trajectories. Multiple open-loop conformations are shown to be predicted by MD particularly in the very flexible cases and are shown to comply with the NMR data despite the fact that such open-loop conformations are missing in the refined NMR structures.

  18. Modelling and Analysis of Hydrodynamics and Water Quality for Rivers in the Northern Cold Region of China

    PubMed Central

    Tang, Gula; Zhu, Yunqiang; Wu, Guozheng; Li, Jing; Li, Zhao-Liang; Sun, Jiulin

    2016-01-01

    In this study, the Mudan River, which is the most typical river in the northern cold region of China was selected as the research object; Environmental Fluid Dynamics Code (EFDC) was adopted to construct a new two-dimensional water quality model for the urban sections of the Mudan River, and concentrations of CODCr and NH3N during ice-covered and open-water periods were simulated and analyzed. Results indicated that roughness coefficient and comprehensive pollutant decay rate were significantly different in those periods. To be specific, the roughness coefficient in the ice-covered period was larger than that of the open-water period, while the decay rate within the former period was smaller than that in the latter. In addition, according to the analysis of the simulated results, the main reasons for the decay rate reduction during the ice-covered period are temperature drop, upstream inflow decrease and ice layer cover; among them, ice sheet is the major contributor of roughness increase. These aspects were discussed in more detail in this work. The model could be generalized to hydrodynamic water quality process simulation researches on rivers in other cold regions as well. PMID:27070631

  19. The role of personal values and basic traits in perceptions of the consequences of immigration: a three-nation study.

    PubMed

    Vecchione, Michele; Caprara, Gianvittorio; Schoen, Harald; Castro, Josè Luis Gonzàlez; Schwartz, Shalom H

    2012-08-01

    Using data from Italy, Spain, and Germany (N= 1,569), this study investigated the role of basic values (universalism and security) and basic traits (openness and agreeableness) in predicting perceptions of the consequences of immigration. In line with Schwartz's (1992) theory, we conceptualized security as having two distinct components, one concerned with safety of the self (personal security) and the other with harmony and stability of larger groups and of society (group security). Structural equation modelling revealed that universalism values underlie perceptions that immigration has positive consequences and group security values underlie perceptions that it has negative consequences. Personal security makes no unique, additional contribution. Multi-group analyses revealed that these associations are invariant across the three countries except for a stronger link between universalism and perceptions of the consequences of immigration in Spain. To examine whether values mediate relations of traits to perceptions of immigration, we used the five-factor model. Findings supported a full mediation model. Individuals' traits of openness and agreeableness explained significant variance in security and universalism values. Basic values, in turn, explained perceptions of the consequences of immigration. ©2011 The British Psychological Society.

  20. Extrinsic and intrinsic index finger muscle attachments in an OpenSim upper-extremity model.

    PubMed

    Lee, Jong Hwa; Asakawa, Deanna S; Dennerlein, Jack T; Jindrich, Devin L

    2015-04-01

    Musculoskeletal models allow estimation of muscle function during complex tasks. We used objective methods to determine possible attachment locations for index finger muscles in an OpenSim upper-extremity model. Data-driven optimization algorithms, Simulated Annealing and Hook-Jeeves, estimated tendon locations crossing the metacarpophalangeal (MCP), proximal interphalangeal (PIP) and distal interphalangeal (DIP) joints by minimizing the difference between model-estimated and experimentally-measured moment arms. Sensitivity analysis revealed that multiple sets of muscle attachments with similar optimized moment arms are possible, requiring additional assumptions or data to select a single set of values. The most smooth muscle paths were assumed to be biologically reasonable. Estimated tendon attachments resulted in variance accounted for (VAF) between calculated moment arms and measured values of 78% for flex/extension and 81% for ab/adduction at the MCP joint. VAF averaged 67% at the PIP joint and 54% at the DIP joint. VAF values at PIP and DIP joints partially reflected the constant moment arms reported for muscles about these joints. However, all moment arm values found through optimization were non-linear and non-constant. Relationships between moment arms and joint angles were best described with quadratic equations for tendons at the PIP and DIP joints.

  1. Using WNTR to Model Water Distribution System Resilience ...

    EPA Pesticide Factsheets

    The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of disruptive events, including earthquakes, contamination incidents, floods, climate change, and fires. The software includes the EPANET solver as well as a WNTR solver with the ability to model pressure-driven demand hydraulics, pipe breaks, component degradation and failure, changes to supply and demand, and cascading failure. Damage to individual components in the network (i.e. pipes, tanks) can be selected probabilistically using fragility curves. WNTR can also simulate different types of resilience-enhancing actions, including scheduled pipe repair or replacement, water conservation efforts, addition of back-up power, and use of contamination warning systems. The software can be used to estimate potential damage in a network, evaluate preparedness, prioritize repair strategies, and identify worse case scenarios. As a Python package, WNTR takes advantage of many existing python capabilities, including parallel processing of scenarios and graphics capabilities. This presentation will outline the modeling components in WNTR, demonstrate their use, give the audience information on how to get started using the code, and invite others to participate in this open source project. This pres

  2. RF Wave Simulation Using the MFEM Open Source FEM Package

    NASA Astrophysics Data System (ADS)

    Stillerman, J.; Shiraiwa, S.; Bonoli, P. T.; Wright, J. C.; Green, D. L.; Kolev, T.

    2016-10-01

    A new plasma wave simulation environment based on the finite element method is presented. MFEM, a scalable open-source FEM library, is used as the basis for this capability. MFEM allows for assembling an FEM matrix of arbitrarily high order in a parallel computing environment. A 3D frequency domain RF physics layer was implemented using a python wrapper for MFEM and a cold collisional plasma model was ported. This physics layer allows for defining the plasma RF wave simulation model without user knowledge of the FEM weak-form formulation. A graphical user interface is built on πScope, a python-based scientific workbench, such that a user can build a model definition file interactively. Benchmark cases have been ported to this new environment, with results being consistent with those obtained using COMSOL multiphysics, GENRAY, and TORIC/TORLH spectral solvers. This work is a first step in bringing to bear the sophisticated computational tool suite that MFEM provides (e.g., adaptive mesh refinement, solver suite, element types) to the linear plasma-wave interaction problem, and within more complicated integrated workflows, such as coupling with core spectral solver, or incorporating additional physics such as an RF sheath potential model or kinetic effects. USDoE Awards DE-FC02-99ER54512, DE-FC02-01ER54648.

  3. Diverse data supports the transition of filamentous fungal model organisms into the post-genomics era

    DOE PAGES

    McCluskey, Kevin; Baker, Scott E.

    2017-02-17

    As model organisms filamentous fungi have been important since the beginning of modern biological inquiry and have benefitted from open data since the earliest genetic maps were shared. From early origins in simple Mendelian genetics of mating types, parasexual genetics of colony colour, and the foundational demonstration of the segregation of a nutritional requirement, the contribution of research systems utilising filamentous fungi has spanned the biochemical genetics era, through the molecular genetics era, and now are at the very foundation of diverse omics approaches to research and development. Fungal model organisms have come from most major taxonomic groups although Ascomycetemore » filamentous fungi have seen the most major sustained effort. In addition to the published material about filamentous fungi, shared molecular tools have found application in every area of fungal biology. Likewise, shared data has contributed to the success of model systems. Furthermore, the scale of data supporting research with filamentous fungi has grown by 10 to 12 orders of magnitude. From genetic to molecular maps, expression databases, and finally genome resources, the open and collaborative nature of the research communities has assured that the rising tide of data has lifted all of the research systems together.« less

  4. Diverse data supports the transition of filamentous fungal model organisms into the post-genomics era

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCluskey, Kevin; Baker, Scott E.

    As model organisms filamentous fungi have been important since the beginning of modern biological inquiry and have benefitted from open data since the earliest genetic maps were shared. From early origins in simple Mendelian genetics of mating types, parasexual genetics of colony colour, and the foundational demonstration of the segregation of a nutritional requirement, the contribution of research systems utilising filamentous fungi has spanned the biochemical genetics era, through the molecular genetics era, and now are at the very foundation of diverse omics approaches to research and development. Fungal model organisms have come from most major taxonomic groups although Ascomycetemore » filamentous fungi have seen the most major sustained effort. In addition to the published material about filamentous fungi, shared molecular tools have found application in every area of fungal biology. Likewise, shared data has contributed to the success of model systems. Furthermore, the scale of data supporting research with filamentous fungi has grown by 10 to 12 orders of magnitude. From genetic to molecular maps, expression databases, and finally genome resources, the open and collaborative nature of the research communities has assured that the rising tide of data has lifted all of the research systems together.« less

  5. Assessing the relationship between groundwater nitrate and animal feeding operations in Iowa (USA)

    USGS Publications Warehouse

    Zirkle, Keith W.; Nolan, Bernard T.; Jones, Rena R.; Weyer, Peter J.; Ward, Mary H.; Wheeler, David C.

    2016-01-01

    Nitrate-nitrogen is a common contaminant of drinking water in many agricultural areas of the United States of America (USA). Ingested nitrate from contaminated drinking water has been linked to an increased risk of several cancers, specific birth defects, and other diseases. In this research, we assessed the relationship between animal feeding operations (AFOs) and groundwater nitrate in private wells in Iowa. We characterized AFOs by swine and total animal units and type (open, confined, or mixed), and we evaluated the number and spatial intensities of AFOs in proximity to private wells. The types of AFO indicate the extent to which a facility is enclosed by a roof. Using linear regression models, we found significant positive associations between the total number of AFOs within 2 km of a well (p trend < 0.001), number of open AFOs within 5 km of a well (p trend < 0.001), and number of mixed AFOs within 30 km of a well (p trend < 0.001) and the log nitrate concentration. Additionally, we found significant increases in log nitrate in the top quartiles for AFO spatial intensity, open AFO spatial intensity, and mixed AFO spatial intensity compared to the bottom quartile (0.171 log(mg/L), 0.319 log(mg/L), and 0.541 log(mg/L), respectively; all p < 0.001). We also explored the spatial distribution of nitrate-nitrogen in drinking wells and found significant spatial clustering of high-nitrate wells (> 5 mg/L) compared with low-nitrate (≤ 5 mg/L) wells (p = 0.001). A generalized additive model for high-nitrate status identified statistically significant areas of risk for high levels of nitrate. Adjustment for some AFO predictor variables explained a portion of the elevated nitrate risk. These results support a relationship between animal feeding operations and groundwater nitrate concentrations and differences in nitrate loss from confined AFOs vs. open or mixed types.

  6. 77 FR 31400 - Nixon Presidential Historical Materials: Opening of Materials

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-25

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Nixon Presidential Historical Materials: Opening of Materials AGENCY: National Archives and Records Administration. ACTION: Notice of Opening of Additional... by the Richard Nixon Presidential Library and Museum, a division of the National Archives and Records...

  7. 77 FR 58179 - Nixon Presidential Historical Materials: Opening of Materials

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-19

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Nixon Presidential Historical Materials: Opening of Materials AGENCY: National Archives and Records Administration ACTION: Notice of opening of additional... by the Richard Nixon Presidential Library and Museum, a division of the National Archives and Records...

  8. 76 FR 27092 - Nixon Presidential Historical Materials: Opening of Materials

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-10

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Nixon Presidential Historical Materials: Opening of Materials AGENCY: National Archives and Records Administration. ACTION: Notice of opening of additional... by the Richard Nixon Presidential Library and Museum, a division of the National Archives and Records...

  9. 75 FR 68384 - Nixon Presidential Historical Materials: Opening of Materials

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-05

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Nixon Presidential Historical Materials: Opening of Materials AGENCY: National Archives and Records Administration. ACTION: Notice of Opening of Additional... by the Richard Nixon Presidential Library and Museum, a division of the National Archives and Records...

  10. A discrete-space urban model with environmental amenities

    Treesearch

    Liaila Tajibaeva; Robert G. Haight; Stephen Polasky

    2008-01-01

    This paper analyzes the effects of providing environmental amenities associated with open space in a discrete-space urban model and characterizes optimal provision of open space across a metropolitan area. The discrete-space model assumes distinct neighborhoods in which developable land is homogeneous within a neighborhood but heterogeneous across neighborhoods. Open...

  11. OpenWorm: an open-science approach to modeling Caenorhabditis elegans.

    PubMed

    Szigeti, Balázs; Gleeson, Padraig; Vella, Michael; Khayrulin, Sergey; Palyanov, Andrey; Hokanson, Jim; Currie, Michael; Cantarelli, Matteo; Idili, Giovanni; Larson, Stephen

    2014-01-01

    OpenWorm is an international collaboration with the aim of understanding how the behavior of Caenorhabditis elegans (C. elegans) emerges from its underlying physiological processes. The project has developed a modular simulation engine to create computational models of the worm. The modularity of the engine makes it possible to easily modify the model, incorporate new experimental data and test hypotheses. The modeling framework incorporates both biophysical neuronal simulations and a novel fluid-dynamics-based soft-tissue simulation for physical environment-body interactions. The project's open-science approach is aimed at overcoming the difficulties of integrative modeling within a traditional academic environment. In this article the rationale is presented for creating the OpenWorm collaboration, the tools and resources developed thus far are outlined and the unique challenges associated with the project are discussed.

  12. Three-Dimensional Modeling of Aircraft High-Lift Components with Vehicle Sketch Pad

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.

    2016-01-01

    Vehicle Sketch Pad (OpenVSP) is a parametric geometry modeler that has been used extensively for conceptual design studies of aircraft, including studies using higher-order analysis. OpenVSP can model flap and slat surfaces using simple shearing of the airfoil coordinates, which is an appropriate level of complexity for lower-order aerodynamic analysis methods. For three-dimensional analysis, however, there is not a built-in method for defining the high-lift components in OpenVSP in a realistic manner, or for controlling their complex motions in a parametric manner that is intuitive to the designer. This paper seeks instead to utilize OpenVSP's existing capabilities, and establish a set of best practices for modeling high-lift components at a level of complexity suitable for higher-order analysis methods. Techniques are described for modeling the flap and slat components as separate three-dimensional surfaces, and for controlling their motion using simple parameters defined in the local hinge-axis frame of reference. To demonstrate the methodology, an OpenVSP model for the Energy-Efficient Transport (EET) AR12 wind-tunnel model has been created, taking advantage of OpenVSP's Advanced Parameter Linking capability to translate the motions of the high-lift components from the hinge-axis coordinate system to a set of transformations in OpenVSP's frame of reference.

  13. Perspective: Markov models for long-timescale biomolecular dynamics.

    PubMed

    Schwantes, C R; McGibbon, R T; Pande, V S

    2014-09-07

    Molecular dynamics simulations have the potential to provide atomic-level detail and insight to important questions in chemical physics that cannot be observed in typical experiments. However, simply generating a long trajectory is insufficient, as researchers must be able to transform the data in a simulation trajectory into specific scientific insights. Although this analysis step has often been taken for granted, it deserves further attention as large-scale simulations become increasingly routine. In this perspective, we discuss the application of Markov models to the analysis of large-scale biomolecular simulations. We draw attention to recent improvements in the construction of these models as well as several important open issues. In addition, we highlight recent theoretical advances that pave the way for a new generation of models of molecular kinetics.

  14. Applying Multivariate Discrete Distributions to Genetically Informative Count Data.

    PubMed

    Kirkpatrick, Robert M; Neale, Michael C

    2016-03-01

    We present a novel method of conducting biometric analysis of twin data when the phenotypes are integer-valued counts, which often show an L-shaped distribution. Monte Carlo simulation is used to compare five likelihood-based approaches to modeling: our multivariate discrete method, when its distributional assumptions are correct, when they are incorrect, and three other methods in common use. With data simulated from a skewed discrete distribution, recovery of twin correlations and proportions of additive genetic and common environment variance was generally poor for the Normal, Lognormal and Ordinal models, but good for the two discrete models. Sex-separate applications to substance-use data from twins in the Minnesota Twin Family Study showed superior performance of two discrete models. The new methods are implemented using R and OpenMx and are freely available.

  15. Plant hormone signaling during development: insights from computational models.

    PubMed

    Oliva, Marina; Farcot, Etienne; Vernoux, Teva

    2013-02-01

    Recent years have seen an impressive increase in our knowledge of the topology of plant hormone signaling networks. The complexity of these topologies has motivated the development of models for several hormones to aid understanding of how signaling networks process hormonal inputs. Such work has generated essential insights into the mechanisms of hormone perception and of regulation of cellular responses such as transcription in response to hormones. In addition, modeling approaches have contributed significantly to exploring how spatio-temporal regulation of hormone signaling contributes to plant growth and patterning. New tools have also been developed to obtain quantitative information on hormone distribution during development and to test model predictions, opening the way for quantitative understanding of the developmental roles of hormones. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  17. OpenMI: the essential concepts and their implications for legacy software

    NASA Astrophysics Data System (ADS)

    Gregersen, J. B.; Gijsbers, P. J. A.; Westen, S. J. P.; Blind, M.

    2005-08-01

    Information & Communication Technology (ICT) tools such as computational models are very helpful in designing river basin management plans (rbmp-s). However, in the scientific world there is consensus that a single integrated modelling system to support e.g. the implementation of the Water Framework Directive cannot be developed and that integrated systems need to be very much tailored to the local situation. As a consequence there is an urgent need to increase the flexibility of modelling systems, such that dedicated model systems can be developed from available building blocks. The HarmonIT project aims at precisely that. Its objective is to develop and implement a standard interface for modelling components and other relevant tools: The Open Modelling Interface (OpenMI) standard. The OpenMI standard has been completed and documented. It relies entirely on the "pull" principle, where data are pulled by one model from the previous model in the chain. This paper gives an overview of the OpenMI standard, explains the foremost concepts and the rational behind it.

  18. Archetype-based conversion of EHR content models: pilot experience with a regional EHR system

    PubMed Central

    2009-01-01

    Background Exchange of Electronic Health Record (EHR) data between systems from different suppliers is a major challenge. EHR communication based on archetype methodology has been developed by openEHR and CEN/ISO. The experience of using archetypes in deployed EHR systems is quite limited today. Currently deployed EHR systems with large user bases have their own proprietary way of representing clinical content using various models. This study was designed to investigate the feasibility of representing EHR content models from a regional EHR system as openEHR archetypes and inversely to convert archetypes to the proprietary format. Methods The openEHR EHR Reference Model (RM) and Archetype Model (AM) specifications were used. The template model of the Cambio COSMIC, a regional EHR product from Sweden, was analyzed and compared to the openEHR RM and AM. This study was focused on the convertibility of the EHR semantic models. A semantic mapping between the openEHR RM/AM and the COSMIC template model was produced and used as the basis for developing prototype software that performs automated bi-directional conversion between openEHR archetypes and COSMIC templates. Results Automated bi-directional conversion between openEHR archetype format and COSMIC template format has been achieved. Several archetypes from the openEHR Clinical Knowledge Repository have been imported into COSMIC, preserving most of the structural and terminology related constraints. COSMIC templates from a large regional installation were successfully converted into the openEHR archetype format. The conversion from the COSMIC templates into archetype format preserves nearly all structural and semantic definitions of the original content models. A strategy of gradually adding archetype support to legacy EHR systems was formulated in order to allow sharing of clinical content models defined using different formats. Conclusion The openEHR RM and AM are expressive enough to represent the existing clinical content models from the template based EHR system tested and legacy content models can automatically be converted to archetype format for sharing of knowledge. With some limitations, internationally available archetypes could be converted to the legacy EHR models. Archetype support can be added to legacy EHR systems in an incremental way allowing a migration path to interoperability based on standards. PMID:19570196

  19. Inward open characterization of EmrD transporter with molecular dynamics simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Xianwei; Wang, Boxiong, E-mail: boxiong_wang@yahoo.com

    EmrD is a member of the multidrug resistance exporter family. Up to now, little is known about the structural dynamics that underline the function of the EmrD protein in inward-facing open state and how the EmrD transits from an occluded state to an inward open state. For the first time the article applied the AT simulation to investigate the membrane transporter protein EmrD, and described the dynamic features of the whole protein, the domain, the helices, and the amino acid residues during an inward-open process from its occluded state. The gradual inward-open process is different from the current model ofmore » rigid-body domain motion in alternating-access mechanism. Simulation results show that the EmrD inward-open conformational fluctuation propagates from a C-terminal domain to an N-terminal domain via the linker region during the transition from its occluded state. The conformational fluctuation of the C-terminal domain is larger than that of the N-terminal domain. In addition, it is observed that the helices exposed to the surrounding membrane show a higher level of flexibility than the other regions, and the protonated E227 plays a key role in the transition from the occluded to the open state. -- Highlights: •This study described the dynamic features of the whole EmrD protein, during an inward-open process from its occluded state. •The EmrD inward-open conformational fluctuation propagates from a C-terminal domain to an N-terminal domain via the linker region during the transition from its occluded state. •The conformational fluctuation of the C-terminal domain is larger than that of the N-terminal domain. •The protonated E227 plays a key role in the transition from the occluded to the open state.« less

  20. A mechanical analysis of conduit arteries accounting for longitudinal residual strains.

    PubMed

    Wang, Ruoya; Gleason, Rudolph L

    2010-04-01

    Identification of an appropriate stress-free reference configuration is critically important in providing a reasonable prediction of the intramural stress distribution when performing biomechanical analyses on arteries. The stress-free state is commonly approximated as a radially cut ring that typically opens into a nearly circular sector, relieving much of the circumferential residual strains that exist in the traction-free configuration. An opening angle is often used to characterize this sector. In this study, we first present experimental results showing significant residual deformations in the longitudinal direction of two commonly studied arteries in the pig: the common carotid artery and the left anterior descending coronary artery. We concluded that a radially cut ring cannot completely describe the stress-free state of the arteries. Instead, we propose the use of a longitudinal opening angle, in conjunction with the traditional circumferential opening angle, to experimentally quantify the stress-free state of an artery. Secondly, we propose a new kinematic model to account for the addition of longitudinal residual strains through employing the longitudinal opening angle and performed a stress analysis. We found that with the inclusion of longitudinal residual strains in the stress analysis, the predicted circumferential stress gradient was decreased by 3-fold and the predicted longitudinal stress gradient was increased by 5.7-fold. Thus, inclusion of longitudinal residual strains has a significant effect on the predicted stress distribution in arteries.

  1. Innovative Liner Concepts: Experiments and Impedance Modeling of Liners Including the Effect of Bias Flow

    NASA Technical Reports Server (NTRS)

    Kelly, Jeff; Betts, Juan Fernando; Fuller, Chris

    2000-01-01

    The study of normal impedance of perforated plate acoustic liners including the effect of bias flow was studied. Two impedance models were developed by modeling the internal flows of perforate orifices as infinite tubes with the inclusion of end corrections to handle finite length effects. These models assumed incompressible and compressible flows, respectively, between the far field and the perforate orifice. The incompressible model was used to predict impedance results for perforated plates with percent open areas ranging from 5% to 15%. The predicted resistance results showed better agreement with experiments for the higher percent open area samples. The agreement also tended to deteriorate as bias flow was increased. For perforated plates with percent open areas ranging from 1% to 5%, the compressible model was used to predict impedance results. The model predictions were closer to the experimental resistance results for the 2% to 3% open area samples. The predictions tended to deteriorate as bias flow was increased. The reactance results were well predicted by the models for the higher percent open area, but deteriorated as the percent open area was lowered (5%) and bias flow was increased. A fit was done on the incompressible model to the experimental database. The fit was performed using an optimization routine that found the optimal set of multiplication coefficients to the non-dimensional groups that minimized the least squares slope error between predictions and experiments. The result of the fit indicated that terms not associated with bias flow required a greater degree of correction than the terms associated with the bias flow. This model improved agreement with experiments by nearly 15% for the low percent open area (5%) samples when compared to the unfitted model. The fitted model and the unfitted model performed equally well for the higher percent open area (10% and 15%).

  2. Pececillo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, Neil; Jibben, Zechariah; Brady, Peter

    2017-06-28

    Pececillo is a proxy-app for the open source Truchas metal processing code (LA-CC-15-097). It implements many of the physics models used in Truchas: free-surface, incompressible Navier-Stokes fluid dynamics (e.g., water waves); heat transport, material phase change, view factor thermal radiation; species advection-diffusion; quasi-static, elastic/plastic solid mechanics with contact; electomagnetics (Maxwell's equations). The models are simplified versions that retain the fundamental computational complexity of the Truchas models while omitting many non-essential features and modeling capabilities. The purpose is to expose Truchas algorithms in a greatly simplified context where computer science problems related to parallel performance on advanced architectures can be moremore » easily investigated. While Pececillo is capable of performing simulations representative of typical Truchas metal casting, welding, and additive manufacturing simulations, it lacks many of the modeling capabilites needed for real applications.« less

  3. Lattice QCD simulations using the OpenACC platform

    NASA Astrophysics Data System (ADS)

    Majumdar, Pushan

    2016-10-01

    In this article we will explore the OpenACC platform for programming Graphics Processing Units (GPUs). The OpenACC platform offers a directive based programming model for GPUs which avoids the detailed data flow control and memory management necessary in a CUDA programming environment. In the OpenACC model, programs can be written in high level languages with OpenMP like directives. We present some examples of QCD simulation codes using OpenACC and discuss their performance on the Fermi and Kepler GPUs.

  4. A portable approach for PIC on emerging architectures

    NASA Astrophysics Data System (ADS)

    Decyk, Viktor

    2016-03-01

    A portable approach for designing Particle-in-Cell (PIC) algorithms on emerging exascale computers, is based on the recognition that 3 distinct programming paradigms are needed. They are: low level vector (SIMD) processing, middle level shared memory parallel programing, and high level distributed memory programming. In addition, there is a memory hierarchy associated with each level. Such algorithms can be initially developed using vectorizing compilers, OpenMP, and MPI. This is the approach recommended by Intel for the Phi processor. These algorithms can then be translated and possibly specialized to other programming models and languages, as needed. For example, the vector processing and shared memory programming might be done with CUDA instead of vectorizing compilers and OpenMP, but generally the algorithm itself is not greatly changed. The UCLA PICKSC web site at http://www.idre.ucla.edu/ contains example open source skeleton codes (mini-apps) illustrating each of these three programming models, individually and in combination. Fortran2003 now supports abstract data types, and design patterns can be used to support a variety of implementations within the same code base. Fortran2003 also supports interoperability with C so that implementations in C languages are also easy to use. Finally, main codes can be translated into dynamic environments such as Python, while still taking advantage of high performing compiled languages. Parallel languages are still evolving with interesting developments in co-Array Fortran, UPC, and OpenACC, among others, and these can also be supported within the same software architecture. Work supported by NSF and DOE Grants.

  5. Analysis of aeromedical retrieval coverage using elliptical isochrones: An evaluation of helicopter fleet size configurations in Scotland.

    PubMed

    Dodds, Naomi; Emerson, Philip; Phillips, Stephanie; Green, David R; Jansen, Jan O

    2017-03-01

    Trauma systems in remote and rural regions often rely on helicopter emergency medical services to facilitate access to definitive care. The siting of such resources is key, but often relies on simplistic modeling of coverage, using circular isochrones. Scotland is in the process of implementing a national trauma network, and there have been calls for an expansion of aeromedical retrieval capacity. The aim of this study was to analyze population and area coverage of the current retrieval service configuration, with three aircraft, and a configuration with an additional helicopter, in the North East of Scotland, using a novel methodology. Both overall coverage and coverage by physician-staffed aircraft, with enhanced clinical capability, were analyzed. This was a geographical analysis based on calculation of elliptical isochrones, which consider the "open-jaw" configuration of many retrieval flights. Helicopters are not always based at hospitals. We modeled coverage based on different outbound and inbound flights. Areally referenced population data were obtained from the Scottish Government. The current helicopter network configuration provides 94.2% population coverage and 59.0% area coverage. The addition of a fourth helicopter would marginally increase population coverage to 94.4% and area coverage to 59.1%. However, when considering only physician-manned aircraft, the current configuration provides only 71.7% population coverage and 29.4% area coverage, which would be increased to 91.1% and 51.2%, respectively, with a second aircraft. Scotland's current helicopter network configuration provides good population coverage for retrievals to major trauma centers, which would only be increased minimally by the addition of a fourth aircraft in the North East. The coverage provided by the single physician-staffed aircraft is more limited, however, and would be increased considerably by a second physician-staffed aircraft in the North East. Elliptical isochrones provide a useful means of modeling "open-jaw" retrieval missions and provide a more realistic estimate of coverage. Epidemiological study, level IV; therapeutic study, level IV.

  6. An open-source library for the numerical modeling of mass-transfer in solid oxide fuel cells

    NASA Astrophysics Data System (ADS)

    Novaresio, Valerio; García-Camprubí, María; Izquierdo, Salvador; Asinari, Pietro; Fueyo, Norberto

    2012-01-01

    The generation of direct current electricity using solid oxide fuel cells (SOFCs) involves several interplaying transport phenomena. Their simulation is crucial for the design and optimization of reliable and competitive equipment, and for the eventual market deployment of this technology. An open-source library for the computational modeling of mass-transport phenomena in SOFCs is presented in this article. It includes several multicomponent mass-transport models ( i.e. Fickian, Stefan-Maxwell and Dusty Gas Model), which can be applied both within porous media and in porosity-free domains, and several diffusivity models for gases. The library has been developed for its use with OpenFOAM ®, a widespread open-source code for fluid and continuum mechanics. The library can be used to model any fluid flow configuration involving multicomponent transport phenomena and it is validated in this paper against the analytical solution of one-dimensional test cases. In addition, it is applied for the simulation of a real SOFC and further validated using experimental data. Program summaryProgram title: multiSpeciesTransportModels Catalogue identifier: AEKB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 18 140 No. of bytes in distributed program, including test data, etc.: 64 285 Distribution format: tar.gz Programming language:: C++ Computer: Any x86 (the instructions reported in the paper consider only the 64 bit case for the sake of simplicity) Operating system: Generic Linux (the instructions reported in the paper consider only the open-source Ubuntu distribution for the sake of simplicity) Classification: 12 External routines: OpenFOAM® (version 1.6-ext) ( http://www.extend-project.de) Nature of problem: This software provides a library of models for the simulation of the steady state mass and momentum transport in a multi-species gas mixture, possibly in a porous medium. The software is particularly designed to be used as the mass-transport library for the modeling of solid oxide fuel cells (SOFC). When supplemented with other sub-models, such as thermal and charge-transport ones, it allows the prediction of the cell polarization curve and hence the cell performance. Solution method: Standard finite volume method (FVM) is used for solving all the conservation equations. The pressure-velocity coupling is solved using the SIMPLE algorithm (possibly adding a porous drag term if required). The mass transport can be calculated using different alternative models, namely Fick, Maxwell-Stefan or dusty gas model. The code adopts a segregated method to solve the resulting linear system of equations. The different regions of the SOFC, namely gas channels, electrodes and electrolyte, are solved independently, and coupled through boundary conditions. Restrictions: When extremely large species fluxes are considered, current implementation of the Neumann and Robin boundary conditions do not avoid negative values of molar and/or mass fractions, which finally end up with numerical instability. However this never happened in the documented runs. Eventually these boundary conditions could be reformulated to become more robust. Running time: From seconds to hours depending on the mesh size and number of species. For example, on a 64 bit machine with Intel Core Duo T8300 and 3 GBytes of RAM, the provided test run requires less than 1 second.

  7. 7. INTERIOR OF VESTIBULE SHOWING OPEN 1LIGHT FRONT DOOR AT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. INTERIOR OF VESTIBULE SHOWING OPEN 1-LIGHT FRONT DOOR AT PHOTO RIGHT, AND OPEN PANEL DOOR TO BEDROOM ADDITION (BEDROOM NUMBER TWO) AT PHOTO CENTER. VIEW TO WEST. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA

  8. 7. INTERIOR OF SOUTH MAIN BUILDING ROOM AND OPEN DOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. INTERIOR OF SOUTH MAIN BUILDING ROOM AND OPEN DOOR TO SHED ADDITION, OPEN DOOR TO NORTH MAIN BUILDING ROOM, AND CLOSED DOOR TO BATHROOM. VIEW TO NORTHWEST. - Bishop Creek Hydroelectric System, Control Station, Hydrographer's Office, Bishop Creek, Bishop, Inyo County, CA

  9. GPU Computing in Bayesian Inference of Realized Stochastic Volatility Model

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya

    2015-01-01

    The realized stochastic volatility (RSV) model that utilizes the realized volatility as additional information has been proposed to infer volatility of financial time series. We consider the Bayesian inference of the RSV model by the Hybrid Monte Carlo (HMC) algorithm. The HMC algorithm can be parallelized and thus performed on the GPU for speedup. The GPU code is developed with CUDA Fortran. We compare the computational time in performing the HMC algorithm on GPU (GTX 760) and CPU (Intel i7-4770 3.4GHz) and find that the GPU can be up to 17 times faster than the CPU. We also code the program with OpenACC and find that appropriate coding can achieve the similar speedup with CUDA Fortran.

  10. First experience of vectorizing electromagnetic physics models for detector simulation

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Bianchini, C.; Bitzes, G.; Brun, R.; Canal, P.; Carminati, F.; de Fine Licht, J.; Duhem, L.; Elvira, D.; Gheata, A.; Jun, S. Y.; Lima, G.; Novak, M.; Presbyterian, M.; Shadura, O.; Seghal, R.; Wenzel, S.

    2015-12-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. The GeantV vector prototype for detector simulations has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth, parallelization needed to achieve optimal performance or memory access latency and speed. An additional challenge is to avoid the code duplication often inherent to supporting heterogeneous platforms. In this paper we present the first experience of vectorizing electromagnetic physics models developed for the GeantV project.

  11. Hybrid cloud: bridging of private and public cloud computing

    NASA Astrophysics Data System (ADS)

    Aryotejo, Guruh; Kristiyanto, Daniel Y.; Mufadhol

    2018-05-01

    Cloud Computing is quickly emerging as a promising paradigm in the recent years especially for the business sector. In addition, through cloud service providers, cloud computing is widely used by Information Technology (IT) based startup company to grow their business. However, the level of most businesses awareness on data security issues is low, since some Cloud Service Provider (CSP) could decrypt their data. Hybrid Cloud Deployment Model (HCDM) has characteristic as open source, which is one of secure cloud computing model, thus HCDM may solve data security issues. The objective of this study is to design, deploy and evaluate a HCDM as Infrastructure as a Service (IaaS). In the implementation process, Metal as a Service (MAAS) engine was used as a base to build an actual server and node. Followed by installing the vsftpd application, which serves as FTP server. In comparison with HCDM, public cloud was adopted through public cloud interface. As a result, the design and deployment of HCDM was conducted successfully, instead of having good security, HCDM able to transfer data faster than public cloud significantly. To the best of our knowledge, Hybrid Cloud Deployment model is one of secure cloud computing model due to its characteristic as open source. Furthermore, this study will serve as a base for future studies about Hybrid Cloud Deployment model which may relevant for solving big security issues of IT-based startup companies especially in Indonesia.

  12. CHARMM-GUI ligand reader and modeler for CHARMM force field generation of small molecules.

    PubMed

    Kim, Seonghoon; Lee, Jumin; Jo, Sunhwan; Brooks, Charles L; Lee, Hui Sun; Im, Wonpil

    2017-06-05

    Reading ligand structures into any simulation program is often nontrivial and time consuming, especially when the force field parameters and/or structure files of the corresponding molecules are not available. To address this problem, we have developed Ligand Reader & Modeler in CHARMM-GUI. Users can upload ligand structure information in various forms (using PDB ID, ligand ID, SMILES, MOL/MOL2/SDF file, or PDB/mmCIF file), and the uploaded structure is displayed on a sketchpad for verification and further modification. Based on the displayed structure, Ligand Reader & Modeler generates the ligand force field parameters and necessary structure files by searching for the ligand in the CHARMM force field library or using the CHARMM general force field (CGenFF). In addition, users can define chemical substitution sites and draw substituents in each site on the sketchpad to generate a set of combinatorial structure files and corresponding force field parameters for throughput or alchemical free energy simulations. Finally, the output from Ligand Reader & Modeler can be used in other CHARMM-GUI modules to build a protein-ligand simulation system for all supported simulation programs, such as CHARMM, NAMD, GROMACS, AMBER, GENESIS, LAMMPS, Desmond, OpenMM, and CHARMM/OpenMM. Ligand Reader & Modeler is available as a functional module of CHARMM-GUI at http://www.charmm-gui.org/input/ligandrm. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  13. Mouth-opening device as a treatment modality in trismus patients with head and neck cancer and oral submucous fibrosis: a prospective study.

    PubMed

    Li, Yu-Hsuan; Chang, Wei-Chin; Chiang, Tien-En; Lin, Chiun-Shu; Chen, Yuan-Wu

    2018-04-26

    This study investigated the clinical effectiveness of intervention with an open-mouth exercise device designed to facilitate maximal interincisal opening (MIO) and improve quality of life in patients with head and neck (H&N) cancer and oral submucous fibrosis (OSF). Sixty patients with H&N cancer, OSF, and trismus (MIO < 35 mm) participated in the functional rehabilitation program. An open-mouth exercise device intervention group and conventional group, each consisting of 20 patients, underwent a 12-week training and exercising program and follow-up. For the control group, an additional 20 patients were randomly selected to match the demographic characteristics of the aforementioned two groups. The patients' MIO improvements in the aforementioned three groups were 14.0, 10.5, and 1.3 mm, respectively. Results of this study confirm the significant improvement in average mouth-opening range. In addition, according to patient feedback, significant improvements in health-related quality of life and reductions in trismus symptoms occurred in the open-mouth exercise device group. This newly designed open-mouth exercise device can facilitate trismus patients with H&N cancer and OSF and improve mouth-opening range and quality of life.

  14. Learning through vulnerability: a mentor-mentee experience.

    PubMed

    Jones, Kohar; Reis, Shmuel

    2010-01-01

    The following essay, drawn from the journals and work notebook of a family medicine resident and a visiting clinical mentor, chronicles their work together in an Advanced Clinical Mentoring program. This program included afternoons of direct clinical observation immediately followed by feedback sessions. In addition to addressing specific professional issues, such as time management, limiting patient encounters, agenda matching, and the One-Minute Preceptor model, the authors developed personally as they opened themselves to learning and growing as a clinician and a teacher.

  15. Structural Modeling and Response of Command, Control and Communication Shelter Systems for Event DICE THROW.

    DTIC Science & Technology

    1980-03-01

    Construction ... ........... ... 71 7 FkAEC"IWAG FitAZbL~ai-am naJ" LIST OF ILLUSTRATIONS (CONT’D) Figure Page 2.26 Grid Point System for AN/TRC-145...systems consists of a basic shelter structure whose side walls are of sandwich construction with internal stiffeners. Channel extrusions along each...free edge of the shelter provide additional strength and stiffening. The shelters contain electronic equipment racks of open framework construction using

  16. Intrinsic Variability in Multiple Systems and Clusters: Open Questions

    NASA Astrophysics Data System (ADS)

    Lampens, P.

    2006-04-01

    It is most interesting and rewarding to probe the stellar structure of stars which belong to a system originating from the same parent cloud as this provides additional and more accurate constraints for the models. New results on pulsating components in multiple systems and clusters are beginning to emerge regularly. Based on concrete studies, I will present still unsolved problems and discuss some of the issues which may affect our understanding of the pulsation physics in such systems but also in general.

  17. Abstracting data warehousing issues in scientific research.

    PubMed

    Tews, Cody; Bracio, Boris R

    2002-01-01

    This paper presents the design and implementation of the Idaho Biomedical Data Management System (IBDMS). This system preprocesses biomedical data from the IMPROVE (Improving Control of Patient Status in Critical Care) library via an Open Database Connectivity (ODBC) connection. The ODBC connection allows for local and remote simulations to access filtered, joined, and sorted data using the Structured Query Language (SQL). The tool is capable of providing an overview of available data in addition to user defined data subset for verification of models of the human respiratory system.

  18. Peer work in Open Dialogue: A discussion paper.

    PubMed

    Bellingham, Brett; Buus, Niels; McCloughen, Andrea; Dawson, Lisa; Schweizer, Richard; Mikes-Liu, Kristof; Peetz, Amy; Boydell, Katherine; River, Jo

    2018-03-25

    Open Dialogue is a resource-oriented approach to mental health care that originated in Finland. As Open Dialogue has been adopted across diverse international healthcare settings, it has been adapted according to contextual factors. One important development in Open Dialogue has been the incorporation of paid, formal peer work. Peer work draws on the knowledge and wisdom gained through lived experience of distress and hardship to establish mutual, reciprocal, and supportive relationships with service users. As Open Dialogue is now being implemented across mental health services in Australia, stakeholders are beginning to consider the role that peer workers might have in this model of care. Open Dialogue was not, initially, conceived to include a specific role for peers, and there is little available literature, and even less empirical research, in this area. This discussion paper aims to surface some of the current debates and ideas about peer work in Open Dialogue. Examples and models of peer work in Open Dialogue are examined, and the potential benefits and challenges of adopting this approach in health services are discussed. Peer work in Open Dialogue could potentially foster democracy and disrupt clinical hierarchies, but could also move peer work from reciprocal to a less symmetrical relationship of 'giver' and 'receiver' of care. Other models of care, such as lived experience practitioners in Open Dialogue, can be conceived. However, it remains uncertain whether the hierarchical structures in healthcare and current models of funding would support any such models. © 2018 Australian College of Mental Health Nurses Inc.

  19. The sources of Antarctic bottom water in a global ice ocean model

    NASA Astrophysics Data System (ADS)

    Goosse, Hugues; Campin, Jean-Michel; Tartinville, Benoı̂t

    Two mechanisms contribute to the formation of Antarctic bottom water (AABW). The first, and probably the most important, is initiated by the brine released on the Antarctic continental shelf during ice formation which is responsible for an increase in salinity. After mixing with ambient water at the shelf break, this salty and dense water sinks along the shelf slope and invades the deepest part of the global ocean. For the second one, the increase of surface water density is due to strong cooling at the ocean-atmosphere interface, together with a contribution from brine release. This induces deep convection and the renewal of deep waters. The relative importance of these two mechanisms is investigated in a global coupled ice-ocean model. Chlorofluorocarbon (CFC) concentrations simulated by the model compare favourably with observations, suggesting a reasonable deep water ventilation in the Southern Ocean, except close to Antarctica where concentrations are too high. Two artificial passive tracers released at surface on the Antarctic continental shelf and in the open-ocean allow to show clearly that the two mechanisms contribute significantly to the renewal of AABW in the model. This indicates that open-ocean convection is overestimated in our simulation. Additional experiments show that the amount of AABW production due to the export of dense shelf waters is quite sensitive to the parameterisation of the effect of downsloping and meso-scale eddies. Nevertheless, shelf waters always contribute significantly to deep water renewal. Besides, increasing the P.R. Gent, J.C. McWilliams [Journal of Physical Oceanography 20 (1990) 150-155] thickness diffusion can nearly suppress the AABW formation by open-ocean convection.

  20. Compartmental analysis of washout effect in rat brain: in-beam OpenPET measurement using a 11C beam

    NASA Astrophysics Data System (ADS)

    Hirano, Yoshiyuki; Kinouchi, Shoko; Ikoma, Yoko; Yoshida, Eiji; Wakizaka, Hidekazu; Ito, Hiroshi; Yamaya, Taiga

    2013-12-01

    In-beam positron emission tomography (PET) is expected to enable visualization of a dose verification using positron emitters (β+ decay). For accurate dose verification, correction of the washout of the positron emitters should be made. In addition, the quantitative washout rate has a potential usefulness as a diagnostic index, but modeling for this has not been studied yet. In this paper, therefore, we applied compartment analyses to in-beam PET data acquired by our small OpenPET prototype, which has a physically opened field-of-view (FOV) between two detector rings. A rat brain was located at the FOV and was irradiated by a 11C beam. Time activity curves of the irradiated field were measured immediately after the irradiations, and the washout rate was obtained based on two models: the two-washout model (medium decay, k2m; slow decay, k2s) developed in a study of rabbit irradiation; and the two-compartment model used in nuclear medicine, where efflux from tissue to blood (k2), influx (k3) and efflux (k4) from the first to second compartments in tissue were evaluated. The observed k2m and k2s were 0.34 and 0.005 min-1, respectively, which was consistent with the rabbit study. Also k2m was close to the washout rate in cerebral blood flow (CBF) measurements by dynamic PET with 15O-water, while, k2, k3, and k4 were 0.16, 0.15 and 0.007 min-1. Our present work suggested the dynamics of 11C might be relevant to CBF or permeability of a molecule containing 11C atoms might be regulated by a transporter because the k2 was relatively low compared with a simple diffusion tracer.

  1. 12 CFR Appendix G to Part 226 - Open-End Model Forms and Clauses

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 3 2013-01-01 2013-01-01 false Open-End Model Forms and Clauses G Appendix G... RESERVE SYSTEM (CONTINUED) TRUTH IN LENDING (REGULATION Z) Pt. 226, App. G Appendix G to Part 226—Open-End Model Forms and Clauses G-1Balance Computation Methods Model Clauses (Home-equity Plans) (§§ 226.6 and...

  2. Development of Automated Procedures to Generate Reference Building Models for ASHRAE Standard 90.1 and India’s Building Energy Code and Implementation in OpenStudio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, Andrew; Haves, Philip; Jegi, Subhash

    This paper describes a software system for automatically generating a reference (baseline) building energy model from the proposed (as-designed) building energy model. This system is built using the OpenStudio Software Development Kit (SDK) and is designed to operate on building energy models in the OpenStudio file format.

  3. SMILI?: A Framework for Interfaces to Learning Data in Open Learner Models, Learning Analytics and Related Fields

    ERIC Educational Resources Information Center

    Bull, Susan; Kay, Judy

    2016-01-01

    The SMILI? (Student Models that Invite the Learner In) Open Learner Model Framework was created to provide a coherent picture of the many and diverse forms of Open Learner Models (OLMs). The aim was for SMILI? to provide researchers with a systematic way to describe, compare and critique OLMs. We expected it to highlight those areas where there…

  4. Lubrication Theory Model to Evaluate Surgical Alterations in Flow Mechanics of the Lower Esophageal Sphincter

    NASA Astrophysics Data System (ADS)

    Ghosh, Sudip K.; Brasseur, James G.; Zaki, Tamer; Kahrilas, Peter J.

    2003-11-01

    Surgery is commonly used to rebuild a weak lower esophageal sphincter (LES) and reduce reflux. Because the driving pressure (DP) is proportional to muscle tension generated in the esophagus, we developed models using lubrication theory to evaluate the consequences of surgery on muscle force required to open the LES and drive the flow. The models relate time changes in DP to lumen geometry and trans-LES flow with a manometric catheter. Inertial effects were included and found negligible. Two models, direct (opening specified) and indirect (opening predicted), were combined with manometric pressure and imaging data from normal and post-surgery LES. A very high sensitivity was predicted between the details of the DP and LES opening. The indirect model accurately captured LES opening and predicted a 3-phase emptying process, with phases I and III requiring rapid generation of muscle tone to open the LES and empty the esophagus. Data showed that phases I and III are adversely altered by surgery causing incomplete emptying. Parametric model studies indicated that changes to the surgical procedure can positively alter LES flow mechanics and improve clinical outcomes.

  5. An OpenACC-Based Unified Programming Model for Multi-accelerator Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jungwon; Lee, Seyong; Vetter, Jeffrey S

    2015-01-01

    This paper proposes a novel SPMD programming model of OpenACC. Our model integrates the different granularities of parallelism from vector-level parallelism to node-level parallelism into a single, unified model based on OpenACC. It allows programmers to write programs for multiple accelerators using a uniform programming model whether they are in shared or distributed memory systems. We implement a prototype of our model and evaluate its performance with a GPU-based supercomputer using three benchmark applications.

  6. Multi-gap high impedance plasma opening switch

    DOEpatents

    Mason, Rodney J.

    1996-01-01

    A high impedance plasma opening switch having an anode and a cathode and at least one additional electrode placed between the anode and cathode. The presence of the additional electrodes leads to the creation of additional plasma gaps which are in series, increasing the net impedance of the switch. An equivalent effect can be obtained by using two or more conventional plasma switches with their plasma gaps wired in series. Higher impedance switches can provide high current and voltage to higher impedance loads such as plasma radiation sources.

  7. The continued movement for open access to peer-reviewed literature.

    PubMed

    Liesegang, Thomas J

    2013-09-01

    To provide a current overview of the movement for open access to the peer review literature. Perspective. Literature review of recent advances in the open access movement with a personal viewpoint of the nuances of the movement. The open access movement is complex, with many different constituents. The idealists for the open access movement are seeking open access to the literature but also to the data that constitute the research within the manuscript. The business model of the traditional subscription journal is being scrutinized in relation to the surge in the number of open access journals. Within this environment authors should beware predatory practices. More government and funding agencies are mandating open access to their funded research. This open access movement will continue to be disruptive until a business model ensures continuity of the scientific record. A flood of open access articles that might enrich, but also might pollute or confuse, the medical literature has altered the filtering mechanism provided by the traditional peer review system. At some point there may be a shake-out, with some literature being lost in cyberspace. The open access movement is maturing and must be embraced in some format. The challenge is to establish a sustainable financial business model that will permit the use of digital technology but yet not endanger the decades-old traditional publication model and peer review system. Authors seem to be slower in adopting open access than the idealists in the movement. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Support of Multidimensional Parallelism in the OpenMP Programming Model

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Jost, Gabriele

    2003-01-01

    OpenMP is the current standard for shared-memory programming. While providing ease of parallel programming, the OpenMP programming model also has limitations which often effect the scalability of applications. Examples for these limitations are work distribution and point-to-point synchronization among threads. We propose extensions to the OpenMP programming model which allow the user to easily distribute the work in multiple dimensions and synchronize the workflow among the threads. The proposed extensions include four new constructs and the associated runtime library. They do not require changes to the source code and can be implemented based on the existing OpenMP standard. We illustrate the concept in a prototype translator and test with benchmark codes and a cloud modeling code.

  9. Sea-Salt Aerosol Forecasts Compared with Wave and Sea-Salt Measurements in the Open Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Kishcha, P.; Starobinets, B.; Bozzano, R.; Pensieri, S.; Canepa, E.; Nickovie, S.; di Sarra, A.; Udisti, R.; Becagli, S.; Alpert, P.

    2012-03-01

    Sea-salt aerosol (SSA) could influence the Earth's climate acting as cloud condensation nuclei. However, there were no regular measurements of SSA in the open sea. At Tel-Aviv University, the DREAM-Salt prediction system has been producing daily forecasts of 3-D distribution of sea-salt aerosol concentrations over the Mediterranean Sea (http://wind.tau.ac.il/saltina/ salt.html). In order to evaluate the model performance in the open sea, daily modeled concentrations were compared directly with SSA measurements taken at the tiny island of Lampedusa, in the Central Mediterranean. In order to further test the robustness of the model, the model performance over the open sea was indirectly verified by comparing modeled SSA concentrations with wave height measurements collected by the ODAS Italia 1 buoy and the Llobregat buoy. Model-vs.-measurement comparisons show that the model is capable of producing realistic SSA concentrations and their day-today variations over the open sea, in accordance with observed wave height and wind speed.

  10. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood forecasting purposes, A.H. Weerts, G.Y.H. El Serafy, S. Hummel, J. Dhondia, and H. Gerritsen (2009), accepted by Geoscience & Computers.

  11. Openness as a buffer against cognitive decline: The Openness-Fluid-Crystallized-Intelligence (OFCI) model applied to late adulthood.

    PubMed

    Ziegler, Matthias; Cengia, Anja; Mussel, Patrick; Gerstorf, Denis

    2015-09-01

    Explaining cognitive decline in late adulthood is a major research area. Models using personality traits as possible influential variables are rare. This study tested assumptions based on an adapted version of the Openness-Fluid-Crystallized-Intelligence (OFCI) model. The OFCI model adapted to late adulthood predicts that openness is related to the decline in fluid reasoning (Gf) through environmental enrichment. Gf should be related to the development of comprehension knowledge (Gc; investment theory). It was also assumed that Gf predicts changes in openness as suggested by the environmental success hypothesis. Finally, the OFCI model proposes that openness has an indirect influence on the decline in Gc through its effect on Gf (mediation hypothesis). Using data from the Berlin Aging Study (N = 516, 70-103 years at T1), these predictions were tested using latent change score and latent growth curve models with indicators of each trait. The current findings and prior research support environmental enrichment and success, investment theory, and partially the mediation hypotheses. Based on a summary of all findings, the OFCI model for late adulthood is suggested. (c) 2015 APA, all rights reserved).

  12. Performance evaluation of Space Shuttle SRB parachutes from air drop and scaled model wind tunnel tests. [Solid Rocket Booster recovery system

    NASA Technical Reports Server (NTRS)

    Moog, R. D.; Bacchus, D. L.; Utreja, L. R.

    1979-01-01

    The aerodynamic performance characteristics have been determined for the Space Shuttle Solid Rocket Booster drogue, main, and pilot parachutes. The performance evaluation on the 20-degree conical ribbon parachutes is based primarily on air drop tests of full scale prototype parachutes. In addition, parametric wind tunnel tests were performed and used in parachute configuration development and preliminary performance assessments. The wind tunnel test data are compared to the drop test results and both sets of data are used to determine the predicted performance of the Solid Rocket Booster flight parachutes. Data from other drop tests of large ribbon parachutes are also compared with the Solid Rocket Booster parachute performance characteristics. Parameters assessed include full open terminal drag coefficients, reefed drag area, opening characteristics, clustering effects, and forebody interference.

  13. Factorization of chiral string amplitudes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Yu-tin; Siegel, Warren; Yuan, Ellis Ye

    We re-examine a closed-string model defined by altering the boundary conditions for one handedness of two-dimensional propagators in otherwise-standard string theory. We evaluate the amplitudes using Kawai-Lewellen-Tye factorization into open-string amplitudes. The only modification to standard string theory is effectively that the spacetime Minkowski metric changes overall sign in one open-string factor. This cancels all but a finite number of states: as found in earlier approaches, with enough supersymmetry (e.g., type II) the tree amplitudes reproduce those of the massless truncation of ordinary string theory. However, we now find for the other cases that additional fields, formerly thought to bemore » auxiliary, describe new spin-2 states at the two adjacent mass levels (tachyonic and tardyonic). The tachyon is always a ghost, but can be avoided in the heterotic case.« less

  14. Factorization of chiral string amplitudes

    DOE PAGES

    Huang, Yu-tin; Siegel, Warren; Yuan, Ellis Ye

    2016-09-16

    We re-examine a closed-string model defined by altering the boundary conditions for one handedness of two-dimensional propagators in otherwise-standard string theory. We evaluate the amplitudes using Kawai-Lewellen-Tye factorization into open-string amplitudes. The only modification to standard string theory is effectively that the spacetime Minkowski metric changes overall sign in one open-string factor. This cancels all but a finite number of states: as found in earlier approaches, with enough supersymmetry (e.g., type II) the tree amplitudes reproduce those of the massless truncation of ordinary string theory. However, we now find for the other cases that additional fields, formerly thought to bemore » auxiliary, describe new spin-2 states at the two adjacent mass levels (tachyonic and tardyonic). The tachyon is always a ghost, but can be avoided in the heterotic case.« less

  15. eLoom and Flatland: specification, simulation and visualization engines for the study of arbitrary hierarchical neural architectures.

    PubMed

    Caudell, Thomas P; Xiao, Yunhai; Healy, Michael J

    2003-01-01

    eLoom is an open source graph simulation software tool, developed at the University of New Mexico (UNM), that enables users to specify and simulate neural network models. Its specification language and libraries enables users to construct and simulate arbitrary, potentially hierarchical network structures on serial and parallel processing systems. In addition, eLoom is integrated with UNM's Flatland, an open source virtual environments development tool to provide real-time visualizations of the network structure and activity. Visualization is a useful method for understanding both learning and computation in artificial neural networks. Through 3D animated pictorially representations of the state and flow of information in the network, a better understanding of network functionality is achieved. ART-1, LAPART-II, MLP, and SOM neural networks are presented to illustrate eLoom and Flatland's capabilities.

  16. An open source Bayesian Monte Carlo isotope mixing model with applications in Earth surface processes

    NASA Astrophysics Data System (ADS)

    Arendt, Carli A.; Aciego, Sarah M.; Hetland, Eric A.

    2015-05-01

    The implementation of isotopic tracers as constraints on source contributions has become increasingly relevant to understanding Earth surface processes. Interpretation of these isotopic tracers has become more accessible with the development of Bayesian Monte Carlo (BMC) mixing models, which allow uncertainty in mixing end-members and provide methodology for systems with multicomponent mixing. This study presents an open source multiple isotope BMC mixing model that is applicable to Earth surface environments with sources exhibiting distinct end-member isotopic signatures. Our model is first applied to new δ18O and δD measurements from the Athabasca Glacier, which showed expected seasonal melt evolution trends and vigorously assessed the statistical relevance of the resulting fraction estimations. To highlight the broad applicability of our model to a variety of Earth surface environments and relevant isotopic systems, we expand our model to two additional case studies: deriving melt sources from δ18O, δD, and 222Rn measurements of Greenland Ice Sheet bulk water samples and assessing nutrient sources from ɛNd and 87Sr/86Sr measurements of Hawaiian soil cores. The model produces results for the Greenland Ice Sheet and Hawaiian soil data sets that are consistent with the originally published fractional contribution estimates. The advantage of this method is that it quantifies the error induced by variability in the end-member compositions, unrealized by the models previously applied to the above case studies. Results from all three case studies demonstrate the broad applicability of this statistical BMC isotopic mixing model for estimating source contribution fractions in a variety of Earth surface systems.

  17. Open Science and the Monitoring of Aquatic Ecosystems

    EPA Science Inventory

    Open science represents both a philosophy and a set of tools that can be leveraged for more effective scientific analysis. At the core of the open science movement is the concept that research should be reproducible and transparent, in addition to having long-term provenance thro...

  18. 76 FR 77229 - Sunshine Act Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-12

    ... EXPORT-IMPORT BANK OF THE UNITED STATES Sunshine Act Meeting ACTION: Notice of a Partially Open...., Washington, DC 20571. OPEN AGENDA ITEM: Item No. 1: Ex-Im Bank Advisory Committee for 2012 (Additional Members). PUBLIC PARTICIPATION: The meeting will be open to public observation for Item No. 1 only...

  19. Simulating closed- and open-loop voluntary movement: a nonlinear control-systems approach.

    PubMed

    Davidson, Paul R; Jones, Richard D; Andreae, John H; Sirisena, Harsha R

    2002-11-01

    In many recent human motor control models, including feedback-error learning and adaptive model theory (AMT), feedback control is used to correct errors while an inverse model is simultaneously tuned to provide accurate feedforward control. This popular and appealing hypothesis, based on a combination of psychophysical observations and engineering considerations, predicts that once the tuning of the inverse model is complete the role of feedback control is limited to the correction of disturbances. This hypothesis was tested by looking at the open-loop behavior of the human motor system during adaptation. An experiment was carried out involving 20 normal adult subjects who learned a novel visuomotor relationship on a pursuit tracking task with a steering wheel for input. During learning, the response cursor was periodically blanked, removing all feedback about the external system (i.e., about the relationship between hand motion and response cursor motion). Open-loop behavior was not consistent with a progressive transfer from closed- to open-loop control. Our recently developed computational model of the brain--a novel nonlinear implementation of AMT--was able to reproduce the observed closed- and open-loop results. In contrast, other control-systems models exhibited only minimal feedback control following adaptation, leading to incorrect open-loop behavior. This is because our model continues to use feedback to control slow movements after adaptation is complete. This behavior enhances the internal stability of the inverse model. In summary, our computational model is currently the only motor control model able to accurately simulate the closed- and open-loop characteristics of the experimental response trajectories.

  20. Open-loop GPS signal tracking at low elevation angles from a ground-based observation site

    NASA Astrophysics Data System (ADS)

    Beyerle, Georg; Zus, Florian

    2016-04-01

    For more than a decade space-based global navigation satellite system (GNSS) radio occultation (RO) observations are used by meteorological services world-wide for their numerical weather prediction models. In addition, climate studies increasingly rely on validated GNSS-RO data sets of atmospheric parameters. GNSS-RO profiles typically cover an altitude range from the boundary layer up to the upper stratosphere; their highest accuracy and precision, however, are attained at the tropopause level. In the lower troposphere, multipath ray propagation tend to induce signal amplitude and frequency fluctuations which lead to the development and implementation of open-loop signal tracking methods in GNSS-RO receiver firmwares. In open-loop mode the feed-back values for the carrier tracking loop are derived not from measured data, but from a Doppler frequency model which usually is extracted from an atmospheric climatology. In order to ensure that this receiver-internal parameter set, does not bias the carrier phase path observables, dual-channel open-loop GNSS-RO signal tracking was suggested. Following this proposal the ground-based "GLESER" (GPS low-elevation setting event recorder) campaign was established. Its objective was to disproof the existence of model-induced frequency biases using ground-based GPS observations at very low elevation angles. Between January and December 2014 about 2600 validated setting events, starting at geometric elevation angles of +2° and extending to -1°… - 1.5°, were recorded by the single frequency "OpenGPS" GPS receiver at a measurement site located close to Potsdam, Germany (52.3808°N, 13.0642°E). The study is based on the assumption that these ground-based observations may be used as proxies for space-based RO measurements, even if the latter occur on a one order of magnitude faster temporal scale. The "GLESER" data analysis shows that the open-loop Doppler model has negligible influence on the derived frequency profile provided signal-to-noise density ratios remain above about 30 dB Hz. At low signal levels, however, the dual-channel open-loop design, which tracks the same signal using two Doppler models separated by a 10 Hz offset, reveals a notable bias. A significant fraction of this bias is caused by frequency aliasing. The receiver's dual-channel setup, however, allows for unambiguous identification of the affected observation samples. Finally, the repeat patterns in terms of azimuth angle of the GPS orbit traces reveals characteristic signatures in both, signal amplitude and Doppler frequency with respect to the topography close to the observation site. On the other hand, mean vertical refractivity gradients extracted from ECMWF meteorological fields exhibit moderate correlations with observed signal amplitude fluctuations at negative elevation angles emphasizing the information content of low-elevation GPS signals with respect to the atmospheric state in the boundary layer.

  1. Closed-loop suppression of chaos in nonlinear driven oscillators

    NASA Astrophysics Data System (ADS)

    Aguirre, L. A.; Billings, S. A.

    1995-05-01

    This paper discusses the suppression of chaos in nonlinear driven oscillators via the addition of a periodic perturbation. Given a system originally undergoing chaotic motions, it is desired that such a system be driven to some periodic orbit. This can be achieved by the addition of a weak periodic signal to the oscillator input. This is usually accomplished in open loop, but this procedure presents some difficulties which are discussed in the paper. To ensure that this is attained despite uncertainties and possible disturbances on the system, a procedure is suggested to perform control in closed loop. In addition, it is illustrated how a model, estimated from input/output data, can be used in the design. Numerical examples which use the Duffing-Ueda and modified van der Pol oscillators are included to illustrate some of the properties of the new approach.

  2. The Open Learning Object Model to Promote Open Educational Resources

    ERIC Educational Resources Information Center

    Fulantelli, Giovanni; Gentile, Manuel; Taibi, Davide; Allegra, Mario

    2008-01-01

    In this paper we present the results of research work, that forms part of the activities of the EU-funded project SLOOP: Sharing Learning Objects in an Open Perspective, aimed at encouraging the definition, development and management of Open Educational Resources based on the Learning Object paradigm (Wiley, 2000). We present a model of Open…

  3. The Emergence of Open-Source Software in North America

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…

  4. OpenMDAO Framework Status

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia Gutierrez

    2010-01-01

    Advancing and exploring the science of Multidisciplinary Analysis & Optimization (MDAO) capabilities are high-level goals in the Fundamental Aeronautics Program s Subsonic Fixed Wing (SFW) project. The OpenMDAO team has made significant progress toward completing the Alpha OpenMDAO deliverable due in September 2010. Included in the presentation are: details of progress on developing the OpenMDAO framework, example usage of OpenMDAO, technology transfer plans, near term plans, progress toward establishing partnerships with external parties, and discussion of additional potential collaborations.

  5. Collaborative data model and data base development for paleoenvironmental and archaeological domain using Semantic MediaWiki

    NASA Astrophysics Data System (ADS)

    Willmes, C.

    2017-12-01

    In the frame of the Collaborative Research Centre 806 (CRC 806) an interdisciplinary research project, that needs to manage data, information and knowledge from heterogeneous domains, such as archeology, cultural sciences, and the geosciences, a collaborative internal knowledge base system was developed. The system is based on the open source MediaWiki software, that is well known as the software that enables Wikipedia, for its facilitation of a web based collaborative knowledge and information management platform. This software is additionally enhanced with the Semantic MediaWiki (SMW) extension, that allows to store and manage structural data within the Wiki platform, as well as it facilitates complex query and API interfaces to the structured data stored in the SMW data base. Using an additional open source software called mobo, it is possible to improve the data model development process, as well as automated data imports, from small spreadsheets to large relational databases. Mobo is a command line tool that helps building and deploying SMW structure in an agile, Schema-Driven Development way, and allows to manage and collaboratively develop the data model formalizations, that are formalized in JSON-Schema format, using version control systems like git. The combination of a well equipped collaborative web platform facilitated by Mediawiki, the possibility to store and query structured data in this collaborative database provided by SMW, as well as the possibility for automated data import and data model development enabled by mobo, result in a powerful but flexible system to build and develop a collaborative knowledge base system. Furthermore, SMW allows the application of Semantic Web technology, the structured data can be exported into RDF, thus it is possible to set a triple-store including a SPARQL endpoint on top of the database. The JSON-Schema based data models, can be enhanced into JSON-LD, to facilitate and profit from the possibilities of Linked Data technology.

  6. Gap opening after merger events of 3-Earth-mass protoplanets

    NASA Astrophysics Data System (ADS)

    Broz, Miroslav; Chrenko, Ondrej

    2017-10-01

    While several-Earth-mass protoplanets can gain non-negligible eccentricities due to their interactions with the gaseous disk and ongoing pebble accretion (so called hot trail effect; see the contribution of Chrenko et al. 2017 for details), there is a opened pathway for giant-planet core formation by means of close encounters and eventual merging. As soon as a massive (~13 M_E) merger is formed, it seems necessary to account for one additional term in the set of hydrodynamic equations, namely the gas accretion, which may affect subsequent orbital evolution, and eventually change Type-I migration to Type-II. Using similar approximations as Crida and Bitsch (2017), we prolong our previous simulations towards the onset of gap opening.At the same time, we try to address the observability of these events, e.g. by ALMA in its full configuration. Because the disk is still mostly optically thick in the vertical direction (tau =~ 100), it is necessary to properly model the disk atmosphere. In the midplane, the mean-free path of gas molecules is small enough to assure a sufficient thermal contact and equilibrium between the gas and dust. This is no more true far from the midplane and one has to use a non-equilibrium model (e.g. Radmc-3d code) for the description of dust grain temperatures, resulting synthetic image, or emergent spectrum.

  7. Contribution to an effective design method for stationary reaction-diffusion patterns.

    PubMed

    Szalai, István; Horváth, Judit; De Kepper, Patrick

    2015-06-01

    The British mathematician Alan Turing predicted, in his seminal 1952 publication, that stationary reaction-diffusion patterns could spontaneously develop in reacting chemical or biochemical solutions. The first two clear experimental demonstrations of such a phenomenon were not made before the early 1990s when the design of new chemical oscillatory reactions and appropriate open spatial chemical reactors had been invented. Yet, the number of pattern producing reactions had not grown until 2009 when we developed an operational design method, which takes into account the feeding conditions and other specificities of real open spatial reactors. Since then, on the basis of this method, five additional reactions were shown to produce stationary reaction-diffusion patterns. To gain a clearer view on where our methodical approach on the patterning capacity of a reaction stands, numerical studies in conditions that mimic true open spatial reactors were made. In these numerical experiments, we explored the patterning capacity of Rabai's model for pH driven Landolt type reactions as a function of experimentally attainable parameters that control the main time and length scales. Because of the straightforward reversible binding of protons to carboxylate carrying polymer chains, this class of reaction is at the base of the chemistry leading to most of the stationary reaction-diffusion patterns presently observed. We compare our model predictions with experimental observations and comment on agreements and differences.

  8. Openness, Web 2.0 Technology, and Open Science

    ERIC Educational Resources Information Center

    Peters, Michael A.

    2010-01-01

    Open science is a term that is being used in the literature to designate a form of science based on open source models or that utilizes principles of open access, open archiving and open publishing to promote scientific communication. Open science increasingly also refers to open governance and more democratized engagement and control of science…

  9. GiPSi:a framework for open source/open architecture software development for organ-level surgical simulation.

    PubMed

    Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank

    2006-04-01

    This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.

  10. Geochemical study of groundwater at Sandia National Laboratories/New Mexico and Kirtland Air Force Base

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The U.S. Department of Energy (DOE) Grand Junction Projects Office (GJPO) and its contractor, Rust Geotech, support the Kirtland Area Office by assisting Sandia National Laboratories/New Mexico (Sandia/NM) with remedial action, remedial design, and technical support of its Environmental Restoration Program. To aid in determining groundwater origins and flow paths, the GJPO was tasked to provide interpretation of groundwater geochemical data. The purpose of this investigation was to describe and analyze the groundwater geochemistry of the Sandia/NM Kirtland Air Force Base (KAFB). Interpretations of groundwater origins are made by using these data and the results of {open_quotes}mass balance{close_quotes} and {open_quotes}reactionmore » path{close_quote} modeling. Additional maps and plots were compiled to more fully comprehend the geochemical distributions. A more complete set of these data representations are provided in the appendices. Previous interpretations of groundwater-flow paths that were based on well-head, geologic, and geochemical data are presented in various reports and were used as the basis for developing the models presented in this investigation.« less

  11. Gabapentin-lactam, but not gabapentin, reduces protein aggregates and improves motor performance in a transgenic mouse model of Huntington's disease.

    PubMed

    Zucker, Birgit; Ludin, Dagmar E; Gerds, Thomas A; Lücking, Carl H; Landwehrmeyer, G Bernhard; Feuerstein, Thomas J

    2004-08-01

    Gabapentin (GBP), an anti-convulsant widely used in the treatment of neuropathic pain syndromes, has been suggested to have neuroprotective properties. There is evidence, however, that the neuroprotective properties attributed to GBP are rather associated with a derivative of GBP, gabapentin-lactam (GBP-L), which opens mitochondrial ATP-dependent K+ channels, in contrast to GBP. We explored whether GBP and GBP-L may attenuate the course of a monogenetic autosomal neurodegenerative disorder, Huntington's disease (HD), using a transgenic mouse model. R6/2 mice treated with GBP-L performed walking on a narrow beam better than mice receiving no treatment, vehicle or GBP, suggesting a beneficial effect of GBP-L on motor function. In addition, a marked reduction of neuronal nuclear and cytoplasmic inclusions was observed in brains of mice treated with GBP-L. The pharmacokinetics of GBP-L yielded a mean plasma concentration near the EC50 of GBP-L to open mitochondrial ATP-dependent K+ channels. These findings support the role of GBP-L as a novel neuroprotective substance in vivo.

  12. Effects of HIV-1 on Cognition in Humanized NSG Mice

    NASA Astrophysics Data System (ADS)

    Akhter, Sidra Pervez

    Host species specificity of human immunodeficiency virus (HIV) creates a challenge to study the pathology, diagnostic tools, and therapeutic agents. The closely related simian immunodeficiency virus and studies of neurocognitive impairments on transgenic animals expressing partial viral genome have significant limitations. The humanized mice model provides a small animal system in which a human immune system can be engrafted and immunopathobiology of HIV-1 infection can be studied. However, features of HIV-associated neurocognitive disorders (HAND) were not evaluated in this model. Open field activity test was selected to characterize behavior of original strain NOD/scid-IL-2Rgammac null (NSG) mice, effects of engraftment of human CD34+ hematopoietic stem cells (HSCs) and functional human immune system (huNSG), and finally, investigate the behavior changes induced by chronic HIV-1 infection. Long-term infected HuNSG mice showed the loss of working memory and increased anxiety in the open field. Additionally, these animals were utilized for evaluation of central nervous system metabolic and structural changes. Detected behavioral abnormalities are correlated with obtained neuroimaging and histological abnormalities published.

  13. Resonances in the cumulative reaction probability for a model electronically nonadiabatic reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, J.; Bowman, J.M.

    1996-05-01

    The cumulative reaction probability, flux{endash}flux correlation function, and rate constant are calculated for a model, two-state, electronically nonadiabatic reaction, given by Shin and Light [S. Shin and J. C. Light, J. Chem. Phys. {bold 101}, 2836 (1994)]. We apply straightforward generalizations of the flux matrix/absorbing boundary condition approach of Miller and co-workers to obtain these quantities. The upper adiabatic electronic potential supports bound states, and these manifest themselves as {open_quote}{open_quote}recrossing{close_quote}{close_quote} resonances in the cumulative reaction probability, at total energies above the barrier to reaction on the lower adiabatic potential. At energies below the barrier, the cumulative reaction probability for themore » coupled system is shifted to higher energies relative to the one obtained for the ground state potential. This is due to the effect of an additional effective barrier caused by the nuclear kinetic operator acting on the ground state, adiabatic electronic wave function, as discussed earlier by Shin and Light. Calculations are reported for five sets of electronically nonadiabatic coupling parameters. {copyright} {ital 1996 American Institute of Physics.}« less

  14. 12 CFR Appendix G to Part 226 - Open-End Model Forms and Clauses

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 3 2012-01-01 2012-01-01 false Open-End Model Forms and Clauses G Appendix G... RESERVE SYSTEM TRUTH IN LENDING (REGULATION Z) Pt. 226, App. G Appendix G to Part 226—Open-End Model Forms and Clauses G-1Balance Computation Methods Model Clauses (Home-equity Plans) (§§ 226.6 and 226.7) G-1...

  15. Energy Spectral Behaviors of Communication Networks of Open-Source Communities

    PubMed Central

    Yang, Jianmei; Yang, Huijie; Liao, Hao; Wang, Jiangtao; Zeng, Jinqun

    2015-01-01

    Large-scale online collaborative production activities in open-source communities must be accompanied by large-scale communication activities. Nowadays, the production activities of open-source communities, especially their communication activities, have been more and more concerned. Take CodePlex C # community for example, this paper constructs the complex network models of 12 periods of communication structures of the community based on real data; then discusses the basic concepts of quantum mapping of complex networks, and points out that the purpose of the mapping is to study the structures of complex networks according to the idea of quantum mechanism in studying the structures of large molecules; finally, according to this idea, analyzes and compares the fractal features of the spectra in different quantum mappings of the networks, and concludes that there are multiple self-similarity and criticality in the communication structures of the community. In addition, this paper discusses the insights and application conditions of different quantum mappings in revealing the characteristics of the structures. The proposed quantum mapping method can also be applied to the structural studies of other large-scale organizations. PMID:26047331

  16. An open repository of earthquake-triggered ground-failure inventories

    USGS Publications Warehouse

    Schmitt, Robert G.; Tanyas, Hakan; Nowicki Jessee, M. Anna; Zhu, Jing; Biegel, Katherine M.; Allstadt, Kate E.; Jibson, Randall W.; Thompson, Eric M.; van Westen, Cees J.; Sato, Hiroshi P.; Wald, David J.; Godt, Jonathan W.; Gorum, Tolga; Xu, Chong; Rathje, Ellen M.; Knudsen, Keith L.

    2017-12-20

    Earthquake-triggered ground failure, such as landsliding and liquefaction, can contribute significantly to losses, but our current ability to accurately include them in earthquake-hazard analyses is limited. The development of robust and widely applicable models requires access to numerous inventories of ground failures triggered by earthquakes that span a broad range of terrains, shaking characteristics, and climates. We present an openly accessible, centralized earthquake-triggered groundfailure inventory repository in the form of a ScienceBase Community to provide open access to these data with the goal of accelerating research progress. The ScienceBase Community hosts digital inventories created by both U.S. Geological Survey (USGS) and non-USGS authors. We present the original digital inventory files (when available) as well as an integrated database with uniform attributes. We also summarize the mapping methodology and level of completeness as reported by the original author(s) for each inventory. This document describes the steps taken to collect, process, and compile the inventories and the process for adding additional ground-failure inventories to the ScienceBase Community in the future.

  17. The Langley 14- by 22-Foot Subsonic Tunnel: Description, Flow Characteristics, and Guide for Users

    NASA Technical Reports Server (NTRS)

    Gentry, Garl L., Jr.; Quinto, P. Frank; Gatlin, Gregory M.; Applin, Zachary T.

    1990-01-01

    The Langley 14- by 22-foot Subsonic Tunnel is a closed circuit, single-return atmospheric wind tunnel with a test section that can be operated in a variety of configurations (closed, slotted, partially open, and open). The closed test section configuration is 14.5 ft high by 21.75 ft wide and 50 ft long with a maximum speed of about 338 ft/sec. The open test section configuration has a maximum speed of about 270 ft/sec, and is formed by raising the ceiling and walls, to form a floor-only configuration. The tunnel may be configured with a moving-belt ground plane and a floor boundary-layer removal system at the entrance to the test section for ground effect testing. In addition, the tunnel had a two-component laser velocimeter, a frequency modulated (FM) tape system for dynamic data acquisition, flow visualization equipment, and acoustic testing capabilities. Users of the 14- by 22-foot Subsonic Tunnel are provided with information required for planning of experimental investigations including test hardware and model support systems.

  18. FP7 GLOWASIS - A new collaborative project aimed at pre-validation of a GMES Global Water Scarcity Information Service

    NASA Astrophysics Data System (ADS)

    Westerhoff, R.; Levizzani, V.; Pappenberger, F.; de Roo, A.; Lange, R. D.; Wagner, W.; Bierkens, M. F.; Ceran, M.; Weerts, A.; Sinclair, S.; Miguez-Macho, G.; Langius, E.; Glowasis Team

    2011-12-01

    The main objective of the project GLOWASIS is to pre-validate a GMES Global Service for Water Scarcity Information. It will be set up as a one-stop-shop portal for water scarcity information, in which focus is put on: - monitoring data from satellites and in-situ sensors; - improving forecasting models with improved monitoring data; - linking statistical water data in forecasting; - promotion of GMES Services and European satellites. In European and global pilots on the scale of river catchments it combines hydrological models with in-situ and satellite derived water cycle information, as well as government ruled statistical water demand data. By linking water demand and supply in three pilot studies with existing platforms (European Drought Observatory and PCR-GLOBWB) for medium- and long-term forecasting in Europe, Africa and worldwide, GLOWASIS' information contributes both in near-real time reporting for emerging drought events as well as in provision of climate change time series. By combining complex water cycle variables, governmental issues and economic relations with respect to water demand, GLOWASIS will aim for the needed streamlining of the wide variety of important water scarcity information. More awareness for the complexity of the water scarcity problem will be created and additional capabilities of satellite-measured water cycle parameters can be promoted. The service uses data from GMES Core Services LMCS Geoland2 and Marine Core Service MyOcean (land use, soil moisture, soil sealing, sea level), in-situ data from GEWEX' initiatives (i.e. International Soil Moisture network), agricultural and industrial water use and demand (statistical - AQUASTAT, SEEAW and modelled) and additional water-cycle information from existing global satellite services. In-depth interviews with a.o. EEA and the Australian Bureau of Meteorology are taking place. GLOWASIS will aim for an open source and open-standard information portal on water scarcity and use of modern media (forums, Twitter, etc). Infrastructure of the GLOWASIS portal is set up for dissemination and inclusion of current and future innovative and integrated multi-purpose products for research & operational applications with open standards. The project has started in January 2011 and the duration is 24 months.

  19. Microstructural architecture developed in the fabrication of solid and open-cellular copper components by additive manufacturing using electron beam melting

    NASA Astrophysics Data System (ADS)

    Ramirez, Diana Alejandra

    The fabrication of Cu components were first built by additive manufacturing using electron beam melting (EBM) from low-purity, atomized Cu powder containing a high density of Cu2O precipitates leading to a novel example of precipitate-dislocation architecture. These microstructures exhibit cell-like arrays (1-3microm) in the horizontal reference plane perpendicular to the build direction with columnar-like arrays extending from ~12 to >60 microm in length and corresponding spatial dimensions of 1-3 microm. These observations were observed by the use of optical metallography, and scanning and transmission electron microscopy. The hardness measurements were taken both on the atomized powder and the Cu components. The hardness for these architectures ranged from ~HV 83 to 88, in contrast to the original Cu powder microindentation hardness of HV 72 and the commercial Cu base plate hardness of HV 57. These observations were utilized for the fabrication of open-cellular copper structures by additive manufacturing using EBM and illustrated the ability to fabricate some form of controlled microstructural architecture by EBM parameter alteration or optimizing. The fabrication of these structures ranged in densities from 0.73g/cm3 to 6.67g/cm3. These structures correspond to four different articulated mesh arrays. While these components contained some porosity as a consequence of some unmelted regions, the Cu2O precipitates also contributed to a reduced density. Using X-ray Diffraction showed the approximate volume fraction estimated to be ~2%. The addition of precipitates created in the EBM melt scan formed microstructural arrays which contributed to hardening contributing to the strength of mesh struts and foam ligaments. The measurements of relative stiffness versus relative density plots for Cu compared very closely with Ti-6Al-4V open cellular structures - both mesh and foams. The Cu reticulated mesh structures exhibit a slope of n = 2 in contrast to a slope of n = 2.4 for the stochastic Cu foams, consistent with the Gibson-Ashby foam model where n = 2. These open cellular structure components exhibit considerable potential for novel, complex, multi-functional electrical and thermal management systems, especially complex, monolithic heat exchange device.

  20. Influences of current collector foils with different opening ratios in passive polymer electrolyte membrane fuel cells

    NASA Astrophysics Data System (ADS)

    Krumbholz, S.; Kaiser, J.; Weiland, M.; Hahn, R.; Reichl, H.

    Even if many fuel cell applications are ready to start into the market, more research needs to be done to improve the currently achieved power density further. In the power range of about 10-20 W micro-PEM fuel cells have a high improvement potential concerning the current collector design and the design of the passive air supply. These two points have a high impact on the water management of a PEM fuel cell and allow a significant decrease of the fuel cell system in size and weight. The current work shows calculations for the fuel cell impedance based on a mathematical resistance model which was already presented for similarly constructed direct methanol fuel cells (DMFCs) [4]. Selected publications on water uptake and membrane humidification for the used Gore MEAs [6,7] are taken into account. The model is evaluated with realized versions of cathode side current collector designs, which influence the maximum power density and the self-heating of the fuel cell stack. Several measurement results are presented, which can confirm the validity of the used model. A very low opening ratio of less than 0.1 induces a very high concentration gradient of the generated water in relation to the net water outtake. From this it follows that the cell impedance is very low and the membrane has a very high ionic conductivity. Additionally it can be shown that the power density of these fuel cells is twice as high as for the cells with an opening ratio greater than 0.45.

  1. Relationships between salt marsh loss and dredged canals in three Louisiana Estuaries

    USGS Publications Warehouse

    Bass, A.S.; Turner, R.E.

    1997-01-01

    Coastal land loss rates were quantified for 27 salt marshes in three estuaries of the Louisiana Mississippi Deltaic plain: Barataria, Terrebonne and St. Bernard. The sites ranged from 23 ha to 908 ha and the total area of all sites was 6,367 ha. Two methods were used to calculate open water and canal density in each of five years: (1) a Geographic Information System for 1956 and 1978, and, (2) a point grid method for 1974, 1988, and 1990. A General Linear Model explained 79% of the variance (R2 = 0.79; P ??? 0.95) among basins for all years and provided an estimate of the impacts of canals in each basin. The land loss rates, virtually all occurring as wetland to open water conversions, were different in each basin. The 'background' land loss rates from 1956 to 1990 (exclusive of the direct or indirect effects of canals; %/yr; ?? + 1 Std. Dev.) for each basin were estimated to be: Barataria: 0.71 ?? 0.12, Terrebonne 0.47 ?? 0.09, and St. Bernard 0.08 ?? 0.14. Canals were equally and directly correlated with landloss in each basin. There was 2.85 ha of open water formed with each ha of canal dredged (inclusive of the canal area) and an additional 1 ha wetland converted to spoil bank vegetation. Additional losses may occur if loss rates continue for periods longer than the mapping intervals. There are documented causal mechanisms involving wetland hydrologic changes that can explain these wetland losses.

  2. Arctic Forecasts Available from Polar Bear Exhibit as an Example of Formal/Informal Collaboration

    NASA Astrophysics Data System (ADS)

    Landis, C. E.; Cervenec, J.

    2012-12-01

    A subset of the general population enjoys and frequents informal education venues, offering an opportunity for lifelong learning that also enhances and supports formal education efforts. The Byrd Polar Research Center (BPRC) at The Ohio State University collaborated with the Columbus Zoo & Aquarium (CZA) in the development of their Polar Frontier exhibit, from its initial planning to the Grand Opening of the exhibit, through the present. Of course, the addition to the Zoo of polar bears and Arctic fox in the Polar Frontier has been very popular, with almost a 7% increase in visitors in 2010 when the exhibit opened. The CZA and BPRC are now investigating ways to increase the climate literacy impact of the exhibit, and to increase engagement with the topics through follow-on activities. For example, individuals or classes anywhere in the world can check forecasts from the Polar Weather and Research Forecasting model and compare them to observed conditions-- allowing deep investigation into changes in the Arctic. In addition, opportunities exist to adapt the Zoo School experience (affecting several Central Ohio school districts) and/or to enable regular participation through social media such as Facebook, Twitter, and other forms of digital communication. BPRC's sustained engagement with the CZA is an example of a trusted and meaningful partnership where open dialogue exists about providing the best learning experience for visitors. This presentation will share some of the lessons learned from this unique partnership, and strategies that are adopted to move it forward.

  3. Reduced-order modeling for hyperthermia control.

    PubMed

    Potocki, J K; Tharp, H S

    1992-12-01

    This paper analyzes the feasibility of using reduced-order modeling techniques in the design of multiple-input, multiple-output (MIMO) hyperthermia temperature controllers. State space thermal models are created based upon a finite difference expansion of the bioheat transfer equation model of a scanned focused ultrasound system (SFUS). These thermal state space models are reduced using the balanced realization technique, and an order reduction criterion is tabulated. Results show that a drastic reduction in model dimension can be achieved using the balanced realization. The reduced-order model is then used to design a reduced-order optimal servomechanism controller for a two-scan input, two thermocouple output tissue model. In addition, a full-order optimal servomechanism controller is designed for comparison and validation purposes. These two controllers are applied to a variety of perturbed tissue thermal models to test the robust nature of the reduced-order controller. A comparison of the two controllers validates the use of open-loop balanced reduced-order models in the design of MIMO hyperthermia controllers.

  4. How Well Can Saliency Models Predict Fixation Selection in Scenes Beyond Central Bias? A New Approach to Model Evaluation Using Generalized Linear Mixed Models.

    PubMed

    Nuthmann, Antje; Einhäuser, Wolfgang; Schütz, Immo

    2017-01-01

    Since the turn of the millennium, a large number of computational models of visual salience have been put forward. How best to evaluate a given model's ability to predict where human observers fixate in images of real-world scenes remains an open research question. Assessing the role of spatial biases is a challenging issue; this is particularly true when we consider the tendency for high-salience items to appear in the image center, combined with a tendency to look straight ahead ("central bias"). This problem is further exacerbated in the context of model comparisons, because some-but not all-models implicitly or explicitly incorporate a center preference to improve performance. To address this and other issues, we propose to combine a-priori parcellation of scenes with generalized linear mixed models (GLMM), building upon previous work. With this method, we can explicitly model the central bias of fixation by including a central-bias predictor in the GLMM. A second predictor captures how well the saliency model predicts human fixations, above and beyond the central bias. By-subject and by-item random effects account for individual differences and differences across scene items, respectively. Moreover, we can directly assess whether a given saliency model performs significantly better than others. In this article, we describe the data processing steps required by our analysis approach. In addition, we demonstrate the GLMM analyses by evaluating the performance of different saliency models on a new eye-tracking corpus. To facilitate the application of our method, we make the open-source Python toolbox "GridFix" available.

  5. A Biomechanical Model of the Scapulothoracic Joint to Accurately Capture Scapular Kinematics during Shoulder Movements

    PubMed Central

    Seth, Ajay; Matias, Ricardo; Veloso, António P.; Delp, Scott L.

    2016-01-01

    The complexity of shoulder mechanics combined with the movement of skin relative to the scapula makes it difficult to measure shoulder kinematics with sufficient accuracy to distinguish between symptomatic and asymptomatic individuals. Multibody skeletal models can improve motion capture accuracy by reducing the space of possible joint movements, and models are used widely to improve measurement of lower limb kinematics. In this study, we developed a rigid-body model of a scapulothoracic joint to describe the kinematics of the scapula relative to the thorax. This model describes scapular kinematics with four degrees of freedom: 1) elevation and 2) abduction of the scapula on an ellipsoidal thoracic surface, 3) upward rotation of the scapula normal to the thoracic surface, and 4) internal rotation of the scapula to lift the medial border of the scapula off the surface of the thorax. The surface dimensions and joint axes can be customized to match an individual’s anthropometry. We compared the model to “gold standard” bone-pin kinematics collected during three shoulder tasks and found modeled scapular kinematics to be accurate to within 2mm root-mean-squared error for individual bone-pin markers across all markers and movement tasks. As an additional test, we added random and systematic noise to the bone-pin marker data and found that the model reduced kinematic variability due to noise by 65% compared to Euler angles computed without the model. Our scapulothoracic joint model can be used for inverse and forward dynamics analyses and to compute joint reaction loads. The computational performance of the scapulothoracic joint model is well suited for real-time applications; it is freely available for use with OpenSim 3.2, and is customizable and usable with other OpenSim models. PMID:26734761

  6. A Biomechanical Model of the Scapulothoracic Joint to Accurately Capture Scapular Kinematics during Shoulder Movements.

    PubMed

    Seth, Ajay; Matias, Ricardo; Veloso, António P; Delp, Scott L

    2016-01-01

    The complexity of shoulder mechanics combined with the movement of skin relative to the scapula makes it difficult to measure shoulder kinematics with sufficient accuracy to distinguish between symptomatic and asymptomatic individuals. Multibody skeletal models can improve motion capture accuracy by reducing the space of possible joint movements, and models are used widely to improve measurement of lower limb kinematics. In this study, we developed a rigid-body model of a scapulothoracic joint to describe the kinematics of the scapula relative to the thorax. This model describes scapular kinematics with four degrees of freedom: 1) elevation and 2) abduction of the scapula on an ellipsoidal thoracic surface, 3) upward rotation of the scapula normal to the thoracic surface, and 4) internal rotation of the scapula to lift the medial border of the scapula off the surface of the thorax. The surface dimensions and joint axes can be customized to match an individual's anthropometry. We compared the model to "gold standard" bone-pin kinematics collected during three shoulder tasks and found modeled scapular kinematics to be accurate to within 2 mm root-mean-squared error for individual bone-pin markers across all markers and movement tasks. As an additional test, we added random and systematic noise to the bone-pin marker data and found that the model reduced kinematic variability due to noise by 65% compared to Euler angles computed without the model. Our scapulothoracic joint model can be used for inverse and forward dynamics analyses and to compute joint reaction loads. The computational performance of the scapulothoracic joint model is well suited for real-time applications; it is freely available for use with OpenSim 3.2, and is customizable and usable with other OpenSim models.

  7. Preclinical evaluation of the effect of the combined use of the Ethicon Securestrap® Open Absorbable Strap Fixation Device and Ethicon Physiomesh™ Open Flexible Composite Mesh Device on surgeon stress during ventral hernia repair

    PubMed Central

    Sutton, Nadia; MacDonald, Melinda H; Lombard, John; Ilie, Bodgan; Hinoul, Piet; Granger, Douglas A

    2018-01-01

    Aim To evaluate whether performing ventral hernia repairs using the Ethicon Physiomesh™ Open Flexible Composite Mesh Device in conjunction with the Ethicon Securestrap® Open Absorbable Strap Fixation Device reduces surgical time and surgeon stress levels, compared with traditional surgical repair methods. Methods To repair a simulated ventral incisional hernia, two surgeries were performed by eight experienced surgeons using a live porcine model. One procedure involved traditional suture methods and a flat mesh, and the other procedure involved a mechanical fixation device and a skirted flexible composite mesh. A Surgery Task Load Index questionnaire was administered before and after the procedure to establish the surgeons’ perceived stress levels, and saliva samples were collected before, during, and after the surgical procedures to assess the biologically expressed stress (cortisol and salivary alpha amylase) levels. Results For mechanical fixation using the Ethicon Physiomesh Open Flexible Composite Mesh Device in conjunction with the Ethicon Securestrap Open Absorbable Strap Fixation Device, surgeons reported a 46.2% reduction in perceived workload stress. There was also a lower physiological reactivity to the intraoperative experience and the total surgical procedure time was reduced by 60.3%. Conclusions This study provides preliminary findings suggesting that the combined use of a mechanical fixation device and a skirted flexible composite mesh in an open intraperitoneal onlay mesh repair has the potential to reduce surgeon stress. Additional studies are needed to determine whether a reduction in stress is observed in a clinical setting and, if so, confirm that this results in improved clinical outcomes. PMID:29296101

  8. Ring-opening of {sigma}-thienyl and {sigma}-furyl ligands at ditungsten (M=M) centers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chisholm, M.H.; Haubrich, S.T.; Huffman, J.C.

    1997-02-19

    A series of compounds of formula 1,2-M{sub 2}({sigma}-Th){sub 2}(NMe{sub 2}){sub 4}, 1,has been prepared, where M = Mo and/or W and Th = 2-thienyl[2-Th], 3-thienyl[3-Th], 5-methyl-2-thienyl[2,5-MeTh], and 2-benzothienyl[2-BTh]. Addition of {sup t}BuOH or CF{sub 3}Me{sub 2}COH to hydrocarbon solutions of 1, where M = W, lead to ring-opened products, 2, when the thienyl ligand is attached via the 2-carbon position. No ring-opening occurs for 3-thienyl derivatives. W{sub 2}(OR){sub 5}({mu}-CCH{sub 2}CHCHS)({sigma}-2-Th), 2, where one of the 2-thienyl rings has been opened, has been fully characterized and shown to be derived from a ring-opened {mu}-vinylidene intermediate W{sub 24}/(O{sup t}Bu){sub 4}({mu}-CCHCHCHS)({sigma}-2-Th). The compoundmore » W{sub 2}({sigma}-2-Fu){sub 2}(NMe{sub 2}){sub 4} was prepared and characterized (2-Fu = 2-furyl) and shown to undergo ring-opening in its reaction with {sup t}BuOH to give W{sub 2}(O{sup t}Bu){sub 5}({mu}-CCH{sub 2}CHCHO)({sigma}-2-Fu), 4, in an analogous manner to the 2-Th complex. The complexes 1 (M = W, 2-Th), 2, 3, and 4 have been characterized by single crystal X-ray studies. The results described herein are compared to other ring-opening reactions of S, N, and O organic heterocyclic compounds as models for the activation of S, N, and O containing fossil fuels in hydrodesulfurization (HDS), hydrodenitrogenation (HDN), and hydrodeoxygenation (HDO) processes. 36 refs., 7 figs., 5 tabs.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana L. Kelly

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition,more » substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.« less

  10. Development and Implementation of Dynamic Scripts to Execute Cycled WRF/GSI Forecasts

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley; Srikishen, Jayanthi; Berndt, Emily; Li, Quanli; Watson, Leela

    2014-01-01

    Automating the coupling of data assimilation (DA) and modeling systems is a unique challenge in the numerical weather prediction (NWP) research community. In recent years, the Development Testbed Center (DTC) has released well-documented tools such as the Weather Research and Forecasting (WRF) model and the Gridpoint Statistical Interpolation (GSI) DA system that can be easily downloaded, installed, and run by researchers on their local systems. However, developing a coupled system in which the various preprocessing, DA, model, and postprocessing capabilities are all integrated can be labor-intensive if one has little experience with any of these individual systems. Additionally, operational modeling entities generally have specific coupling methodologies that can take time to understand and develop code to implement properly. To better enable collaborating researchers to perform modeling and DA experiments with GSI, the Short-term Prediction Research and Transition (SPoRT) Center has developed a set of Perl scripts that couple GSI and WRF in a cycling methodology consistent with the use of real-time, regional observation data from the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC). Because Perl is open source, the code can be easily downloaded and executed regardless of the user's native shell environment. This paper will provide a description of this open-source code and descriptions of a number of the use cases that have been performed by SPoRT collaborators using the scripts on different computing systems.

  11. Can Economic Model Transparency Improve Provider Interpretation of Cost-effectiveness Analysis? Evaluating Tradeoffs Presented by the Second Panel on Cost-effectiveness in Health and Medicine.

    PubMed

    Padula, William V; McQueen, Robert Brett; Pronovost, Peter J

    2017-11-01

    The Second Panel on Cost-Effectiveness in Health and Medicine convened on December 7, 2016 at the National Academy of Medicine to disseminate their recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses (CEAs). Following its summary, panel proceedings included lengthy discussions including the field's struggle to disseminate findings efficiently through peer-reviewed literature to target audiences. With editors of several medical and outcomes research journals in attendance, there was consensus that findings of cost-effectiveness analyses do not effectively reach other researchers or health care providers. The audience members suggested several solutions including providing additional training to clinicians in cost-effectiveness research and requiring that cost-effectiveness models are made publicly available. However, there remains the questions of whether making economic modelers' work open-access through journals is fair under the defense that these models remain one's own intellectual property, or whether journals can properly manage the peer-review process specifically for cost-effectiveness analyses. In this article, we elaborate on these issues and provide some suggested solutions that may increase the dissemination and application of cost-effectiveness literature to reach its intended audiences and ultimately benefit the patient. Ultimately, it is our combined view as economic modelers and clinicians that cost-effectiveness results need to reach the clinician to improve the efficiency of medical practice, but that open-access models do not improve clinician access or interpretation of the economics of medicine.

  12. OpenCOR: a modular and interoperable approach to computational biology

    PubMed Central

    Garny, Alan; Hunter, Peter J.

    2015-01-01

    Computational biologists have been developing standards and formats for nearly two decades, with the aim of easing the description and exchange of experimental data, mathematical models, simulation experiments, etc. One of those efforts is CellML (cellml.org), an XML-based markup language for the encoding of mathematical models. Early CellML-based environments include COR and OpenCell. However, both of those tools have limitations and were eventually replaced with OpenCOR (opencor.ws). OpenCOR is an open source modeling environment that is supported on Windows, Linux and OS X. It relies on a modular approach, which means that all of its features come in the form of plugins. Those plugins can be used to organize, edit, simulate and analyze models encoded in the CellML format. We start with an introduction to CellML and two of its early adopters, which limitations eventually led to the development of OpenCOR. We then go onto describing the general philosophy behind OpenCOR, as well as describing its openness and its development process. Next, we illustrate various aspects of OpenCOR, such as its user interface and some of the plugins that come bundled with it (e.g., its editing and simulation plugins). Finally, we discuss some of the advantages and limitations of OpenCOR before drawing some concluding remarks. PMID:25705192

  13. Low Openness on the Revised NEO Personality Inventory as a Risk Factor for Treatment-Resistant Depression

    PubMed Central

    Takahashi, Michio; Shirayama, Yukihiko; Muneoka, Katsumasa; Suzuki, Masatoshi; Sato, Koichi; Hashimoto, Kenji

    2013-01-01

    Background Recently, we reported that low reward dependence, and to a lesser extent, low cooperativeness in the Temperature and Character Inventory (TCI) may be risk factors for treatment-resistant depression. Here, we analyzed additional psychological traits in these patients. Methods We administered Costa and McCrae's five-factor model personality inventory, NEO Personality Inventory-Revised (NEO-PI-R), to antidepressant-treatment resistant depressed patients (n = 35), remitted depressed patients (n = 27), and healthy controls (n = 66). We also evaluated the relationships between scores on NEO and TCI, using the same cohort of patients with treatment-resistant depression, as our previous study. Results Patients with treatment-resistant depression showed high scores for neuroticism, low scores for extraversion, openness and conscientiousness, without changes in agreeableness, on the NEO. However, patients in remitted depression showed no significant scores on NEO. Patients with treatment-resistant depression and low openness on NEO showed positive relationships with reward dependence and cooperativeness on the TCI. Conclusions Many studies have reported that depressed patients show high neuroticism, low extraversion and low conscientiousness on the NEO. Our study highlights low openness on the NEO, as a risk mediator in treatment-resistant depression. This newly identified trait should be included as a risk factor in treatment-resistant depression. PMID:24019864

  14. Low openness on the revised NEO personality inventory as a risk factor for treatment-resistant depression.

    PubMed

    Takahashi, Michio; Shirayama, Yukihiko; Muneoka, Katsumasa; Suzuki, Masatoshi; Sato, Koichi; Hashimoto, Kenji

    2013-01-01

    Recently, we reported that low reward dependence, and to a lesser extent, low cooperativeness in the Temperature and Character Inventory (TCI) may be risk factors for treatment-resistant depression. Here, we analyzed additional psychological traits in these patients. We administered Costa and McCrae's five-factor model personality inventory, NEO Personality Inventory-Revised (NEO-PI-R), to antidepressant-treatment resistant depressed patients (n=35), remitted depressed patients (n=27), and healthy controls (n=66). We also evaluated the relationships between scores on NEO and TCI, using the same cohort of patients with treatment-resistant depression, as our previous study. Patients with treatment-resistant depression showed high scores for neuroticism, low scores for extraversion, openness and conscientiousness, without changes in agreeableness, on the NEO. However, patients in remitted depression showed no significant scores on NEO. Patients with treatment-resistant depression and low openness on NEO showed positive relationships with reward dependence and cooperativeness on the TCI. Many studies have reported that depressed patients show high neuroticism, low extraversion and low conscientiousness on the NEO. Our study highlights low openness on the NEO, as a risk mediator in treatment-resistant depression. This newly identified trait should be included as a risk factor in treatment-resistant depression.

  15. Early deprivation increases high-leaning behavior, a novel anxiety-like behavior, in the open field test in rats.

    PubMed

    Kuniishi, Hiroshi; Ichisaka, Satoshi; Yamamoto, Miki; Ikubo, Natsuko; Matsuda, Sae; Futora, Eri; Harada, Riho; Ishihara, Kohei; Hata, Yoshio

    2017-10-01

    The open field test is one of the most popular ethological tests to assess anxiety-like behavior in rodents. In the present study, we examined the effect of early deprivation (ED), a model of early life stress, on anxiety-like behavior in rats. In ED animals, we failed to find significant changes in the time spent in the center or thigmotaxis area of the open field, the common indexes of anxiety-like behavior. However, we found a significant increase in high-leaning behavior in which animals lean against the wall standing on their hindlimbs while touching the wall with their forepaws at a high position. The high-leaning behavior was decreased by treatment with an anxiolytic, diazepam, and it was increased under intense illumination as observed in the center activity. In addition, we compared the high-leaning behavior and center activity under various illumination intensities and found that the high-leaning behavior is more sensitive to illumination intensity than the center activity in the particular illumination range. These results suggest that the high-leaning behavior is a novel anxiety-like behavior in the open field test that can complement the center activity to assess the anxiety state of rats. Copyright © 2017 Elsevier Ireland Ltd and Japan Neuroscience Society. All rights reserved.

  16. The OpenPicoAmp: an open-source planar lipid bilayer amplifier for hands-on learning of neuroscience.

    PubMed

    Shlyonsky, Vadim; Dupuis, Freddy; Gall, David

    2014-01-01

    Understanding the electrical biophysical properties of the cell membrane can be difficult for neuroscience students as it relies solely on lectures of theoretical models without practical hands on experiments. To address this issue, we developed an open-source lipid bilayer amplifier, the OpenPicoAmp, which is appropriate for use in introductory courses in biophysics or neurosciences at the undergraduate level, dealing with the electrical properties of the cell membrane. The amplifier is designed using the common lithographic printed circuit board fabrication process and off-the-shelf electronic components. In addition, we propose a specific design for experimental chambers allowing the insertion of a commercially available polytetrafluoroethylene film. We provide a complete documentation allowing to build the amplifier and the experimental chamber. The students hand-out giving step-by step instructions to perform a recording is also included. Our experimental setup can be used in basic experiments in which students monitor the bilayer formation by capacitance measurement and record unitary currents produced by ionic channels like gramicidin A dimers. Used in combination with a low-cost data acquisition board this system provides a complete solution for hands-on lessons, therefore improving the effectiveness in teaching basic neurosciences or biophysics.

  17. The OpenPicoAmp: An Open-Source Planar Lipid Bilayer Amplifier for Hands-On Learning of Neuroscience

    PubMed Central

    Shlyonsky, Vadim; Dupuis, Freddy; Gall, David

    2014-01-01

    Understanding the electrical biophysical properties of the cell membrane can be difficult for neuroscience students as it relies solely on lectures of theoretical models without practical hands on experiments. To address this issue, we developed an open-source lipid bilayer amplifier, the OpenPicoAmp, which is appropriate for use in introductory courses in biophysics or neurosciences at the undergraduate level, dealing with the electrical properties of the cell membrane. The amplifier is designed using the common lithographic printed circuit board fabrication process and off-the-shelf electronic components. In addition, we propose a specific design for experimental chambers allowing the insertion of a commercially available polytetrafluoroethylene film. We provide a complete documentation allowing to build the amplifier and the experimental chamber. The students hand-out giving step-by step instructions to perform a recording is also included. Our experimental setup can be used in basic experiments in which students monitor the bilayer formation by capacitance measurement and record unitary currents produced by ionic channels like gramicidin A dimers. Used in combination with a low-cost data acquisition board this system provides a complete solution for hands-on lessons, therefore improving the effectiveness in teaching basic neurosciences or biophysics. PMID:25251830

  18. Article processing charges, funding, and open access publishing at Journal of Experimental & Clinical Assisted Reproduction.

    PubMed

    Sills, Eric Scott; Vincent, Tina Thibault; Palermo, Gianpiero D

    2005-01-13

    Journal of Experimental & Clinical Assisted Reproduction is an Open Access, online, electronic journal published by BioMed Central with full contents available to the scientific and medical community free of charge to all readers. Authors maintain the copyright to their own work, a policy facilitating dissemination of data to the widest possible audience without requiring permission from the publisher. This Open Access publishing model is subsidized by authors (or their institutions/funding agencies) in the form of a single pound330 article processing charge (APC), due at the time of manuscript acceptance for publication. Payment of the APC is not a condition for formal peer review and does not apply to articles rejected after review. Additionally, this fee is waived for authors whose institutions are BioMed Central members or where genuine financial hardship exists. Considering ordinary publication fees related to page charges and reprints, the APC at Journal of Experimental & Clinical Assisted Reproduction is comparable to costs associated with publishing in some traditional print journals, and is less expensive than many. Implementation of the APC within this Open Access framework is envisioned as a modern research-friendly policy that supports networking among investigators, brings new research into reach rapidly, and empowers authors with greater control over their own scholarly publications.

  19. Open Energy Info (OpenEI) (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2010-12-01

    The Open Energy Information (OpenEI.org) initiative is a free, open-source, knowledge-sharing platform. OpenEI was created to provide access to data, models, tools, and information that accelerate the transition to clean energy systems through informed decisions.

  20. 12 CFR Appendix G to Part 226 - Open-End Model Forms and Clauses

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 3 2014-01-01 2014-01-01 false Open-End Model Forms and Clauses G Appendix G... RESERVE SYSTEM (CONTINUED) TRUTH IN LENDING (REGULATION Z) Pt. 226, App. G Appendix G to Part 226—Open-End... Card Accounts Under an Open-End (Not Home-Secured) Consumer Credit Plan [Interest will be charged to...

Top