Sample records for current modeling approaches

  1. A partial Hamiltonian approach for current value Hamiltonian systems

    NASA Astrophysics Data System (ADS)

    Naz, R.; Mahomed, F. M.; Chaudhry, Azam

    2014-10-01

    We develop a partial Hamiltonian framework to obtain reductions and closed-form solutions via first integrals of current value Hamiltonian systems of ordinary differential equations (ODEs). The approach is algorithmic and applies to many state and costate variables of the current value Hamiltonian. However, we apply the method to models with one control, one state and one costate variable to illustrate its effectiveness. The current value Hamiltonian systems arise in economic growth theory and other economic models. We explain our approach with the help of a simple illustrative example and then apply it to two widely used economic growth models: the Ramsey model with a constant relative risk aversion (CRRA) utility function and Cobb Douglas technology and a one-sector AK model of endogenous growth are considered. We show that our newly developed systematic approach can be used to deduce results given in the literature and also to find new solutions.

  2. Distinguishing Continuous and Discrete Approaches to Multilevel Mixture IRT Models: A Model Comparison Perspective

    ERIC Educational Resources Information Center

    Zhu, Xiaoshu

    2013-01-01

    The current study introduced a general modeling framework, multilevel mixture IRT (MMIRT) which detects and describes characteristics of population heterogeneity, while accommodating the hierarchical data structure. In addition to introducing both continuous and discrete approaches to MMIRT, the main focus of the current study was to distinguish…

  3. A Sub-filter Scale Noise Equation far Hybrid LES Simulations

    NASA Technical Reports Server (NTRS)

    Goldstein, Marvin E.

    2006-01-01

    Hybrid LES/subscale modeling approaches have an important advantage over the current noise prediction methods in that they only involve modeling of the relatively universal subscale motion and not the configuration dependent larger scale turbulence . Previous hybrid approaches use approximate statistical techniques or extrapolation methods to obtain the requisite information about the sub-filter scale motion. An alternative approach would be to adopt the modeling techniques used in the current noise prediction methods and determine the unknown stresses from experimental data. The present paper derives an equation for predicting the sub scale sound from information that can be obtained with currently available experimental procedures. The resulting prediction method would then be intermediate between the current noise prediction codes and previously proposed hybrid techniques.

  4. Limitations Of The Current State Space Modelling Approach In Multistage Machining Processes Due To Operation Variations

    NASA Astrophysics Data System (ADS)

    Abellán-Nebot, J. V.; Liu, J.; Romero, F.

    2009-11-01

    The State Space modelling approach has been recently proposed as an engineering-driven technique for part quality prediction in Multistage Machining Processes (MMP). Current State Space models incorporate fixture and datum variations in the multi-stage variation propagation, without explicitly considering common operation variations such as machine-tool thermal distortions, cutting-tool wear, cutting-tool deflections, etc. This paper shows the limitations of the current State Space model through an experimental case study where the effect of the spindle thermal expansion, cutting-tool flank wear and locator errors are introduced. The paper also discusses the extension of the current State Space model to include operation variations and its potential benefits.

  5. A Survey of Model Evaluation Approaches with a Tutorial on Hierarchical Bayesian Methods

    ERIC Educational Resources Information Center

    Shiffrin, Richard M.; Lee, Michael D.; Kim, Woojae; Wagenmakers, Eric-Jan

    2008-01-01

    This article reviews current methods for evaluating models in the cognitive sciences, including theoretically based approaches, such as Bayes factors and minimum description length measures; simulation approaches, including model mimicry evaluations; and practical approaches, such as validation and generalization measures. This article argues…

  6. Investigation of tDCS volume conduction effects in a highly realistic head model

    NASA Astrophysics Data System (ADS)

    Wagner, S.; Rampersad, S. M.; Aydin, Ü.; Vorwerk, J.; Oostendorp, T. F.; Neuling, T.; Herrmann, C. S.; Stegeman, D. F.; Wolters, C. H.

    2014-02-01

    Objective. We investigate volume conduction effects in transcranial direct current stimulation (tDCS) and present a guideline for efficient and yet accurate volume conductor modeling in tDCS using our newly-developed finite element (FE) approach. Approach. We developed a new, accurate and fast isoparametric FE approach for high-resolution geometry-adapted hexahedral meshes and tissue anisotropy. To attain a deeper insight into tDCS, we performed computer simulations, starting with a homogenized three-compartment head model and extending this step by step to a six-compartment anisotropic model. Main results. We are able to demonstrate important tDCS effects. First, we find channeling effects of the skin, the skull spongiosa and the cerebrospinal fluid compartments. Second, current vectors tend to be oriented towards the closest higher conducting region. Third, anisotropic WM conductivity causes current flow in directions more parallel to the WM fiber tracts. Fourth, the highest cortical current magnitudes are not only found close to the stimulation sites. Fifth, the median brain current density decreases with increasing distance from the electrodes. Significance. Our results allow us to formulate a guideline for volume conductor modeling in tDCS. We recommend to accurately model the major tissues between the stimulating electrodes and the target areas, while for efficient yet accurate modeling, an exact representation of other tissues is less important. Because for the low-frequency regime in electrophysiology the quasi-static approach is justified, our results should also be valid for at least low-frequency (e.g., below 100 Hz) transcranial alternating current stimulation.

  7. State Higher Education Funding Models: An Assessment of Current and Emerging Approaches

    ERIC Educational Resources Information Center

    Layzell, Daniel T.

    2007-01-01

    This article provides an assessment of the current and emerging approaches used by state governments in allocating funding for higher education institutions and programs. It reviews a number of desired characteristics or outcomes for state higher education funding models, including equity, adequacy, stability, and flexibility. Although there is…

  8. An Overview of the Effectiveness of Adolescent Substance Abuse Treatment Models.

    ERIC Educational Resources Information Center

    Muck, Randolph; Zempolich, Kristin A.; Titus, Janet C.; Fishman, Marc; Godley, Mark D.; Schwebel, Robert

    2001-01-01

    Describes current approaches to adolescent substance abuse treatment, including the 12-step treatment approach, behavioral treatment approach, family-based treatment approach, and therapeutic community approach. Summarizes research that assesses the effectiveness of these models, offering findings from the Center for Substance Abuse Treatment's…

  9. The induced electric field due to a current transient

    NASA Astrophysics Data System (ADS)

    Beck, Y.; Braunstein, A.; Frankental, S.

    2007-05-01

    Calculations and measurements of the electric fields, induced by a lightning strike, are important for understanding the phenomenon and developing effective protection systems. In this paper, a novel approach to the calculation of the electric fields due to lightning strikes, using a relativistic approach, is presented. This approach is based on a known current wave-pair model, representing the lightning current wave. The model presented is one that describes the lightning current wave, either at the first stage of the descending charge wave from the cloud or at the later stage of the return stroke. The electric fields computed are cylindrically symmetric. A simplified method for the calculation of the electric field is achieved by using special relativity theory and relativistic considerations. The proposed approach, described in this paper, is based on simple expressions (by applying Coulomb's law) compared with much more complicated partial differential equations based on Maxwell's equations. A straight forward method of calculating the electric field due to a lightning strike, modelled as a negative-positive (NP) wave-pair, is determined by using the special relativity theory in order to calculate the 'velocity field' and relativistic concepts for calculating the 'acceleration field'. These fields are the basic elements required for calculating the total field resulting from the current wave-pair model. Moreover, a modified simpler method using sub models is represented. The sub-models are filaments of either static charges or charges at constant velocity only. Combining these simple sub-models yields the total wave-pair model. The results fully agree with that obtained by solving Maxwell's equations for the discussed problem.

  10. Approaches to quantifying long-term continental shelf sediment transport with an example from the Northern California STRESS mid-shelf site

    NASA Astrophysics Data System (ADS)

    Harris, Courtney K.; Wiberg, Patricia L.

    1997-09-01

    Modeling shelf sediment transport rates and bed reworking depths is problematic when the wave and current forcing conditions are not precisely known, as is usually the case when long-term sedimentation patterns are of interest. Two approaches to modeling sediment transport under such circumstances are considered. The first relies on measured or simulated time series of flow conditions to drive model calculations. The second approach uses as model input probability distribution functions of bottom boundary layer flow conditions developed from wave and current measurements. Sediment transport rates, frequency of bed resuspension by waves and currents, and bed reworking calculated using the two methods are compared at the mid-shelf STRESS (Sediment TRansport on Shelves and Slopes) site on the northern California continental shelf. Current, wave and resuspension measurements at the site are used to generate model inputs and test model results. An 11-year record of bottom wave orbital velocity, calculated from surface wave spectra measured by the National Data Buoy Center (NDBC) Buoy 46013 and verified against bottom tripod measurements, is used to characterize the frequency and duration of wave-driven transport events and to estimate the joint probability distribution of wave orbital velocity and period. A 109-day record of hourly current measurements 10 m above bottom is used to estimate the probability distribution of bottom boundary layer current velocity at this site and to develop an auto-regressive model to simulate current velocities for times when direct measurements of currents are not available. Frequency of transport, the maximum volume of suspended sediment, and average flux calculated using measured wave and simulated current time series agree well with values calculated using measured time series. A probabilistic approach is more amenable to calculations over time scales longer than existing wave records, but it tends to underestimate net transport because it does not capture the episodic nature of transport events. Both methods enable estimates to be made of the uncertainty in transport quantities that arise from an incomplete knowledge of the specific timing of wave and current conditions. 1997 Elsevier Science Ltd

  11. Positive feedback : exploring current approaches in iterative travel demand model implementation.

    DOT National Transportation Integrated Search

    2012-01-01

    Currently, the models that TxDOTs Transportation Planning and Programming Division (TPP) developed are : traditional three-step models (i.e., trip generation, trip distribution, and traffic assignment) that are sequentially : applied. A limitation...

  12. Electroencephalography (EEG) forward modeling via H(div) finite element sources with focal interpolation.

    PubMed

    Pursiainen, S; Vorwerk, J; Wolters, C H

    2016-12-21

    The goal of this study is to develop focal, accurate and robust finite element method (FEM) based approaches which can predict the electric potential on the surface of the computational domain given its structure and internal primary source current distribution. While conducting an EEG evaluation, the placement of source currents to the geometrically complex grey matter compartment is a challenging but necessary task to avoid forward errors attributable to tissue conductivity jumps. Here, this task is approached via a mathematically rigorous formulation, in which the current field is modeled via divergence conforming H(div) basis functions. Both linear and quadratic functions are used while the potential field is discretized via the standard linear Lagrangian (nodal) basis. The resulting model includes dipolar sources which are interpolated into a random set of positions and orientations utilizing two alternative approaches: the position based optimization (PBO) and the mean position/orientation (MPO) method. These results demonstrate that the present dipolar approach can reach or even surpass, at least in some respects, the accuracy of two classical reference methods, the partial integration (PI) and St. Venant (SV) approach which utilize monopolar loads instead of dipolar currents.

  13. An overview of quantitative approaches in Gestalt perception.

    PubMed

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A quantum wave based compact modeling approach for the current in ultra-short DG MOSFETs suitable for rapid multi-scale simulations

    NASA Astrophysics Data System (ADS)

    Hosenfeld, Fabian; Horst, Fabian; Iñíguez, Benjamín; Lime, François; Kloes, Alexander

    2017-11-01

    Source-to-drain (SD) tunneling decreases the device performance in MOSFETs falling below the 10 nm channel length. Modeling quantum mechanical effects including SD tunneling has gained more importance specially for compact model developers. The non-equilibrium Green's function (NEGF) has become a state-of-the-art method for nano-scaled device simulation in the past years. In the sense of a multi-scale simulation approach it is necessary to bridge the gap between compact models with their fast and efficient calculation of the device current, and numerical device models which consider quantum effects of nano-scaled devices. In this work, an NEGF based analytical model for nano-scaled double-gate (DG) MOSFETs is introduced. The model consists of a closed-form potential solution of a classical compact model and a 1D NEGF formalism for calculating the device current, taking into account quantum mechanical effects. The potential calculation omits the iterative coupling and allows the straightforward current calculation. The model is based on a ballistic NEGF approach whereby backscattering effects are considered as second order effect in a closed-form. The accuracy and scalability of the non-iterative DG MOSFET model is inspected in comparison with numerical NanoMOS TCAD data for various channel lengths. With the help of this model investigations on short-channel and temperature effects are performed.

  15. Towards Current Profile Control in ITER: Potential Approaches and Research Needs

    NASA Astrophysics Data System (ADS)

    Schuster, E.; Barton, J. E.; Wehner, W. P.

    2014-10-01

    Many challenging plasma control problems still need to be addressed in order for the ITER Plasma Control System (PCS) to be able to successfully achieve the ITER project goals. For instance, setting up a suitable toroidal current density profile is key for one possible advanced scenario characterized by noninductive sustainment of the plasma current and steady-state operation. The nonlinearity and high dimensionality exhibited by the plasma demand a model-based current-profile control synthesis procedure that can accommodate this complexity through embedding the known physics within the design. The development of a model capturing the dynamics of the plasma relevant for control design enables not only the design of feedback controllers for regulation or tracking but also the design of optimal feedforward controllers for a systematic model-based approach to scenario planning, the design of state estimators for a reliable real-time reconstruction of the plasma internal profiles based on limited and noisy diagnostics, and the development of a fast predictive simulation code for closed-loop performance evaluation before implementation. Progress towards control-oriented modeling of the current profile evolution and associated control design has been reported following both data-driven and first-principles-driven approaches. An overview of these two approaches will be provided, as well as a discussion on research needs associated with each one of the model applications described above. Supported by the US Department of Energy under DE-SC0001334 and DE-SC0010661.

  16. Next generation system modeling of NTR systems

    NASA Technical Reports Server (NTRS)

    Buksa, John J.; Rider, William J.

    1993-01-01

    The topics are presented in viewgraph form and include the following: nuclear thermal rocket (NTR) modeling challenges; current approaches; shortcomings of current analysis method; future needs; and present steps to these goals.

  17. Mathematical modeling and computational prediction of cancer drug resistance.

    PubMed

    Sun, Xiaoqiang; Hu, Bin

    2017-06-23

    Diverse forms of resistance to anticancer drugs can lead to the failure of chemotherapy. Drug resistance is one of the most intractable issues for successfully treating cancer in current clinical practice. Effective clinical approaches that could counter drug resistance by restoring the sensitivity of tumors to the targeted agents are urgently needed. As numerous experimental results on resistance mechanisms have been obtained and a mass of high-throughput data has been accumulated, mathematical modeling and computational predictions using systematic and quantitative approaches have become increasingly important, as they can potentially provide deeper insights into resistance mechanisms, generate novel hypotheses or suggest promising treatment strategies for future testing. In this review, we first briefly summarize the current progress of experimentally revealed resistance mechanisms of targeted therapy, including genetic mechanisms, epigenetic mechanisms, posttranslational mechanisms, cellular mechanisms, microenvironmental mechanisms and pharmacokinetic mechanisms. Subsequently, we list several currently available databases and Web-based tools related to drug sensitivity and resistance. Then, we focus primarily on introducing some state-of-the-art computational methods used in drug resistance studies, including mechanism-based mathematical modeling approaches (e.g. molecular dynamics simulation, kinetic model of molecular networks, ordinary differential equation model of cellular dynamics, stochastic model, partial differential equation model, agent-based model, pharmacokinetic-pharmacodynamic model, etc.) and data-driven prediction methods (e.g. omics data-based conventional screening approach for node biomarkers, static network approach for edge biomarkers and module biomarkers, dynamic network approach for dynamic network biomarkers and dynamic module network biomarkers, etc.). Finally, we discuss several further questions and future directions for the use of computational methods for studying drug resistance, including inferring drug-induced signaling networks, multiscale modeling, drug combinations and precision medicine. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Systems and context modeling approach to requirements analysis

    NASA Astrophysics Data System (ADS)

    Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick

    2014-08-01

    Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.

  19. 3-D time-domain induced polarization tomography: a new approach based on a source current density formulation

    NASA Astrophysics Data System (ADS)

    Soueid Ahmed, A.; Revil, A.

    2018-04-01

    Induced polarization (IP) of porous rocks can be associated with a secondary source current density, which is proportional to both the intrinsic chargeability and the primary (applied) current density. This gives the possibility of reformulating the time domain induced polarization (TDIP) problem as a time-dependent self-potential-type problem. This new approach implies a change of strategy regarding data acquisition and inversion, allowing major time savings for both. For inverting TDIP data, we first retrieve the electrical resistivity distribution. Then, we use this electrical resistivity distribution to reconstruct the primary current density during the injection/retrieval of the (primary) current between the current electrodes A and B. The time-lapse secondary source current density distribution is determined given the primary source current density and a distribution of chargeability (forward modelling step). The inverse problem is linear between the secondary voltages (measured at all the electrodes) and the computed secondary source current density. A kernel matrix relating the secondary observed voltages data to the source current density model is computed once (using the electrical conductivity distribution), and then used throughout the inversion process. This recovered source current density model is in turn used to estimate the time-dependent chargeability (normalized voltages) in each cell of the domain of interest. Assuming a Cole-Cole model for simplicity, we can reconstruct the 3-D distributions of the relaxation time τ and the Cole-Cole exponent c by fitting the intrinsic chargeability decay curve to a Cole-Cole relaxation model for each cell. Two simple cases are studied in details to explain this new approach. In the first case, we estimate the Cole-Cole parameters as well as the source current density field from a synthetic TDIP data set. Our approach is successfully able to reveal the presence of the anomaly and to invert its Cole-Cole parameters. In the second case, we perform a laboratory sandbox experiment in which we mix a volume of burning coal and sand. The algorithm is able to localize the burning coal both in terms of electrical conductivity and chargeability.

  20. Modeling Latent Interactions at Level 2 in Multilevel Structural Equation Models: An Evaluation of Mean-Centered and Residual-Centered Unconstrained Approaches

    ERIC Educational Resources Information Center

    Leite, Walter L.; Zuo, Youzhen

    2011-01-01

    Among the many methods currently available for estimating latent variable interactions, the unconstrained approach is attractive to applied researchers because of its relatively easy implementation with any structural equation modeling (SEM) software. Using a Monte Carlo simulation study, we extended and evaluated the unconstrained approach to…

  1. Modeling energy/economy interactions for conservation and renewable energy-policy analysis

    NASA Astrophysics Data System (ADS)

    Groncki, P. J.

    Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.

  2. Development of a time-dependent hurricane evacuation model for the New Orleans area : research project capsule.

    DOT National Transportation Integrated Search

    2008-08-01

    Current hurricane evacuation transportation modeling uses an approach fashioned after the : traditional four-step procedure applied in urban transportation planning. One of the limiting : features of this approach is that it models traffic in a stati...

  3. Efficient non-hydrostatic modelling of 3D wave-induced currents using a subgrid approach

    NASA Astrophysics Data System (ADS)

    Rijnsdorp, Dirk P.; Smit, Pieter B.; Zijlema, Marcel; Reniers, Ad J. H. M.

    2017-08-01

    Wave-induced currents are an ubiquitous feature in coastal waters that can spread material over the surf zone and the inner shelf. These currents are typically under resolved in non-hydrostatic wave-flow models due to computational constraints. Specifically, the low vertical resolutions adequate to describe the wave dynamics - and required to feasibly compute at the scales of a field site - are too coarse to account for the relevant details of the three-dimensional (3D) flow field. To describe the relevant dynamics of both wave and currents, while retaining a model framework that can be applied at field scales, we propose a two grid approach to solve the governing equations. With this approach, the vertical accelerations and non-hydrostatic pressures are resolved on a relatively coarse vertical grid (which is sufficient to accurately resolve the wave dynamics), whereas the horizontal velocities and turbulent stresses are resolved on a much finer subgrid (of which the resolution is dictated by the vertical scale of the mean flows). This approach ensures that the discrete pressure Poisson equation - the solution of which dominates the computational effort - is evaluated on the coarse grid scale, thereby greatly improving efficiency, while providing a fine vertical resolution to resolve the vertical variation of the mean flow. This work presents the general methodology, and discusses the numerical implementation in the SWASH wave-flow model. Model predictions are compared with observations of three flume experiments to demonstrate that the subgrid approach captures both the nearshore evolution of the waves, and the wave-induced flows like the undertow profile and longshore current. The accuracy of the subgrid predictions is comparable to fully resolved 3D simulations - but at much reduced computational costs. The findings of this work thereby demonstrate that the subgrid approach has the potential to make 3D non-hydrostatic simulations feasible at the scale of a realistic coastal region.

  4. Validation of Finite-Element Models of Persistent-Current Effects in Nb 3Sn Accelerator Magnets

    DOE PAGES

    Wang, X.; Ambrosio, G.; Chlachidze, G.; ...

    2015-01-06

    Persistent magnetization currents are induced in superconducting filaments during the current ramping in magnets. The resulting perturbation to the design magnetic field leads to field quality degradation, in particular at low field where the effect is stronger relative to the main field. The effects observed in NbTi accelerator magnets were reproduced well with the critical-state model. However, this approach becomes less accurate for the calculation of the persistent-current effects observed in Nb 3Sn accelerator magnets. Here a finite-element method based on the measured strand magnetization is validated against three state-of-art Nb3Sn accelerator magnets featuring different subelement diameters, critical currents, magnetmore » designs and measurement temperatures. The temperature dependence of the persistent-current effects is reproduced. Based on the validated model, the impact of conductor design on the persistent current effects is discussed. The performance, limitations and possible improvements of the approach are also discussed.« less

  5. Workshop on Current Issues in Predictive Approaches to Intelligence and Security Analytics: Fostering the Creation of Decision Advantage through Model Integration and Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.

    2010-05-23

    The increasing asymmetric nature of threats to the security, health and sustainable growth of our society requires that anticipatory reasoning become an everyday activity. Currently, the use of anticipatory reasoning is hindered by the lack of systematic methods for combining knowledge- and evidence-based models, integrating modeling algorithms, and assessing model validity, accuracy and utility. The workshop addresses these gaps with the intent of fostering the creation of a community of interest on model integration and evaluation that may serve as an aggregation point for existing efforts and a launch pad for new approaches.

  6. Combustion system CFD modeling at GE Aircraft Engines

    NASA Technical Reports Server (NTRS)

    Burrus, D.; Mongia, H.; Tolpadi, Anil K.; Correa, S.; Braaten, M.

    1995-01-01

    This viewgraph presentation discusses key features of current combustion system CFD modeling capabilities at GE Aircraft Engines provided by the CONCERT code; CONCERT development history; modeling applied for designing engine combustion systems; modeling applied to improve fundamental understanding; CONCERT3D results for current production combustors; CONCERT3D model of NASA/GE E3 combustor; HYBRID CONCERT CFD/Monte-Carlo modeling approach; and future modeling directions.

  7. Combustion system CFD modeling at GE Aircraft Engines

    NASA Astrophysics Data System (ADS)

    Burrus, D.; Mongia, H.; Tolpadi, Anil K.; Correa, S.; Braaten, M.

    1995-03-01

    This viewgraph presentation discusses key features of current combustion system CFD modeling capabilities at GE Aircraft Engines provided by the CONCERT code; CONCERT development history; modeling applied for designing engine combustion systems; modeling applied to improve fundamental understanding; CONCERT3D results for current production combustors; CONCERT3D model of NASA/GE E3 combustor; HYBRID CONCERT CFD/Monte-Carlo modeling approach; and future modeling directions.

  8. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  9. Improving homology modeling of G-protein coupled receptors through multiple-template derived conserved inter-residue interactions

    NASA Astrophysics Data System (ADS)

    Chaudhari, Rajan; Heim, Andrew J.; Li, Zhijun

    2015-05-01

    Evidenced by the three-rounds of G-protein coupled receptors (GPCR) Dock competitions, improving homology modeling methods of helical transmembrane proteins including the GPCRs, based on templates of low sequence identity, remains an eminent challenge. Current approaches addressing this challenge adopt the philosophy of "modeling first, refinement next". In the present work, we developed an alternative modeling approach through the novel application of available multiple templates. First, conserved inter-residue interactions are derived from each additional template through conservation analysis of each template-target pairwise alignment. Then, these interactions are converted into distance restraints and incorporated in the homology modeling process. This approach was applied to modeling of the human β2 adrenergic receptor using the bovin rhodopsin and the human protease-activated receptor 1 as templates and improved model quality was demonstrated compared to the homology model generated by standard single-template and multiple-template methods. This method of "refined restraints first, modeling next", provides a fast and complementary way to the current modeling approaches. It allows rational identification and implementation of additional conserved distance restraints extracted from multiple templates and/or experimental data, and has the potential to be applicable to modeling of all helical transmembrane proteins.

  10. Surface-Charge-Based Micro-Models--A Solid Foundation for Learning about Direct Current Circuits

    ERIC Educational Resources Information Center

    Hirvonen, P. E.

    2007-01-01

    This study explores how the use of a surface-charge-based instructional approach affects introductory university level students' understanding of direct current (dc) circuits. The introduced teaching intervention includes electrostatics, surface-charge-based micro-models that explain the existence of an electric field inside the current-carrying…

  11. Toward refined environmental scenarios for ecological risk assessment of down-the-drain chemicals in freshwater environments.

    PubMed

    Franco, Antonio; Price, Oliver R; Marshall, Stuart; Jolliet, Olivier; Van den Brink, Paul J; Rico, Andreu; Focks, Andreas; De Laender, Frederik; Ashauer, Roman

    2017-03-01

    Current regulatory practice for chemical risk assessment suffers from the lack of realism in conventional frameworks. Despite significant advances in exposure and ecological effect modeling, the implementation of novel approaches as high-tier options for prospective regulatory risk assessment remains limited, particularly among general chemicals such as down-the-drain ingredients. While reviewing the current state of the art in environmental exposure and ecological effect modeling, we propose a scenario-based framework that enables a better integration of exposure and effect assessments in a tiered approach. Global- to catchment-scale spatially explicit exposure models can be used to identify areas of higher exposure and to generate ecologically relevant exposure information for input into effect models. Numerous examples of mechanistic ecological effect models demonstrate that it is technically feasible to extrapolate from individual-level effects to effects at higher levels of biological organization and from laboratory to environmental conditions. However, the data required to parameterize effect models that can embrace the complexity of ecosystems are large and require a targeted approach. Experimental efforts should, therefore, focus on vulnerable species and/or traits and ecological conditions of relevance. We outline key research needs to address the challenges that currently hinder the practical application of advanced model-based approaches to risk assessment of down-the-drain chemicals. Integr Environ Assess Manag 2017;13:233-248. © 2016 SETAC. © 2016 SETAC.

  12. Non-animal models of epithelial barriers (skin, intestine and lung) in research, industrial applications and regulatory toxicology.

    PubMed

    Gordon, Sarah; Daneshian, Mardas; Bouwstra, Joke; Caloni, Francesca; Constant, Samuel; Davies, Donna E; Dandekar, Gudrun; Guzman, Carlos A; Fabian, Eric; Haltner, Eleonore; Hartung, Thomas; Hasiwa, Nina; Hayden, Patrick; Kandarova, Helena; Khare, Sangeeta; Krug, Harald F; Kneuer, Carsten; Leist, Marcel; Lian, Guoping; Marx, Uwe; Metzger, Marco; Ott, Katharina; Prieto, Pilar; Roberts, Michael S; Roggen, Erwin L; Tralau, Tewes; van den Braak, Claudia; Walles, Heike; Lehr, Claus-Michael

    2015-01-01

    Models of the outer epithelia of the human body - namely the skin, the intestine and the lung - have found valid applications in both research and industrial settings as attractive alternatives to animal testing. A variety of approaches to model these barriers are currently employed in such fields, ranging from the utilization of ex vivo tissue to reconstructed in vitro models, and further to chip-based technologies, synthetic membrane systems and, of increasing current interest, in silico modeling approaches. An international group of experts in the field of epithelial barriers was convened from academia, industry and regulatory bodies to present both the current state of the art of non-animal models of the skin, intestinal and pulmonary barriers in their various fields of application, and to discuss research-based, industry-driven and regulatory-relevant future directions for both the development of new models and the refinement of existing test methods. Issues of model relevance and preference, validation and standardization, acceptance, and the need for simplicity versus complexity were focal themes of the discussions. The outcomes of workshop presentations and discussions, in relation to both current status and future directions in the utilization and development of epithelial barrier models, are presented by the attending experts in the current report.

  13. Paraboloid magnetospheric magnetic field model and the status of the model as an ISO standard

    NASA Astrophysics Data System (ADS)

    Alexeev, I.

    A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions The empirical model developed by Tsyganenko T96 is constructed by minimizing the rms deviation from the large magnetospheric data base The applicability of the T96 model is limited mainly by quiet conditions in the solar wind along the Earth orbit But contrary to the internal planet s field the external magnetospheric magnetic field sources are much more time-dependent A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions It is a reason why the method of the paraboloid magnetospheric model construction based on the more accurate and physically consistent approach in which each source of the magnetic field would have its own relaxation timescale and a driving function based on an individual best fit combination of the solar wind and IMF parameters Such approach is based on a priori information about the global magnetospheric current systems structure Each current system is included as a separate block module in the magnetospheric model As it was shown by the spacecraft magnetometer data there are three current systems which are the main contributors to the external magnetospheric magnetic field magnetopause currents ring current and tail current sheet Paraboloid model is based on an analytical solution of the Laplace equation for each of these large-scale current systems in the magnetosphere with a

  14. Bayesian Model Development for Analysis of Open Source Information to Support the Assessment of Nuclear Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.

    2013-07-15

    Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less

  15. Logistic regression modeling to assess groundwater vulnerability to contamination in Hawaii, USA

    NASA Astrophysics Data System (ADS)

    Mair, Alan; El-Kadi, Aly I.

    2013-10-01

    Capture zone analysis combined with a subjective susceptibility index is currently used in Hawaii to assess vulnerability to contamination of drinking water sources derived from groundwater. In this study, we developed an alternative objective approach that combines well capture zones with multiple-variable logistic regression (LR) modeling and applied it to the highly-utilized Pearl Harbor and Honolulu aquifers on the island of Oahu, Hawaii. Input for the LR models utilized explanatory variables based on hydrogeology, land use, and well geometry/location. A suite of 11 target contaminants detected in the region, including elevated nitrate (> 1 mg/L), four chlorinated solvents, four agricultural fumigants, and two pesticides, was used to develop the models. We then tested the ability of the new approach to accurately separate groups of wells with low and high vulnerability, and the suitability of nitrate as an indicator of other types of contamination. Our results produced contaminant-specific LR models that accurately identified groups of wells with the lowest/highest reported detections and the lowest/highest nitrate concentrations. Current and former agricultural land uses were identified as significant explanatory variables for eight of the 11 target contaminants, while elevated nitrate was a significant variable for five contaminants. The utility of the combined approach is contingent on the availability of hydrologic and chemical monitoring data for calibrating groundwater and LR models. Application of the approach using a reference site with sufficient data could help identify key variables in areas with similar hydrogeology and land use but limited data. In addition, elevated nitrate may also be a suitable indicator of groundwater contamination in areas with limited data. The objective LR modeling approach developed in this study is flexible enough to address a wide range of contaminants and represents a suitable addition to the current subjective approach.

  16. Analytical modeling and analysis of magnetic field and torque for novel axial flux eddy current couplers with PM excitation

    NASA Astrophysics Data System (ADS)

    Li, Zhao; Wang, Dazhi; Zheng, Di; Yu, Linxin

    2017-10-01

    Rotational permanent magnet eddy current couplers are promising devices for torque and speed transmission without any mechanical contact. In this study, flux-concentration disk-type permanent magnet eddy current couplers with double conductor rotor are investigated. Given the drawback of the accurate three-dimensional finite element method, this paper proposes a mixed two-dimensional analytical modeling approach. Based on this approach, the closed-form expressions of magnetic field, eddy current, electromagnetic force and torque for such devices are obtained. Finally, a three-dimensional finite element method is employed to validate the analytical results. Besides, a prototype is manufactured and tested for the torque-speed characteristic.

  17. A methodology for achieving high-speed rates for artificial conductance injection in electrically excitable biological cells.

    PubMed

    Butera, R J; Wilson, C G; Delnegro, C A; Smith, J C

    2001-12-01

    We present a novel approach to implementing the dynamic-clamp protocol (Sharp et al., 1993), commonly used in neurophysiology and cardiac electrophysiology experiments. Our approach is based on real-time extensions to the Linux operating system. Conventional PC-based approaches have typically utilized single-cycle computational rates of 10 kHz or slower. In thispaper, we demonstrate reliable cycle-to-cycle rates as fast as 50 kHz. Our system, which we call model reference current injection (MRCI); pronounced merci is also capable of episodic logging of internal state variables and interactive manipulation of model parameters. The limiting factor in achieving high speeds was not processor speed or model complexity, but cycle jitter inherent in the CPU/motherboard performance. We demonstrate these high speeds and flexibility with two examples: 1) adding action-potential ionic currents to a mammalian neuron under whole-cell patch-clamp and 2) altering a cell's intrinsic dynamics via MRCI while simultaneously coupling it via artificial synapses to an internal computational model cell. These higher rates greatly extend the applicability of this technique to the study of fast electrophysiological currents such fast a currents and fast excitatory/inhibitory synapses.

  18. My Response to the Systemic Approach to Gifted Education

    ERIC Educational Resources Information Center

    Lee, Seon-Young

    2012-01-01

    As an alternative to the current paradigm of gifted education, Ziegler and Phillipson proposed a systemic approach and argued that factors in the current mechanistic model of giftedness are not good predictors for exceptionality. They pinpointed that a single factor identified as an indicator of giftedness, ineffective measures, inappropriate…

  19. X-56A MUTT: Aeroservoelastic Modeling

    NASA Technical Reports Server (NTRS)

    Ouellette, Jeffrey A.

    2015-01-01

    For the NASA X-56a Program, Armstrong Flight Research Center has been developing a set of linear states space models that integrate the flight dynamics and structural dynamics. These high order models are needed for the control design, control evaluation, and test input design. The current focus has been on developing stiff wing models to validate the current modeling approach. The extension of the modeling approach to the flexible wings requires only a change in the structural model. Individual subsystems models (actuators, inertial properties, etc.) have been validated by component level ground tests. Closed loop simulation of maneuvers designed to validate the flight dynamics of these models correlates very well flight test data. The open loop structural dynamics are also shown to correlate well to the flight test data.

  20. Nitrous oxide emissions from agricultural landscapes: quantification tools, policy development, and opportunities for improved management

    NASA Astrophysics Data System (ADS)

    Tonitto, C.; Gurwick, N. P.

    2012-12-01

    Policy initiatives to reduce greenhouse gas emissions (GHG) have promoted the development of agricultural management protocols to increase SOC storage and reduce GHG emissions. We review approaches for quantifying N2O flux from agricultural landscapes. We summarize the temporal and spatial extent of observations across representative soil classes, climate zones, cropping systems, and management scenarios. We review applications of simulation and empirical modeling approaches and compare validation outcomes across modeling tools. Subsequently, we review current model application in agricultural management protocols. In particular, we compare approaches adapted for compliance with the California Global Warming Solutions Act, the Alberta Climate Change and Emissions Management Act, and by the American Carbon Registry. In the absence of regional data to drive model development, policies that require GHG quantification often use simple empirical models based on highly aggregated data of N2O flux as a function of applied N - Tier 1 models according to IPCC categorization. As participants in development of protocols that could be used in carbon offset markets, we observed that stakeholders outside of the biogeochemistry community favored outcomes from simulation modeling (Tier 3) rather than empirical modeling (Tier 2). In contrast, scientific advisors were more accepting of outcomes based on statistical approaches that rely on local observations, and their views sometimes swayed policy practitioners over the course of policy development. Both Tier 2 and Tier 3 approaches have been implemented in current policy development, and it is important that the strengths and limitations of both approaches, in the face of available data, be well-understood by those drafting and adopting policies and protocols. The reliability of all models is contingent on sufficient observations for model development and validation. Simulation models applied without site-calibration generally result in poor validation results, and this point particularly needs to be emphasized during policy development. For cases where sufficient calibration data are available, simulation models have demonstrated the ability to capture seasonal patterns of N2O flux. The reliability of statistical models likewise depends on data availability. Because soil moisture is a significant driver of N2O flux, the best outcomes occur when empirical models are applied to systems with relevant soil classification and climate. The structure of current carbon offset protocols is not well-aligned with a budgetary approach to GHG accounting. Current protocols credit field-scale reduction in N2O flux as a result of reduced fertilizer use. Protocols do not award farmers credit for reductions in CO2 emissions resulting from reduced production of synthetic N fertilizer. To achieve the greatest GHG emission reductions through reduced synthetic N production and reduced landscape N saturation requires a re-envisioning of the agricultural landscape to include cropping systems with legume and manure N sources. The current focus on on-farm GHG sources focuses credits on simple reductions of N applied in conventional systems rather than on developing cropping systems which promote higher recycling and retention of N.

  1. Teaching Direct Current Theory Using a Field Model

    ERIC Educational Resources Information Center

    Stocklmayer, Susan

    2010-01-01

    Principles of direct current have long been recognised in the literature as presenting difficulties for learners. Most of these difficulties have been reported in the context of the traditional electron flow model. In this paper, an alternative approach for high school students using a field model is explored. Findings from a range of short pilot…

  2. Validation of finite element model of transcranial electrical stimulation using scalp potentials: implications for clinical dose

    NASA Astrophysics Data System (ADS)

    Datta, Abhishek; Zhou, Xiang; Su, Yuzhou; Parra, Lucas C.; Bikson, Marom

    2013-06-01

    Objective. During transcranial electrical stimulation, current passage across the scalp generates voltage across the scalp surface. The goal was to characterize these scalp voltages for the purpose of validating subject-specific finite element method (FEM) models of current flow. Approach. Using a recording electrode array, we mapped skin voltages resulting from low-intensity transcranial electrical stimulation. These voltage recordings were used to compare the predictions obtained from the high-resolution model based on the subject undergoing transcranial stimulation. Main results. Each of the four stimulation electrode configurations tested resulted in a distinct distribution of scalp voltages; these spatial maps were linear with applied current amplitude (0.1 to 1 mA) over low frequencies (1 to 10 Hz). The FEM model accurately predicted the distinct voltage distributions and correlated the induced scalp voltages with current flow through cortex. Significance. Our results provide the first direct model validation for these subject-specific modeling approaches. In addition, the monitoring of scalp voltages may be used to verify electrode placement to increase transcranial electrical stimulation safety and reproducibility.

  3. Automated visualization of rule-based models

    PubMed Central

    Tapia, Jose-Juan; Faeder, James R.

    2017-01-01

    Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816

  4. Assessment of Current Process Modeling Approaches to Determine Their Limitations, Applicability and Developments Needed for Long-Fiber Thermoplastic Injection Molded Composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Holbery, Jim; Smith, Mark T.

    2006-11-30

    This report describes the status of the current process modeling approaches to predict the behavior and flow of fiber-filled thermoplastics under injection molding conditions. Previously, models have been developed to simulate the injection molding of short-fiber thermoplastics, and an as-formed composite part or component can then be predicted that contains a microstructure resulting from the constituents’ material properties and characteristics as well as the processing parameters. Our objective is to assess these models in order to determine their capabilities and limitations, and the developments needed for long-fiber injection-molded thermoplastics (LFTs). First, the concentration regimes are summarized to facilitate the understandingmore » of different types of fiber-fiber interaction that can occur for a given fiber volume fraction. After the formulation of the fiber suspension flow problem and the simplification leading to the Hele-Shaw approach, the interaction mechanisms are discussed. Next, the establishment of the rheological constitutive equation is presented that reflects the coupled flow/orientation nature. The decoupled flow/orientation approach is also discussed which constitutes a good simplification for many applications involving flows in thin cavities. Finally, before outlining the necessary developments for LFTs, some applications of the current orientation model and the so-called modified Folgar-Tucker model are illustrated through the fiber orientation predictions for selected LFT samples.« less

  5. Heuristic-Leadership Model: Adapting to Current Training and Changing Times.

    ERIC Educational Resources Information Center

    Danielson, Mary Ann

    A model was developed for training individuals to adapt better to the changing work environment by focusing on the subordinate to supervisor relationship and providing a heuristic approach to leadership. The model emphasizes a heuristic approach to decision-making through the active participation of both members of the dyad. The demand among…

  6. Numerical modeling of hydrodynamics and sediment transport—an integrated approach

    NASA Astrophysics Data System (ADS)

    Gic-Grusza, Gabriela; Dudkowska, Aleksandra

    2017-10-01

    Point measurement-based estimation of bedload transport in the coastal zone is very difficult. The only way to assess the magnitude and direction of bedload transport in larger areas, particularly those characterized by complex bottom topography and hydrodynamics, is to use a holistic approach. This requires modeling of waves, currents, and the critical bed shear stress and bedload transport magnitude, with a due consideration to the realistic bathymetry and distribution of surface sediment types. Such a holistic approach is presented in this paper which describes modeling of bedload transport in the Gulf of Gdańsk. Extreme storm conditions defined based on 138-year NOAA data were assumed. The SWAN model (Booij et al. 1999) was used to define wind-wave fields, whereas wave-induced currents were calculated using the Kołodko and Gic-Grusza (2015) model, and the magnitude of bedload transport was estimated using the modified Meyer-Peter and Müller (1948) formula. The calculations were performed using a GIS model. The results obtained are innovative. The approach presented appears to be a valuable source of information on bedload transport in the coastal zone.

  7. AIR QUALITY MODELING OF HAZARDOUS POLLUTANTS: CURRENT STATUS AND FUTURE DIRECTIONS

    EPA Science Inventory

    The paper presents a review of current air toxics modeling applications and discusses possible advanced approaches. Many applications require the ability to predict hot spots from industrial sources or large roadways that are needed for community health and Environmental Justice...

  8. SMALL POPULATIONS REQUIRE SPECIFIC MODELING APPROACHES FOR ASSESSING RISK

    EPA Science Inventory

    All populations face non-zero risks of extinction. However, the risks for small populations, and therefore the modeling approaches necessary to predict them, are different from those of large populations. These differences are currently hindering assessment of risk to small pop...

  9. Accessing the dynamics of end-grafted flexible polymer chains by atomic force-electrochemical microscopy. Theoretical modeling of the approach curves by the elastic bounded diffusion model and Monte Carlo simulations. Evidence for compression-induced lateral chain escape.

    PubMed

    Abbou, Jeremy; Anne, Agnès; Demaille, Christophe

    2006-11-16

    The dynamics of a molecular layer of linear poly(ethylene glycol) (PEG) chains of molecular weight 3400, bearing at one end a ferrocene (Fc) label and thiol end-grafted at a low surface coverage onto a gold substrate, is probed using combined atomic force-electrochemical microscopy (AFM-SECM), at the scale of approximately 100 molecules. Force and current approach curves are simultaneously recorded as a force-sensing microelectrode (tip) is inserted within the approximately 10 nm thick, redox labeled, PEG chain layer. Whereas the force approach curve gives access to the structure of the compressed PEG layer, the tip-current, resulting from tip-to-substrate redox cycling of the Fc head of the chain, is controlled by chain dynamics. The elastic bounded diffusion model, which considers the motion of the Fc head as diffusion in a conformational field, complemented by Monte Carlo (MC) simulations, from which the chain conformation can be derived for any degree of confinement, allows the theoretical tip-current approach curve to be calculated. The experimental current approach curve can then be very satisfyingly reproduced by theory, down to a tip-substrate separation of approximately 2 nm, using only one adjustable parameter characterizing the chain dynamics: the effective diffusion coefficient of the chain head. At closer tip-substrate separations, an unpredicted peak is observed in the experimental current approach curve, which is shown to find its origin in a compression-induced escape of the chain from within the narrowing tip-substrate gap. MC simulations provide quantitative support for lateral chain elongation as the escape mechanism.

  10. Mathematical modeling of cancer metabolism.

    PubMed

    Medina, Miguel Ángel

    2018-04-01

    Systemic approaches are needed and useful for the study of the very complex issue of cancer. Modeling has a central position in these systemic approaches. Metabolic reprogramming is nowadays acknowledged as an essential hallmark of cancer. Mathematical modeling could contribute to a better understanding of cancer metabolic reprogramming and to identify new potential ways of therapeutic intervention. Herein, I review several alternative approaches to metabolic modeling and their current and future impact in oncology. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Decision Support for Renewal of Wastewater Collection and Water Distribution Systems

    EPA Science Inventory

    The objective of this study was to identify the current decision support methodologies, models and approaches being used for determining how to rehabilitate or replace underground utilities; identify the critical gaps of these current models through comparison with case history d...

  12. Monitoring tooth profile faults in epicyclic gearboxes using synchronously averaged motor currents: Mathematical modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Ottewill, J. R.; Ruszczyk, A.; Broda, D.

    2017-02-01

    Time-varying transmission paths and inaccessibility can increase the difficulty in both acquiring and processing vibration signals for the purpose of monitoring epicyclic gearboxes. Recent work has shown that the synchronous signal averaging approach may be applied to measured motor currents in order to diagnose tooth faults in parallel shaft gearboxes. In this paper we further develop the approach, so that it may also be applied to monitor tooth faults in epicyclic gearboxes. A low-degree-of-freedom model of an epicyclic gearbox which incorporates the possibility of simulating tooth faults, as well as any subsequent tooth contact loss due to these faults, is introduced. By combining this model with a simple space-phasor model of an induction motor it is possible to show that, in theory, tooth faults in epicyclic gearboxes may be identified from motor currents. Applying the synchronous averaging approach to experimentally recorded motor currents and angular displacements recorded from a shaft mounted encoder, validate this finding. Comparison between experiments and theory highlight the influence of operating conditions, backlash and shaft couplings on the transient response excited in the currents by the tooth fault. The results obtained suggest that the method may be a viable alternative or complement to more traditional methods for monitoring gearboxes. However, general observations also indicate that further investigations into the sensitivity and robustness of the method would be beneficial.

  13. Evaluation of training programs and entry-level qualifications for nuclear-power-plant control-room personnel based on the systems approach to training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haas, P M; Selby, D L; Hanley, M J

    1983-09-01

    This report summarizes results of research sponsored by the US Nuclear Regulatory Commission (NRC) Office of Nuclear Regulatory Research to initiate the use of the Systems Approach to Training in the evaluation of training programs and entry level qualifications for nuclear power plant (NPP) personnel. Variables (performance shaping factors) of potential importance to personnel selection and training are identified, and research to more rigorously define an operationally useful taxonomy of those variables is recommended. A high-level model of the Systems Approach to Training for use in the nuclear industry, which could serve as a model for NRC evaluation of industrymore » programs, is presented. The model is consistent with current publically stated NRC policy, with the approach being followed by the Institute for Nuclear Power Operations, and with current training technology. Checklists to be used by NRC evaluators to assess training programs for NPP control-room personnel are proposed which are based on this model.« less

  14. Application Perspective of 2D+SCALE Dimension

    NASA Astrophysics Data System (ADS)

    Karim, H.; Rahman, A. Abdul

    2016-09-01

    Different applications or users need different abstraction of spatial models, dimensionalities and specification of their datasets due to variations of required analysis and output. Various approaches, data models and data structures are now available to support most current application models in Geographic Information System (GIS). One of the focuses trend in GIS multi-dimensional research community is the implementation of scale dimension with spatial datasets to suit various scale application needs. In this paper, 2D spatial datasets that been scaled up as the third dimension are addressed as 2D+scale (or 3D-scale) dimension. Nowadays, various data structures, data models, approaches, schemas, and formats have been proposed as the best approaches to support variety of applications and dimensionality in 3D topology. However, only a few of them considers the element of scale as their targeted dimension. As the scale dimension is concerned, the implementation approach can be either multi-scale or vario-scale (with any available data structures and formats) depending on application requirements (topology, semantic and function). This paper attempts to discuss on the current and new potential applications which positively could be integrated upon 3D-scale dimension approach. The previous and current works on scale dimension as well as the requirements to be preserved for any given applications, implementation issues and future potential applications forms the major discussion of this paper.

  15. The Continuum of Pharmacist Prescriptive Authority.

    PubMed

    Adams, Alex J; Weaver, Krystalyn K

    2016-09-01

    Recently momentum has been building behind pharmacist prescriptive authority for certain products such as oral contraceptives or naloxone. To some, prescriptive authority by pharmacists represents a departure from the traditional role of pharmacists in dispensing medications. Nearly all states, however, currently enable pharmacist prescriptive authority in some form or fashion. The variety of different state approaches makes it difficult for pharmacists to ascertain the pros and cons of different models. We leverage data available from the National Alliance of State Pharmacy Associations (NASPA), a trade association that tracks pharmacy legislation and regulations across all states, to characterize models of pharmacist prescriptive authority along a continuum from most restrictive to least restrictive. We identify 2 primary categories of current pharmacist prescriptive authority: (1) collaborative prescribing and (2) autonomous prescribing. Collaborative prescribing models provide a broad framework for the treatment of acute or chronic disease. Current autonomous prescribing models have focused on a limited range of medications for which a specific diagnosis is not needed. Approaches to pharmacist prescriptive authority are not mutually exclusive. We anticipate that more states will pursue the less-restrictive approaches in the years ahead. © The Author(s) 2016.

  16. Why Doesn't the "High School Drop Out Rate" Drop?

    ERIC Educational Resources Information Center

    Truby, William F.

    2016-01-01

    This article provides information, questions, and answers about current approaches to dropping the dropout rate of our students. For example, our current model of education is based on the mass production or assembly line model promoted by Henry Ford back in early years of the 1900s (1900-1920). This model served both factory production and…

  17. Dynamic-landscape metapopulation models predict complex response of wildlife populations to climate and landscape change

    Treesearch

    Thomas W. Bonnot; Frank R. Thompson; Joshua J. Millspaugh

    2017-01-01

    The increasing need to predict how climate change will impact wildlife species has exposed limitations in how well current approaches model important biological processes at scales at which those processes interact with climate. We used a comprehensive approach that combined recent advances in landscape and population modeling into dynamic-landscape metapopulation...

  18. Comparison of statistical and theoretical habitat models for conservation planning: the benefit of ensemble prediction

    Treesearch

    D. Todd Jones-Farrand; Todd M. Fearer; Wayne E. Thogmartin; Frank R. Thompson; Mark D. Nelson; John M. Tirpak

    2011-01-01

    Selection of a modeling approach is an important step in the conservation planning process, but little guidance is available. We compared two statistical and three theoretical habitat modeling approaches representing those currently being used for avian conservation planning at landscape and regional scales: hierarchical spatial count (HSC), classification and...

  19. A simple method for EEG guided transcranial electrical stimulation without models

    NASA Astrophysics Data System (ADS)

    Cancelli, Andrea; Cottone, Carlo; Tecchio, Franca; Truong, Dennis Q.; Dmochowski, Jacek; Bikson, Marom

    2016-06-01

    Objective. There is longstanding interest in using EEG measurements to inform transcranial Electrical Stimulation (tES) but adoption is lacking because users need a simple and adaptable recipe. The conventional approach is to use anatomical head-models for both source localization (the EEG inverse problem) and current flow modeling (the tES forward model), but this approach is computationally demanding, requires an anatomical MRI, and strict assumptions about the target brain regions. We evaluate techniques whereby tES dose is derived from EEG without the need for an anatomical head model, target assumptions, difficult case-by-case conjecture, or many stimulation electrodes. Approach. We developed a simple two-step approach to EEG-guided tES that based on the topography of the EEG: (1) selects locations to be used for stimulation; (2) determines current applied to each electrode. Each step is performed based solely on the EEG with no need for head models or source localization. Cortical dipoles represent idealized brain targets. EEG-guided tES strategies are verified using a finite element method simulation of the EEG generated by a dipole, oriented either tangential or radial to the scalp surface, and then simulating the tES-generated electric field produced by each model-free technique. These model-free approaches are compared to a ‘gold standard’ numerically optimized dose of tES that assumes perfect understanding of the dipole location and head anatomy. We vary the number of electrodes from a few to over three hundred, with focality or intensity as optimization criterion. Main results. Model-free approaches evaluated include (1) voltage-to-voltage, (2) voltage-to-current; (3) Laplacian; and two Ad-Hoc techniques (4) dipole sink-to-sink; and (5) sink to concentric. Our results demonstrate that simple ad hoc approaches can achieve reasonable targeting for the case of a cortical dipole, remarkably with only 2-8 electrodes and no need for a model of the head. Significance. Our approach is verified directly only for a theoretically localized source, but may be potentially applied to an arbitrary EEG topography. For its simplicity and linearity, our recipe for model-free EEG guided tES lends itself to broad adoption and can be applied to static (tDCS), time-variant (e.g., tACS, tRNS, tPCS), or closed-loop tES.

  20. Modeling approaches for characterizing and evaluating environmental exposure to engineered nanomaterials in support of risk-based decision making.

    PubMed

    Hendren, Christine Ogilvie; Lowry, Michael; Grieger, Khara D; Money, Eric S; Johnston, John M; Wiesner, Mark R; Beaulieu, Stephen M

    2013-02-05

    As the use of engineered nanomaterials becomes more prevalent, the likelihood of unintended exposure to these materials also increases. Given the current scarcity of experimental data regarding fate, transport, and bioavailability, determining potential environmental exposure to these materials requires an in depth analysis of modeling techniques that can be used in both the near- and long-term. Here, we provide a critical review of traditional and emerging exposure modeling approaches to highlight the challenges that scientists and decision-makers face when developing environmental exposure and risk assessments for nanomaterials. We find that accounting for nanospecific properties, overcoming data gaps, realizing model limitations, and handling uncertainty are key to developing informative and reliable environmental exposure and risk assessments for engineered nanomaterials. We find methods suited to recognizing and addressing significant uncertainty to be most appropriate for near-term environmental exposure modeling, given the current state of information and the current insufficiency of established deterministic models to address environmental exposure to engineered nanomaterials.

  1. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

    NASA Technical Reports Server (NTRS)

    Hamilton, George S.; Williams, Jermaine C.

    1998-01-01

    This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

  2. A biophysical observation model for field potentials of networks of leaky integrate-and-fire neurons.

    PubMed

    Beim Graben, Peter; Rodrigues, Serafim

    2012-01-01

    We present a biophysical approach for the coupling of neural network activity as resulting from proper dipole currents of cortical pyramidal neurons to the electric field in extracellular fluid. Starting from a reduced three-compartment model of a single pyramidal neuron, we derive an observation model for dendritic dipole currents in extracellular space and thereby for the dendritic field potential (DFP) that contributes to the local field potential (LFP) of a neural population. This work aligns and satisfies the widespread dipole assumption that is motivated by the "open-field" configuration of the DFP around cortical pyramidal cells. Our reduced three-compartment scheme allows to derive networks of leaky integrate-and-fire (LIF) models, which facilitates comparison with existing neural network and observation models. In particular, by means of numerical simulations we compare our approach with an ad hoc model by Mazzoni et al. (2008), and conclude that our biophysically motivated approach yields substantial improvement.

  3. A biophysical observation model for field potentials of networks of leaky integrate-and-fire neurons

    PubMed Central

    beim Graben, Peter; Rodrigues, Serafim

    2013-01-01

    We present a biophysical approach for the coupling of neural network activity as resulting from proper dipole currents of cortical pyramidal neurons to the electric field in extracellular fluid. Starting from a reduced three-compartment model of a single pyramidal neuron, we derive an observation model for dendritic dipole currents in extracellular space and thereby for the dendritic field potential (DFP) that contributes to the local field potential (LFP) of a neural population. This work aligns and satisfies the widespread dipole assumption that is motivated by the “open-field” configuration of the DFP around cortical pyramidal cells. Our reduced three-compartment scheme allows to derive networks of leaky integrate-and-fire (LIF) models, which facilitates comparison with existing neural network and observation models. In particular, by means of numerical simulations we compare our approach with an ad hoc model by Mazzoni et al. (2008), and conclude that our biophysically motivated approach yields substantial improvement. PMID:23316157

  4. Accurate analytical modeling of junctionless DG-MOSFET by green's function approach

    NASA Astrophysics Data System (ADS)

    Nandi, Ashutosh; Pandey, Nilesh

    2017-11-01

    An accurate analytical model of Junctionless double gate MOSFET (JL-DG-MOSFET) in the subthreshold regime of operation is developed in this work using green's function approach. The approach considers 2-D mixed boundary conditions and multi-zone techniques to provide an exact analytical solution to 2-D Poisson's equation. The Fourier coefficients are calculated correctly to derive the potential equations that are further used to model the channel current and subthreshold slope of the device. The threshold voltage roll-off is computed from parallel shifts of Ids-Vgs curves between the long channel and short-channel devices. It is observed that the green's function approach of solving 2-D Poisson's equation in both oxide and silicon region can accurately predict channel potential, subthreshold current (Isub), threshold voltage (Vt) roll-off and subthreshold slope (SS) of both long & short channel devices designed with different doping concentrations and higher as well as lower tsi/tox ratio. All the analytical model results are verified through comparisons with TCAD Sentaurus simulation results. It is observed that the model matches quite well with TCAD device simulations.

  5. Design of Accelerator Online Simulator Server Using Structured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Guobao; /Brookhaven; Chu, Chungming

    2012-07-06

    Model based control plays an important role for a modern accelerator during beam commissioning, beam study, and even daily operation. With a realistic model, beam behaviour can be predicted and therefore effectively controlled. The approach used by most current high level application environments is to use a built-in simulation engine and feed a realistic model into that simulation engine. Instead of this traditional monolithic structure, a new approach using a client-server architecture is under development. An on-line simulator server is accessed via network accessible structured data. With this approach, a user can easily access multiple simulation codes. This paper describesmore » the design, implementation, and current status of PVData, which defines the structured data, and PVAccess, which provides network access to the structured data.« less

  6. Vehicle track segmentation using higher order random fields

    DOE PAGES

    Quach, Tu -Thach

    2017-01-09

    Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less

  7. Vehicle track segmentation using higher order random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quach, Tu -Thach

    Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less

  8. Improving clinical models based on knowledge extracted from current datasets: a new approach.

    PubMed

    Mendes, D; Paredes, S; Rocha, T; Carvalho, P; Henriques, J; Morais, J

    2016-08-01

    The Cardiovascular Diseases (CVD) are the leading cause of death in the world, being prevention recognized to be a key intervention able to contradict this reality. In this context, although there are several models and scores currently used in clinical practice to assess the risk of a new cardiovascular event, they present some limitations. The goal of this paper is to improve the CVD risk prediction taking into account the current models as well as information extracted from real and recent datasets. This approach is based on a decision tree scheme in order to assure the clinical interpretability of the model. An innovative optimization strategy is developed in order to adjust the decision tree thresholds (rule structure is fixed) based on recent clinical datasets. A real dataset collected in the ambit of the National Registry on Acute Coronary Syndromes, Portuguese Society of Cardiology is applied to validate this work. In order to assess the performance of the new approach, the metrics sensitivity, specificity and accuracy are used. This new approach achieves sensitivity, a specificity and an accuracy values of, 80.52%, 74.19% and 77.27% respectively, which represents an improvement of about 26% in relation to the accuracy of the original score.

  9. Analytical modeling of eddy-current losses caused by pulse-width-modulation switching in permanent-magnet brushless direct-current motors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, F.; Nehl, T.W.

    1998-09-01

    Because of their high efficiency and power density the PM brushless dc motor is a strong candidate for electric and hybrid vehicle propulsion systems. An analytical approach is developed to predict the inverter high frequency pulse width modulation (PWM) switching caused eddy-current losses in a permanent magnet brushless dc motor. The model uses polar coordinates to take curvature effects into account, and is also capable of including the space harmonic effect of the stator magnetic field and the stator lamination effect on the losses. The model was applied to an existing motor design and was verified with the finite elementmore » method. Good agreement was achieved between the two approaches. Hence, the model is expected to be very helpful in predicting PWM switching losses in permanent magnet machine design.« less

  10. CP violation in multibody B decays from QCD factorization

    NASA Astrophysics Data System (ADS)

    Klein, Rebecca; Mannel, Thomas; Virto, Javier; Vos, K. Keri

    2017-10-01

    We test a data-driven approach based on QCD factorization for charmless three-body B-decays by confronting it to measurements of CP violation in B - → π - π + π -. While some of the needed non-perturbative objects can be directly extracted from data, some others can, so far, only be modelled. Although this approach is currently model dependent, we comment on the perspectives to reduce this model dependence. While our model naturally accommodates the gross features of the Dalitz distribution, it cannot quantitatively explain the details seen in the current experimental data on local CP asymmetries. We comment on possible refinements of our simple model and conclude by briefly discussing a possible extension of the model to large invariant masses, where large local CP asymmetries have been measured.

  11. A New Trans-Disciplinary Approach to Regional Integrated Assessment of Climate Impact and Adaptation in Agricultural Systems (Invited)

    NASA Astrophysics Data System (ADS)

    Antle, J. M.; Valdivia, R. O.; Jones, J.; Rosenzweig, C.; Ruane, A. C.

    2013-12-01

    This presentation provides an overview of the new methods developed by researchers in the Agricultural Model Inter-comparison and Improvement Project (AgMIP) for regional climate impact assessment and analysis of adaptation in agricultural systems. This approach represents a departure from approaches in the literature in several dimensions. First, the approach is based on the analysis of agricultural systems (not individual crops) and is inherently trans-disciplinary: it is based on a deep collaboration among a team of climate scientists, agricultural scientists and economists to design and implement regional integrated assessments of agricultural systems. Second, in contrast to previous approaches that have imposed future climate on models based on current socio-economic conditions, this approach combines bio-physical and economic models with a new type of pathway analysis (Representative Agricultural Pathways) to parameterize models consistent with a plausible future world in which climate change would be occurring. Third, adaptation packages for the agricultural systems in a region are designed by the research team with a level of detail that is useful to decision makers, such as research administrators and donors, who are making agricultural R&D investment decisions. The approach is illustrated with examples from AgMIP's projects currently being carried out in Africa and South Asia.

  12. From Databases to Modelling of Functional Pathways

    PubMed Central

    2004-01-01

    This short review comments on current informatics resources and methodologies in the study of functional pathways in cell biology. It highlights recent achievements in unveiling the structural design of protein and gene networks and discusses current approaches to model and simulate the dynamics of regulatory pathways in the cell. PMID:18629070

  13. From databases to modelling of functional pathways.

    PubMed

    Nasi, Sergio

    2004-01-01

    This short review comments on current informatics resources and methodologies in the study of functional pathways in cell biology. It highlights recent achievements in unveiling the structural design of protein and gene networks and discusses current approaches to model and simulate the dynamics of regulatory pathways in the cell.

  14. A New Approach to Special Education Finance: The Resource Cost Model.

    ERIC Educational Resources Information Center

    Geske, Terry G.; Johnston, Mary Jo

    1985-01-01

    Describes current practices in Illinois where a personnel reimbursement formula is used to finance special education. Summarizes the basic components of the Resource Cost Model (RCM), a complex school finance formula, and compares and contrasts RCM with Illinois' current method of financing special education. (MLF)

  15. Personalized medicine and chronic obstructive pulmonary disease.

    PubMed

    Wouters, E F M; Wouters, B B R A F; Augustin, I M L; Franssen, F M E

    2017-05-01

    The current review summarizes ongoing developments in personalized medicine and precision medicine in chronic obstructive pulmonary disease (COPD). Our current approach is far away of personalized management algorithms as current recommendations for COPD are largely based on a reductionist disease description, operationally defined by results of spirometry. Besides precision medicine developments, a personalized medicine approach in COPD is described based on a holistic approach of the patient and considering illness as the consequence of dynamic interactions within and between multiple interacting and self-adjusting systems. Pulmonary rehabilitation is described as a model of personalized medicine. Largely based on current understanding of inflammatory processes in COPD, targeted interventions in COPD are reviewed. Augmentation therapy for α-1-antitrypsine deficiency is described as model of precision medicine in COPD based in profound understanding of the related genetic endotype. Future developments of precision medicine in COPD require identification of relevant endotypes combined with proper identification of phenotypes involved in the complex and heterogeneous manifestations of COPD.

  16. Mechanistic materials modeling for nuclear fuel performance

    DOE PAGES

    Tonks, Michael R.; Andersson, David; Phillpot, Simon R.; ...

    2017-03-15

    Fuel performance codes are critical tools for the design, certification, and safety analysis of nuclear reactors. However, their ability to predict fuel behavior under abnormal conditions is severely limited by their considerable reliance on empirical materials models correlated to burn-up (a measure of the number of fission events that have occurred, but not a unique measure of the history of the material). In this paper, we propose a different paradigm for fuel performance codes to employ mechanistic materials models that are based on the current state of the evolving microstructure rather than burn-up. In this approach, a series of statemore » variables are stored at material points and define the current state of the microstructure. The evolution of these state variables is defined by mechanistic models that are functions of fuel conditions and other state variables. The material properties of the fuel and cladding are determined from microstructure/property relationships that are functions of the state variables and the current fuel conditions. Multiscale modeling and simulation is being used in conjunction with experimental data to inform the development of these models. Finally, this mechanistic, microstructure-based approach has the potential to provide a more predictive fuel performance capability, but will require a team of researchers to complete the required development and to validate the approach.« less

  17. Modeling Educational Content: The Cognitive Approach of the PALO Language

    ERIC Educational Resources Information Center

    Rodriguez-Artacho, Miguel; Verdejo Maillo, M. Felisa

    2004-01-01

    This paper presents a reference framework to describe educational material. It introduces the PALO Language as a cognitive based approach to Educational Modeling Languages (EML). In accordance with recent trends for reusability and interoperability in Learning Technologies, EML constitutes an evolution of the current content-centered…

  18. A mixture theory approach to model co- and counter-current two-phase flow in porous media accounting for viscous coupling

    NASA Astrophysics Data System (ADS)

    Qiao, Y.; Andersen, P. Ø.; Evje, S.; Standnes, D. C.

    2018-02-01

    It is well known that relative permeabilities can depend on the flow configuration and they are commonly lower during counter-current flow as compared to co-current flow. Conventional models must deal with this by manually changing the relative permeability curves depending on the observed flow regime. In this paper we use a novel two-phase momentum-equation-approach based on general mixture theory to generate effective relative permeabilities where this dependence (and others) is automatically captured. In particular, this formulation includes two viscous coupling effects: (i) Viscous drag between the flowing phases and the stagnant porous rock; (ii) viscous drag caused by momentum transfer between the flowing phases. The resulting generalized model will predict that during co-current flow the faster moving fluid accelerates the slow fluid, but is itself decelerated, while for counter-current flow they are both decelerated. The implications of these mechanisms are demonstrated by investigating recovery of oil from a matrix block surrounded by water due to a combination of gravity drainage and spontaneous imbibition, a situation highly relevant for naturally fractured reservoirs. We implement relative permeability data obtained experimentally through co-current flooding experiments and then explore the model behavior for different flow cases ranging from counter-current dominated to co-current dominated. In particular, it is demonstrated how the proposed model seems to offer some possible interesting improvements over conventional modeling by providing generalized mobility functions that automatically are able to capture more correctly different flow regimes for one and the same parameter set.

  19. Towards predictive models of the human gut microbiome

    PubMed Central

    2014-01-01

    The intestinal microbiota is an ecosystem susceptible to external perturbations such as dietary changes and antibiotic therapies. Mathematical models of microbial communities could be of great value in the rational design of microbiota-tailoring diets and therapies. Here, we discuss how advances in another field, engineering of microbial communities for wastewater treatment bioreactors, could inspire development of mechanistic mathematical models of the gut microbiota. We review the current state-of-the-art in bioreactor modeling and current efforts in modeling the intestinal microbiota. Mathematical modeling could benefit greatly from the deluge of data emerging from metagenomic studies, but data-driven approaches such as network inference that aim to predict microbiome dynamics without explicit mechanistic knowledge seem better suited to model these data. Finally, we discuss how the integration of microbiome shotgun sequencing and metabolic modeling approaches such as flux balance analysis may fulfill the promise of a mechanistic model of the intestinal microbiota. PMID:24727124

  20. Bioregulatory systems medicine: an innovative approach to integrating the science of molecular networks, inflammation, and systems biology with the patient's autoregulatory capacity?

    PubMed Central

    Goldman, Alyssa W.; Burmeister, Yvonne; Cesnulevicius, Konstantin; Herbert, Martha; Kane, Mary; Lescheid, David; McCaffrey, Timothy; Schultz, Myron; Seilheimer, Bernd; Smit, Alta; St. Laurent, Georges; Berman, Brian

    2015-01-01

    Bioregulatory systems medicine (BrSM) is a paradigm that aims to advance current medical practices. The basic scientific and clinical tenets of this approach embrace an interconnected picture of human health, supported largely by recent advances in systems biology and genomics, and focus on the implications of multi-scale interconnectivity for improving therapeutic approaches to disease. This article introduces the formal incorporation of these scientific and clinical elements into a cohesive theoretical model of the BrSM approach. The authors review this integrated body of knowledge and discuss how the emergent conceptual model offers the medical field a new avenue for extending the armamentarium of current treatment and healthcare, with the ultimate goal of improving population health. PMID:26347656

  1. Logistic regression modeling to assess groundwater vulnerability to contamination in Hawaii, USA.

    PubMed

    Mair, Alan; El-Kadi, Aly I

    2013-10-01

    Capture zone analysis combined with a subjective susceptibility index is currently used in Hawaii to assess vulnerability to contamination of drinking water sources derived from groundwater. In this study, we developed an alternative objective approach that combines well capture zones with multiple-variable logistic regression (LR) modeling and applied it to the highly-utilized Pearl Harbor and Honolulu aquifers on the island of Oahu, Hawaii. Input for the LR models utilized explanatory variables based on hydrogeology, land use, and well geometry/location. A suite of 11 target contaminants detected in the region, including elevated nitrate (>1 mg/L), four chlorinated solvents, four agricultural fumigants, and two pesticides, was used to develop the models. We then tested the ability of the new approach to accurately separate groups of wells with low and high vulnerability, and the suitability of nitrate as an indicator of other types of contamination. Our results produced contaminant-specific LR models that accurately identified groups of wells with the lowest/highest reported detections and the lowest/highest nitrate concentrations. Current and former agricultural land uses were identified as significant explanatory variables for eight of the 11 target contaminants, while elevated nitrate was a significant variable for five contaminants. The utility of the combined approach is contingent on the availability of hydrologic and chemical monitoring data for calibrating groundwater and LR models. Application of the approach using a reference site with sufficient data could help identify key variables in areas with similar hydrogeology and land use but limited data. In addition, elevated nitrate may also be a suitable indicator of groundwater contamination in areas with limited data. The objective LR modeling approach developed in this study is flexible enough to address a wide range of contaminants and represents a suitable addition to the current subjective approach. © 2013 Elsevier B.V. All rights reserved.

  2. A Bayesian Hierarchical Modeling Approach to Predicting Flow in Ungauged Basins

    NASA Astrophysics Data System (ADS)

    Gronewold, A.; Alameddine, I.; Anderson, R. M.

    2009-12-01

    Recent innovative approaches to identifying and applying regression-based relationships between land use patterns (such as increasing impervious surface area and decreasing vegetative cover) and rainfall-runoff model parameters represent novel and promising improvements to predicting flow from ungauged basins. In particular, these approaches allow for predicting flows under uncertain and potentially variable future conditions due to rapid land cover changes, variable climate conditions, and other factors. Despite the broad range of literature on estimating rainfall-runoff model parameters, however, the absence of a robust set of modeling tools for identifying and quantifying uncertainties in (and correlation between) rainfall-runoff model parameters represents a significant gap in current hydrological modeling research. Here, we build upon a series of recent publications promoting novel Bayesian and probabilistic modeling strategies for quantifying rainfall-runoff model parameter estimation uncertainty. Our approach applies alternative measures of rainfall-runoff model parameter joint likelihood (including Nash-Sutcliffe efficiency, among others) to simulate samples from the joint parameter posterior probability density function. We then use these correlated samples as response variables in a Bayesian hierarchical model with land use coverage data as predictor variables in order to develop a robust land use-based tool for forecasting flow in ungauged basins while accounting for, and explicitly acknowledging, parameter estimation uncertainty. We apply this modeling strategy to low-relief coastal watersheds of Eastern North Carolina, an area representative of coastal resource waters throughout the world because of its sensitive embayments and because of the abundant (but currently threatened) natural resources it hosts. Consequently, this area is the subject of several ongoing studies and large-scale planning initiatives, including those conducted through the United States Environmental Protection Agency (USEPA) total maximum daily load (TMDL) program, as well as those addressing coastal population dynamics and sea level rise. Our approach has several advantages, including the propagation of parameter uncertainty through a nonparametric probability distribution which avoids common pitfalls of fitting parameters and model error structure to a predetermined parametric distribution function. In addition, by explicitly acknowledging correlation between model parameters (and reflecting those correlations in our predictive model) our model yields relatively efficient prediction intervals (unlike those in the current literature which are often unnecessarily large, and may lead to overly-conservative management actions). Finally, our model helps improve understanding of the rainfall-runoff process by identifying model parameters (and associated catchment attributes) which are most sensitive to current and future land use change patterns. Disclaimer: Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy.

  3. An experimental study of nonlinear dynamic system identification

    NASA Technical Reports Server (NTRS)

    Stry, Greselda I.; Mook, D. Joseph

    1990-01-01

    A technique for robust identification of nonlinear dynamic systems is developed and illustrated using both simulations and analog experiments. The technique is based on the Minimum Model Error optimal estimation approach. A detailed literature review is included in which fundamental differences between the current approach and previous work is described. The most significant feature of the current work is the ability to identify nonlinear dynamic systems without prior assumptions regarding the form of the nonlinearities, in constrast to existing nonlinear identification approaches which usually require detailed assumptions of the nonlinearities. The example illustrations indicate that the method is robust with respect to prior ignorance of the model, and with respect to measurement noise, measurement frequency, and measurement record length.

  4. High-precision radiometric tracking for planetary approach and encounter in the inner solar system

    NASA Technical Reports Server (NTRS)

    Christensen, C. S.; Thurman, S. W.; Davidson, J. M.; Finger, M. H.; Folkner, W. M.

    1989-01-01

    The benefits of improved radiometric tracking data have been studied for planetary approach within the inner Solar System using the Mars Rover Sample Return trajectory as a model. It was found that the benefit of improved data to approach and encounter navigation was highly dependent on the a priori uncertainties assumed for several non-estimated parameters, including those for frame-tie, Earth orientation, troposphere delay, and station locations. With these errors at their current levels, navigational performance was found to be insensitive to enhancements in data accuracy. However, when expected improvements in these errors are modeled, performance with current-accuracy data significantly improves, with substantial further improvements possible with enhancements in data accuracy.

  5. A short review of perfectionism in sport, dance and exercise: out with the old, in with the 2×2.

    PubMed

    Hill, Andrew P; Madigan, Daniel J

    2017-08-01

    The purpose of the current paper is to review research examining multidimensional perfectionism in sport, dance, and exercise. We start by providing a conceptual overview of perfectionism. We then describe three main approaches to examining perfectionism. These approaches are an independent effects approach, the tripartite model, and the 2×2 model of perfectionism. Alongside the description of each approach, research findings are summarized. We close the paper by explaining how the development of the 2×2 model has likely rendered the tripartite model obsolete. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Intrinsic ethics regarding integrated assessment models for climate management.

    PubMed

    Schienke, Erich W; Baum, Seth D; Tuana, Nancy; Davis, Kenneth J; Keller, Klaus

    2011-09-01

    In this essay we develop and argue for the adoption of a more comprehensive model of research ethics than is included within current conceptions of responsible conduct of research (RCR). We argue that our model, which we label the ethical dimensions of scientific research (EDSR), is a more comprehensive approach to encouraging ethically responsible scientific research compared to the currently typically adopted approach in RCR training. This essay focuses on developing a pedagogical approach that enables scientists to better understand and appreciate one important component of this model, what we call intrinsic ethics. Intrinsic ethical issues arise when values and ethical assumptions are embedded within scientific findings and analytical methods. Through a close examination of a case study and its application in teaching, namely, evaluation of climate change integrated assessment models, this paper develops a method and case for including intrinsic ethics within research ethics training to provide scientists with a comprehensive understanding and appreciation of the critical role of values and ethical choices in the production of research outcomes.

  7. Current Approaches to Intervention in Children with Developmental Coordination Disorder

    ERIC Educational Resources Information Center

    Sugden, David

    2007-01-01

    This review analyzes approaches to intervention in children with developmental coordination disorder within the framework of how children develop and learn motor skills, drawing upon maturational, cognitive, and dynamic systems models. The approaches to intervention are divided into two categories: (1) process or deficit-oriented approaches; and…

  8. A simple method for EEG guided transcranial electrical stimulation without models.

    PubMed

    Cancelli, Andrea; Cottone, Carlo; Tecchio, Franca; Truong, Dennis Q; Dmochowski, Jacek; Bikson, Marom

    2016-06-01

    There is longstanding interest in using EEG measurements to inform transcranial Electrical Stimulation (tES) but adoption is lacking because users need a simple and adaptable recipe. The conventional approach is to use anatomical head-models for both source localization (the EEG inverse problem) and current flow modeling (the tES forward model), but this approach is computationally demanding, requires an anatomical MRI, and strict assumptions about the target brain regions. We evaluate techniques whereby tES dose is derived from EEG without the need for an anatomical head model, target assumptions, difficult case-by-case conjecture, or many stimulation electrodes. We developed a simple two-step approach to EEG-guided tES that based on the topography of the EEG: (1) selects locations to be used for stimulation; (2) determines current applied to each electrode. Each step is performed based solely on the EEG with no need for head models or source localization. Cortical dipoles represent idealized brain targets. EEG-guided tES strategies are verified using a finite element method simulation of the EEG generated by a dipole, oriented either tangential or radial to the scalp surface, and then simulating the tES-generated electric field produced by each model-free technique. These model-free approaches are compared to a 'gold standard' numerically optimized dose of tES that assumes perfect understanding of the dipole location and head anatomy. We vary the number of electrodes from a few to over three hundred, with focality or intensity as optimization criterion. Model-free approaches evaluated include (1) voltage-to-voltage, (2) voltage-to-current; (3) Laplacian; and two Ad-Hoc techniques (4) dipole sink-to-sink; and (5) sink to concentric. Our results demonstrate that simple ad hoc approaches can achieve reasonable targeting for the case of a cortical dipole, remarkably with only 2-8 electrodes and no need for a model of the head. Our approach is verified directly only for a theoretically localized source, but may be potentially applied to an arbitrary EEG topography. For its simplicity and linearity, our recipe for model-free EEG guided tES lends itself to broad adoption and can be applied to static (tDCS), time-variant (e.g., tACS, tRNS, tPCS), or closed-loop tES.

  9. Numerically pricing American options under the generalized mixed fractional Brownian motion model

    NASA Astrophysics Data System (ADS)

    Chen, Wenting; Yan, Bowen; Lian, Guanghua; Zhang, Ying

    2016-06-01

    In this paper, we introduce a robust numerical method, based on the upwind scheme, for the pricing of American puts under the generalized mixed fractional Brownian motion (GMFBM) model. By using portfolio analysis and applying the Wick-Itô formula, a partial differential equation (PDE) governing the prices of vanilla options under the GMFBM is successfully derived for the first time. Based on this, we formulate the pricing of American puts under the current model as a linear complementarity problem (LCP). Unlike the classical Black-Scholes (B-S) model or the generalized B-S model discussed in Cen and Le (2011), the newly obtained LCP under the GMFBM model is difficult to be solved accurately because of the numerical instability which results from the degeneration of the governing PDE as time approaches zero. To overcome this difficulty, a numerical approach based on the upwind scheme is adopted. It is shown that the coefficient matrix of the current method is an M-matrix, which ensures its stability in the maximum-norm sense. Remarkably, we have managed to provide a sharp theoretic error estimate for the current method, which is further verified numerically. The results of various numerical experiments also suggest that this new approach is quite accurate, and can be easily extended to price other types of financial derivatives with an American-style exercise feature under the GMFBM model.

  10. Parameter reduction in nonlinear state-space identification of hysteresis

    NASA Astrophysics Data System (ADS)

    Fakhrizadeh Esfahani, Alireza; Dreesen, Philippe; Tiels, Koen; Noël, Jean-Philippe; Schoukens, Johan

    2018-05-01

    Recent work on black-box polynomial nonlinear state-space modeling for hysteresis identification has provided promising results, but struggles with a large number of parameters due to the use of multivariate polynomials. This drawback is tackled in the current paper by applying a decoupling approach that results in a more parsimonious representation involving univariate polynomials. This work is carried out numerically on input-output data generated by a Bouc-Wen hysteretic model and follows up on earlier work of the authors. The current article discusses the polynomial decoupling approach and explores the selection of the number of univariate polynomials with the polynomial degree. We have found that the presented decoupling approach is able to reduce the number of parameters of the full nonlinear model up to about 50%, while maintaining a comparable output error level.

  11. Using Web-Based Knowledge Extraction Techniques to Support Cultural Modeling

    NASA Astrophysics Data System (ADS)

    Smart, Paul R.; Sieck, Winston R.; Shadbolt, Nigel R.

    The World Wide Web is a potentially valuable source of information about the cognitive characteristics of cultural groups. However, attempts to use the Web in the context of cultural modeling activities are hampered by the large-scale nature of the Web and the current dominance of natural language formats. In this paper, we outline an approach to support the exploitation of the Web for cultural modeling activities. The approach begins with the development of qualitative cultural models (which describe the beliefs, concepts and values of cultural groups), and these models are subsequently used to develop an ontology-based information extraction capability. Our approach represents an attempt to combine conventional approaches to information extraction with epidemiological perspectives of culture and network-based approaches to cultural analysis. The approach can be used, we suggest, to support the development of models providing a better understanding of the cognitive characteristics of particular cultural groups.

  12. Theoretical Framework for Interaction Game Design

    DTIC Science & Technology

    2016-05-19

    modeling. We take a data-driven quantitative approach to understand conversational behaviors by measuring conversational behaviors using advanced sensing...current state of the art, human computing is considered to be a reasonable approach to break through the current limitation. To solicit high quality and...proper resources in conversation to enable smooth and effective interaction. The last technique is about conversation measurement , analysis, and

  13. Shared decision making in senior medical students: results from a national survey.

    PubMed

    Zeballos-Palacios, Claudia; Quispe, Renato; Mongilardi, Nicole; Diaz-Arocutipa, Carlos; Mendez-Davalos, Carlos; Lizarraga, Natalia; Paz, Aldo; Montori, Victor M; Malaga, German

    2015-05-01

    To explore perceptions and experiences of Peruvian medical students about observed, preferred, and feasible decision-making approaches. We surveyed senior medical students from 19 teaching hospitals in 4 major cities in Peru. The self-administered questionnaire collected demographic information, current approach, exposure to role models for and training in shared decision making, and perceptions of the pertinence and feasibility of the different decision-making approaches in general as well as in challenging scenarios. A total of 327 senior medical students (51% female) were included. The mean age was 25 years. Among all respondents, 2% reported receiving both theoretical and practical training in shared decision making. While 46% of students identified their current decision-making approach as clinician-as-perfect-agent, 50% of students identified their teachers with the paternalistic approach. Remarkably, 53% of students thought shared decision making should be the preferred approach and 50% considered it feasible in Peru. Among the 10 challenging scenarios, shared decision making reached a plurality (40%) in only one scenario (terminally ill patients). Despite limited exposure and training, Peruvian medical students aspire to practice shared decision making but their current attitude reflects the less participatory approaches they see role modeled by their teachers. © The Author(s) 2015.

  14. Modelling rotational and cyclical spectral solar irradiance variations

    NASA Astrophysics Data System (ADS)

    Unruh, Yvonne

    Solar irradiance changes are highly wavelength dependent: solar-cycle variations in the UV can be on the order of tens of percent, while changes in the visible are typically only of the order of one or two permille. With the launch of a number of instruments to measure spectral solar irradiance, we are now for a first time in a good position to explore the changing solar irradiance over a large range of wavelengths and to test our irradiance models as well as some of their underlying assumptions. I will introduce some of the current modelling approaches and present model-data comparisons, using the SATIRE irradiance model and SORCE/SIM measurements as an example. I will conclude by highlighting a number of outstanding questions regarding the modelling of spectral irradiance and current approaches to address these.

  15. 75 FR 75162 - Protection of Cleared Swaps Customers Before and After Commodity Broker Bankruptcies

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-02

    ... are different from the current model for protecting futures customer collateral would bring... 10-11. (4) Baseline Model--The current approach to futures. The rights and obligations arising out of... COMMODITY FUTURES TRADING COMMISSION 17 CFR Part 190 RIN 3038-AD99 Protection of Cleared Swaps...

  16. Modeling current climate conditions for forest pest risk assessment

    Treesearch

    Frank H. Koch; John W. Coulston

    2010-01-01

    Current information on broad-scale climatic conditions is essential for assessing potential distribution of forest pests. At present, sophisticated spatial interpolation approaches such as the Parameter-elevation Regressions on Independent Slopes Model (PRISM) are used to create high-resolution climatic data sets. Unfortunately, these data sets are based on 30-year...

  17. Advances in Bayesian Modeling in Educational Research

    ERIC Educational Resources Information Center

    Levy, Roy

    2016-01-01

    In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…

  18. Phytoplankton as Particles - A New Approach to Modeling Algal Blooms

    DTIC Science & Technology

    2013-07-01

    68  Figure 69. Amplitudes of lunar semi-diurnal and diurnal harmonics of observed and computed...particle behavior when the trajectory takes a particle outside the model domain. The rules associated with the present particle-tracking algorithms are... land - ward, although occasional reversals occurred. Amplitude of the current fluctuations was ≈ 20 cm s-1. Model residual currents for one year were

  19. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    ERIC Educational Resources Information Center

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  20. Automatic Generation of Customized, Model Based Information Systems for Operations Management.

    DTIC Science & Technology

    The paper discusses the need for developing a customized, model based system to support management decision making in the field of operations ... management . It provides a critique of the current approaches available, formulates a framework to classify logistics decisions, and suggests an approach for the automatic development of logistics systems. (Author)

  1. Using eddy covariance and flux partitioning to assess basal, soil, and stress coefficients for crop evapotranspiration models

    USDA-ARS?s Scientific Manuscript database

    Current approaches to scheduling crop irrigation using reference evapotranspiration (ET0) recommend using a dual-coefficient approach using basal (Kcb) and soil (Ke) coefficients along with a stress coefficient (Ks) to model crop evapotranspiration (ETc), [e.g. ETc=(Ks*Kcb+Ke)*ET0]. However, indepe...

  2. Develop a Systems Approach to Characterizing and Predicting Thyroid Toxicity using an Amphibian Model

    EPA Science Inventory

    This research makes use of in vitro and in vivo approaches to understand and discriminate the compensatory and toxicological responses of the highly regulated HPT system. Development of an initial systems model will be based on the current understanding of the HPT axis and the co...

  3. A Comparison of Methods for Estimating Quadratic Effects in Nonlinear Structural Equation Models

    ERIC Educational Resources Information Center

    Harring, Jeffrey R.; Weiss, Brandi A.; Hsu, Jui-Chen

    2012-01-01

    Two Monte Carlo simulations were performed to compare methods for estimating and testing hypotheses of quadratic effects in latent variable regression models. The methods considered in the current study were (a) a 2-stage moderated regression approach using latent variable scores, (b) an unconstrained product indicator approach, (c) a latent…

  4. Current models broadly neglect specific needs of biodiversity conservation in protected areas under climate change.

    PubMed

    Sieck, Mungla; Ibisch, Pierre L; Moloney, Kirk A; Jeltsch, Florian

    2011-05-03

    Protected areas are the most common and important instrument for the conservation of biological diversity and are called for under the United Nations' Convention on Biological Diversity. Growing human population densities, intensified land-use, invasive species and increasing habitat fragmentation threaten ecosystems worldwide and protected areas are often the only refuge for endangered species. Climate change is posing an additional threat that may also impact ecosystems currently under protection. Therefore, it is of crucial importance to include the potential impact of climate change when designing future nature conservation strategies and implementing protected area management. This approach would go beyond reactive crisis management and, by necessity, would include anticipatory risk assessments. One avenue for doing so is being provided by simulation models that take advantage of the increase in computing capacity and performance that has occurred over the last two decades.Here we review the literature to determine the state-of-the-art in modeling terrestrial protected areas under climate change, with the aim of evaluating and detecting trends and gaps in the current approaches being employed, as well as to provide a useful overview and guidelines for future research. Most studies apply statistical, bioclimatic envelope models and focus primarily on plant species as compared to other taxa. Very few studies utilize a mechanistic, process-based approach and none examine biotic interactions like predation and competition. Important factors like land-use, habitat fragmentation, invasion and dispersal are rarely incorporated, restricting the informative value of the resulting predictions considerably. The general impression that emerges is that biodiversity conservation in protected areas could benefit from the application of modern modeling approaches to a greater extent than is currently reflected in the scientific literature. It is particularly true that existing models have been underutilized in testing different management options under climate change. Based on these findings we suggest a strategic framework for more effectively incorporating the impact of climate change in models exploring the effectiveness of protected areas.

  5. The Risk Need Responsivity Model of Offender Rehabilitation: Is There Really a Need for a Paradigm Shift?

    ERIC Educational Resources Information Center

    Looman, Jan; Abracen, Jeffrey

    2013-01-01

    The current paper critically reviews the Risk-Need-Responsivity (RNR) and Good Lives Model (GLM) approaches to correctional treatment. Research, or the lack thereof, is discussed in terms of whether there is a need for a new model of offender rehabilitation. We argue that although there is a wealth of research in support of RNR approaches, there…

  6. Moving university hydrology education forward with geoinformatics, data and modeling approaches

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Ruddell, B. L.

    2012-02-01

    In this opinion paper, we review recent literature related to data and modeling driven instruction in hydrology, and present our findings from surveying the hydrology education community in the United States. This paper presents an argument that that Data and Modeling Driven Geoscience Cybereducation (DMDGC) approaches are valuable for teaching the conceptual and applied aspects of hydrology, as a part of the broader effort to improve Science, Technology, Engineering, and Mathematics (STEM) education at the university level. The authors have undertaken a series of surveys and a workshop involving the community of university hydrology educators to determine the state of the practice of DMDGC approaches to hydrology. We identify the most common tools and approaches currently utilized, quantify the extent of the adoption of DMDGC approaches in the university hydrology classroom, and explain the community's views on the challenges and barriers preventing DMDGC approaches from wider use. DMDGC approaches are currently emphasized at the graduate level of the curriculum, and only the most basic modeling and visualization tools are in widespread use. The community identifies the greatest barriers to greater adoption as a lack of access to easily adoptable curriculum materials and a lack of time and training to learn constantly changing tools and methods. The community's current consensus is that DMDGC approaches should emphasize conceptual learning, and should be used to complement rather than replace lecture-based pedagogies. Inadequate online material-publication and sharing systems, and a lack of incentives for faculty to develop and publish materials via such systems, is also identified as a challenge. Based on these findings, we suggest that a number of steps should be taken by the community to develop the potential of DMDGC in university hydrology education, including formal development and assessment of curriculum materials integrating lecture-format and DMDGC approaches, incentivizing the publication by faculty of excellent DMDGC curriculum materials, and implementing the publication and dissemination cyberinfrastructure necessary to support the unique DMDGC digital curriculum materials.

  7. Modeling of phonon scattering in n-type nanowire transistors using one-shot analytic continuation technique

    NASA Astrophysics Data System (ADS)

    Bescond, Marc; Li, Changsheng; Mera, Hector; Cavassilas, Nicolas; Lannoo, Michel

    2013-10-01

    We present a one-shot current-conserving approach to model the influence of electron-phonon scattering in nano-transistors using the non-equilibrium Green's function formalism. The approach is based on the lowest order approximation (LOA) to the current and its simplest analytic continuation (LOA+AC). By means of a scaling argument, we show how both LOA and LOA+AC can be easily obtained from the first iteration of the usual self-consistent Born approximation (SCBA) algorithm. Both LOA and LOA+AC are then applied to model n-type silicon nanowire field-effect-transistors and are compared to SCBA current characteristics. In this system, the LOA fails to describe electron-phonon scattering, mainly because of the interactions with acoustic phonons at the band edges. In contrast, the LOA+AC still well approximates the SCBA current characteristics, thus demonstrating the power of analytic continuation techniques. The limits of validity of LOA+AC are also discussed, and more sophisticated and general analytic continuation techniques are suggested for more demanding cases.

  8. A META-COMPOSITE SOFTWARE DEVELOPMENT APPROACH FOR TRANSLATIONAL RESEARCH

    PubMed Central

    Sadasivam, Rajani S.; Tanik, Murat M.

    2013-01-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users’ needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements. PMID:23504436

  9. A meta-composite software development approach for translational research.

    PubMed

    Sadasivam, Rajani S; Tanik, Murat M

    2013-06-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users' needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements.

  10. Modeling AEC—New Approaches to Study Rare Genetic Disorders

    PubMed Central

    Koch, Peter J.; Dinella, Jason; Fete, Mary; Siegfried, Elaine C.; Koster, Maranke I.

    2015-01-01

    Ankyloblepharon-ectodermal defects-cleft lip/palate (AEC) syndrome is a rare monogenetic disorder that is characterized by severe abnormalities in ectoderm-derived tissues, such as skin and its appendages. A major cause of morbidity among affected infants is severe and chronic skin erosions. Currently, supportive care is the only available treatment option for AEC patients. Mutations in TP63, a gene that encodes key regulators of epidermal development, are the genetic cause of AEC. However, it is currently not clear how mutations in TP63 lead to the various defects seen in the patients’ skin. In this review, we will discuss current knowledge of the AEC disease mechanism obtained by studying patient tissue and genetically engineered mouse models designed to mimic aspects of the disorder. We will then focus on new approaches to model AEC, including the use of patient cells and stem cell technology to replicate the disease in a human tissue culture model. The latter approach will advance our understanding of the disease and will allow for the development of new in vitro systems to identify drugs for the treatment of skin erosions in AEC patients. Further, the use of stem cell technology, in particular induced pluripotent stem cells (iPSC), will enable researchers to develop new therapeutic approaches to treat the disease using the patient’s own cells (autologous keratinocyte transplantation) after correction of the disease-causing mutations. PMID:24665072

  11. From mitochondrial ion channels to arrhythmias in the heart: computational techniques to bridge the spatio-temporal scales

    PubMed Central

    Plank, Gernot; Zhou, Lufang; Greenstein, Joseph L; Cortassa, Sonia; Winslow, Raimond L; O'Rourke, Brian; Trayanova, Natalia A

    2008-01-01

    Computer simulations of electrical behaviour in the whole ventricles have become commonplace during the last few years. The goals of this article are (i) to review the techniques that are currently employed to model cardiac electrical activity in the heart, discussing the strengths and weaknesses of the various approaches, and (ii) to implement a novel modelling approach, based on physiological reasoning, that lifts some of the restrictions imposed by current state-of-the-art ionic models. To illustrate the latter approach, the present study uses a recently developed ionic model of the ventricular myocyte that incorporates an excitation–contraction coupling and mitochondrial energetics model. A paradigm to bridge the vastly disparate spatial and temporal scales, from subcellular processes to the entire organ, and from sub-microseconds to minutes, is presented. Achieving sufficient computational efficiency is the key to success in the quest to develop multiscale realistic models that are expected to lead to better understanding of the mechanisms of arrhythmia induction following failure at the organelle level, and ultimately to the development of novel therapeutic applications. PMID:18603526

  12. Development of a new model for short period ocean tidal variations of Earth rotation

    NASA Astrophysics Data System (ADS)

    Schuh, Harald

    2015-08-01

    Within project SPOT (Short Period Ocean Tidal variations in Earth rotation) we develop a new high frequency Earth rotation model based on empirical ocean tide models. The main purpose of the SPOT model is its application to space geodetic observations such as GNSS and VLBI.We consider an empirical ocean tide model, which does not require hydrodynamic ocean modeling to determine ocean tidal angular momentum. We use here the EOT11a model of Savcenko & Bosch (2012), which is extended for some additional minor tides (e.g. M1, J1, T2). As empirical tidal models do not provide ocean tidal currents, which are re- quired for the computation of oceanic relative angular momentum, we implement an approach first published by Ray (2001) to estimate ocean tidal current veloci- ties for all tides considered in the extended EOT11a model. The approach itself is tested by application to tidal heights from hydrodynamic ocean tide models, which also provide tidal current velocities. Based on the tidal heights and the associated current velocities the oceanic tidal angular momentum (OTAM) is calculated.For the computation of the related short period variation of Earth rotation, we have re-examined the Euler-Liouville equation for an elastic Earth model with a liquid core. The focus here is on the consistent calculation of the elastic Love num- bers and associated Earth model parameters, which are considered in the Euler- Liouville equation for diurnal and sub-diurnal periods in the frequency domain.

  13. An Additional Approach to Model Current Followers and Amplifiers with Electronically Controllable Parameters from Commercially Available ICs

    NASA Astrophysics Data System (ADS)

    Sotner, R.; Kartci, A.; Jerabek, J.; Herencsar, N.; Dostal, T.; Vrba, K.

    2012-12-01

    Several behavioral models of current active elements for experimental purposes are introduced in this paper. These models are based on commercially available devices. They are suitable for experimental tests of current- and mixed-mode filters, oscillators, and other circuits (employing current-mode active elements) frequently used in analog signal processing without necessity of onchip fabrication of proper active element. Several methods of electronic control of intrinsic resistance in the proposed behavioral models are discussed. All predictions and theoretical assumptions are supported by simulations and experiments. This contribution helps to find a cheaper and more effective way to preliminary laboratory tests without expensive on-chip fabrication of special active elements.

  14. A Hybrid RANS/LES Approach for Predicting Jet Noise

    NASA Technical Reports Server (NTRS)

    Goldstein, Marvin E.

    2006-01-01

    Hybrid acoustic prediction methods have an important advantage over the current Reynolds averaged Navier-Stokes (RANS) based methods in that they only involve modeling of the relatively universal subscale motion and not the configuration dependent larger scale turbulence. Unfortunately, they are unable to account for the high frequency sound generated by the turbulence in the initial mixing layers. This paper introduces an alternative approach that directly calculates the sound from a hybrid RANS/LES flow model (which can resolve the steep gradients in the initial mixing layers near the nozzle lip) and adopts modeling techniques similar to those used in current RANS based noise prediction methods to determine the unknown sources in the equations for the remaining unresolved components of the sound field. The resulting prediction method would then be intermediate between the current noise prediction codes and previously proposed hybrid noise prediction methods.

  15. Do we have the right models for scaling up health services to achieve the Millennium Development Goals?

    PubMed

    Subramanian, Savitha; Naimoli, Joseph; Matsubayashi, Toru; Peters, David H

    2011-12-14

    There is widespread agreement on the need for scaling up in the health sector to achieve the Millennium Development Goals (MDGs). But many countries are not on track to reach the MDG targets. The dominant approach used by global health initiatives promotes uniform interventions and targets, assuming that specific technical interventions tested in one country can be replicated across countries to rapidly expand coverage. Yet countries scale up health services and progress against the MDGs at very different rates. Global health initiatives need to take advantage of what has been learned about scaling up. A systematic literature review was conducted to identify conceptual models for scaling up health in developing countries, with the articles assessed according to the practical concerns of how to scale up, including the planning, monitoring and implementation approaches. We identified six conceptual models for scaling up in health based on experience with expanding pilot projects and diffusion of innovations. They place importance on paying attention to enhancing organizational, functional, and political capabilities through experimentation and adaptation of strategies in addition to increasing the coverage and range of health services. These scaling up approaches focus on fostering sustainable institutions and the constructive engagement between end users and the provider and financing organizations. The current approaches to scaling up health services to reach the MDGs are overly simplistic and not working adequately. Rather than relying on blueprint planning and raising funds, an approach characteristic of current global health efforts, experience with alternative models suggests that more promising pathways involve "learning by doing" in ways that engage key stakeholders, uses data to address constraints, and incorporates results from pilot projects. Such approaches should be applied to current strategies to achieve the MDGs.

  16. From Network Analysis to Functional Metabolic Modeling of the Human Gut Microbiota.

    PubMed

    Bauer, Eugen; Thiele, Ines

    2018-01-01

    An important hallmark of the human gut microbiota is its species diversity and complexity. Various diseases have been associated with a decreased diversity leading to reduced metabolic functionalities. Common approaches to investigate the human microbiota include high-throughput sequencing with subsequent correlative analyses. However, to understand the ecology of the human gut microbiota and consequently design novel treatments for diseases, it is important to represent the different interactions between microbes with their associated metabolites. Computational systems biology approaches can give further mechanistic insights by constructing data- or knowledge-driven networks that represent microbe interactions. In this minireview, we will discuss current approaches in systems biology to analyze the human gut microbiota, with a particular focus on constraint-based modeling. We will discuss various community modeling techniques with their advantages and differences, as well as their application to predict the metabolic mechanisms of intestinal microbial communities. Finally, we will discuss future perspectives and current challenges of simulating realistic and comprehensive models of the human gut microbiota.

  17. Current reversals and metastable states in the infinite Bose-Hubbard chain with local particle loss

    NASA Astrophysics Data System (ADS)

    Kiefer-Emmanouilidis, M.; Sirker, J.

    2017-12-01

    We present an algorithm which combines the quantum trajectory approach to open quantum systems with a density-matrix renormalization-group scheme for infinite one-dimensional lattice systems. We apply this method to investigate the long-time dynamics in the Bose-Hubbard model with local particle loss starting from a Mott-insulating initial state with one boson per site. While the short-time dynamics can be described even quantitatively by an equation of motion (EOM) approach at the mean-field level, many-body interactions lead to unexpected effects at intermediate and long times: local particle currents far away from the dissipative site start to reverse direction ultimately leading to a metastable state with a total particle current pointing away from the lossy site. An alternative EOM approach based on an effective fermion model shows that the reversal of currents can be understood qualitatively by the creation of holon-doublon pairs at the edge of the region of reduced particle density. The doublons are then able to escape while the holes move towards the dissipative site, a process reminiscent—in a loose sense—of Hawking radiation.

  18. Risk prediction model: Statistical and artificial neural network approach

    NASA Astrophysics Data System (ADS)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  19. Moving university hydrology education forward with community-based geoinformatics, data and modeling resources

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Ruddell, B. L.

    2012-08-01

    In this opinion paper, we review recent literature related to data and modeling driven instruction in hydrology, and present our findings from surveying the hydrology education community in the United States. This paper presents an argument that that data and modeling driven geoscience cybereducation (DMDGC) approaches are essential for teaching the conceptual and applied aspects of hydrology, as a part of the broader effort to improve science, technology, engineering, and mathematics (STEM) education at the university level. The authors have undertaken a series of surveys and a workshop involving university hydrology educators to determine the state of the practice of DMDGC approaches to hydrology. We identify the most common tools and approaches currently utilized, quantify the extent of the adoption of DMDGC approaches in the university hydrology classroom, and explain the community's views on the challenges and barriers preventing DMDGC approaches from wider use. DMDGC approaches are currently emphasized at the graduate level of the curriculum, and only the most basic modeling and visualization tools are in widespread use. The community identifies the greatest barriers to greater adoption as a lack of access to easily adoptable curriculum materials and a lack of time and training to learn constantly changing tools and methods. The community's current consensus is that DMDGC approaches should emphasize conceptual learning, and should be used to complement rather than replace lecture-based pedagogies. Inadequate online material publication and sharing systems, and a lack of incentives for faculty to develop and publish materials via such systems, is also identified as a challenge. Based on these findings, we suggest that a number of steps should be taken by the community to develop the potential of DMDGC in university hydrology education, including formal development and assessment of curriculum materials, integrating lecture-format and DMDGC approaches, incentivizing the publication by faculty of excellent DMDGC curriculum materials, and implementing the publication and dissemination cyberinfrastructure necessary to support the unique DMDGC digital curriculum materials.

  20. Prediction of Tidal Elevations and Barotropic Currents in the Gulf of Bone

    NASA Astrophysics Data System (ADS)

    Purnamasari, Rika; Ribal, Agustinus; Kusuma, Jeffry

    2018-03-01

    Tidal elevation and barotropic current predictions in the gulf of Bone have been carried out in this work based on a two-dimensional, depth-integrated Advanced Circulation (ADCIRC-2DDI) model for 2017. Eight tidal constituents which were obtained from FES2012 have been imposed along the open boundary conditions. However, even using these very high-resolution tidal constituents, the discrepancy between the model and the data from tide gauge is still very high. In order to overcome such issues, Green’s function approach has been applied which reduced the root-mean-square error (RMSE) significantly. Two different starting times are used for predictions, namely from 2015 and 2016. After improving the open boundary conditions, RMSE between observation and model decreased significantly. In fact, RMSEs for 2015 and 2016 decreased 75.30% and 88.65%, respectively. Furthermore, the prediction for tidal elevations as well as tidal current, which is barotropic current, is carried out. This prediction was compared with the prediction conducted by Geospatial Information Agency (GIA) of Indonesia and we found that our prediction is much better than one carried out by GIA. Finally, since there is no tidal current observation available in this area, we assume that, when tidal elevations have been fixed, then the tidal current will approach the actual current velocity.

  1. The use of discrete-event simulation modeling to compare handwritten and electronic prescribing systems.

    PubMed

    Ghany, Ahmad; Vassanji, Karim; Kuziemsky, Craig; Keshavjee, Karim

    2013-01-01

    Electronic prescribing (e-prescribing) is expected to bring many benefits to Canadian healthcare, such as a reduction in errors and adverse drug reactions. As there currently is no functioning e-prescribing system in Canada that is completely electronic, we are unable to evaluate the performance of a live system. An alternative approach is to use simulation modeling for evaluation. We developed two discrete-event simulation models, one of the current handwritten prescribing system and one of a proposed e-prescribing system, to compare the performance of these two systems. We were able to compare the number of processes in each model, workflow efficiency, and the distribution of patients or prescriptions. Although we were able to compare these models to each other, using discrete-event simulation software was challenging. We were limited in the number of variables we could measure. We discovered non-linear processes and feedback loops in both models that could not be adequately represented using discrete-event simulation software. Finally, interactions between entities in both models could not be modeled using this type of software. We have come to the conclusion that a more appropriate approach to modeling both the handwritten and electronic prescribing systems would be to use a complex adaptive systems approach using agent-based modeling or systems-based modeling.

  2. When can time-dependent currents be reproduced by the Landauer steady-state approximation?

    NASA Astrophysics Data System (ADS)

    Carey, Rachel; Chen, Liping; Gu, Bing; Franco, Ignacio

    2017-05-01

    We establish well-defined limits in which the time-dependent electronic currents across a molecular junction subject to a fluctuating environment can be quantitatively captured via the Landauer steady-state approximation. For this, we calculate the exact time-dependent non-equilibrium Green's function (TD-NEGF) current along a model two-site molecular junction, in which the site energies are subject to correlated noise, and contrast it with that obtained from the Landauer approach. The ability of the steady-state approximation to capture the TD-NEGF behavior at each instant of time is quantified via the same-time correlation function of the currents obtained from the two methods, while their global agreement is quantified by examining differences in the average currents. The Landauer steady-state approach is found to be a useful approximation when (i) the fluctuations do not disrupt the degree of delocalization of the molecular eigenstates responsible for transport and (ii) the characteristic time for charge exchange between the molecule and leads is fast with respect to the molecular correlation time. For resonant transport, when these conditions are satisfied, the Landauer approach is found to accurately describe the current, both on average and at each instant of time. For non-resonant transport, we find that while the steady-state approach fails to capture the time-dependent transport at each instant of time, it still provides a good approximation to the average currents. These criteria can be employed to adopt effective modeling strategies for transport through molecular junctions in interaction with a fluctuating environment, as is necessary to describe experiments.

  3. How Do Tides and Tsunamis Interact in a Highly Energetic Channel? The Case of Canal Chacao, Chile

    NASA Astrophysics Data System (ADS)

    Winckler, Patricio; Sepúlveda, Ignacio; Aron, Felipe; Contreras-López, Manuel

    2017-12-01

    This study aims at understanding the role of tidal level, speed, and direction in tsunami propagation in highly energetic tidal channels. The main goal is to comprehend whether tide-tsunami interactions enhance/reduce elevation, currents speeds, and arrival times, when compared to pure tsunami models and to simulations in which tides and tsunamis are linearly superimposed. We designed various numerical experiments to compute the tsunami propagation along Canal Chacao, a highly energetic channel in the Chilean Patagonia lying on a subduction margin prone to megathrust earthquakes. Three modeling approaches were implemented under the same seismic scenario: a tsunami model with a constant tide level, a series of six composite models in which independent tide and tsunami simulations are linearly superimposed, and a series of six tide-tsunami nonlinear interaction models (full models). We found that hydrodynamic patterns differ significantly among approaches, being the composite and full models sensitive to both the tidal phase at which the tsunami is triggered and the local depth of the channel. When compared to full models, composite models adequately predicted the maximum surface elevation, but largely overestimated currents. The amplitude and arrival time of the tsunami-leading wave computed with the full model was found to be strongly dependent on the direction of the tidal current and less responsive to the tide level and the tidal current speed. These outcomes emphasize the importance of addressing more carefully the interactions of tides and tsunamis on hazard assessment studies.

  4. Heat capacities and volumetric changes in the glass transition range: a constitutive approach based on the standard linear solid

    NASA Astrophysics Data System (ADS)

    Lion, Alexander; Mittermeier, Christoph; Johlitz, Michael

    2017-09-01

    A novel approach to represent the glass transition is proposed. It is based on a physically motivated extension of the linear viscoelastic Poynting-Thomson model. In addition to a temperature-dependent damping element and two linear springs, two thermal strain elements are introduced. In order to take the process dependence of the specific heat into account and to model its characteristic behaviour below and above the glass transition, the Helmholtz free energy contains an additional contribution which depends on the temperature history and on the current temperature. The model describes the process-dependent volumetric and caloric behaviour of glass-forming materials, and defines a functional relationship between pressure, volumetric strain, and temperature. If a model for the isochoric part of the material behaviour is already available, for example a model of finite viscoelasticity, the caloric and volumetric behaviour can be represented with the current approach. The proposed model allows computing the isobaric and isochoric heat capacities in closed form. The difference c_p -c_v is process-dependent and tends towards the classical expression in the glassy and equilibrium ranges. Simulations and theoretical studies demonstrate the physical significance of the model.

  5. Hierarchical multi-scale approach to validation and uncertainty quantification of hyper-spectral image modeling

    NASA Astrophysics Data System (ADS)

    Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.

    2016-05-01

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.

  6. Synthesizing Technology Adoption and Learners' Approaches towards Active Learning in Higher Education

    ERIC Educational Resources Information Center

    Chan, Kevin; Cheung, George; Wan, Kelvin; Brown, Ian; Luk, Green

    2015-01-01

    In understanding how active and blended learning approaches with learning technologies engagement in undergraduate education, current research models tend to undermine the effect of learners' variations, particularly regarding their styles and approaches to learning, on intention and use of learning technologies. This study contributes to further…

  7. New approach based on tetrahedral-mesh geometry for accurate 4D Monte Carlo patient-dose calculation

    NASA Astrophysics Data System (ADS)

    Han, Min Cheol; Yeom, Yeon Soo; Kim, Chan Hyeong; Kim, Seonghoon; Sohn, Jason W.

    2015-02-01

    In the present study, to achieve accurate 4D Monte Carlo dose calculation in radiation therapy, we devised a new approach that combines (1) modeling of the patient body using tetrahedral-mesh geometry based on the patient’s 4D CT data, (2) continuous movement/deformation of the tetrahedral patient model by interpolation of deformation vector fields acquired through deformable image registration, and (3) direct transportation of radiation particles during the movement and deformation of the tetrahedral patient model. The results of our feasibility study show that it is certainly possible to construct 4D patient models (= phantoms) with sufficient accuracy using the tetrahedral-mesh geometry and to directly transport radiation particles during continuous movement and deformation of the tetrahedral patient model. This new approach not only produces more accurate dose distribution in the patient but also replaces the current practice of using multiple 3D voxel phantoms and combining multiple dose distributions after Monte Carlo simulations. For routine clinical application of our new approach, the use of fast automatic segmentation algorithms is a must. In order to achieve, simultaneously, both dose accuracy and computation speed, the number of tetrahedrons for the lungs should be optimized. Although the current computation speed of our new 4D Monte Carlo simulation approach is slow (i.e. ~40 times slower than that of the conventional dose accumulation approach), this problem is resolvable by developing, in Geant4, a dedicated navigation class optimized for particle transportation in tetrahedral-mesh geometry.

  8. A self-sensing active magnetic bearing based on a direct current measurement approach.

    PubMed

    Niemann, Andries C; van Schoor, George; du Rand, Carel P

    2013-09-11

    Active magnetic bearings (AMBs) have become a key technology in various industrial applications. Self-sensing AMBs provide an integrated sensorless solution for position estimation, consolidating the sensing and actuating functions into a single electromagnetic transducer. The approach aims to reduce possible hardware failure points, production costs, and system complexity. Despite these advantages, self-sensing methods must address various technical challenges to maximize the performance thereof. This paper presents the direct current measurement (DCM) approach for self-sensing AMBs, denoting the direct measurement of the current ripple component. In AMB systems, switching power amplifiers (PAs) modulate the rotor position information onto the current waveform. Demodulation self-sensing techniques then use bandpass and lowpass filters to estimate the rotor position from the voltage and current signals. However, the additional phase-shift introduced by these filters results in lower stability margins. The DCM approach utilizes a novel PA switching method that directly measures the current ripple to obtain duty-cycle invariant position estimates. Demodulation filters are largely excluded to minimize additional phase-shift in the position estimates. Basic functionality and performance of the proposed self-sensing approach are demonstrated via a transient simulation model as well as a high current (10 A) experimental system. A digital implementation of amplitude modulation self-sensing serves as a comparative estimator.

  9. INTERACTIVE PIT LAKES 2004 CONFERENCE

    EPA Science Inventory

    This CD and the workshop provide a pit lakes forum for the exchange of scientific information on current domestic and international approaches, including arid and wet regions throughout the world. These approaches include characterization, modeling/monitoring, and treatment and r...

  10. A genetic algorithm based global search strategy for population pharmacokinetic/pharmacodynamic model selection

    PubMed Central

    Sale, Mark; Sherer, Eric A

    2015-01-01

    The current algorithm for selecting a population pharmacokinetic/pharmacodynamic model is based on the well-established forward addition/backward elimination method. A central strength of this approach is the opportunity for a modeller to continuously examine the data and postulate new hypotheses to explain observed biases. This algorithm has served the modelling community well, but the model selection process has essentially remained unchanged for the last 30 years. During this time, more robust approaches to model selection have been made feasible by new technology and dramatic increases in computation speed. We review these methods, with emphasis on genetic algorithm approaches and discuss the role these methods may play in population pharmacokinetic/pharmacodynamic model selection. PMID:23772792

  11. Assessment of the magnetic field exposure due to the battery current of digital mobile phones.

    PubMed

    Jokela, Kari; Puranen, Lauri; Sihvonen, Ari-Pekka

    2004-01-01

    Hand-held digital mobile phones generate pulsed magnetic fields associated with the battery current. The peak value and the waveform of the battery current were measured for seven different models of digital mobile phones, and the results were applied to compute approximately the magnetic flux density and induced currents in the phone-user's head. A simple circular loop model was used for the magnetic field source and a homogeneous sphere consisting of average brain tissue equivalent material simulated the head. The broadband magnetic flux density and the maximal induced current density were compared with the guidelines of ICNIRP using two various approaches. In the first approach the relative exposure was determined separately at each frequency and the exposure ratios were summed to obtain the total exposure (multiple-frequency rule). In the second approach the waveform was weighted in the time domain with a simple low-pass RC filter and the peak value was divided by a peak limit, both derived from the guidelines (weighted peak approach). With the maximum transmitting power (2 W) the measured peak current varied from 1 to 2.7 A. The ICNIRP exposure ratio based on the current density varied from 0.04 to 0.14 for the weighted peak approach and from 0.08 to 0.27 for the multiple-frequency rule. The latter values are considerably greater than the corresponding exposure ratios 0.005 (min) to 0.013 (max) obtained by applying the evaluation based on frequency components presented by the new IEEE standard. Hence, the exposure does not seem to exceed the guidelines. The computed peak magnetic flux density exceeded substantially the derived peak reference level of ICNIRP, but it should be noted that in a near-field exposure the external field strengths are not valid indicators of exposure. Currently, no biological data exist to give a reason for concern about the health effects of magnetic field pulses from mobile phones.

  12. Magnetic Field of Conductive Objects as Superposition of Elementary Eddy Currents and Eddy Current Tomography

    NASA Astrophysics Data System (ADS)

    Sukhanov, D. Ya.; Zav'yalova, K. V.

    2018-03-01

    The paper represents induced currents in an electrically conductive object as a totality of elementary eddy currents. The proposed scanning method includes measurements of only one component of the secondary magnetic field. Reconstruction of the current distribution is performed by deconvolution with regularization. Numerical modeling supported by the field experiments show that this approach is of direct practical relevance.

  13. Bridging the gap between habitat-modeling research and bird conservation with dynamic landscape and population models

    Treesearch

    Frank R., III Thompson

    2009-01-01

    Habitat models are widely used in bird conservation planning to assess current habitat or populations and to evaluate management alternatives. These models include species-habitat matrix or database models, habitat suitability models, and statistical models that predict abundance. While extremely useful, these approaches have some limitations.

  14. Co-occurring Substance Abuse and Mental Health Problems among Homeless Persons: Suggestions for Research and Practice.

    PubMed

    Polcin, Douglas L

    Communities throughout the U.S. are struggling to find solutions for serious and persistent homelessness. Alcohol and drug problems can be causes and consequences of homelessness, as well as co-occurring problems that complicate efforts to succeed in finding stable housing. Two prominent service models exist, one known as "Housing First" takes a harm reduction approach and the other known as the "linear" model typically supports a goal of abstinence from alcohol and drugs. Despite their popularity, the research supporting these models suffers from methodological problems and inconsistent findings. One purpose of this paper is to describe systematic reviews of the homelessness services literature, which illustrate weaknesses in research designs and inconsistent conclusions about the effectiveness of current models. Problems among some of the seminal studies on homelessness include poorly defined inclusion and exclusion criteria, inadequate measures of alcohol and drug use, unspecified or poorly implemented comparison conditions, and lack of procedures documenting adherence to service models. Several recent papers have suggested broader based approaches for homeless services that integrate alternatives and respond better to consumer needs. Practical considerations for implementing a broader system of services are described and peer managed recovery homes are presented as examples of services that address some of the gaps in current approaches. Three issues are identified that need more attention from researchers: 1) improving upon the methodological limitations in current studies, 2) assessing the impact of broader based, integrated services on outcome, and 3) assessing approaches to the service needs of homeless persons involved in the criminal justice system.

  15. Co-occurring substance abuse and mental health problems among homeless persons: Suggestions for research and practice

    PubMed Central

    Polcin, Douglas L.

    2016-01-01

    Abstract Communities throughout the U.S. are struggling to find solutions for serious and persistent homelessness. Alcohol and drug problems can be causes and consequences of homelessness, as well as co-occurring problems that complicate efforts to succeed in finding stable housing. Two prominent service models exist, one known as “Housing First” takes a harm reduction approach and the other known as the “linear” model typically supports a goal of abstinence from alcohol and drugs. Despite their popularity, the research supporting these models suffers from methodological problems and inconsistent findings. One purpose of this paper is to describe systematic reviews of the homelessness services literature, which illustrate weaknesses in research designs and inconsistent conclusions about the effectiveness of current models. Problems among some of the seminal studies on homelessness include poorly defined inclusion and exclusion criteria, inadequate measures of alcohol and drug use, unspecified or poorly implemented comparison conditions, and lack of procedures documenting adherence to service models. Several recent papers have suggested broader based approaches for homeless services that integrate alternatives and respond better to consumer needs. Practical considerations for implementing a broader system of services are described and peer-managed recovery homes are presented as examples of services that address some of the gaps in current approaches. Three issues are identified that need more attention from researchers: (1) improving upon the methodological limitations in current studies, (2) assessing the impact of broader based, integrated services on outcome, and (3) assessing approaches to the service needs of homeless persons involved in the criminal justice system. PMID:27092027

  16. Decision analysis and risk models for land development affecting infrastructure systems.

    PubMed

    Thekdi, Shital A; Lambert, James H

    2012-07-01

    Coordination and layering of models to identify risks in complex systems such as large-scale infrastructure of energy, water, and transportation is of current interest across application domains. Such infrastructures are increasingly vulnerable to adjacent commercial and residential land development. Land development can compromise the performance of essential infrastructure systems and increase the costs of maintaining or increasing performance. A risk-informed approach to this topic would be useful to avoid surprise, regret, and the need for costly remedies. This article develops a layering and coordination of models for risk management of land development affecting infrastructure systems. The layers are: system identification, expert elicitation, predictive modeling, comparison of investment alternatives, and implications of current decisions for future options. The modeling layers share a focus on observable factors that most contribute to volatility of land development and land use. The relevant data and expert evidence include current and forecasted growth in population and employment, conservation and preservation rules, land topography and geometries, real estate assessments, market and economic conditions, and other factors. The approach integrates to a decision framework of strategic considerations based on assessing risk, cost, and opportunity in order to prioritize needs and potential remedies that mitigate impacts of land development to the infrastructure systems. The approach is demonstrated for a 5,700-mile multimodal transportation system adjacent to 60,000 tracts of potential land development. © 2011 Society for Risk Analysis.

  17. Physics-of-Failure Approach to Prognostics

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.

    2017-01-01

    As more and more electric vehicles emerge in our daily operation progressively, a very critical challenge lies in accurate prediction of the electrical components present in the system. In case of electric vehicles, computing remaining battery charge is safety-critical. In order to tackle and solve the prediction problem, it is essential to have awareness of the current state and health of the system, especially since it is necessary to perform condition-based predictions. To be able to predict the future state of the system, it is also required to possess knowledge of the current and future operations of the vehicle. In this presentation our approach to develop a system level health monitoring safety indicator for different electronic components is presented which runs estimation and prediction algorithms to determine state-of-charge and estimate remaining useful life of respective components. Given models of the current and future system behavior, the general approach of model-based prognostics can be employed as a solution to the prediction problem and further for decision making.

  18. Towards a whole-cell modeling approach for synthetic biology

    NASA Astrophysics Data System (ADS)

    Purcell, Oliver; Jain, Bonny; Karr, Jonathan R.; Covert, Markus W.; Lu, Timothy K.

    2013-06-01

    Despite rapid advances over the last decade, synthetic biology lacks the predictive tools needed to enable rational design. Unlike established engineering disciplines, the engineering of synthetic gene circuits still relies heavily on experimental trial-and-error, a time-consuming and inefficient process that slows down the biological design cycle. This reliance on experimental tuning is because current modeling approaches are unable to make reliable predictions about the in vivo behavior of synthetic circuits. A major reason for this lack of predictability is that current models view circuits in isolation, ignoring the vast number of complex cellular processes that impinge on the dynamics of the synthetic circuit and vice versa. To address this problem, we present a modeling approach for the design of synthetic circuits in the context of cellular networks. Using the recently published whole-cell model of Mycoplasma genitalium, we examined the effect of adding genes into the host genome. We also investigated how codon usage correlates with gene expression and find agreement with existing experimental results. Finally, we successfully implemented a synthetic Goodwin oscillator in the whole-cell model. We provide an updated software framework for the whole-cell model that lays the foundation for the integration of whole-cell models with synthetic gene circuit models. This software framework is made freely available to the community to enable future extensions. We envision that this approach will be critical to transforming the field of synthetic biology into a rational and predictive engineering discipline.

  19. A New Multivariate Approach in Generating Ensemble Meteorological Forcings for Hydrological Forecasting

    NASA Astrophysics Data System (ADS)

    Khajehei, Sepideh; Moradkhani, Hamid

    2015-04-01

    Producing reliable and accurate hydrologic ensemble forecasts are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model structure, and model parameters. Producing reliable and skillful precipitation ensemble forecasts is one approach to reduce the total uncertainty in hydrological applications. Currently, National Weather Prediction (NWP) models are developing ensemble forecasts for various temporal ranges. It is proven that raw products from NWP models are biased in mean and spread. Given the above state, there is a need for methods that are able to generate reliable ensemble forecasts for hydrological applications. One of the common techniques is to apply statistical procedures in order to generate ensemble forecast from NWP-generated single-value forecasts. The procedure is based on the bivariate probability distribution between the observation and single-value precipitation forecast. However, one of the assumptions of the current method is fitting Gaussian distribution to the marginal distributions of observed and modeled climate variable. Here, we have described and evaluated a Bayesian approach based on Copula functions to develop an ensemble precipitation forecast from the conditional distribution of single-value precipitation forecasts. Copula functions are known as the multivariate joint distribution of univariate marginal distributions, which are presented as an alternative procedure in capturing the uncertainties related to meteorological forcing. Copulas are capable of modeling the joint distribution of two variables with any level of correlation and dependency. This study is conducted over a sub-basin in the Columbia River Basin in USA using the monthly precipitation forecasts from Climate Forecast System (CFS) with 0.5x0.5 Deg. spatial resolution to reproduce the observations. The verification is conducted on a different period and the superiority of the procedure is compared with Ensemble Pre-Processor approach currently used by National Weather Service River Forecast Centers in USA.

  20. Angular velocity estimation based on star vector with improved current statistical model Kalman filter.

    PubMed

    Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Zhang, He

    2016-11-20

    Angular velocity information is a requisite for a spacecraft guidance, navigation, and control system. In this paper, an approach for angular velocity estimation based merely on star vector measurement with an improved current statistical model Kalman filter is proposed. High-precision angular velocity estimation can be achieved under dynamic conditions. The amount of calculation is also reduced compared to a Kalman filter. Different trajectories are simulated to test this approach, and experiments with real starry sky observation are implemented for further confirmation. The estimation accuracy is proved to be better than 10-4  rad/s under various conditions. Both the simulation and the experiment demonstrate that the described approach is effective and shows an excellent performance under both static and dynamic conditions.

  1. Reverse engineering systems models of regulation: discovery, prediction and mechanisms.

    PubMed

    Ashworth, Justin; Wurtmann, Elisabeth J; Baliga, Nitin S

    2012-08-01

    Biological systems can now be understood in comprehensive and quantitative detail using systems biology approaches. Putative genome-scale models can be built rapidly based upon biological inventories and strategic system-wide molecular measurements. Current models combine statistical associations, causative abstractions, and known molecular mechanisms to explain and predict quantitative and complex phenotypes. This top-down 'reverse engineering' approach generates useful organism-scale models despite noise and incompleteness in data and knowledge. Here we review and discuss the reverse engineering of biological systems using top-down data-driven approaches, in order to improve discovery, hypothesis generation, and the inference of biological properties. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. A complex systems approach to constructing better models for managing financial markets and the economy

    NASA Astrophysics Data System (ADS)

    Farmer, J. Doyne; Gallegati, M.; Hommes, C.; Kirman, A.; Ormerod, P.; Cincotti, S.; Sanchez, A.; Helbing, D.

    2012-11-01

    We outline a vision for an ambitious program to understand the economy and financial markets as a complex evolving system of coupled networks of interacting agents. This is a completely different vision from that currently used in most economic models. This view implies new challenges and opportunities for policy and managing economic crises. The dynamics of such models inherently involve sudden and sometimes dramatic changes of state. Further, the tools and approaches we use emphasize the analysis of crises rather than of calm periods. In this they respond directly to the calls of Governors Bernanke and Trichet for new approaches to macroeconomic modelling.

  3. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    NASA Astrophysics Data System (ADS)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2017-05-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  4. Optimization of the short-circuit current in an InP nanowire array solar cell through opto-electronic modeling.

    PubMed

    Chen, Yang; Kivisaari, Pyry; Pistol, Mats-Erik; Anttu, Nicklas

    2016-09-23

    InP nanowire arrays with axial p-i-n junctions are promising devices for next-generation photovoltaics, with a demonstrated efficiency of 13.8%. However, the short-circuit current in such arrays does not match their absorption performance. Here, through combined optical and electrical modeling, we study how the absorption of photons and separation of the resulting photogenerated electron-hole pairs define and limit the short-circuit current in the nanowires. We identify how photogenerated minority carriers in the top n segment (i.e. holes) diffuse to the ohmic top contact where they recombine without contributing to the short-circuit current. In our modeling, such contact recombination can lead to a 60% drop in the short-circuit current. To hinder such hole diffusion, we include a gradient doping profile in the n segment to create a front surface barrier. This approach leads to a modest 5% increase in the short-circuit current, limited by Auger recombination with increased doping. A more efficient approach is to switch the n segment to a material with a higher band gap, like GaP. Then, a much smaller number of holes is photogenerated in the n segment, strongly limiting the amount that can diffuse and disappear into the top contact. For a 500 nm long top segment, the GaP approach leads to a 50% higher short-circuit current than with an InP top segment. Such a long top segment could facilitate the fabrication and contacting of nanowire array solar cells. Such design schemes for managing minority carriers could open the door to higher performance in single- and multi-junction nanowire-based solar cells.

  5. Modeling microbial communities: current, developing, and future technologies for predicting microbial community interaction.

    PubMed

    Larsen, Peter; Hamada, Yuki; Gilbert, Jack

    2012-07-31

    Never has there been a greater opportunity for investigating microbial communities. Not only are the profound effects of microbial ecology on every aspect of Earth's geochemical cycles beginning to be understood, but also the analytical and computational tools for investigating microbial Earth are undergoing a rapid revolution. This environmental microbial interactome, the system of interactions between the microbiome and the environment, has shaped the planet's past and will undoubtedly continue to do so in the future. We review recent approaches for modeling microbial community structures and the interactions of microbial populations with their environments. Different modeling approaches consider the environmental microbial interactome from different aspects, and each provides insights to different facets of microbial ecology. We discuss the challenges and opportunities for the future of microbial modeling and describe recent advances in microbial community modeling that are extending current descriptive technologies into a predictive science. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Electricity Market Manipulation: How Behavioral Modeling Can Help Market Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallo, Giulia

    The question of how to best design electricity markets to integrate variable and uncertain renewable energy resources is becoming increasingly important as more renewable energy is added to electric power systems. Current markets were designed based on a set of assumptions that are not always valid in scenarios of high penetrations of renewables. In a future where renewables might have a larger impact on market mechanisms as well as financial outcomes, there is a need for modeling tools and power system modeling software that can provide policy makers and industry actors with more realistic representations of wholesale markets. One optionmore » includes using agent-based modeling frameworks. This paper discusses how key elements of current and future wholesale power markets can be modeled using an agent-based approach and how this approach may become a useful paradigm that researchers can employ when studying and planning for power systems of the future.« less

  7. A coarse-grained DNA model for the prediction of current signals in DNA translocation experiments

    NASA Astrophysics Data System (ADS)

    Weik, Florian; Kesselheim, Stefan; Holm, Christian

    2016-11-01

    We present an implicit solvent coarse-grained double-stranded DNA (dsDNA) model confined to an infinite cylindrical pore that reproduces the experimentally observed current modulations of a KaCl solution at various concentrations. Our model extends previous coarse-grained and mean-field approaches by incorporating a position dependent friction term on the ions, which Kesselheim et al. [Phys. Rev. Lett. 112, 018101 (2014)] identified as an essential ingredient to correctly reproduce the experimental data of Smeets et al. [Nano Lett. 6, 89 (2006)]. Our approach reduces the computational effort by orders of magnitude compared with all-atom simulations and serves as a promising starting point for modeling the entire translocation process of dsDNA. We achieve a consistent description of the system's electrokinetics by using explicitly parameterized ions, a friction term between the DNA beads and the ions, and a lattice-Boltzmann model for the solvent.

  8. The development and evaluation of a hydrological seasonal forecast system prototype for predicting spring flood volumes in Swedish rivers

    NASA Astrophysics Data System (ADS)

    Foster, Kean; Bertacchi Uvo, Cintia; Olsson, Jonas

    2018-05-01

    Hydropower makes up nearly half of Sweden's electrical energy production. However, the distribution of the water resources is not aligned with demand, as most of the inflows to the reservoirs occur during the spring flood period. This means that carefully planned reservoir management is required to help redistribute water resources to ensure optimal production and accurate forecasts of the spring flood volume (SFV) is essential for this. The current operational SFV forecasts use a historical ensemble approach where the HBV model is forced with historical observations of precipitation and temperature. In this work we develop and test a multi-model prototype, building on previous work, and evaluate its ability to forecast the SFV in 84 sub-basins in northern Sweden. The hypothesis explored in this work is that a multi-model seasonal forecast system incorporating different modelling approaches is generally more skilful at forecasting the SFV in snow dominated regions than a forecast system that utilises only one approach. The testing is done using cross-validated hindcasts for the period 1981-2015 and the results are evaluated against both climatology and the current system to determine skill. Both the multi-model methods considered showed skill over the reference forecasts. The version that combined the historical modelling chain, dynamical modelling chain, and statistical modelling chain performed better than the other and was chosen for the prototype. The prototype was able to outperform the current operational system 57 % of the time on average and reduce the error in the SFV by ˜ 6 % across all sub-basins and forecast dates.

  9. Finite element analysis of gradient z-coil induced eddy currents in a permanent MRI magnet.

    PubMed

    Li, Xia; Xia, Ling; Chen, Wufan; Liu, Feng; Crozier, Stuart; Xie, Dexin

    2011-01-01

    In permanent magnetic resonance imaging (MRI) systems, pulsed gradient fields induce strong eddy currents in the conducting structures of the magnet body. The gradient field for image encoding is perturbed by these eddy currents leading to MR image distortions. This paper presents a comprehensive finite element (FE) analysis of the eddy current generation in the magnet conductors. In the proposed FE model, the hysteretic characteristics of ferromagnetic materials are considered and a scalar Preisach hysteresis model is employed. The developed FE model was applied to study gradient z-coil induced eddy currents in a 0.5 T permanent MRI device. The simulation results demonstrate that the approach could be effectively used to investigate eddy current problems involving ferromagnetic materials. With the knowledge gained from this eddy current model, our next step is to design a passive magnet structure and active gradient coils to reduce the eddy current effects. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. Legacy nutrient dynamics and patterns of catchment response under changing land use and management

    NASA Astrophysics Data System (ADS)

    Attinger, S.; Van, M. K.; Basu, N. B.

    2017-12-01

    Watersheds are complex heterogeneous systems that store, transform, and release water and nutrients under a broad distribution of both natural and anthropogenic controls. Many current watershed models, from complex numerical models to simpler reservoir-type models, are considered to be well-developed in their ability to predict fluxes of water and nutrients to streams and groundwater. They are generally less adept, however, at capturing watershed storage dynamics. In other words, many current models are run with an assumption of steady-state dynamics, and focus on nutrient flows rather than changes in nutrient stocks within watersheds. Although these commonly used modeling approaches may be able to adequately capture short-term watershed dynamics, they are unable to represent the clear nonlinearities or hysteresis responses observed in watersheds experiencing significant changes in nutrient inputs. To address such a lack, we have, in the present work, developed a parsimonious modeling approach designed to capture long-term catchment responses to spatial and temporal changes in nutrient inputs. In this approach, we conceptualize the catchment as a biogeochemical reactor that is driven by nutrient inputs, characterized internally by both biogeochemical degradation and residence or travel time distributions, resulting in a specific nutrient output. For the model simulations, we define a range of different scenarios to represent real-world changes in land use and management implemented to improve water quality. We then introduce the concept of state-space trajectories to describe system responses to these potential changes in anthropogenic forcings. We also increase model complexity, in a stepwise fashion, by dividing the catchment into multiple biogeochemical reactors, coupled in series or in parallel. Using this approach, we attempt to answer the following questions: (1) What level of model complexity is needed to capture observed system responses? (2) How can we explain different patterns of nonlinearity in watershed nutrient dynamics? And finally, how does the accumulation of nutrient legacies within watersheds impact current and future water quality?

  11. Linking Goal-Oriented Requirements and Model-Driven Development

    NASA Astrophysics Data System (ADS)

    Pastor, Oscar; Giachetti, Giovanni

    In the context of Goal-Oriented Requirement Engineering (GORE) there are interesting modeling approaches for the analysis of complex scenarios that are oriented to obtain and represent the relevant requirements for the development of software products. However, the way to use these GORE models in an automated Model-Driven Development (MDD) process is not clear, and, in general terms, the translation of these models into the final software products is still manually performed. Therefore, in this chapter, we show an approach to automatically link GORE models and MDD processes, which has been elaborated by considering the experience obtained from linking the i * framework with an industrially applied MDD approach. The linking approach proposed is formulated by means of a generic process that is based on current modeling standards and technologies in order to facilitate its application for different MDD and GORE approaches. Special attention is paid to how this process generates appropriate model transformation mechanisms to automatically obtain MDD conceptual models from GORE models, and how it can be used to specify validation mechanisms to assure the correct model transformations.

  12. Two approaches to improving mental health care: positivist/quantitative versus skill-based/qualitative.

    PubMed

    Luchins, Daniel

    2012-01-01

    The quality improvement model currently used in medicine and mental health was adopted from industry, where it developed out of early 20th-century efforts to apply a positivist/quantitative agenda to improving manufacturing. This article questions the application of this model to mental health care. It argues that (1) developing "operational definitions" for something as value-laden as "quality" risks conflating two realms, what we measure with what we value; (2) when measurements that are tied to individuals are aggregated to establish benchmarks and goals, unwarranted mathematical assumptions are made; (3) choosing clinical outcomes is problematic; (4) there is little relationship between process measures and clinical outcomes; and (5) since changes in quality indices do not relate to improved clinical care, management's reliance on such indices provides an illusory sense of control. An alternative model is the older, skill-based/qualitative approach to knowing, which relies on "implicit/ expert" knowledge. These two approaches offer a series of contrasts: quality versus excellence, competence versus expertise, management versus leadership, extrinsic versus intrinsic rewards. The article concludes that we need not totally dispense with the current quality improvement model, but rather should balance quantitative efforts with the older qualitative approach in a mixed methods model.

  13. Working To Learn: A Holistic Approach to Young People's Education and Training.

    ERIC Educational Resources Information Center

    Senker, Peter; Rainbird, Helen; Evans, Karen; Hodkinson, Phil; Keep, Ewart; Maguire, Malcolm; Raffe, David; Unwin, Lorna

    2000-01-01

    Highlights deficiencies in current British policies on work-based learning for 16-19 year-olds. Discusses problems arising from employers' voluntary participation. Outlines a holistic approach based on the community of practice model. (SK)

  14. General Information: Chapman Conference on Magnetospheric Current Systems

    NASA Technical Reports Server (NTRS)

    Spicer, Daniel S.; Curtis, Steven

    1999-01-01

    The goal of this conference is to address recent achievements of observational, computational, theoretical, and modeling studies, and to foster communication among people working with different approaches. Electric current systems play an important role in the energetics of the magnetosphere. This conference will target outstanding issues related to magnetospheric current systems, placing its emphasis on interregional processes and driving mechanisms of current systems.

  15. Influence of boundary conditions on the hydrodynamic forces of an oscillating sphere

    NASA Astrophysics Data System (ADS)

    Mirauda, Domenica; Negri, Marco; Martinelli, Luca; Malavasi, Stefano

    2018-06-01

    The design of submerged structures in sea currents presents certain problems that are not only connected to the shape of the obstacle but also to the number of acting forces as well as the correct modelling of the structures dynamic response. Currently, the common approach is that of integrated numerical modelling, which considers the contribution of both current and structure. The reliability of such an approach is better verified with experimental tests performed on models of simple geometry. On the basis of these considerations, the present work analyses the hydrodynamic forces acting on a sphere, which is characterised by a low mass ratio and damping. The sphere is immersed in a free surface flow and can oscillate along the streamwise and transverse flow direction. It is located at three different positions inside the current: close to the channel bottom, near the free surface and in the middle, and equally distant from both the bottom and free surface. The obtained results for different boundaries and flow kinematic conditions show a relevant influence of the free surface on the hydrodynamic forces along both the streamwise and transverse flow directions.

  16. Students’ mental model in electric current

    NASA Astrophysics Data System (ADS)

    Pramesti, Y. S.; Setyowidodo, I.

    2018-05-01

    Electricity is one of essential topic in learning physics. This topic was studied in elementary until university level. Although electricity was related to our daily activities, but it doesn’t ensure that students have the correct concept. The aim of this research was to investigate and then categorized the students’ mental model. Subject consisted of 59 students of mechanical engineering that studied Physics for Engineering. This study was used a qualitative approach that used in this research is phenomenology. Data were analyzed qualitatively by using pre-test, post-test, and investigation for discovering further information. Three models were reported, showing a pattern which related to individual way of thinking about electric current. The mental model that was discovered in this research are: 1) electric current as a flow; 2) electric current as a source of energy, 3) electric current as a moving charge.

  17. Population Density and Moment-based Approaches to Modeling Domain Calcium-mediated Inactivation of L-type Calcium Channels.

    PubMed

    Wang, Xiao; Hardcastle, Kiah; Weinberg, Seth H; Smith, Gregory D

    2016-03-01

    We present a population density and moment-based description of the stochastic dynamics of domain [Formula: see text]-mediated inactivation of L-type [Formula: see text] channels. Our approach accounts for the effect of heterogeneity of local [Formula: see text] signals on whole cell [Formula: see text] currents; however, in contrast with prior work, e.g., Sherman et al. (Biophys J 58(4):985-995, 1990), we do not assume that [Formula: see text] domain formation and collapse are fast compared to channel gating. We demonstrate the population density and moment-based modeling approaches using a 12-state Markov chain model of an L-type [Formula: see text] channel introduced by Greenstein and Winslow (Biophys J 83(6):2918-2945, 2002). Simulated whole cell voltage clamp responses yield an inactivation function for the whole cell [Formula: see text] current that agrees with the traditional approach when domain dynamics are fast. We analyze the voltage-dependence of [Formula: see text] inactivation that may occur via slow heterogeneous domain [[Formula: see text

  18. POPULATION EXPOSURES TO PARTICULATE MATTER: A COMPARISON OF EXPOSURE MODEL PREDICTIONS AND MEASUREMENT DATA

    EPA Science Inventory

    The US EPA National Exposure Research Laboratory (NERL) is currently developing an integrated human exposure source-to-dose modeling system (HES2D). This modeling system will incorporate models that use a probabilistic approach to predict population exposures to environmental ...

  19. Desiderata: Towards Indigenous Models of Vocational Psychology

    ERIC Educational Resources Information Center

    Leong, Frederick T. L.; Pearce, Marina

    2011-01-01

    As a result of a relative lack of cross-cultural validity in most current (Western) psychological models, indigenous models of psychology have recently become a popular approach for understanding behaviour in specific cultures. Such models would be valuable to vocational psychology research with culturally diverse populations. Problems facing…

  20. Pupils' Representations of Electric Current before, during and after Instruction on DC Circuits.

    ERIC Educational Resources Information Center

    Psillos, D.; And Others

    1987-01-01

    Reported are compulsory education pupils' representations of electric current in a constructivist approach to introducing direct current (DC) circuits. Suggests that the pupils views can be modelled after an energy framework. Makes suggestions about the content, the apparatus and the experiments used in teaching DC circuits. (CW)

  1. Glass Property Models and Constraints for Estimating the Glass to be Produced at Hanford by Implementing Current Advanced Glass Formulation Efforts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vienna, John D.; Kim, Dong-Sang; Skorski, Daniel C.

    2013-07-01

    Recent glass formulation and melter testing data have suggested that significant increases in waste loading in HLW and LAW glasses are possible over current system planning estimates. The data (although limited in some cases) were evaluated to determine a set of constraints and models that could be used to estimate the maximum loading of specific waste compositions in glass. It is recommended that these models and constraints be used to estimate the likely HLW and LAW glass volumes that would result if the current glass formulation studies are successfully completed. It is recognized that some of the models are preliminarymore » in nature and will change in the coming years. Plus the models do not currently address the prediction uncertainties that would be needed before they could be used in plant operations. The models and constraints are only meant to give an indication of rough glass volumes and are not intended to be used in plant operation or waste form qualification activities. A current research program is in place to develop the data, models, and uncertainty descriptions for that purpose. A fundamental tenet underlying the research reported in this document is to try to be less conservative than previous studies when developing constraints for estimating the glass to be produced by implementing current advanced glass formulation efforts. The less conservative approach documented herein should allow for the estimate of glass masses that may be realized if the current efforts in advanced glass formulations are completed over the coming years and are as successful as early indications suggest they may be. Because of this approach there is an unquantifiable uncertainty in the ultimate glass volume projections due to model prediction uncertainties that has to be considered along with other system uncertainties such as waste compositions and amounts to be immobilized, split factors between LAW and HLW, etc.« less

  2. Remote sensing of ocean surface currents: a review of what is being observed and what is being assimilated

    NASA Astrophysics Data System (ADS)

    Isern-Fontanet, Jordi; Ballabrera-Poy, Joaquim; Turiel, Antonio; García-Ladona, Emilio

    2017-10-01

    Ocean currents play a key role in Earth's climate - they impact almost any process taking place in the ocean and are of major importance for navigation and human activities at sea. Nevertheless, their observation and forecasting are still difficult. First, no observing system is able to provide direct measurements of global ocean currents on synoptic scales. Consequently, it has been necessary to use sea surface height and sea surface temperature measurements and refer to dynamical frameworks to derive the velocity field. Second, the assimilation of the velocity field into numerical models of ocean circulation is difficult mainly due to lack of data. Recent experiments that assimilate coastal-based radar data have shown that ocean currents will contribute to increasing the forecast skill of surface currents, but require application in multidata assimilation approaches to better identify the thermohaline structure of the ocean. In this paper we review the current knowledge in these fields and provide a global and systematic view of the technologies to retrieve ocean velocities in the upper ocean and the available approaches to assimilate this information into ocean models.

  3. Models for forecasting hospital bed requirements in the acute sector.

    PubMed Central

    Farmer, R D; Emami, J

    1990-01-01

    STUDY OBJECTIVE--The aim was to evaluate the current approach to forecasting hospital bed requirements. DESIGN--The study was a time series and regression analysis. The time series for mean duration of stay for general surgery in the age group 15-44 years (1969-1982) was used in the evaluation of different methods of forecasting future values of mean duration of stay and its subsequent use in the formation of hospital bed requirements. RESULTS--It has been suggested that the simple trend fitting approach suffers from model specification error and imposes unjustified restrictions on the data. Time series approach (Box-Jenkins method) was shown to be a more appropriate way of modelling the data. CONCLUSION--The simple trend fitting approach is inferior to the time series approach in modelling hospital bed requirements. PMID:2277253

  4. Life extending control for rocket engines

    NASA Technical Reports Server (NTRS)

    Lorenzo, C. F.; Saus, J. R.; Ray, A.; Carpino, M.; Wu, M.-K.

    1992-01-01

    The concept of life extending control is defined. A brief discussion of current fatigue life prediction methods is given and the need for an alternative life prediction model based on a continuous functional relationship is established. Two approaches to life extending control are considered: (1) the implicit approach which uses cyclic fatigue life prediction as a basis for control design; and (2) the continuous life prediction approach which requires a continuous damage law. Progress on an initial formulation of a continuous (in time) fatigue model is presented. Finally, nonlinear programming is used to develop initial results for life extension for a simplified rocket engine (model).

  5. Team Approach to Staffing the Reference Center: A Speculation.

    ERIC Educational Resources Information Center

    Lawson, Mollie D.; And Others

    This document applies theories of participatory management to a proposal for a model that uses a team approach to staffing university library reference centers. In particular, the Ward Edwards Library at Central Missouri State University is examined in terms of the advantages and disadvantages of its current approach. Special attention is given to…

  6. Advanced EUV mask and imaging modeling

    NASA Astrophysics Data System (ADS)

    Evanschitzky, Peter; Erdmann, Andreas

    2017-10-01

    The exploration and optimization of image formation in partially coherent EUV projection systems with complex source shapes requires flexible, accurate, and efficient simulation models. This paper reviews advanced mask diffraction and imaging models for the highly accurate and fast simulation of EUV lithography systems, addressing important aspects of the current technical developments. The simulation of light diffraction from the mask employs an extended rigorous coupled wave analysis (RCWA) approach, which is optimized for EUV applications. In order to be able to deal with current EUV simulation requirements, several additional models are included in the extended RCWA approach: a field decomposition and a field stitching technique enable the simulation of larger complex structured mask areas. An EUV multilayer defect model including a database approach makes the fast and fully rigorous defect simulation and defect repair simulation possible. A hybrid mask simulation approach combining real and ideal mask parts allows the detailed investigation of the origin of different mask 3-D effects. The image computation is done with a fully vectorial Abbe-based approach. Arbitrary illumination and polarization schemes and adapted rigorous mask simulations guarantee a high accuracy. A fully vectorial sampling-free description of the pupil with Zernikes and Jones pupils and an optimized representation of the diffraction spectrum enable the computation of high-resolution images with high accuracy and short simulation times. A new pellicle model supports the simulation of arbitrary membrane stacks, pellicle distortions, and particles/defects on top of the pellicle. Finally, an extension for highly accurate anamorphic imaging simulations is included. The application of the models is demonstrated by typical use cases.

  7. Models of current sintering

    NASA Astrophysics Data System (ADS)

    Angst, Sebastian; Engelke, Lukas; Winterer, Markus; Wolf, Dietrich E.

    2017-06-01

    Densification of (semi-)conducting particle agglomerates with the help of an electrical current is much faster and more energy efficient than traditional thermal sintering or powder compression. Therefore, this method becomes more and more common among experimentalists, engineers, and in industry. The mechanisms at work at the particle scale are highly complex because of the mutual feedback between current and pore structure. This paper extends previous modelling approaches in order to study mixtures of particles of two different materials. In addition to the delivery of Joule heat throughout the sample, especially in current bottlenecks, thermoelectric effects must be taken into account. They lead to segregation or spatial correlations in the particle arrangement. Various model extensions are possible and will be discussed.

  8. Integrating models with data in ecology and palaeoecology: advances towards a model-data fusion approach.

    PubMed

    Peng, Changhui; Guiot, Joel; Wu, Haibin; Jiang, Hong; Luo, Yiqi

    2011-05-01

    It is increasingly being recognized that global ecological research requires novel methods and strategies in which to combine process-based ecological models and data in cohesive, systematic ways. Model-data fusion (MDF) is an emerging area of research in ecology and palaeoecology. It provides a new quantitative approach that offers a high level of empirical constraint over model predictions based on observations using inverse modelling and data assimilation (DA) techniques. Increasing demands to integrate model and data methods in the past decade has led to MDF utilization in palaeoecology, ecology and earth system sciences. This paper reviews key features and principles of MDF and highlights different approaches with regards to DA. After providing a critical evaluation of the numerous benefits of MDF and its current applications in palaeoecology (i.e., palaeoclimatic reconstruction, palaeovegetation and palaeocarbon storage) and ecology (i.e. parameter and uncertainty estimation, model error identification, remote sensing and ecological forecasting), the paper discusses method limitations, current challenges and future research direction. In the ongoing data-rich era of today's world, MDF could become an important diagnostic and prognostic tool in which to improve our understanding of ecological processes while testing ecological theory and hypotheses and forecasting changes in ecosystem structure, function and services. © 2011 Blackwell Publishing Ltd/CNRS.

  9. Hierarchical Multi-Scale Approach To Validation and Uncertainty Quantification of Hyper-Spectral Image Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensormore » level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.« less

  10. Current models broadly neglect specific needs of biodiversity conservation in protected areas under climate change

    PubMed Central

    2011-01-01

    Background Protected areas are the most common and important instrument for the conservation of biological diversity and are called for under the United Nations' Convention on Biological Diversity. Growing human population densities, intensified land-use, invasive species and increasing habitat fragmentation threaten ecosystems worldwide and protected areas are often the only refuge for endangered species. Climate change is posing an additional threat that may also impact ecosystems currently under protection. Therefore, it is of crucial importance to include the potential impact of climate change when designing future nature conservation strategies and implementing protected area management. This approach would go beyond reactive crisis management and, by necessity, would include anticipatory risk assessments. One avenue for doing so is being provided by simulation models that take advantage of the increase in computing capacity and performance that has occurred over the last two decades. Here we review the literature to determine the state-of-the-art in modeling terrestrial protected areas under climate change, with the aim of evaluating and detecting trends and gaps in the current approaches being employed, as well as to provide a useful overview and guidelines for future research. Results Most studies apply statistical, bioclimatic envelope models and focus primarily on plant species as compared to other taxa. Very few studies utilize a mechanistic, process-based approach and none examine biotic interactions like predation and competition. Important factors like land-use, habitat fragmentation, invasion and dispersal are rarely incorporated, restricting the informative value of the resulting predictions considerably. Conclusion The general impression that emerges is that biodiversity conservation in protected areas could benefit from the application of modern modeling approaches to a greater extent than is currently reflected in the scientific literature. It is particularly true that existing models have been underutilized in testing different management options under climate change. Based on these findings we suggest a strategic framework for more effectively incorporating the impact of climate change in models exploring the effectiveness of protected areas. PMID:21539736

  11. Bridging paradigms: hybrid mechanistic-discriminative predictive models.

    PubMed

    Doyle, Orla M; Tsaneva-Atansaova, Krasimira; Harte, James; Tiffin, Paul A; Tino, Peter; Díaz-Zuccarini, Vanessa

    2013-03-01

    Many disease processes are extremely complex and characterized by multiple stochastic processes interacting simultaneously. Current analytical approaches have included mechanistic models and machine learning (ML), which are often treated as orthogonal viewpoints. However, to facilitate truly personalized medicine, new perspectives may be required. This paper reviews the use of both mechanistic models and ML in healthcare as well as emerging hybrid methods, which are an exciting and promising approach for biologically based, yet data-driven advanced intelligent systems.

  12. DATA ASSIMILATION APPROACH FOR FORECAST OF SOLAR ACTIVITY CYCLES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitiashvili, Irina N., E-mail: irina.n.kitiashvili@nasa.gov

    Numerous attempts to predict future solar cycles are mostly based on empirical relations derived from observations of previous cycles, and they yield a wide range of predicted strengths and durations of the cycles. Results obtained with current dynamo models also deviate strongly from each other, thus raising questions about criteria to quantify the reliability of such predictions. The primary difficulties in modeling future solar activity are shortcomings of both the dynamo models and observations that do not allow us to determine the current and past states of the global solar magnetic structure and its dynamics. Data assimilation is a relativelymore » new approach to develop physics-based predictions and estimate their uncertainties in situations where the physical properties of a system are not well-known. This paper presents an application of the ensemble Kalman filter method for modeling and prediction of solar cycles through use of a low-order nonlinear dynamo model that includes the essential physics and can describe general properties of the sunspot cycles. Despite the simplicity of this model, the data assimilation approach provides reasonable estimates for the strengths of future solar cycles. In particular, the prediction of Cycle 24 calculated and published in 2008 is so far holding up quite well. In this paper, I will present my first attempt to predict Cycle 25 using the data assimilation approach, and discuss the uncertainties of that prediction.« less

  13. UPDATE ON EPA'S URBAN WATERSHED MANAGEMENT BRANCH MODELING ACTIVITIES

    EPA Science Inventory

    This paper provides the Stormwater Management Model (SWMM) user community with a description of the Environmental Protection Agency (EPA's) Office of Research and Development (ORD) approach to urban watershed modeling research and provides an update on current ORD SWMM-related pr...

  14. Women's Self-definition in Adulthood: From a Different Model?

    ERIC Educational Resources Information Center

    Peck, Teresa A.

    1986-01-01

    Examines criticisms of existing models of adult development from both feminist and developmental psychologists. A model of women's adult self-definition is presented, based upon current research on women's adult experience. The model combines a dialectical approach, which considers the effects of social/historical factors, with a feminist…

  15. Building Path Diagrams for Multilevel Models

    ERIC Educational Resources Information Center

    Curran, Patrick J.; Bauer, Daniel J.

    2007-01-01

    Multilevel models have come to play an increasingly important role in many areas of social science research. However, in contrast to other modeling strategies, there is currently no widely used approach for graphically diagramming multilevel models. Ideally, such diagrams would serve two functions: to provide a formal structure for deriving the…

  16. The Chronic Care Model: A Collaborative Approach to Preventing and Treating Asthma in Infants and Young Children

    ERIC Educational Resources Information Center

    Wessel, Lois; Spain, Jacqueline

    2005-01-01

    The authors that a collaborative approach between parents and professionals is the best way to care for a young child with asthma. They use Ed Wagner's transdisciplinary 1998 Chronic Care Model as their preferred method for collaboration. More than 5 million children in the U.S. are currently affected by asthma, and a growing body of evidence…

  17. Field-dependent critical state of high-Tc superconducting strip simultaneously exposed to transport current and perpendicular magnetic field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xue, Cun; He, An; Yong, Huadong

    We present an exact analytical approach for arbitrary field-dependent critical state of high-T{sub c} superconducting strip with transport current. The sheet current and flux-density profiles are derived by solving the integral equations, which agree with experiments quite well. For small transport current, the approximate explicit expressions of sheet current, flux-density and penetration depth for the Kim model are derived based on the mean value theorem for integration. We also extend the results to the field-dependent critical state of superconducting strip in the simultaneous presence of applied field and transport current. The sheet current distributions calculated by the Kim model agreemore » with experiments better than that by the Bean model. Moreover, the lines in the I{sub a}-B{sub a} plane for the Kim model are not monotonic, which is quite different from that the Bean model. The results reveal that the maximum transport current in thin superconducting strip will decrease with increasing applied field which vanishes for the Bean model. The results of this paper are useful to calculate ac susceptibility and ac loss.« less

  18. An overview of modelling approaches and potential solution towards an endgame of tobacco

    NASA Astrophysics Data System (ADS)

    Halim, Tisya Farida Abdul; Sapiri, Hasimah; Abidin, Norhaslinda Zainal

    2015-12-01

    A high number of premature mortality due to tobacco use has increased worldwide. Despite control policies being implemented to reduce premature mortality, the rate of smoking prevalence is still high. Moreover, tobacco issues become increasingly difficult since many aspects need to be considered simultaneously. Thus, the purpose of this paper is to present an overview of existing modelling studies on tobacco control system. The background section describes the tobacco issues and its current trends. These models have been categorised according to their modelling approaches either individual or integrated approaches. Next, a framework of modelling approaches based on the integration of multi-criteria decision making, system dynamics and nonlinear programming is proposed, expected to reduce the smoking prevalence. This framework provides guideline for modelling the interaction between smoking behaviour and its impacts, tobacco control policies and the effectiveness of each strategy in healthcare.

  19. Episodes of care: is emergency medicine ready?

    PubMed

    Wiler, Jennifer L; Beck, Dennis; Asplin, Brent R; Granovsky, Michael; Moorhead, John; Pilgrim, Randy; Schuur, Jeremiah D

    2012-05-01

    Optimizing resource use, eliminating waste, aligning provider incentives, reducing overall costs, and coordinating the delivery of quality care while improving outcomes have been major themes of health care reform initiatives. Recent legislation contains several provisions designed to move away from the current fee-for-service payment mechanism toward a model that reimburses providers for caring for a population of patients over time while shifting more financial risk to providers. In this article, we review current approaches to episode of care development and reimbursement. We describe the challenges of incorporating emergency medicine into the episode of care approach and the uncertain influence this delivery model will have on emergency medicine care, including quality outcomes. We discuss the limitations of the episode of care payment model for emergency services and advocate retention of the current fee-for-service payment model, as well as identify research gaps that, if addressed, could be used to inform future policy decisions of emergency medicine health policy leaders. We then describe a meaningful role for emergency medicine in an episode of care setting. Copyright © 2011. Published by Mosby, Inc.

  20. Modelling of sediment transport and morphological evolution under the combined action of waves and currents

    NASA Astrophysics Data System (ADS)

    Franz, Guilherme; Delpey, Matthias T.; Brito, David; Pinto, Lígia; Leitão, Paulo; Neves, Ramiro

    2017-09-01

    Coastal defence structures are often constructed to prevent beach erosion. However, poorly designed structures may cause serious erosion problems in the downdrift direction. Morphological models are useful tools to predict such impacts and assess the efficiency of defence structures for different scenarios. Nevertheless, morphological modelling is still a topic under intense research effort. The processes simulated by a morphological model depend on model complexity. For instance, undertow currents are neglected in coastal area models (2DH), which is a limitation for simulating the evolution of beach profiles for long periods. Model limitations are generally overcome by predefining invariant equilibrium profiles that are allowed to shift offshore or onshore. A more flexible approach is described in this paper, which can be generalised to 3-D models. The present work is based on the coupling of the MOHID modelling system and the SWAN wave model. The impacts of different designs of detached breakwaters and groynes were simulated in a schematic beach configuration following a 2DH approach. The results of bathymetry evolution are in agreement with the patterns found in the literature for several existing structures. The model was also tested in a 3-D test case to simulate the formation of sandbars by undertow currents. The findings of this work confirmed the applicability of the MOHID modelling system to study sediment transport and morphological changes in coastal zones under the combined action of waves and currents. The same modelling methodology was applied to a coastal zone (Costa da Caparica) located at the mouth of a mesotidal estuary (Tagus Estuary, Portugal) to evaluate the hydrodynamics and sediment transport both in calm water conditions and during events of highly energetic waves. The MOHID code is available in the GitHub repository.

  1. Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis

    PubMed Central

    Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven

    2011-01-01

    Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320

  2. Putting the psychology back into psychological models: mechanistic versus rational approaches.

    PubMed

    Sakamoto, Yasuaki; Jones, Mattr; Love, Bradley C

    2008-09-01

    Two basic approaches to explaining the nature of the mind are the rational and the mechanistic approaches. Rational analyses attempt to characterize the environment and the behavioral outcomes that humans seek to optimize, whereas mechanistic models attempt to simulate human behavior using processes and representations analogous to those used by humans. We compared these approaches with regard to their accounts of how humans learn the variability of categories. The mechanistic model departs in subtle ways from rational principles. In particular, the mechanistic model incrementally updates its estimates of category means and variances through error-driven learning, based on discrepancies between new category members and the current representation of each category. The model yields a prediction, which we verify, regarding the effects of order manipulations that the rational approach does not anticipate. Although both rational and mechanistic models can successfully postdict known findings, we suggest that psychological advances are driven primarily by consideration of process and representation and that rational accounts trail these breakthroughs.

  3. Real-time monitoring of a microbial electrolysis cell using an electrical equivalent circuit model.

    PubMed

    Hussain, S A; Perrier, M; Tartakovsky, B

    2018-04-01

    Efforts in developing microbial electrolysis cells (MECs) resulted in several novel approaches for wastewater treatment and bioelectrosynthesis. Practical implementation of these approaches necessitates the development of an adequate system for real-time (on-line) monitoring and diagnostics of MEC performance. This study describes a simple MEC equivalent electrical circuit (EEC) model and a parameter estimation procedure, which enable such real-time monitoring. The proposed approach involves MEC voltage and current measurements during its operation with periodic power supply connection/disconnection (on/off operation) followed by parameter estimation using either numerical or analytical solution of the model. The proposed monitoring approach is demonstrated using a membraneless MEC with flow-through porous electrodes. Laboratory tests showed that changes in the influent carbon source concentration and composition significantly affect MEC total internal resistance and capacitance estimated by the model. Fast response of these EEC model parameters to changes in operating conditions enables the development of a model-based approach for real-time monitoring and fault detection.

  4. Stochastic and deterministic multiscale models for systems biology: an auxin-transport case study.

    PubMed

    Twycross, Jamie; Band, Leah R; Bennett, Malcolm J; King, John R; Krasnogor, Natalio

    2010-03-26

    Stochastic and asymptotic methods are powerful tools in developing multiscale systems biology models; however, little has been done in this context to compare the efficacy of these methods. The majority of current systems biology modelling research, including that of auxin transport, uses numerical simulations to study the behaviour of large systems of deterministic ordinary differential equations, with little consideration of alternative modelling frameworks. In this case study, we solve an auxin-transport model using analytical methods, deterministic numerical simulations and stochastic numerical simulations. Although the three approaches in general predict the same behaviour, the approaches provide different information that we use to gain distinct insights into the modelled biological system. We show in particular that the analytical approach readily provides straightforward mathematical expressions for the concentrations and transport speeds, while the stochastic simulations naturally provide information on the variability of the system. Our study provides a constructive comparison which highlights the advantages and disadvantages of each of the considered modelling approaches. This will prove helpful to researchers when weighing up which modelling approach to select. In addition, the paper goes some way to bridging the gap between these approaches, which in the future we hope will lead to integrative hybrid models.

  5. Remaining lifetime modeling using State-of-Health estimation

    NASA Astrophysics Data System (ADS)

    Beganovic, Nejra; Söffker, Dirk

    2017-08-01

    Technical systems and system's components undergo gradual degradation over time. Continuous degradation occurred in system is reflected in decreased system's reliability and unavoidably lead to a system failure. Therefore, continuous evaluation of State-of-Health (SoH) is inevitable to provide at least predefined lifetime of the system defined by manufacturer, or even better, to extend the lifetime given by manufacturer. However, precondition for lifetime extension is accurate estimation of SoH as well as the estimation and prediction of Remaining Useful Lifetime (RUL). For this purpose, lifetime models describing the relation between system/component degradation and consumed lifetime have to be established. In this contribution modeling and selection of suitable lifetime models from database based on current SoH conditions are discussed. Main contribution of this paper is the development of new modeling strategies capable to describe complex relations between measurable system variables, related system degradation, and RUL. Two approaches with accompanying advantages and disadvantages are introduced and compared. Both approaches are capable to model stochastic aging processes of a system by simultaneous adaption of RUL models to current SoH. The first approach requires a priori knowledge about aging processes in the system and accurate estimation of SoH. An estimation of SoH here is conditioned by tracking actual accumulated damage into the system, so that particular model parameters are defined according to a priori known assumptions about system's aging. Prediction accuracy in this case is highly dependent on accurate estimation of SoH but includes high number of degrees of freedom. The second approach in this contribution does not require a priori knowledge about system's aging as particular model parameters are defined in accordance to multi-objective optimization procedure. Prediction accuracy of this model does not highly depend on estimated SoH. This model has lower degrees of freedom. Both approaches rely on previously developed lifetime models each of them corresponding to predefined SoH. Concerning first approach, model selection is aided by state-machine-based algorithm. In the second approach, model selection conditioned by tracking an exceedance of predefined thresholds is concerned. The approach is applied to data generated from tribological systems. By calculating Root Squared Error (RSE), Mean Squared Error (MSE), and Absolute Error (ABE) the accuracy of proposed models/approaches is discussed along with related advantages and disadvantages. Verification of the approach is done using cross-fold validation, exchanging training and test data. It can be stated that the newly introduced approach based on data (denoted as data-based or data-driven) parametric models can be easily established providing detailed information about remaining useful/consumed lifetime valid for systems with constant load but stochastically occurred damage.

  6. Shear-induced opening of the coronal magnetic field

    NASA Technical Reports Server (NTRS)

    Wolfson, Richard

    1995-01-01

    This work describes the evolution of a model solar corona in response to motions of the footpoints of its magnetic field. The mathematics involved is semianalytic, with the only numerical solution being that of an ordinary differential equation. This approach, while lacking the flexibility and physical details of full MHD simulations, allows for very rapid computation along with complete and rigorous exploration of the model's implications. We find that the model coronal field bulges upward, at first slowly and then more dramatically, in response to footpoint displacements. The energy in the field rises monotonically from that of the initial potential state, and the field configuration and energy appraoch asymptotically that of a fully open field. Concurrently, electric currents develop and concentrate into a current sheet as the limiting case of the open field is approached. Examination of the equations shows rigorously that in the asymptotic limit of the fully open field, the current layer becomes a true ideal MHD singularity.

  7. Evaluating disease management program effectiveness: an introduction to survival analysis.

    PubMed

    Linden, Ariel; Adams, John L; Roberts, Nancy

    2004-01-01

    Currently, the most widely used method in the disease management industry for evaluating program effectiveness is the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer plausible rationale explaining the change from baseline. Survival analysis allows for the inclusion of data from censored cases, those subjects who either "survived" the program without experiencing the event (e.g., achievement of target clinical levels, hospitalization) or left the program prematurely, due to disenrollement from the health plan or program, or were lost to follow-up. Additionally, independent variables may be included in the model to help explain the variability in the outcome measure. In order to maximize the potential of this statistical method, validity of the model and research design must be assured. This paper reviews survival analysis as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.

  8. Current Source Based on H-Bridge Inverter with Output LCL Filter

    NASA Astrophysics Data System (ADS)

    Blahnik, Vojtech; Talla, Jakub; Peroutka, Zdenek

    2015-09-01

    The paper deals with a control of current source with an LCL output filter. The controlled current source is realized as a single-phase inverter and output LCL filter provides low ripple of output current. However, systems incorporating LCL filters require more complex control strategies and there are several interesting approaches to the control of this type of converter. This paper presents the inverter control algorithm, which combines model based control with a direct current control based on resonant controllers and single-phase vector control. The primary goal is to reduce the current ripple and distortion under required limits and provides fast and precise control of output current. The proposed control technique is verified by measurements on the laboratory model.

  9. Crash Certification by Analysis - Are We There Yet?

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.

    2006-01-01

    This paper addresses the issue of crash certification by analysis. This broad topic encompasses many ancillary issues including model validation procedures, uncertainty in test data and analysis models, probabilistic techniques for test-analysis correlation, verification of the mathematical formulation, and establishment of appropriate qualification requirements. This paper will focus on certification requirements for crashworthiness of military helicopters; capabilities of the current analysis codes used for crash modeling and simulation, including some examples of simulations from the literature to illustrate the current approach to model validation; and future directions needed to achieve "crash certification by analysis."

  10. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  11. Practical examples of modeling choices and their consequences for risk assessment

    EPA Science Inventory

    Although benchmark dose (BMD) modeling has become the preferred approach to identifying a point of departure (POD) over the No Observed Adverse Effect Level, there remain challenges to its application in human health risk assessment. BMD modeling, as currently implemented by the...

  12. ASSESSING MULTIMEDIA/MULTIPATHWAY EXPOSURE TO ARSENIC USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...

  13. Robust Synchronization Models for Presentation System Using SMIL-Driven Approach

    ERIC Educational Resources Information Center

    Asnawi, Rustam; Ahmad, Wan Fatimah Wan; Rambli, Dayang Rohaya Awang

    2013-01-01

    Current common Presentation System (PS) models are slide based oriented and lack synchronization analysis either with temporal or spatial constraints. Such models, in fact, tend to lead to synchronization problems, particularly on parallel synchronization with spatial constraints between multimedia element presentations. However, parallel…

  14. ASSESSMENT OF DYNAMIC PRA TECHNIQUES WITH INDUSTRY AVERAGE COMPONENT PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yadav, Vaibhav; Agarwal, Vivek; Gribok, Andrei V.

    In the nuclear industry, risk monitors are intended to provide a point-in-time estimate of the system risk given the current plant configuration. Current risk monitors are limited in that they do not properly take into account the deteriorating states of plant equipment, which are unit-specific. Current approaches to computing risk monitors use probabilistic risk assessment (PRA) techniques, but the assessment is typically a snapshot in time. Living PRA models attempt to address limitations of traditional PRA models in a limited sense by including temporary changes in plant and system configurations. However, information on plant component health are not considered. Thismore » often leaves risk monitors using living PRA models incapable of conducting evaluations with dynamic degradation scenarios evolving over time. There is a need to develop enabling approaches to solidify risk monitors to provide time and condition-dependent risk by integrating traditional PRA models with condition monitoring and prognostic techniques. This paper presents estimation of system risk evolution over time by integrating plant risk monitoring data with dynamic PRA methods incorporating aging and degradation. Several online, non-destructive approaches have been developed for diagnosing plant component conditions in nuclear industry, i.e., condition indication index, using vibration analysis, current signatures, and operational history [1]. In this work the component performance measures at U.S. commercial nuclear power plants (NPP) [2] are incorporated within the various dynamic PRA methodologies [3] to provide better estimates of probability of failures. Aging and degradation is modeled within the Level-1 PRA framework and is applied to several failure modes of pumps and can be extended to a range of components, viz. valves, generators, batteries, and pipes.« less

  15. Mesoscopic kinetic Monte Carlo modeling of organic photovoltaic device characteristics

    NASA Astrophysics Data System (ADS)

    Kimber, Robin G. E.; Wright, Edward N.; O'Kane, Simon E. J.; Walker, Alison B.; Blakesley, James C.

    2012-12-01

    Measured mobility and current-voltage characteristics of single layer and photovoltaic (PV) devices composed of poly{9,9-dioctylfluorene-co-bis[N,N'-(4-butylphenyl)]bis(N,N'-phenyl-1,4-phenylene)diamine} (PFB) and poly(9,9-dioctylfluorene-co-benzothiadiazole) (F8BT) have been reproduced by a mesoscopic model employing the kinetic Monte Carlo (KMC) approach. Our aim is to show how to avoid the uncertainties common in electrical transport models arising from the need to fit a large number of parameters when little information is available, for example, a single current-voltage curve. Here, simulation parameters are derived from a series of measurements using a self-consistent “building-blocks” approach, starting from data on the simplest systems. We found that site energies show disorder and that correlations in the site energies and a distribution of deep traps must be included in order to reproduce measured charge mobility-field curves at low charge densities in bulk PFB and F8BT. The parameter set from the mobility-field curves reproduces the unipolar current in single layers of PFB and F8BT and allows us to deduce charge injection barriers. Finally, by combining these disorder descriptions and injection barriers with an optical model, the external quantum efficiency and current densities of blend and bilayer organic PV devices can be successfully reproduced across a voltage range encompassing reverse and forward bias, with the recombination rate the only parameter to be fitted, found to be 1×107 s-1. These findings demonstrate an approach that removes some of the arbitrariness present in transport models of organic devices, which validates the KMC as an accurate description of organic optoelectronic systems, and provides information on the microscopic origins of the device behavior.

  16. Fate of microplastics and mesoplastics carried by surface currents and wind waves: A numerical model approach in the Sea of Japan.

    PubMed

    Iwasaki, Shinsuke; Isobe, Atsuhiko; Kako, Shin'ichiro; Uchida, Keiichi; Tokai, Tadashi

    2017-08-15

    A numerical model was established to reproduce the oceanic transport processes of microplastics and mesoplastics in the Sea of Japan. A particle tracking model, where surface ocean currents were given by a combination of a reanalysis ocean current product and Stokes drift computed separately by a wave model, simulated particle movement. The model results corresponded with the field survey. Modeled results indicated the micro- and mesoplastics are moved northeastward by the Tsushima Current. Subsequently, Stokes drift selectively moves mesoplastics during winter toward the Japanese coast, resulting in increased contributions of mesoplastics south of 39°N. Additionally, Stokes drift also transports micro- and mesoplastics out to the sea area south of the subpolar front where the northeastward Tsushima Current carries them into the open ocean via the Tsugaru and Soya straits. Average transit time of modeled particles in the Sea of Japan is drastically reduced when including Stokes drift in the model. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. APPROACH FOR ESTIMATING GLOBAL LANDFILL METHANE EMISSIONS

    EPA Science Inventory

    The report is an overview of available country-specific data and modeling approaches for estimating global landfill methane. Current estimates of global landfill methane indicate that landfills account for between 4 and 15% of the global methane budget. The report describes an ap...

  18. Multi-tiered Approach to Development of Increased Throughput Assay Models to Assess Endocrine-Disrupting Activity of Chemicals

    EPA Science Inventory

    Screening for endocrine-disrupting chemicals (EDCs) requires sensitive, scalable assays. Current high-throughput screening (HTPS) approaches for estrogenic and androgenic activity yield rapid results, but many are not sensitive to physiological hormone concentrations, suggesting ...

  19. Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization

    NASA Astrophysics Data System (ADS)

    Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad

    2017-02-01

    Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.

  20. Implementation of a Goal-Based Systems Engineering Process Using the Systems Modeling Language (SysML)

    NASA Technical Reports Server (NTRS)

    Patterson, Jonathan D.; Breckenridge, Jonathan T.; Johnson, Stephen B.

    2013-01-01

    Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.

  1. An iterative model for the steady state current distribution in oxide-confined vertical-cavity surface-emitting lasers (VCSELs)

    NASA Astrophysics Data System (ADS)

    Chuang, Hsueh-Hua

    The purpose of this dissertation is to develop an iterative model for the analysis of the current distribution in vertical-cavity surface-emitting lasers (VCSELs) using a circuit network modeling approach. This iterative model divides the VCSEL structure into numerous annular elements and uses a circuit network consisting of resistors and diodes. The measured sheet resistance of the p-distributed Bragg reflector (DBR), the measured sheet resistance of the layers under the oxide layer, and two empirical adjustable parameters are used as inputs to the iterative model to determine the resistance of each resistor. The two empirical values are related to the anisotropy of the resistivity of the p-DBR structure. The spontaneous current, stimulated current, and surface recombination current are accounted for by the diodes. The lateral carrier transport in the quantum well region is analyzed using drift and diffusion currents. The optical gain is calculated as a function of wavelength and carrier density from fundamental principles. The predicted threshold current densities for these VCSELs match the experimentally measured current densities over the wavelength range of 0.83 mum to 0.86 mum with an error of less than 5%. This model includes the effects of the resistance of the p-DBR mirrors, the oxide current-confining layer and spatial hole burning. Our model shows that higher sheet resistance under the oxide layer reduces the threshold current, but also reduces the current range over which single transverse mode operation occurs. The spatial hole burning profile depends on the lateral drift and diffusion of carriers in the quantum wells but is dominated by the voltage drop across the p-DBR region. To my knowledge, for the first time, the drift current and the diffusion current are treated separately. Previous work uses an ambipolar approach, which underestimates the total charge transferred in the quantum well region, especially under the oxide region. However, the total result of the drift current and the diffusion current is less significant than the Ohmic current, especially in the cavity region. This simple iterative model is applied to commercially available oxide-confined VCSELs. The simulation results show excellent agreement with experimentally measured voltage-current curves (within 3.7% for a 10 mum and within 4% for a 5 mum diameter VCSEL) and light-current curves (within 2% for a 10 mum and within 9% for a 5 mum diameter VCSEL) curves and provides insight into the detailed distributions of current and voltage within a VCSEL. This difference between the theoretically calculated results and the measured results is less than the variation shown in the data sheets for production VCSELs.

  2. A Self-Consistent Model of the Interacting Ring Current Ions with Electromagnetic ICWs

    NASA Technical Reports Server (NTRS)

    Khazanov, G. V.; Gamayunov, K. V.; Jordanova, V. K.; Krivorutsky, E. N.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    Initial results from a newly developed model of the interacting ring current ions and ion cyclotron waves are presented. The model is based on the system of two bound kinetic equations: one equation describes the ring current ion dynamics, and another equation describes wave evolution. The system gives a self-consistent description of ring current ions and ion cyclotron waves in a quasilinear approach. These two equations were solved on a global scale under non steady-state conditions during the May 2-5, 1998 storm. The structure and dynamics of the ring current proton precipitating flux regions and the wave active zones at three time cuts around initial, main, and late recovery phases of the May 4, 1998 storm phase are presented and discussed in detail. Comparisons of the model wave-ion data with the Polar/HYDRA and Polar/MFE instruments results are presented..

  3. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  4. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  5. Best practices for evaluating the capability of nondestructive evaluation (NDE) and structural health monitoring (SHM) techniques for damage characterization

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Annis, Charles; Sabbagh, Harold A.; Lindgren, Eric A.

    2016-02-01

    A comprehensive approach to NDE and SHM characterization error (CE) evaluation is presented that follows the framework of the `ahat-versus-a' regression analysis for POD assessment. Characterization capability evaluation is typically more complex with respect to current POD evaluations and thus requires engineering and statistical expertise in the model-building process to ensure all key effects and interactions are addressed. Justifying the statistical model choice with underlying assumptions is key. Several sizing case studies are presented with detailed evaluations of the most appropriate statistical model for each data set. The use of a model-assisted approach is introduced to help assess the reliability of NDE and SHM characterization capability under a wide range of part, environmental and damage conditions. Best practices of using models are presented for both an eddy current NDE sizing and vibration-based SHM case studies. The results of these studies highlight the general protocol feasibility, emphasize the importance of evaluating key application characteristics prior to the study, and demonstrate an approach to quantify the role of varying SHM sensor durability and environmental conditions on characterization performance.

  6. Modeling Interfacial Glass-Water Reactions: Recent Advances and Current Limitations

    DOE PAGES

    Pierce, Eric M.; Frugier, Pierre; Criscenti, Louise J.; ...

    2014-07-12

    Describing the reactions that occur at the glass-water interface and control the development of the altered layer constitutes one of the main scientific challenges impeding existing models from providing accurate radionuclide release estimates. Radionuclide release estimates are a critical component of the safety basis for geologic repositories. The altered layer (i.e., amorphous hydrated surface layer and crystalline reaction products) represents a complex region, both physically and chemically, sandwiched between two distinct boundaries pristine glass surface at the inner most interface and aqueous solution at the outer most interface. Computational models, spanning different length and time-scales, are currently being developed tomore » improve our understanding of this complex and dynamic process with the goal of accurately describing the pore-scale changes that occur as the system evolves. These modeling approaches include geochemical simulations [i.e., classical reaction path simulations and glass reactivity in allowance for alteration layer (GRAAL) simulations], Monte Carlo simulations, and Molecular Dynamics methods. Finally, in this manuscript, we discuss the advances and limitations of each modeling approach placed in the context of the glass-water reaction and how collectively these approaches provide insights into the mechanisms that control the formation and evolution of altered layers.« less

  7. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  8. Approaches to simulating the “March of Bricks and Mortar”

    USGS Publications Warehouse

    Goldstein, Noah Charles; Candau, J.T.; Clarke, K.C.

    2004-01-01

    Re-creation of the extent of urban land use at different periods in time is valuable for examining how cities grow and how policy changes influence urban dynamics. To date, there has been little focus on the modeling of historical urban extent (other than for ancient cities). Instead, current modeling research has emphasized simulating the cities of the future. Predictive models can provide insights into urban growth processes and are valuable for land-use and urban planners, yet historical trends are largely ignored. This is unfortunate since historical data exist for urban areas and can be used to quantitatively test dynamic models and theory. We maintain that understanding the growth dynamics of a region's past allows more intelligent forecasts of its future. We compare using a spatio-temporal interpolation method with an agent-based simulation approach to recreate the urban extent of Santa Barbara, California, annually from 1929 to 2001. The first method uses current yet incomplete data on the construction of homes in the region. The latter uses a Cellular Automata based model, SLEUTH, to back- or hind-cast the urban extent. The success at historical urban growth reproduction of the two approaches used in this work was quantified for comparison. The performance of each method is described, as well as the utility of each model in re-creating the history of Santa Barbara. Additionally, the models’ assumptions about space are contrasted. As a consequence, we propose that both approaches are useful in historical urban simulations, yet the cellular approach is more flexible as it can be extended for spatio-temporal extrapolation.

  9. A Model-Driven Approach for Telecommunications Network Services Definition

    NASA Astrophysics Data System (ADS)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  10. Integrated earth system dynamic modeling for life cycle impact assessment of ecosystem services.

    PubMed

    Arbault, Damien; Rivière, Mylène; Rugani, Benedetto; Benetto, Enrico; Tiruta-Barna, Ligia

    2014-02-15

    Despite the increasing awareness of our dependence on Ecosystem Services (ES), Life Cycle Impact Assessment (LCIA) does not explicitly and fully assess the damages caused by human activities on ES generation. Recent improvements in LCIA focus on specific cause-effect chains, mainly related to land use changes, leading to Characterization Factors (CFs) at the midpoint assessment level. However, despite the complexity and temporal dynamics of ES, current LCIA approaches consider the environmental mechanisms underneath ES to be independent from each other and devoid of dynamic character, leading to constant CFs whose representativeness is debatable. This paper takes a step forward and is aimed at demonstrating the feasibility of using an integrated earth system dynamic modeling perspective to retrieve time- and scenario-dependent CFs that consider the complex interlinkages between natural processes delivering ES. The GUMBO (Global Unified Metamodel of the Biosphere) model is used to quantify changes in ES production in physical terms - leading to midpoint CFs - and changes in human welfare indicators, which are considered here as endpoint CFs. The interpretation of the obtained results highlights the key methodological challenges to be solved to consider this approach as a robust alternative to the mainstream rationale currently adopted in LCIA. Further research should focus on increasing the granularity of environmental interventions in the modeling tools to match current standards in LCA and on adapting the conceptual approach to a spatially-explicit integrated model. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. A Bayesian model averaging method for improving SMT phrase table

    NASA Astrophysics Data System (ADS)

    Duan, Nan

    2013-03-01

    Previous methods on improving translation quality by employing multiple SMT models usually carry out as a second-pass decision procedure on hypotheses from multiple systems using extra features instead of using features in existing models in more depth. In this paper, we propose translation model generalization (TMG), an approach that updates probability feature values for the translation model being used based on the model itself and a set of auxiliary models, aiming to alleviate the over-estimation problem and enhance translation quality in the first-pass decoding phase. We validate our approach for translation models based on auxiliary models built by two different ways. We also introduce novel probability variance features into the log-linear models for further improvements. We conclude our approach can be developed independently and integrated into current SMT pipeline directly. We demonstrate BLEU improvements on the NIST Chinese-to-English MT tasks for single-system decodings.

  12. The mathematics of cancer: integrating quantitative models.

    PubMed

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  13. Comparison of vertical discretization techniques in finite-difference models of ground-water flow; example from a hypothetical New England setting

    USGS Publications Warehouse

    Harte, Philip T.

    1994-01-01

    Proper discretization of a ground-water-flow field is necessary for the accurate simulation of ground-water flow by models. Although discretiza- tion guidelines are available to ensure numerical stability, current guidelines arc flexible enough (particularly in vertical discretization) to allow for some ambiguity of model results. Testing of two common types of vertical-discretization schemes (horizontal and nonhorizontal-model-layer approach) were done to simulate sloping hydrogeologic units characteristic of New England. Differences of results of model simulations using these two approaches are small. Numerical errors associated with use of nonhorizontal model layers are small (4 percent). even though this discretization technique does not adhere to the strict formulation of the finite-difference method. It was concluded that vertical discretization by means of the nonhorizontal layer approach has advantages in representing the hydrogeologic units tested and in simplicity of model-data input. In addition, vertical distortion of model cells by this approach may improve the representation of shallow flow processes.

  14. Technical note: Comparison of methane ebullition modelling approaches used in terrestrial wetland models

    NASA Astrophysics Data System (ADS)

    Peltola, Olli; Raivonen, Maarit; Li, Xuefei; Vesala, Timo

    2018-02-01

    Emission via bubbling, i.e. ebullition, is one of the main methane (CH4) emission pathways from wetlands to the atmosphere. Direct measurement of gas bubble formation, growth and release in the peat-water matrix is challenging and in consequence these processes are relatively unknown and are coarsely represented in current wetland CH4 emission models. In this study we aimed to evaluate three ebullition modelling approaches and their effect on model performance. This was achieved by implementing the three approaches in one process-based CH4 emission model. All the approaches were based on some kind of threshold: either on CH4 pore water concentration (ECT), pressure (EPT) or free-phase gas volume (EBG) threshold. The model was run using 4 years of data from a boreal sedge fen and the results were compared with eddy covariance measurements of CH4 fluxes.

    Modelled annual CH4 emissions were largely unaffected by the different ebullition modelling approaches; however, temporal variability in CH4 emissions varied an order of magnitude between the approaches. Hence the ebullition modelling approach drives the temporal variability in modelled CH4 emissions and therefore significantly impacts, for instance, high-frequency (daily scale) model comparison and calibration against measurements. The modelling approach based on the most recent knowledge of the ebullition process (volume threshold, EBG) agreed the best with the measured fluxes (R2 = 0.63) and hence produced the most reasonable results, although there was a scale mismatch between the measurements (ecosystem scale with heterogeneous ebullition locations) and model results (single horizontally homogeneous peat column). The approach should be favoured over the two other more widely used ebullition modelling approaches and researchers are encouraged to implement it into their CH4 emission models.

  15. Determination of plasma density from data on the ion current to cylindrical and planar probes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voloshin, D. G., E-mail: dvoloshin@mics.msu.su; Vasil’eva, A. N.; Kovalev, A. S.

    2016-12-15

    To improve probe methods of plasma diagnostics, special probe measurements were performed and numerical models describing ion transport to a probe with allowance for collisions were developed. The current–voltage characteristics of cylindrical and planar probes were measured in an RF capacitive discharge in argon at a frequency of 81 MHz and plasma densities of 10{sup 10}–10{sup 11} cm{sup –3}, typical of modern RF reactors. 1D and 2D numerical models based on the particle-in-cell method with Monte Carlo collisions for simulating ion motion and the Boltzmann equilibrium for electrons are developed to describe current collection by a probe. The models weremore » used to find the plasma density from the ion part of the current–voltage characteristic, study the effect of ion collisions, and verify simplified approaches to determining the plasma density. A 1D hydrodynamic model of the ion current to a cylindrical probe with allowance for ion collisions is proposed. For a planar probe, a method to determine the plasma density from the averaged numerical results is developed. A comparative analysis of different approaches to calculating the plasma density from the ion current to a probe is performed.« less

  16. Model Uncertainty and Bayesian Model Averaged Benchmark Dose Estimation for Continuous Data

    EPA Science Inventory

    The benchmark dose (BMD) approach has gained acceptance as a valuable risk assessment tool, but risk assessors still face significant challenges associated with selecting an appropriate BMD/BMDL estimate from the results of a set of acceptable dose-response models. Current approa...

  17. A MANUAL OF INSTRUCTIONAL PROBLEMS FOR THE U.S.G.S. MODFLOW MODEL

    EPA Science Inventory

    A recent report by the United States Environmental Protection Agency Groundwater Modeling Policy Study Group (van der Heijde and Park, 1986) offered several approaches to training Agency staff in the application of groundwater modeling. They identified the problem that current t...

  18. SIMULATION OF AEROSOL DYNAMICS: A COMPARATIVE REVIEW OF ALGORITHMS USED IN AIR QUALITY MODELS

    EPA Science Inventory

    A comparative review of algorithms currently used in air quality models to simulate aerosol dynamics is presented. This review addresses coagulation, condensational growth, nucleation, and gas/particle mass transfer. Two major approaches are used in air quality models to repres...

  19. Addressing the Barriers to Agile Development in the Department of Defense: Program Structure, Requirements, and Contracting

    DTIC Science & Technology

    2015-04-30

    approach directly contrast with the traditional DoD acquisition model designed for a single big-bang waterfall approach (Broadus, 2013). Currently...progress, reduce technical and programmatic risk, and respond to feedback and changes more quickly than traditional waterfall methods (Modigliani...requirements, and contracting. The DoD can address these barriers by utilizing a proactively tailored Agile acquisition model , implementing an IT Box

  20. Corima: A Bilingual Experiment in the Tarahumara Region in the State of Chihuahua, Mexico. How Does It Measure against Transitional Bilingual Programs in the United States?

    ERIC Educational Resources Information Center

    Nunez, Mario A.

    This report explores two bilingual educational approaches currently in use in Mexico and the United States. The study pursues a limited comparison between two modalities of bilingual instruction, as observed and reported in the consulted literature. The U.S. model featured is known as the two-way bilingual model, an additive approach to…

  1. Modelling fatigue and the use of fatigue models in work settings.

    PubMed

    Dawson, Drew; Ian Noy, Y; Härmä, Mikko; Akerstedt, Torbjorn; Belenky, Gregory

    2011-03-01

    In recent years, theoretical models of the sleep and circadian system developed in laboratory settings have been adapted to predict fatigue and, by inference, performance. This is typically done using the timing of prior sleep and waking or working hours as the primary input and the time course of the predicted variables as the primary output. The aim of these models is to provide employers, unions and regulators with quantitative information on the likely average level of fatigue, or risk, associated with a given pattern of work and sleep with the goal of better managing the risk of fatigue-related errors and accidents/incidents. The first part of this review summarises the variables known to influence workplace fatigue and draws attention to the considerable variability attributable to individual and task variables not included in current models. The second part reviews the current fatigue models described in the scientific and technical literature and classifies them according to whether they predict fatigue directly by using the timing of prior sleep and wake (one-step models) or indirectly by using work schedules to infer an average sleep-wake pattern that is then used to predict fatigue (two-step models). The third part of the review looks at the current use of fatigue models in field settings by organizations and regulators. Given their limitations it is suggested that the current generation of models may be appropriate for use as one element in a fatigue risk management system. The final section of the review looks at the future of these models and recommends a standardised approach for their use as an element of the 'defenses-in-depth' approach to fatigue risk management. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer

    PubMed Central

    Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain

    2017-01-01

    Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results. PMID:28467468

  3. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer.

    PubMed

    Rogiers, Bart; Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain

    2017-01-01

    Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.

  4. Colorectal Cancer Deaths Attributable to Nonuse of Screening in the United States

    PubMed Central

    Meester, Reinier G.S.; Doubeni, Chyke A.; Lansdorp-Vogelaar, Iris; Goede, S.L.; Levin, Theodore R.; Quinn, Virginia P.; van Ballegooijen, Marjolein; Corley, Douglas A.; Zauber, Ann G.

    2015-01-01

    Purpose Screening is a major contributor to colorectal cancer (CRC) mortality reductions in the U.S., but is underutilized. We estimated the fraction of CRC deaths attributable to nonuse of screening to demonstrate the potential benefits from targeted interventions. Methods The established MISCAN-colon microsimulation model was used to estimate the population attributable fraction (PAF) in people aged ≥50 years. The model incorporates long-term patterns and effects of screening by age and type of screening test. PAF for 2010 was estimated using currently available data on screening uptake; PAF was also projected assuming constant future screening rates to incorporate lagged effects from past increases in screening uptake. We also computed PAF using Levin's formula to gauge how this simpler approach differs from the model-based approach. Results There were an estimated 51,500 CRC deaths in 2010, about 63% (N∼32,200) of which were attributable to non-screening. The PAF decreases slightly to 58% in 2020. Levin's approach yielded a considerably more conservative PAF of 46% (N∼23,600) for 2010. Conclusions The majority of current U.S. CRC deaths are attributable to non-screening. This underscores the potential benefits of increasing screening uptake in the population. Traditional methods of estimating PAF underestimated screening effects compared with model-based approaches. PMID:25721748

  5. Inelastic cotunneling with energy-dependent contact transmission

    NASA Astrophysics Data System (ADS)

    Blok, S.; Agundez Mojarro, R. R.; Maduro, L. A.; Blaauboer, M.; Van Der Molen, S. J.

    2017-03-01

    We investigate inelastic cotunneling in a model system where the charging island is connected to the leads through molecules with energy-dependent transmission functions. To study this problem, we propose two different approaches. The first is a pragmatic approach that assumes Lorentzian-like transmission functions that determine the transmission probability to the island. Using this model, we calculate current versus voltage (IV) curves for increasing resonance level positions of the molecule. We find that shifting the resonance energy of the molecule away from the Fermi energy of the contacts leads to a decreased current at low bias, but as bias increases, this difference decreases and eventually inverses. This is markedly different from IV behavior outside the cotunneling regime. The second approach involves multiple cotunneling where also the molecules are considered to be in the Coulomb blockade regime. We find here that when Ec≫eV ,kBT , the IV behavior approaches the original cotunneling behavior proposed by Averin and Nazarov [Phys. Rev. Lett. 65, 2446-2449 (1990)].

  6. Systematic Applications of Metabolomics in Metabolic Engineering

    PubMed Central

    Dromms, Robert A.; Styczynski, Mark P.

    2012-01-01

    The goals of metabolic engineering are well-served by the biological information provided by metabolomics: information on how the cell is currently using its biochemical resources is perhaps one of the best ways to inform strategies to engineer a cell to produce a target compound. Using the analysis of extracellular or intracellular levels of the target compound (or a few closely related molecules) to drive metabolic engineering is quite common. However, there is surprisingly little systematic use of metabolomics datasets, which simultaneously measure hundreds of metabolites rather than just a few, for that same purpose. Here, we review the most common systematic approaches to integrating metabolite data with metabolic engineering, with emphasis on existing efforts to use whole-metabolome datasets. We then review some of the most common approaches for computational modeling of cell-wide metabolism, including constraint-based models, and discuss current computational approaches that explicitly use metabolomics data. We conclude with discussion of the broader potential of computational approaches that systematically use metabolomics data to drive metabolic engineering. PMID:24957776

  7. Social Work Education on Mental Health: Postmodern Discourse and the Medical Model

    ERIC Educational Resources Information Center

    Casstevens, W. J.

    2010-01-01

    This article provides a pedagogical approach to presenting alternatives along with the traditional medical model in the context of mental health treatment and service provision. Given the current influence of the medical model in community mental health, this article outlines a rationale for challenging the model and considering alternative models…

  8. Incorporating uncertainty into mercury-offset decisions with a probabilistic network for National Pollutant Discharge Elimination System permit holders: an interim report

    USGS Publications Warehouse

    Wood, Alexander

    2004-01-01

    This interim report describes an alternative approach for evaluating the efficacy of using mercury (Hg) offsets to improve water quality. Hg-offset programs may allow dischargers facing higher-pollution control costs to meet their regulatory obligations by making more cost effective pollutant-reduction decisions. Efficient Hg management requires methods to translate that science and economics into a regulatory decision framework. This report documents the work in progress by the U.S. Geological Surveys Western Geographic Science Center in collaboration with Stanford University toward developing this decision framework to help managers, regulators, and other stakeholders decide whether offsets can cost effectively meet the Hg total maximum daily load (TMDL) requirements in the Sacramento River watershed. Two key approaches being considered are: (1) a probabilistic approach that explicitly incorporates scientific uncertainty, cost information, and value judgments; and (2) a quantitative approach that captures uncertainty in testing the feasibility of Hg offsets. Current fate and transport-process models commonly attempt to predict chemical transformations and transport pathways deterministically. However, the physical, chemical, and biologic processes controlling the fate and transport of Hg in aquatic environments are complex and poorly understood. Deterministic models of Hg environmental behavior contain large uncertainties, reflecting this lack of understanding. The uncertainty in these underlying physical processes may produce similarly large uncertainties in the decisionmaking process. However, decisions about control strategies are still being made despite the large uncertainties in current Hg loadings, the relations between total Hg (HgT) loading and methylmercury (MeHg) formation, and the relations between control efforts and Hg content in fish. The research presented here focuses on an alternative analytical approach to the current use of safety factors and deterministic methods for Hg TMDL decision support, one that is fully compatible with an adaptive management approach. This alternative approach uses empirical data and informed judgment to provide a scientific and technical basis for helping National Pollutant Discharge Elimination System (NPDES) permit holders make management decisions. An Hg-offset system would be an option if a wastewater-treatment plant could not achieve NPDES permit requirements for HgT reduction. We develop a probabilistic decision-analytical model consisting of three submodels for HgT loading, MeHg, and cost mitigation within a Bayesian network that integrates information of varying rigor and detail into a simple model of a complex system. Hg processes are identified and quantified by using a combination of historical data, statistical models, and expert judgment. Such an integrated approach to uncertainty analysis allows easy updating of prediction and inference when observations of model variables are made. We demonstrate our approach with data from the Cache Creek watershed (a subbasin of the Sacramento River watershed). The empirical models used to generate the needed probability distributions are based on the same empirical models currently being used by the Central Valley Regional Water Quality Control Cache Creek Hg TMDL working group. The significant difference is that input uncertainty and error are explicitly included in the model and propagated throughout its algorithms. This work demonstrates how to integrate uncertainty into the complex and highly uncertain Hg TMDL decisionmaking process. The various sources of uncertainty are propagated as decision risk that allows decisionmakers to simultaneously consider uncertainties in remediation/implementation costs while attempting to meet environmental/ecologic targets. We must note that this research is on going. As more data are collected, the HgT and cost-mitigation submodels are updated and the uncer

  9. Modeling fuels and fire effects in 3D: Model description and applications

    Treesearch

    Francois Pimont; Russell Parsons; Eric Rigolot; Francois de Coligny; Jean-Luc Dupuy; Philippe Dreyfus; Rodman R. Linn

    2016-01-01

    Scientists and managers critically need ways to assess how fuel treatments alter fire behavior, yet few tools currently exist for this purpose.We present a spatially-explicit-fuel-modeling system, FuelManager, which models fuels, vegetation growth, fire behavior (using a physics-based model, FIRETEC), and fire effects. FuelManager's flexible approach facilitates...

  10. Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition

    DTIC Science & Technology

    2013-06-01

    building information models ( BIM ) at the coordinated design stage of building construction. 1.3 Approach To...standard for exchanging Building Information Modeling ( BIM ) data, which defines hundreds of classes for common use in software, currently supported by...specifications, Construction Operations Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF:

  11. Prediction of car cabin environment by means of 1D and 3D cabin model

    NASA Astrophysics Data System (ADS)

    Fišer, J.; Pokorný, J.; Jícha, M.

    2012-04-01

    Thermal comfort and also reduction of energy requirements of air-conditioning system in vehicle cabins are currently very intensively investigated and up-to-date issues. The article deals with two approaches of modelling of car cabin environment; the first model was created in simulation language Modelica (typical 1D approach without cabin geometry) and the second one was created in specialized software Theseus-FE (3D approach with cabin geometry). Performance and capabilities of this tools are demonstrated on the example of the car cabin and the results from simulations are compared with the results from the real car cabin climate chamber measurements.

  12. Gene Therapy and Targeted Toxins for Glioma

    PubMed Central

    Castro, Maria G.; Candolfi, Marianela; Kroeger, Kurt; King, Gwendalyn D.; Curtin, James F.; Yagiz, Kader; Mineharu, Yohei; Assi, Hikmat; Wibowo, Mia; Muhammad, AKM Ghulam; Foulad, David; Puntel, Mariana; Lowenstein, Pedro R.

    2011-01-01

    The most common primary brain tumor in adults is glioblastoma. These tumors are highly invasive and aggressive with a mean survival time of nine to twelve months from diagnosis to death. Current treatment modalities are unable to significantly prolong survival in patients diagnosed with glioblastoma. As such, glioma is an attractive target for developing novel therapeutic approaches utilizing gene therapy. This review will examine the available preclinical models for glioma including xenographs, syngeneic and genetic models. Several promising therapeutic targets are currently being pursued in pre-clinical investigations. These targets will be reviewed by mechanism of action, i.e., conditional cytotoxic, targeted toxins, oncolytic viruses, tumor suppressors/oncogenes, and immune stimulatory approaches. Preclinical gene therapy paradigms aim to determine which strategies will provide rapid tumor regression and long-term protection from recurrence. While a wide range of potential targets are being investigated preclinically, only the most efficacious are further transitioned into clinical trial paradigms. Clinical trials reported to date are summarized including results from conditionally cytotoxic, targeted toxins, oncolytic viruses and oncogene targeting approaches. Clinical trial results have not been as robust as preclinical models predicted; this could be due to the limitations of the GBM models employed. Once this is addressed, and we develop effective gene therapies in models that better replicate the clinical scenario, gene therapy will provide a powerful approach to treat and manage brain tumors. PMID:21453286

  13. On tridimensional rip current modeling

    NASA Astrophysics Data System (ADS)

    Marchesiello, Patrick; Benshila, Rachid; Almar, Rafael; Uchiyama, Yusuke; McWilliams, James C.; Shchepetkin, Alexander

    2015-12-01

    Do lateral shear instabilities of nearshore circulation account for a substantial part of Very Low-Frequency (VLF) variability? If yes, it would promote stirring and mixing of coastal waters and surf-shelf exchanges. Another question is whether tridimensional transient processes are important for instability generation. An innovative modeling system with tridimensional wave-current interactions was designed to investigate transient nearshore currents and interactions between nearshore and innershelf circulations. We present here some validation of rip current modeling for the Aquitanian coast of France, using in-situ and remote video sensing. We then proceed to show the benefits of 3D versus 2D (depth-mean flow) modeling of rip currents and their low-frequency variability. It appears that a large part of VLF motions is due to intrinsic variability of the tridimensional flow. 3D models may thus provide a valuable, only marginally more expensive alternative to conventional 2D approaches that miss the vertical flow structure and its nonlinear interaction with the depth-averaged flow.

  14. Agent-Based Modeling in Public Health: Current Applications and Future Directions.

    PubMed

    Tracy, Melissa; Cerdá, Magdalena; Keyes, Katherine M

    2018-04-01

    Agent-based modeling is a computational approach in which agents with a specified set of characteristics interact with each other and with their environment according to predefined rules. We review key areas in public health where agent-based modeling has been adopted, including both communicable and noncommunicable disease, health behaviors, and social epidemiology. We also describe the main strengths and limitations of this approach for questions with public health relevance. Finally, we describe both methodologic and substantive future directions that we believe will enhance the value of agent-based modeling for public health. In particular, advances in model validation, comparisons with other causal modeling procedures, and the expansion of the models to consider comorbidity and joint influences more systematically will improve the utility of this approach to inform public health research, practice, and policy.

  15. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  16. Efficient prediction of terahertz quantum cascade laser dynamics from steady-state simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnew, G.; Lim, Y. L.; Nikolić, M.

    2015-04-20

    Terahertz-frequency quantum cascade lasers (THz QCLs) based on bound-to-continuum active regions are difficult to model owing to their large number of quantum states. We present a computationally efficient reduced rate equation (RE) model that reproduces the experimentally observed variation of THz power with respect to drive current and heat-sink temperature. We also present dynamic (time-domain) simulations under a range of drive currents and predict an increase in modulation bandwidth as the current approaches the peak of the light–current curve, as observed experimentally in mid-infrared QCLs. We account for temperature and bias dependence of the carrier lifetimes, gain, and injection efficiency,more » calculated from a full rate equation model. The temperature dependence of the simulated threshold current, emitted power, and cut-off current are thus all reproduced accurately with only one fitting parameter, the interface roughness, in the full REs. We propose that the model could therefore be used for rapid dynamical simulation of QCL designs.« less

  17. Aphasia: Current Concepts in Theory and Practice

    PubMed Central

    Tippett, Donna C.; Niparko, John K.; Hillis, Argye E.

    2014-01-01

    Recent advances in neuroimaging contribute to a new insights regarding brain-behavior relationships and expand understanding of the functional neuroanatomy of language. Modern concepts of the functional neuroanatomy of language invoke rich and complex models of language comprehension and expression, such as dual stream networks. Increasingly, aphasia is seen as a disruption of cognitive processes underlying language. Rehabilitation of aphasia incorporates evidence based and person-centered approaches. Novel techniques, such as methods of delivering cortical brain stimulation to modulate cortical excitability, such as repetitive transcranial magnetic stimulation and transcranial direct current stimulation, are just beginning to be explored. In this review, we discuss the historical context of the foundations of neuroscientific approaches to language. We sample the emergent theoretical models of the neural substrates of language and cognitive processes underlying aphasia that contribute to more refined and nuanced concepts of language. Current concepts of aphasia rehabilitation are reviewed, including the promising role of cortical stimulation as an adjunct to behavioral therapy and changes in therapeutic approaches based on principles of neuroplasticity and evidence-based/person-centered practice to optimize functional outcomes. PMID:24904925

  18. Efficacy of monitoring and empirical predictive modeling at improving public health protection at Chicago beaches

    USGS Publications Warehouse

    Nevers, Meredith B.; Whitman, Richard L.

    2011-01-01

    Efforts to improve public health protection in recreational swimming waters have focused on obtaining real-time estimates of water quality. Current monitoring techniques rely on the time-intensive culturing of fecal indicator bacteria (FIB) from water samples, but rapidly changing FIB concentrations result in management errors that lead to the public being exposed to high FIB concentrations (type II error) or beaches being closed despite acceptable water quality (type I error). Empirical predictive models may provide a rapid solution, but their effectiveness at improving health protection has not been adequately assessed. We sought to determine if emerging monitoring approaches could effectively reduce risk of illness exposure by minimizing management errors. We examined four monitoring approaches (inactive, current protocol, a single predictive model for all beaches, and individual models for each beach) with increasing refinement at 14 Chicago beaches using historical monitoring and hydrometeorological data and compared management outcomes using different standards for decision-making. Predictability (R2) of FIB concentration improved with model refinement at all beaches but one. Predictive models did not always reduce the number of management errors and therefore the overall illness burden. Use of a Chicago-specific single-sample standard-rather than the default 235 E. coli CFU/100 ml widely used-together with predictive modeling resulted in the greatest number of open beach days without any increase in public health risk. These results emphasize that emerging monitoring approaches such as empirical models are not equally applicable at all beaches, and combining monitoring approaches may expand beach access.

  19. Current views on HIV-1 latency, persistence, and cure.

    PubMed

    Melkova, Zora; Shankaran, Prakash; Madlenakova, Michaela; Bodor, Josef

    2017-01-01

    HIV-1 infection cannot be cured as it persists in latently infected cells that are targeted neither by the immune system nor by available therapeutic approaches. Consequently, a lifelong therapy suppressing only the actively replicating virus is necessary. The latent reservoir has been defined and characterized in various experimental models and in human patients, allowing research and development of approaches targeting individual steps critical for HIV-1 latency establishment, maintenance, and reactivation. However, additional mechanisms and processes driving the remaining low-level HIV-1 replication in the presence of the suppressive therapy still remain to be identified and targeted. Current approaches toward HIV-1 cure involve namely attempts to reactivate and purge HIV latently infected cells (so-called "shock and kill" strategy), as well as approaches involving gene therapy and/or gene editing and stem cell transplantation aiming at generation of cells resistant to HIV-1. This review summarizes current views and concepts underlying different approaches aiming at functional or sterilizing cure of HIV-1 infection.

  20. Design principles for shift current photovoltaics

    DOE PAGES

    Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando; ...

    2017-01-25

    While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. Furthermore, by analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. This method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenidesmore » such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W -1 . Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells.« less

  1. Design principles for shift current photovoltaics

    PubMed Central

    Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando; Coh, Sinisa; Moore, Joel E.

    2017-01-01

    While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. By analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. Our method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenides such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W−1. Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells. PMID:28120823

  2. Design principles for shift current photovoltaics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Ashley M.; M. Fregoso, Benjamin; de Juan, Fernando

    While the basic principles of conventional solar cells are well understood, little attention has gone towards maximizing the efficiency of photovoltaic devices based on shift currents. Furthermore, by analysing effective models, here we outline simple design principles for the optimization of shift currents for frequencies near the band gap. This method allows us to express the band edge shift current in terms of a few model parameters and to show it depends explicitly on wavefunctions in addition to standard band structure. We use our approach to identify two classes of shift current photovoltaics, ferroelectric polymer films and single-layer orthorhombic monochalcogenidesmore » such as GeS, which display the largest band edge responsivities reported so far. Moreover, exploring the parameter space of the tight-binding models that describe them we find photoresponsivities that can exceed 100 mA W -1 . Our results illustrate the great potential of shift current photovoltaics to compete with conventional solar cells.« less

  3. Causal Models for Mediation Analysis: An Introduction to Structural Mean Models.

    PubMed

    Zheng, Cheng; Atkins, David C; Zhou, Xiao-Hua; Rhew, Isaac C

    2015-01-01

    Mediation analyses are critical to understanding why behavioral interventions work. To yield a causal interpretation, common mediation approaches must make an assumption of "sequential ignorability." The current article describes an alternative approach to causal mediation called structural mean models (SMMs). A specific SMM called a rank-preserving model (RPM) is introduced in the context of an applied example. Particular attention is given to the assumptions of both approaches to mediation. Applying both mediation approaches to the college student drinking data yield notable differences in the magnitude of effects. Simulated examples reveal instances in which the traditional approach can yield strongly biased results, whereas the RPM approach remains unbiased in these cases. At the same time, the RPM approach has its own assumptions that must be met for correct inference, such as the existence of a covariate that strongly moderates the effect of the intervention on the mediator and no unmeasured confounders that also serve as a moderator of the effect of the intervention or the mediator on the outcome. The RPM approach to mediation offers an alternative way to perform mediation analysis when there may be unmeasured confounders.

  4. Integrated Workforce Modeling System

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.

    2000-01-01

    There are several computer-based systems, currently in various phases of development at KSC, which encompass some component, aspect, or function of workforce modeling. These systems may offer redundant capabilities and/or incompatible interfaces. A systems approach to workforce modeling is necessary in order to identify and better address user requirements. This research has consisted of two primary tasks. Task 1 provided an assessment of existing and proposed KSC workforce modeling systems for their functionality and applicability to the workforce planning function. Task 2 resulted in the development of a proof-of-concept design for a systems approach to workforce modeling. The model incorporates critical aspects of workforce planning, including hires, attrition, and employee development.

  5. A modular approach to addressing model design, scale, and parameter estimation issues in distributed hydrological modelling

    USGS Publications Warehouse

    Leavesley, G.H.; Markstrom, S.L.; Restrepo, Pedro J.; Viger, R.J.

    2002-01-01

    A modular approach to model design and construction provides a flexible framework in which to focus the multidisciplinary research and operational efforts needed to facilitate the development, selection, and application of the most robust distributed modelling methods. A variety of modular approaches have been developed, but with little consideration for compatibility among systems and concepts. Several systems are proprietary, limiting any user interaction. The US Geological Survey modular modelling system (MMS) is a modular modelling framework that uses an open source software approach to enable all members of the scientific community to address collaboratively the many complex issues associated with the design, development, and application of distributed hydrological and environmental models. Implementation of a common modular concept is not a trivial task. However, it brings the resources of a larger community to bear on the problems of distributed modelling, provides a framework in which to compare alternative modelling approaches objectively, and provides a means of sharing the latest modelling advances. The concepts and components of the MMS are described and an example application of the MMS, in a decision-support system context, is presented to demonstrate current system capabilities. Copyright ?? 2002 John Wiley and Sons, Ltd.

  6. Detecting Abnormal Vehicular Dynamics at Intersections Based on an Unsupervised Learning Approach and a Stochastic Model

    PubMed Central

    Jiménez-Hernández, Hugo; González-Barbosa, Jose-Joel; Garcia-Ramírez, Teresa

    2010-01-01

    This investigation demonstrates an unsupervised approach for modeling traffic flow and detecting abnormal vehicle behaviors at intersections. In the first stage, the approach reveals and records the different states of the system. These states are the result of coding and grouping the historical motion of vehicles as long binary strings. In the second stage, using sequences of the recorded states, a stochastic graph model based on a Markovian approach is built. A behavior is labeled abnormal when current motion pattern cannot be recognized as any state of the system or a particular sequence of states cannot be parsed with the stochastic model. The approach is tested with several sequences of images acquired from a vehicular intersection where the traffic flow and duration used in connection with the traffic lights are continuously changed throughout the day. Finally, the low complexity and the flexibility of the approach make it reliable for use in real time systems. PMID:22163616

  7. Detecting abnormal vehicular dynamics at intersections based on an unsupervised learning approach and a stochastic model.

    PubMed

    Jiménez-Hernández, Hugo; González-Barbosa, Jose-Joel; Garcia-Ramírez, Teresa

    2010-01-01

    This investigation demonstrates an unsupervised approach for modeling traffic flow and detecting abnormal vehicle behaviors at intersections. In the first stage, the approach reveals and records the different states of the system. These states are the result of coding and grouping the historical motion of vehicles as long binary strings. In the second stage, using sequences of the recorded states, a stochastic graph model based on a Markovian approach is built. A behavior is labeled abnormal when current motion pattern cannot be recognized as any state of the system or a particular sequence of states cannot be parsed with the stochastic model. The approach is tested with several sequences of images acquired from a vehicular intersection where the traffic flow and duration used in connection with the traffic lights are continuously changed throughout the day. Finally, the low complexity and the flexibility of the approach make it reliable for use in real time systems.

  8. Effect of boundary treatments on quantum transport current in the Green's function and Wigner distribution methods for a nano-scale DG-MOSFET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang Haiyan; Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223-0001; Cai Wei

    2010-06-20

    In this paper, we conduct a study of quantum transport models for a two-dimensional nano-size double gate (DG) MOSFET using two approaches: non-equilibrium Green's function (NEGF) and Wigner distribution. Both methods are implemented in the framework of the mode space methodology where the electron confinements below the gates are pre-calculated to produce subbands along the vertical direction of the device while the transport along the horizontal channel direction is described by either approach. Each approach handles the open quantum system along the transport direction in a different manner. The NEGF treats the open boundaries with boundary self-energy defined by amore » Dirichlet to Neumann mapping, which ensures non-reflection at the device boundaries for electron waves leaving the quantum device active region. On the other hand, the Wigner equation method imposes an inflow boundary treatment for the Wigner distribution, which in contrast ensures non-reflection at the boundaries for free electron waves entering the device active region. In both cases the space-charge effect is accounted for by a self-consistent coupling with a Poisson equation. Our goals are to study how the device boundaries are treated in both transport models affects the current calculations, and to investigate the performance of both approaches in modeling the DG-MOSFET. Numerical results show mostly consistent quantum transport characteristics of the DG-MOSFET using both methods, though with higher transport current for the Wigner equation method, and also provide the current-voltage (I-V) curve dependence on various physical parameters such as the gate voltage and the oxide thickness.« less

  9. An analysis of household waste management policy using system dynamics modelling.

    PubMed

    Inghels, Dirk; Dullaert, Wout

    2011-04-01

    This paper analyses the Flemish household waste management policy. Based on historical data from the period 1991-2006, literature reviews and interviews, both mathematical and descriptive relationships are derived that describe Flemish waste collection, reuse, recycling and disposal behaviour. This provides insights into how gross domestic product (GDP), population and selective collection behaviour have influenced household waste production and collection over time. These relationships are used to model the dynamic relationships underlying household waste management in Flanders by using a system dynamics (SD) modelling approach. Where most SD models in literature are conceptual and descriptive, in the present study a real-life case with both correlational and descriptive relationships was modelled for Flanders, a European region with an outstanding waste management track record. This model was used to evaluate the current Flemish household waste management policy based on the principles of the waste hierarchy, also referred as the Lansink ranking. The results show that Flemish household waste targets up to 2015 can be achieved by the current waste policy measures. It also shows the sensitivity of some key policy parameters such as prevention and reuse. Given the general nature of the model and its limited data requirements, the authors believe that the approach implemented in this model can also assist waste policy makers in other regions or countries to meet their policy targets by simulating the effect of their current and potential household waste policy measures.

  10. Survey of air cargo forecasting techniques

    NASA Technical Reports Server (NTRS)

    Kuhlthan, A. R.; Vermuri, R. S.

    1978-01-01

    Forecasting techniques currently in use in estimating or predicting the demand for air cargo in various markets are discussed with emphasis on the fundamentals of the different forecasting approaches. References to specific studies are cited when appropriate. The effectiveness of current methods is evaluated and several prospects for future activities or approaches are suggested. Appendices contain summary type analyses of about 50 specific publications on forecasting, and selected bibliographies on air cargo forecasting, air passenger demand forecasting, and general demand and modalsplit modeling.

  11. Bioprocess systems engineering: transferring traditional process engineering principles to industrial biotechnology.

    PubMed

    Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N; Mantalaris, Athanasios

    2012-01-01

    The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.

  12. Bioprocess systems engineering: transferring traditional process engineering principles to industrial biotechnology

    PubMed Central

    Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N.; Mantalaris, Athanasios

    2013-01-01

    The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals. PMID:24688682

  13. An integrated uncertainty analysis and data assimilation approach for improved streamflow predictions

    NASA Astrophysics Data System (ADS)

    Hogue, T. S.; He, M.; Franz, K. J.; Margulis, S. A.; Vrugt, J. A.

    2010-12-01

    The current study presents an integrated uncertainty analysis and data assimilation approach to improve streamflow predictions while simultaneously providing meaningful estimates of the associated uncertainty. Study models include the National Weather Service (NWS) operational snow model (SNOW17) and rainfall-runoff model (SAC-SMA). The proposed approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. An ensemble Kalman filter (EnKF) is configured with the DREAM-identified uncertainty structure and applied to assimilating snow water equivalent data into the SNOW17 model for improved snowmelt simulations. Snowmelt estimates then serves as an input to the SAC-SMA model to provide streamflow predictions at the basin outlet. The robustness and usefulness of the approach is evaluated for a snow-dominated watershed in the northern Sierra Mountains. This presentation describes the implementation of DREAM and EnKF into the coupled SNOW17 and SAC-SMA models and summarizes study results and findings.

  14. Conditions and limitations on learning in the adaptive management of mallard harvests

    USGS Publications Warehouse

    Johnson, F.A.; Kendall, W.L.; Dubovsky, J.A.

    2002-01-01

    In 1995, the United States Fish and Wildlife Service adopted a protocol for the adaptive management of waterfowl hunting regulations (AHM) to help reduce uncertainty about the magnitude of sustainable harvests. To date, the AHM process has focused principally on the midcontinent population of mallards (Anas platyrhynchos), whose dynamics are described by 4 alternative models. Collectively, these models express uncertainty (or disagreement) about whether harvest is an additive or a compensatory form of mortality and whether the reproductive process is weakly or strongly density-dependent. Each model is associated with a probability or 'weight,' which describes its relative ability to predict changes in population size. These Bayesian probabilities are updated annually using a comparison of population size predicted under each model with that observed by a monitoring program. The current AHM process is passively adaptive, in the sense that there is no a priori consideration of how harvest decisions might affect discrimination among models. We contrast this approach with an actively adaptive approach, in which harvest decisions are used in part to produce the learning needed to increase long-term management performance. Our investigation suggests that the passive approach is expected to perform nearly as well as an optimal actively adaptive approach, particularly considering the nature of the model set, management objectives and constraints, and current regulatory alternatives. We offer some comments about the nature of the biological hypotheses being tested and describe some of the inherent limitations on learning in the AHM process.

  15. Theoretical Approaches in Evolutionary Ecology: Environmental Feedback as a Unifying Perspective.

    PubMed

    Lion, Sébastien

    2018-01-01

    Evolutionary biology and ecology have a strong theoretical underpinning, and this has fostered a variety of modeling approaches. A major challenge of this theoretical work has been to unravel the tangled feedback loop between ecology and evolution. This has prompted the development of two main classes of models. While quantitative genetics models jointly consider the ecological and evolutionary dynamics of a focal population, a separation of timescales between ecology and evolution is assumed by evolutionary game theory, adaptive dynamics, and inclusive fitness theory. As a result, theoretical evolutionary ecology tends to be divided among different schools of thought, with different toolboxes and motivations. My aim in this synthesis is to highlight the connections between these different approaches and clarify the current state of theory in evolutionary ecology. Central to this approach is to make explicit the dependence on environmental dynamics of the population and evolutionary dynamics, thereby materializing the eco-evolutionary feedback loop. This perspective sheds light on the interplay between environmental feedback and the timescales of ecological and evolutionary processes. I conclude by discussing some potential extensions and challenges to our current theoretical understanding of eco-evolutionary dynamics.

  16. Resource Letter MPCVW-1: Modeling Political Conflict, Violence, and Wars: A Survey

    NASA Astrophysics Data System (ADS)

    Morgenstern, Ana P.; Velásquez, Nicolás; Manrique, Pedro; Hong, Qi; Johnson, Nicholas; Johnson, Neil

    2013-11-01

    This Resource Letter provides a guide into the literature on modeling and explaining political conflict, violence, and wars. Although this literature is dominated by social scientists, multidisciplinary work is currently being developed in the wake of myriad methodological approaches that have sought to analyze and predict political violence. The works covered herein present an overview of this abundance of methodological approaches. Since there is a variety of possible data sets and theoretical approaches, the level of detail and scope of models can vary quite considerably. The review does not provide a summary of the available data sets, but instead highlights recent works on quantitative or multi-method approaches to modeling different forms of political violence. Journal articles and books are organized in the following topics: social movements, diffusion of social movements, political violence, insurgencies and terrorism, and civil wars.

  17. Evaluating disease management program effectiveness: an introduction to time-series analysis.

    PubMed

    Linden, Ariel; Adams, John L; Roberts, Nancy

    2003-01-01

    Currently, the most widely used method in the disease management (DM) industry for evaluating program effectiveness is referred to as the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer a plausible rationale explaining the change from baseline. Furthermore, with the current inclination of DM programs to use financial indicators rather than program-specific utilization indicators as the principal measure of program success, additional biases are introduced that may cloud evaluation results. This paper presents a non-technical introduction to time-series analysis (using disease-specific utilization measures) as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.

  18. School Leadership Models: What Do We Know?

    ERIC Educational Resources Information Center

    Bush, Tony; Glover, Derek

    2014-01-01

    The growth in the importance of school leadership has been accompanied by theory development, with new models emerging and established approaches being redefined and further developed. The purpose of this paper is to review current and recent writing on leadership models. The paper examines theoretical literature, to see how leadership is…

  19. [Anorexia nervosa and bulimia nervosa. Psychological considerations for its treatment].

    PubMed

    Barriguete Meléndez, J Armando; Rojo, Luis; Emmelhainz, Marisa

    2004-11-01

    It is presented the current perspectives in the study and treatment of the eating disorders, in specific: anorexia nervosa and bulimia nervosa, epidemiology, and the interface among the different medical specialties, nutrition and sciences of the behavior, the diagnostic approaches, instruments and current therapeutic models.

  20. A Self-Consistent Model of the Interacting Ring Current Ions and Electromagnetic Ion Cyclotron Waves, Initial Results: Waves and Precipitating Fluxes

    NASA Technical Reports Server (NTRS)

    Khazanov, G. V.; Gamayunov, K. V.; Jordanova, V. K.; Krivorutsky, E. N.

    2002-01-01

    Initial results from a newly developed model of the interacting ring current ions and ion cyclotron waves are presented. The model is based on the system of two kinetic equations: one equation describes the ring current ion dynamics, and another equation describes wave evolution. The system gives a self-consistent description of the ring current ions and ion cyclotron waves in a quasilinear approach. These equations for the ion phase space distribution function and for the wave power spectral density were solved on aglobal magnetospheric scale undernonsteady state conditions during the 2-5 May 1998 storm. The structure and dynamics of the ring current proton precipitating flux regions and the ion cyclotron wave-active zones during extreme geomagnetic disturbances on 4 May 1998 are presented and discussed in detail.

  1. A Self-Consistent Model of the Interacting Ring Current Ions and Electromagnetic ICWs. Initial Results: Waves and Precipitation Fluxes

    NASA Technical Reports Server (NTRS)

    Khazanov, G. V.; Gamayunov, K. V.; Jordanova, V. K.; Krivorutsky, E. N.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    Initial results from the new developed model of the interacting ring current ions and ion cyclotron waves are presented. The model described by the system of two bound kinetic equations: one equation describes the ring current ion dynamics, and another one gives wave evolution. Such system gives a self-consistent description of the ring current ions and ion cyclotron waves in a quasilinear approach. Calculating ion-wave relationships, on a global scale under non steady-state conditions during May 2-5, 1998 storm, we presented the data at three time cuts around initial, main, and late recovery phases of May 4, 1998 storm phase. The structure and dynamics of the ring current proton precipitating flux regions and the wave active ones are discussed in detail.

  2. Modeling dilute pyroclastic density currents on Earth and Mars

    NASA Astrophysics Data System (ADS)

    Clarke, A. B.; Brand, B. D.; De'Michieli Vitturi, M.

    2013-12-01

    The surface of Mars has been shaped extensively by volcanic activity, including explosive eruptions that may have been heavily influenced by water- or ice-magma interaction. However, the dynamics of associated pyroclastic density currents (PDCs) under Martian atmospheric conditions and controls on deposition and runout from such currents are poorly understood. This work combines numerical modeling with terrestrial field measurements to explore the dynamics of dilute PDC dynamics on Earth and Mars, especially as they relate to deposit characteristics. We employ two numerical approaches. Model (1) consists of simulation of axi-symmetric flow and sedimentation from a steady-state, depth-averaged density current. Equations for conservation of mass, momentum, and energy are solved simultaneously, and the effects of atmospheric entrainment, particle sedimentation, basal friction, temperature changes, and variations in current thickness and density are explored. The Rouse number and Brunt-Väisälä frequency are used to estimate the wavelength of internal gravity waves in a density-stratified current, which allows us to predict deposit dune wavelengths. The model predicts realistic runout distances and bedform wavelengths for several well-documented field cases on Earth. The model results also suggest that dilute PDCs on Mars would have runout distances up to three times that of equivalent currents on Earth and would produce longer-wavelength bedforms. In both cases results are heavily dependent on source conditions, grain-size characteristics, and entrainment and friction parameters. Model (2) relaxes several key simplifications, resulting in a fully 3D, multiphase, unsteady model that captures more details of propagation, including density stratification, and depositional processes. Using this more complex approach, we focus on the role of unsteady or pulsatory vent conditions typically associated with phreatomagmatic eruptions. Runout distances from Model (2) agree reasonably well with Model (1) results, but details of deposit distribution vary between the two models. Model (2) shows that the Earth case initially outpaces the Mars case due to faster propagation velocities associated with higher gravitational acceleration. However, the Mars currents ultimately out-distance the Earth currents due to slower particle settling rates, which also largely explain the longer wavelength bedforms. Model (2) also predicts a peak in the streamwise distribution of deposits farther from the source compared to equivalent results from Model (1), and produces more complex patterns of vertical distribution of particles in the moving current, which varies significantly in time and space. This combination of modeling and deposit data results in a powerful tool for testing hypotheses related to PDCs on Mars, potentially improving our capacity to interpret Martian features on both the outcrop (e.g., Home Plate) and regional scale (e.g., Apollinaris Mons).

  3. Treatment of Sexual Disorders in the 1990s: An Integrated Approach.

    ERIC Educational Resources Information Center

    Rosen, Raymond C.; Leiblum, Sandra R.

    1995-01-01

    Reviews existing data regarding the etiology and treatment of male and female sexual dysfunctions. Discusses the use of multidimensional assessment models, especially in the evaluation of erectile dysfunction and sexual pain disorders. Despite the conceptual and technological sophistication of current approaches, treatment outcomes are…

  4. Merging field mapping and modeling to interpret the lithofacies variations from unsteady ash-rich pyroclastic density currents on uneven topography

    NASA Astrophysics Data System (ADS)

    Doronzo, Domenico; Dellino, Pierfrancesco; Sulpizio, Roberto; Lucchi, Federico

    2017-04-01

    In order to obtain significant volcanological results from computer simulations of explosive eruptions, one either needs a systematic statistical approach to test a wide range of initial and boundary conditions, or needs using a well-constrained field case study. Here we followed the second approach, using data obtained from field mapping of the Grotta dei Palizzi 2 pyroclastic deposits (Vulcano Island, Italy) as input for numerical modeling. This case study deals with impulsive phreatomagmatic explosions that generated ash-rich pyroclastic density currents, interacting with the high topographic obstacle of the La Fossa Caldera rim. We demonstrate that by merging field data with 3D numerical simulation it is possible to highlight the details of the dynamical current-terrain interaction, and to interpret the lithofacies variations of the associated deposits as a function of topography-induced sedimentation rate. Results suggest that a value of the sedimentation rate lower than 5 kg/m2s at the bed load can still be sheared by the overlying current, producing tractional structures in the deposit. Instead, a sedimentation rate in excess of that threshold can preclude the formation of tractional structures, producing thick massive deposits. We think that the approach used in this study could be applied to other case studies to confirm or refine such threshold value of the sedimentation rate, which is to be considered as an upper value as for the limitations of the numerical model.

  5. Past, present and prospect of an Artificial Intelligence (AI) based model for sediment transport prediction

    NASA Astrophysics Data System (ADS)

    Afan, Haitham Abdulmohsin; El-shafie, Ahmed; Mohtar, Wan Hanna Melini Wan; Yaseen, Zaher Mundher

    2016-10-01

    An accurate model for sediment prediction is a priority for all hydrological researchers. Many conventional methods have shown an inability to achieve an accurate prediction of suspended sediment. These methods are unable to understand the behaviour of sediment transport in rivers due to the complexity, noise, non-stationarity, and dynamism of the sediment pattern. In the past two decades, Artificial Intelligence (AI) and computational approaches have become a remarkable tool for developing an accurate model. These approaches are considered a powerful tool for solving any non-linear model, as they can deal easily with a large number of data and sophisticated models. This paper is a review of all AI approaches that have been applied in sediment modelling. The current research focuses on the development of AI application in sediment transport. In addition, the review identifies major challenges and opportunities for prospective research. Throughout the literature, complementary models superior to classical modelling.

  6. The Layer-Oriented Approach to Declarative Languages for Biological Modeling

    PubMed Central

    Raikov, Ivan; De Schutter, Erik

    2012-01-01

    We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language. PMID:22615554

  7. The layer-oriented approach to declarative languages for biological modeling.

    PubMed

    Raikov, Ivan; De Schutter, Erik

    2012-01-01

    We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language.

  8. Predicting Trophic Interactions and Habitat Utilization in the California Current Ecosystem

    DTIC Science & Technology

    2013-09-30

    in the California Current Ecosystem Jerome Fiechter UC Santa Cruz Institute of Marine Sciences 1156 High Street Santa Cruz, CA 95064 phone... Ecosystem (CCLME), the long-term goal of our modeling approach is to better understand and characterize biological “hotspots” (i.e., the aggregation of...multiple marine organisms over multiple trophic levels) off the U.S. west coast and in other regions where similar fully-coupled ecosystem models may

  9. Uncertainty quantification of fast sodium current steady-state inactivation for multi-scale models of cardiac electrophysiology.

    PubMed

    Pathmanathan, Pras; Shotwell, Matthew S; Gavaghan, David J; Cordeiro, Jonathan M; Gray, Richard A

    2015-01-01

    Perhaps the most mature area of multi-scale systems biology is the modelling of the heart. Current models are grounded in over fifty years of research in the development of biophysically detailed models of the electrophysiology (EP) of cardiac cells, but one aspect which is inadequately addressed is the incorporation of uncertainty and physiological variability. Uncertainty quantification (UQ) is the identification and characterisation of the uncertainty in model parameters derived from experimental data, and the computation of the resultant uncertainty in model outputs. It is a necessary tool for establishing the credibility of computational models, and will likely be expected of EP models for future safety-critical clinical applications. The focus of this paper is formal UQ of one major sub-component of cardiac EP models, the steady-state inactivation of the fast sodium current, INa. To better capture average behaviour and quantify variability across cells, we have applied for the first time an 'individual-based' statistical methodology to assess voltage clamp data. Advantages of this approach over a more traditional 'population-averaged' approach are highlighted. The method was used to characterise variability amongst cells isolated from canine epi and endocardium, and this variability was then 'propagated forward' through a canine model to determine the resultant uncertainty in model predictions at different scales, such as of upstroke velocity and spiral wave dynamics. Statistically significant differences between epi and endocardial cells (greater half-inactivation and less steep slope of steady state inactivation curve for endo) was observed, and the forward propagation revealed a lack of robustness of the model to underlying variability, but also surprising robustness to variability at the tissue scale. Overall, the methodology can be used to: (i) better analyse voltage clamp data; (ii) characterise underlying population variability; (iii) investigate consequences of variability; and (iv) improve the ability to validate a model. To our knowledge this article is the first to quantify population variability in membrane dynamics in this manner, and the first to perform formal UQ for a component of a cardiac model. The approach is likely to find much wider applicability across systems biology as current application domains reach greater levels of maturity. Published by Elsevier Ltd.

  10. A REVIEW OF BIOACCUMULATION MODELING APPROACHES FOR PERSISTENT ORGANIC POLLUTANTS

    EPA Science Inventory

    Persistent organic pollutants and mercury are likely to bioaccumulate in biological components of the environment, including fish and wildlife. The complex and long-term dynamics involved with bioaccumulation are often represented with models. Current scientific developments in t...

  11. Democratic Postsecondary Vocational Education

    ERIC Educational Resources Information Center

    Molnar, Christopher J.

    2010-01-01

    This dissertation offers a critique of current approaches to postsecondary vocational education. It concludes that the traditional apprenticeship and didactic model leads to inadequate preparation of students for independent thinking and problem solving. An alternative model is proposed that uses "democratic education" principles. These include…

  12. Uncertainties in Emissions In Emissions Inputs for Near-Road Assessments

    EPA Science Inventory

    Emissions, travel demand, and dispersion models are all needed to obtain temporally and spatially resolved pollutant concentrations. Current methodology combines these three models in a bottom-up approach based on hourly traffic and emissions estimates, and hourly dispersion conc...

  13. Tolerance in Nonhuman Primates by Delayed Mixed Chimerism

    DTIC Science & Technology

    2017-12-01

    person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control ...induction of mixed chimerism in a non -human primate (NHP) model. This approach, in contrast to protocols which have already reached clinical trials...principle of delayed induction of mixed chimerism in a non -human primate (NHP) model. This approach, in contrast to protocols which have already reached

  14. Multi-scale heat and mass transfer modelling of cell and tissue cryopreservation

    PubMed Central

    Xu, Feng; Moon, Sangjun; Zhang, Xiaohui; Shao, Lei; Song, Young Seok; Demirci, Utkan

    2010-01-01

    Cells and tissues undergo complex physical processes during cryopreservation. Understanding the underlying physical phenomena is critical to improve current cryopreservation methods and to develop new techniques. Here, we describe multi-scale approaches for modelling cell and tissue cryopreservation including heat transfer at macroscale level, crystallization, cell volume change and mass transport across cell membranes at microscale level. These multi-scale approaches allow us to study cell and tissue cryopreservation. PMID:20047939

  15. Assessing habitat risk from human activities to inform coastal and marine spatial planning: a demonstration in Belize

    NASA Astrophysics Data System (ADS)

    Arkema, Katie K.; Verutes, Gregory; Bernhardt, Joanna R.; Clarke, Chantalle; Rosado, Samir; Canto, Maritza; Wood, Spencer A.; Ruckelshaus, Mary; Rosenthal, Amy; McField, Melanie; de Zegher, Joann

    2014-11-01

    Integrated coastal and ocean management requires transparent and accessible approaches for understanding the influence of human activities on marine environments. Here we introduce a model for assessing the combined risk to habitats from multiple ocean uses. We apply the model to coral reefs, mangrove forests and seagrass beds in Belize to inform the design of the country’s first Integrated Coastal Zone Management (ICZM) Plan. Based on extensive stakeholder engagement, review of existing legislation and data collected from diverse sources, we map the current distribution of coastal and ocean activities and develop three scenarios for zoning these activities in the future. We then estimate ecosystem risk under the current and three future scenarios. Current levels of risk vary spatially among the nine coastal planning regions in Belize. Empirical tests of the model are strong—three-quarters of the measured data for coral reef health lie within the 95% confidence interval of interpolated model data and 79% of the predicted mangrove occurrences are associated with observed responses. The future scenario that harmonizes conservation and development goals results in a 20% reduction in the area of high-risk habitat compared to the current scenario, while increasing the extent of several ocean uses. Our results are a component of the ICZM Plan for Belize that will undergo review by the national legislature in 2015. This application of our model to marine spatial planning in Belize illustrates an approach that can be used broadly by coastal and ocean planners to assess risk to habitats under current and future management scenarios.

  16. The influence of media role models on gay, lesbian, and bisexual identity.

    PubMed

    Gomillion, Sarah C; Giuliano, Traci A

    2011-01-01

    The current investigation examined the influence of the media on gay, lesbian, and bisexual (GLB) identity using both survey and in-depth interview approaches. In Study 1, 126 GLB survey respondents (11 unreported) in Texas indicated that the media influenced their self-realization, coming out, and current identities by providing role models and inspiration. In Study 2, 15 interviewees (6 women and 9 men) revealed that media role models serve as sources of pride, inspiration, and comfort. Our findings suggest that increasing the availability of GLB role models in the media may positively influence GLB identity.

  17. Update in Infectious Diseases 2017.

    PubMed

    Candel, F J; Peñuelas, M; Lejárraga, C; Emilov, T; Rico, C; Díaz, I; Lázaro, C; Viñuela-Prieto, J M; Matesanz, M

    2017-09-01

    Antimicrobial resistance in complex models of continuous infection is a current issue. The update 2017 course addresses about microbiological, epidemiological and clinical aspects useful for a current approach to infectious disease. During the last year, nosocomial pneumonia approach guides, recommendations for management of yeast and filamentous fungal infections, review papers on the empirical approach to peritonitis and extensive guidelines on stewardship have been published. HIV infection is being treated before and more intensively. The implementation of molecular biology, spectrometry and inmunology to traditional techniques of staining and culture achieve a better and faster microbiological diagnosis. Finally, the infection is increasingly integrated, assessing non-antibiotic aspects in the treatment.

  18. A framework for testing and comparing binaural models.

    PubMed

    Dietz, Mathias; Lestang, Jean-Hugues; Majdak, Piotr; Stern, Richard M; Marquardt, Torsten; Ewert, Stephan D; Hartmann, William M; Goodman, Dan F M

    2018-03-01

    Auditory research has a rich history of combining experimental evidence with computational simulations of auditory processing in order to deepen our theoretical understanding of how sound is processed in the ears and in the brain. Despite significant progress in the amount of detail and breadth covered by auditory models, for many components of the auditory pathway there are still different model approaches that are often not equivalent but rather in conflict with each other. Similarly, some experimental studies yield conflicting results which has led to controversies. This can be best resolved by a systematic comparison of multiple experimental data sets and model approaches. Binaural processing is a prominent example of how the development of quantitative theories can advance our understanding of the phenomena, but there remain several unresolved questions for which competing model approaches exist. This article discusses a number of current unresolved or disputed issues in binaural modelling, as well as some of the significant challenges in comparing binaural models with each other and with the experimental data. We introduce an auditory model framework, which we believe can become a useful infrastructure for resolving some of the current controversies. It operates models over the same paradigms that are used experimentally. The core of the proposed framework is an interface that connects three components irrespective of their underlying programming language: The experiment software, an auditory pathway model, and task-dependent decision stages called artificial observers that provide the same output format as the test subject. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Respiratory sensitization and allergy: Current research approaches and needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boverhof, Darrell R.; Billington, Richard; Gollapudi, B. Bhaskar

    2008-01-01

    There are currently no accepted regulatory models for assessing the potential of a substance to cause respiratory sensitization and allergy. In contrast, a number of models exist for the assessment of contact sensitization and allergic contact dermatitis (ACD). Research indicates that respiratory sensitizers may be identified through contact sensitization assays such as the local lymph node assay, although only a small subset of the compounds that yield positive results in these assays are actually respiratory sensitizers. Due to the increasing health concerns associated with occupational asthma and the impending directives on the regulation of respiratory sensitizers and allergens, an approachmore » which can identify these compounds and distinguish them from contact sensitizers is required. This report discusses some of the important contrasts between respiratory allergy and ACD, and highlights several prominent in vivo, in vitro and in silico approaches that are being applied or could be further developed to identify compounds capable of causing respiratory allergy. Although a number of animal models have been used for researching respiratory sensitization and allergy, protocols and endpoints for these approaches are often inconsistent, costly and difficult to reproduce, thereby limiting meaningful comparisons of data between laboratories and development of a consensus approach. A number of emerging in vitro and in silico models show promise for use in the characterization of contact sensitization potential and should be further explored for their ability to identify and differentiate contact and respiratory sensitizers. Ultimately, the development of a consistent, accurate and cost-effective model will likely incorporate a number of these approaches and will require effective communication, collaboration and consensus among all stakeholders.« less

  20. Mapping primary health care renewal in South America.

    PubMed

    Acosta Ramírez, Naydú; Giovanella, Ligia; Vega Romero, Roman; Tejerina Silva, Herland; de Almeida, Patty Fidelis; Ríos, Gilberto; Goede, Hedwig; Oliveira, Suelen

    2016-06-01

    Primary health care (PHC) renewal processes are currently ongoing in South America (SA), but their characteristics have not been systematically described. The study aimed to describe and contrast the PHC approaches being implemented in SA to provide knowledge of current conceptions, models and challenges. This multiple case study used a qualitative approach with technical visits to health ministries in order to apply key-informant interviews of 129 PHC national policy makers and 53 local managers, as well as field observation of 57 selected PHC providers and document analysis, using a common matrix for data collection and analysis. PHC approaches were analysed by triangulating sources using the following categories: PHC philosophy and conception, service provision organization, intersectoral collaboration and social participation. Primary health care models were identified in association with existing health system types and the dynamics of PHC renewal in each country. A neo-selective model was found in three countries where coverage is segmented by private and public regimes; here, individual and collective care are separated. A comprehensive approach similar to the Alma-Ata model was found in seven countries where the public sector predominates and individual, family and community care are coordinated under the responsibility of the same health care team. The process of implementing a renewed PHC approach is affected by how health systems are funded and organized. Both models face many obstacles. In addition, care system organization, intersectoral coordination and social participation are weak in most of the countries. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. From Single-Cell Dynamics to Scaling Laws in Oncology

    NASA Astrophysics Data System (ADS)

    Chignola, Roberto; Sega, Michela; Stella, Sabrina; Vyshemirsky, Vladislav; Milotti, Edoardo

    We are developing a biophysical model of tumor biology. We follow a strictly quantitative approach where each step of model development is validated by comparing simulation outputs with experimental data. While this strategy may slow down our advancements, at the same time it provides an invaluable reward: we can trust simulation outputs and use the model to explore territories of cancer biology where current experimental techniques fail. Here, we review our multi-scale biophysical modeling approach and show how a description of cancer at the cellular level has led us to general laws obeyed by both in vitro and in vivo tumors.

  2. Children's Early Approaches to Learning and Academic Trajectories through Fifth Grade

    ERIC Educational Resources Information Center

    Li-Grining, Christine P.; Votruba-Drzal, Elizabeth; Maldonado-Carreno, Carolina; Haas, Kelly

    2010-01-01

    Children's early approaches to learning (ATL) enhance their adaptation to the demands they experience with the start of formal schooling. The current study uses individual growth modeling to investigate whether children's early ATL, which includes persistence, emotion regulation, and attentiveness, explain individual differences in their academic…

  3. An examination of fuel particle heating during fire spread

    Treesearch

    Jack D. Cohen; Mark A. Finney

    2010-01-01

    Recent high intensity wildfires and our demonstrated inability to control extreme fire behavior suggest a need for alternative approaches for preventing wildfire disasters. Current fire spread models are not sufficiently based on a basic understanding of fire spread processes to provide more effective management alternatives. An experimental and theoretical approach...

  4. IMPROVING PARTICULATE MATTER SOURCE APPORTIONMENT FOR HEALTH STUDIES: A TRAINED RECEPTOR MODELING APPROACH WITH SENSITIVITY, UNCERTAINTY AND SPATIAL ANALYSES

    EPA Science Inventory

    An approach for conducting PM source apportionment will be developed, tested, and applied that directly addresses limitations in current SA methods, in particular variability, biases, and intensive resource requirements. Uncertainties in SA results and sensitivities to SA inpu...

  5. Assessing FAO-56 dual crop coefficients using eddy covariance flux partitioning

    USDA-ARS?s Scientific Manuscript database

    Current approaches to scheduling crop irrigation using reference evapotranspiration (ET0) recommend using a dual-coefficient approach using basal (Kcb) and soil (Ke) coefficients along with a stress coefficient (Ks) to model crop evapotranspiration (ETc), [e.g. ETc=(Ks*Kcb+Ke)*ET0]. However, determi...

  6. Comparison of modeling approaches to prioritize chemicals based on estimates of exposure and exposure potential

    PubMed Central

    Mitchell, Jade; Arnot, Jon A.; Jolliet, Olivier; Georgopoulos, Panos G.; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A.; Vallero, Daniel A.

    2014-01-01

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA’s need to develop novel approaches and tools for rapidly prioritizing chemicals, a “Challenge” was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA’s effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. PMID:23707726

  7. Comparison of modeling approaches to prioritize chemicals based on estimates of exposure and exposure potential.

    PubMed

    Mitchell, Jade; Arnot, Jon A; Jolliet, Olivier; Georgopoulos, Panos G; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A; Vallero, Daniel A

    2013-08-01

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA's need to develop novel approaches and tools for rapidly prioritizing chemicals, a "Challenge" was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA's effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Estimation and modeling of forest attributes across large spatial scales using BiomeBGC, high-resolution imagery, LiDAR data, and inventory data

    NASA Astrophysics Data System (ADS)

    Golinkoff, Jordan Seth

    The accurate estimation of forest attributes at many different spatial scales is a critical problem. Forest landowners may be interested in estimating timber volume, forest biomass, and forest structure to determine their forest's condition and value. Counties and states may be interested to learn about their forests to develop sustainable management plans and policies related to forests, wildlife, and climate change. Countries and consortiums of countries need information about their forests to set global and national targets to deal with issues of climate change and deforestation as well as to set national targets and understand the state of their forest at a given point in time. This dissertation approaches these questions from two perspectives. The first perspective uses the process model Biome-BGC paired with inventory and remote sensing data to make inferences about a current forest state given known climate and site variables. Using a model of this type, future climate data can be used to make predictions about future forest states as well. An example of this work applied to a forest in northern California is presented. The second perspective of estimating forest attributes uses high resolution aerial imagery paired with light detection and ranging (LiDAR) remote sensing data to develop statistical estimates of forest structure. Two approaches within this perspective are presented: a pixel based approach and an object based approach. Both approaches can serve as the platform on which models (either empirical growth and yield models or process models) can be run to generate inferences about future forest state and current forest biogeochemical cycling.

  9. Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reboud, C.; Premel, D.; Lesselier, D.

    2007-03-21

    Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.

  10. Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations

    NASA Astrophysics Data System (ADS)

    Reboud, C.; Prémel, D.; Lesselier, D.; Bisiaux, B.

    2007-03-01

    Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.

  11. Categorical and dimensional approaches in the evaluation of the relationship between attachment and personality disorders: an empirical study.

    PubMed

    Chiesa, Marco; Cirasola, Antonella; Williams, Riccardo; Nassisi, Valentina; Fonagy, Peter

    2017-04-01

    Although several studies have highlighted the relationship between attachment states of mind and personality disorders, their findings have not been consistent, possibly due to the application of the traditional taxonomic classification model of attachment. A more recently developed dimensional classification of attachment representations, including more specific aspects of trauma-related representations, may have advantages. In this study, we compare specific associations and predictive power of the categorical attachment and dimensional models applied to 230 Adult Attachment Interview transcripts obtained from personality disordered and nonpsychiatric subjects. We also investigate the role that current levels of psychiatric distress may have in the prediction of PD. The results showed that both models predict the presence of PD, with the dimensional approach doing better in discriminating overall diagnosis of PD. However, both models are less helpful in discriminating specific PD diagnostic subtypes. Current psychiatric distress was found to be the most consistent predictor of PD capturing a large share of the variance and obscuring the role played by attachment variables. The results suggest that attachment parameters correlate with the presence of PD alone and have no specific associations with particular PD subtypes when current psychiatric distress is taken into account.

  12. Pharmaceutical interventions for mitigating an influenza pandemic: modeling the risks and health-economic impacts.

    PubMed

    Postma, Maarten J; Milne, George; Nelson, E Anthony S; Pyenson, Bruce; Basili, Marcello; Coker, Richard; Oxford, John; Garrison, Louis P

    2010-12-01

    Model-based analyses built on burden-of-disease and cost-effectiveness theory predict that pharmaceutical interventions may efficiently mitigate both the epidemiologic and economic impact of an influenza pandemic. Pharmaceutical interventions typically encompass the application of (pre)pandemic influenza vaccines, other vaccines (notably pneumococcal), antiviral treatments and other drug treatment (e.g., antibiotics to target potential complications of influenza). However, these models may be too limited to capture the full macro-economic impact of pandemic influenza. The aim of this article is to summarize current health-economic modeling approaches to recognize the strengths and weaknesses of these approaches, and to compare these with more recently proposed alternative methods. We conclude that it is useful, particularly for policy and planning purposes, to extend modeling concepts through the application of alternative approaches, including insurers' risk theories, human capital approaches and sectoral and full macro-economic modeling. This article builds on a roundtable meeting of the Pandemic Influenza Economic Impact Group that was held in Boston, MA, USA, in December 2008.

  13. Generation of net sediment transport by velocity skewness in oscillatory sheet flow

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Li, Yong; Chen, Genfa; Wang, Fujun; Tang, Xuelin

    2018-01-01

    This study utilizes a qualitative approach and a two-phase numerical model to investigate net sediment transport caused by velocity skewness beneath oscillatory sheet flow and current. The qualitative approach is derived based on the pseudo-laminar approximation of boundary layer velocity and exponential approximation of concentration. The two-phase model can obtain well the instantaneous erosion depth, sediment flux, boundary layer thickness, and sediment transport rate. It can especially illustrate the difference between positive and negative flow stages caused by velocity skewness, which is considerably important in determining the net boundary layer flow and sediment transport direction. The two-phase model also explains the effect of sediment diameter and phase-lag to sediment transport by comparing the instantaneous-type formulas to better illustrate velocity skewness effect. In previous studies about sheet flow transport in pure velocity-skewed flows, net sediment transport is only attributed to the phase-lag effect. In the present study with the qualitative approach and two-phase model, phase-lag effect is shown important but not sufficient for the net sediment transport beneath pure velocity-skewed flow and current, while the asymmetric wave boundary layer development between positive and negative flow stages also contributes to the sediment transport.

  14. Bromamine Decomposition Revisited: A Holistic Approach for Analyzing Acid and Base Catalysis Kinetics.

    PubMed

    Wahman, David G; Speitel, Gerald E; Katz, Lynn E

    2017-11-21

    Chloramine chemistry is complex, with a variety of reactions occurring in series and parallel and many that are acid or base catalyzed, resulting in numerous rate constants. Bromide presence increases system complexity even further with possible bromamine and bromochloramine formation. Therefore, techniques for parameter estimation must address this complexity through thoughtful experimental design and robust data analysis approaches. The current research outlines a rational basis for constrained data fitting using Brønsted theory, application of the microscopic reversibility principle to reversible acid or base catalyzed reactions, and characterization of the relative significance of parallel reactions using fictive product tracking. This holistic approach was used on a comprehensive and well-documented data set for bromamine decomposition, allowing new interpretations of existing data by revealing that a previously published reaction scheme was not robust; it was not able to describe monobromamine or dibromamine decay outside of the conditions for which it was calibrated. The current research's simplified model (3 reactions, 17 constants) represented the experimental data better than the previously published model (4 reactions, 28 constants). A final model evaluation was conducted based on representative drinking water conditions to determine a minimal model (3 reactions, 8 constants) applicable for drinking water conditions.

  15. Contact drying: a review of experimental and mechanistic modeling approaches.

    PubMed

    Sahni, Ekneet Kaur; Chaudhuri, Bodhisattwa

    2012-09-15

    Drying is one of the most complex unit operations with simultaneous heat and mass transfer. The contact drying process is also not well understood as several physical phenomena occur concurrently. This paper reviews current experimental and modeling approaches employed towards a better understanding of the contact drying operation. Additionally, an overview of some fundamental aspects relating to contact drying is provided. A brief discussion of some model extensions such as incorporation of noncontact forces, interstitial fluids and attrition rate is also presented. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Detailed numerical investigation of the dissipative stochastic mechanics based neuron model.

    PubMed

    Güler, Marifi

    2008-10-01

    Recently, a physical approach for the description of neuronal dynamics under the influence of ion channel noise was proposed in the realm of dissipative stochastic mechanics (Güler, Phys Rev E 76:041918, 2007). Led by the presence of a multiple number of gates in an ion channel, the approach establishes a viewpoint that ion channels are exposed to two kinds of noise: the intrinsic noise, associated with the stochasticity in the movement of gating particles between the inner and the outer faces of the membrane, and the topological noise, associated with the uncertainty in accessing the permissible topological states of open gates. Renormalizations of the membrane capacitance and of a membrane voltage dependent potential function were found to arise from the mutual interaction of the two noisy systems. The formalism therein was scrutinized using a special membrane with some tailored properties giving the Rose-Hindmarsh dynamics in the deterministic limit. In this paper, the resultant computational neuron model of the above approach is investigated in detail numerically for its dynamics using time-independent input currents. The following are the major findings obtained. The intrinsic noise gives rise to two significant coexisting effects: it initiates spiking activity even in some range of input currents for which the corresponding deterministic model is quiet and causes bursting in some other range of input currents for which the deterministic model fires tonically. The renormalization corrections are found to augment the above behavioral transitions from quiescence to spiking and from tonic firing to bursting, and, therefore, the bursting activity is found to take place in a wider range of input currents for larger values of the correction coefficients. Some findings concerning the diffusive behavior in the voltage space are also reported.

  17. Current Status and Challenges of Atmospheric Data Assimilation

    NASA Astrophysics Data System (ADS)

    Atlas, R. M.; Gelaro, R.

    2016-12-01

    The issues of modern atmospheric data assimilation are fairly simple to comprehend but difficult to address, involving the combination of literally billions of model variables and tens of millions of observations daily. In addition to traditional meteorological variables such as wind, temperature pressure and humidity, model state vectors are being expanded to include explicit representation of precipitation, clouds, aerosols and atmospheric trace gases. At the same time, model resolutions are approaching single-kilometer scales globally and new observation types have error characteristics that are increasingly non-Gaussian. This talk describes the current status and challenges of atmospheric data assimilation, including an overview of current methodologies, the difficulty of estimating error statistics, and progress toward coupled earth system analyses.

  18. Analytical approach to an integrate-and-fire model with spike-triggered adaptation

    NASA Astrophysics Data System (ADS)

    Schwalger, Tilo; Lindner, Benjamin

    2015-12-01

    The calculation of the steady-state probability density for multidimensional stochastic systems that do not obey detailed balance is a difficult problem. Here we present the analytical derivation of the stationary joint and various marginal probability densities for a stochastic neuron model with adaptation current. Our approach assumes weak noise but is valid for arbitrary adaptation strength and time scale. The theory predicts several effects of adaptation on the statistics of the membrane potential of a tonically firing neuron: (i) a membrane potential distribution with a convex shape, (ii) a strongly increased probability of hyperpolarized membrane potentials induced by strong and fast adaptation, and (iii) a maximized variability associated with the adaptation current at a finite adaptation time scale.

  19. (Q)SARs to predict environmental toxicities: current status and future needs.

    PubMed

    Cronin, Mark T D

    2017-03-22

    The current state of the art of (Quantitative) Structure-Activity Relationships ((Q)SARs) to predict environmental toxicity is assessed along with recommendations to develop these models further. The acute toxicity of compounds acting by the non-polar narcotic mechanism of action can be well predicted, however other approaches, including read-across, may be required for compounds acting by specific mechanisms of action. The chronic toxicity of compounds to environmental species is more difficult to predict from (Q)SARs, with robust data sets and more mechanistic information required. In addition, the toxicity of mixtures is little addressed by (Q)SAR approaches. Developments in environmental toxicology including Adverse Outcome Pathways (AOPs) and omics responses should be utilised to develop better, more mechanistically relevant, (Q)SAR models.

  20. Did we choose the best one? A new site selection approach based on exposure and uptake potential for waste incineration.

    PubMed

    Demirarslan, K Onur; Korucu, M Kemal; Karademir, Aykan

    2016-08-01

    Ecological problems arising after the construction and operation of a waste incineration plant generally originate from incorrect decisions made during the selection of the location of the plant. The main objective of this study is to investigate how the selection method for the location of a new municipal waste incineration plant can be improved by using a dispersion modelling approach supported by geographical information systems and multi-criteria decision analysis. Considering this aim, the appropriateness of the current location of an existent plant was assessed by applying a pollution dispersion model. Using this procedure, the site ranking for a total of 90 candidate locations and the site of the existing incinerator were determined by a new location selection practice and the current place of the plant was evaluated by ANOVA and Tukey tests. This ranking, made without the use of modelling approaches, was re-evaluated based on the modelling of various variables, including the concentration of pollutants, population and population density, demography, temporality of meteorological data, pollutant type, risk formation type by CALPUFF and re-ranking the results. The findings clearly indicate the impropriety of the location of the current plant, as the pollution distribution model showed that its location was the fourth-worst choice among 91 possibilities. It was concluded that the location selection procedures for waste incinerators should benefit from the improvements obtained by the articulation of pollution dispersion studies combined with the population density data to obtain the most suitable location. © The Author(s) 2016.

  1. Planning for Capital Reinvestment.

    ERIC Educational Resources Information Center

    Biedenweg, Frederick; Weisburg-Swanson, Lynda; Gardner, Catherine

    1998-01-01

    Describes and evaluates four alternatives for planning and budgeting for capital reinvestment for college and university facilities: physical plant auditing; a depreciation-based approach; percentage of current replacement value; and facility subsystem modeling, or life-cycle modeling. Each has advantages and limitations in budgeting for and…

  2. Merging field mapping and numerical simulation to interpret the lithofacies variations from unsteady pyroclastic density currents on uneven terrain: The case of La Fossa di Vulcano (Aeolian Islands, Italy)

    NASA Astrophysics Data System (ADS)

    Doronzo, Domenico M.; Dellino, Pierfrancesco; Sulpizio, Roberto; Lucchi, Federico

    2017-01-01

    In order to obtain results from computer simulations of explosive volcanic eruptions, one either needs a statistical approach to test a wide range of initial and boundary conditions, or needs using a well-constrained field case study via stratigraphy. Here we followed the second approach, using data obtained from field mapping of the Grotta dei Palizzi 2 pyroclastic deposits (Vulcano Island, Italy) as input for numerical modeling. This case study deals with impulsive phreatomagmatic explosions of La Fossa Cone that generated ash-rich pyroclastic density currents, interacting with the topographic high of the La Fossa Caldera rim. One of the simplifications in dealing with well-sorted ash (one particle size in the model) is to highlight the topographic effects on the same pyroclastic material in an unsteady current. We demonstrate that by merging field data with 3D numerical simulation results it is possible to see key details of the dynamical current-terrain interaction, and to interpret the lithofacies variations of the associated deposits as a function of topography-induced sedimentation (settling) rate. Results suggest that a value of the sedimentation rate lower than 5 kg/m2 s at the bed load can still be sheared by the overlying current, producing tractional structures (laminae) in the deposits. Instead, a sedimentation rate higher than that threshold can preclude the formation of tractional structures, producing thicker massive deposits. We think that the approach used in this study could be applied to other case studies (both for active and ancient volcanoes) to confirm or refine such threshold value of the sedimentation rate, which is to be considered as an upper value as for the limitations of the numerical model.

  3. Regression analysis of informative current status data with the additive hazards model.

    PubMed

    Zhao, Shishun; Hu, Tao; Ma, Ling; Wang, Peijie; Sun, Jianguo

    2015-04-01

    This paper discusses regression analysis of current status failure time data arising from the additive hazards model in the presence of informative censoring. Many methods have been developed for regression analysis of current status data under various regression models if the censoring is noninformative, and also there exists a large literature on parametric analysis of informative current status data in the context of tumorgenicity experiments. In this paper, a semiparametric maximum likelihood estimation procedure is presented and in the method, the copula model is employed to describe the relationship between the failure time of interest and the censoring time. Furthermore, I-splines are used to approximate the nonparametric functions involved and the asymptotic consistency and normality of the proposed estimators are established. A simulation study is conducted and indicates that the proposed approach works well for practical situations. An illustrative example is also provided.

  4. Comparative analysis of numerical models of pipe handling equipment used in offshore drilling applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlus, Witold, E-mail: witold.p.pawlus@ieee.org; Ebbesen, Morten K.; Hansen, Michael R.

    Design of offshore drilling equipment is a task that involves not only analysis of strict machine specifications and safety requirements but also consideration of changeable weather conditions and harsh environment. These challenges call for a multidisciplinary approach and make the design process complex. Various modeling software products are currently available to aid design engineers in their effort to test and redesign equipment before it is manufactured. However, given the number of available modeling tools and methods, the choice of the proper modeling methodology becomes not obvious and – in some cases – troublesome. Therefore, we present a comparative analysis ofmore » two popular approaches used in modeling and simulation of mechanical systems: multibody and analytical modeling. A gripper arm of the offshore vertical pipe handling machine is selected as a case study for which both models are created. In contrast to some other works, the current paper shows verification of both systems by benchmarking their simulation results against each other. Such criteria as modeling effort and results accuracy are evaluated to assess which modeling strategy is the most suitable given its eventual application.« less

  5. Distributed, cooperating knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    Some current research in the development and application of distributed, cooperating knowledge-based systems technology is addressed. The focus of the current research is the spacecraft ground operations environment. The underlying hypothesis is that, because of the increasing size, complexity, and cost of planned systems, conventional procedural approaches to the architecture of automated systems will give way to a more comprehensive knowledge-based approach. A hallmark of these future systems will be the integration of multiple knowledge-based agents which understand the operational goals of the system and cooperate with each other and the humans in the loop to attain the goals. The current work includes the development of a reference model for knowledge-base management, the development of a formal model of cooperating knowledge-based agents, the use of testbed for prototyping and evaluating various knowledge-based concepts, and beginning work on the establishment of an object-oriented model of an intelligent end-to-end (spacecraft to user) system. An introductory discussion of these activities is presented, the major concepts and principles being investigated are highlighted, and their potential use in other application domains is indicated.

  6. Anatomy of health care reform proposals.

    PubMed Central

    Soffel, D; Luft, H S

    1993-01-01

    The current proliferation of proposals for health care reform makes it difficult to sort out the differences among plans and the likely outcome of different approaches to reform. The current health care system has two basic features. The first, enrollment and eligibility functions, includes how people get into the system and gain coverage for health care services. We describe 4 models, ranging from an individual, voluntary approach to a universal, tax-based model. The second, the provision of health care, includes how physician services are organized, how they are paid for, what mechanisms are in place for quality assurance, and the degree of organization and oversight of the health care system. We describe 7 models of the organization component, including the current fee-for-service system with no national health budget, managed care, salaried providers under a budget, and managed competition with and without a national health budget. These 2 components provide the building blocks for health care plans, presented as a matrix. We also evaluate several reform proposals by how they combine these 2 elements. PMID:8273344

  7. An Algorithm for Interactive Modeling of Space-Transportation Engine Simulations: A Constraint Satisfaction Approach

    NASA Technical Reports Server (NTRS)

    Mitra, Debasis; Thomas, Ajai; Hemminger, Joseph; Sakowski, Barbara

    2001-01-01

    In this research we have developed an algorithm for the purpose of constraint processing by utilizing relational algebraic operators. Van Beek and others have investigated in the past this type of constraint processing from within a relational algebraic framework, producing some unique results. Apart from providing new theoretical angles, this approach also gives the opportunity to use the existing efficient implementations of relational database management systems as the underlying data structures for any relevant algorithm. Our algorithm here enhances that framework. The algorithm is quite general in its current form. Weak heuristics (like forward checking) developed within the Constraint-satisfaction problem (CSP) area could be also plugged easily within this algorithm for further enhancements of efficiency. The algorithm as developed here is targeted toward a component-oriented modeling problem that we are currently working on, namely, the problem of interactive modeling for batch-simulation of engineering systems (IMBSES). However, it could be adopted for many other CSP problems as well. The research addresses the algorithm and many aspects of the problem IMBSES that we are currently handling.

  8. Simulation of Wave and Current Processes Using Novel, Phase Resolving Models

    DTIC Science & Technology

    2013-09-30

    fundamental technical approach is to represent nearshore water wave systems by retaining Boussinesq scaling assumptions, but without any assumption of... Boussinesq approach that allows for much more freedom in determining the system properties. The resulting systems can have two forms: a classic...of a pressure-Poisson approach to Boussinesq systems . The wave generation-absorption system has now been shown to provide highly accurate results

  9. Predicting Future-Year Ozone Concentrations: Integrated Observational-Modeling Approach for Probabilistic Evaluation of the Efficacy of Emission Control strategies

    EPA Science Inventory

    Regional-scale air quality models are being used to demonstrate attainment of the ozone air quality standard. In current regulatory applications, a regional-scale air quality model is applied for a base year and a future year with reduced emissions using the same meteorological ...

  10. THE CONTRIBUTION OF AMBIENT PM2.5 TO TOTAL PERSONAL EXPOSURES: RESULTS FROM A POPULATION EXPOSURE MODEL FOR PHILADELPHIA, PA

    EPA Science Inventory

    The US EPA National Exposure Research Laboratory (NERL) is currently developing an integrated human exposure source-to-dose modeling system (HES2D). This modeling system will incorporate population exposure modules that use a probabilistic approach to predict population exposu...

  11. The Substitution Augmentation Modification Redefinition (SAMR) Model: A Critical Review and Suggestions for Its Use

    ERIC Educational Resources Information Center

    Hamilton, Erica R.; Rosenberg, Joshua M.; Akcaoglu, Mete

    2016-01-01

    The Substitution, Augmentation, Modification, and Redefinition (SAMR) model is a four-level, taxonomy-based approach for selecting, using, and evaluating technology in K-12 settings (Puentedura 2006). Despite its increasing popularity among practitioners, the SAMR model is not currently represented in the extant literature. To focus the ongoing…

  12. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    ERIC Educational Resources Information Center

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  13. A Theoretical Model of Children's Storytelling Using Physically-Oriented Technologies (SPOT)

    ERIC Educational Resources Information Center

    Guha, Mona Leigh; Druin, Allison; Montemayor, Jaime; Chipman, Gene; Farber, Allison

    2007-01-01

    This paper develops a model of children's storytelling using Physically-Oriented Technology (SPOT). The SPOT model draws upon literature regarding current physical storytelling technologies and was developed using a grounded theory approach to qualitative research. This empirical work focused on the experiences of 18 children, ages 5-6, who worked…

  14. Metabolic Network Modeling of Microbial Interactions in Natural and Engineered Environmental Systems

    PubMed Central

    Perez-Garcia, Octavio; Lear, Gavin; Singhal, Naresh

    2016-01-01

    We review approaches to characterize metabolic interactions within microbial communities using Stoichiometric Metabolic Network (SMN) models for applications in environmental and industrial biotechnology. SMN models are computational tools used to evaluate the metabolic engineering potential of various organisms. They have successfully been applied to design and optimize the microbial production of antibiotics, alcohols and amino acids by single strains. To date however, such models have been rarely applied to analyze and control the metabolism of more complex microbial communities. This is largely attributed to the diversity of microbial community functions, metabolisms, and interactions. Here, we firstly review different types of microbial interaction and describe their relevance for natural and engineered environmental processes. Next, we provide a general description of the essential methods of the SMN modeling workflow including the steps of network reconstruction, simulation through Flux Balance Analysis (FBA), experimental data gathering, and model calibration. Then we broadly describe and compare four approaches to model microbial interactions using metabolic networks, i.e., (i) lumped networks, (ii) compartment per guild networks, (iii) bi-level optimization simulations, and (iv) dynamic-SMN methods. These approaches can be used to integrate and analyze diverse microbial physiology, ecology and molecular community data. All of them (except the lumped approach) are suitable for incorporating species abundance data but so far they have been used only to model simple communities of two to eight different species. Interactions based on substrate exchange and competition can be directly modeled using the above approaches. However, interactions based on metabolic feedbacks, such as product inhibition and synthropy require extensions to current models, incorporating gene regulation and compounding accumulation mechanisms. SMN models of microbial interactions can be used to analyze complex “omics” data and to infer and optimize metabolic processes. Thereby, SMN models are suitable to capitalize on advances in high-throughput molecular and metabolic data generation. SMN models are starting to be applied to describe microbial interactions during wastewater treatment, in-situ bioremediation, microalgae blooms methanogenic fermentation, and bioplastic production. Despite their current challenges, we envisage that SMN models have future potential for the design and development of novel growth media, biochemical pathways and synthetic microbial associations. PMID:27242701

  15. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 1: Method.

    PubMed

    Norris, Peter M; da Silva, Arlindo M

    2016-07-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.

  16. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability Using High-Resolution Cloud Observations. Part 1: Method

    NASA Technical Reports Server (NTRS)

    Norris, Peter M.; Da Silva, Arlindo M.

    2016-01-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.

  17. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 1: Method

    PubMed Central

    Norris, Peter M.; da Silva, Arlindo M.

    2018-01-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC. PMID:29618847

  18. Using Data Assimilation Methods of Prediction of Solar Activity

    NASA Technical Reports Server (NTRS)

    Kitiashvili, Irina N.; Collins, Nancy S.

    2017-01-01

    The variable solar magnetic activity known as the 11-year solar cycle has the longest history of solar observations. These cycles dramatically affect conditions in the heliosphere and the Earth's space environment. Our current understanding of the physical processes that make up global solar dynamics and the dynamo that generates the magnetic fields is sketchy, resulting in unrealistic descriptions in theoretical and numerical models of the solar cycles. The absence of long-term observations of solar interior dynamics and photospheric magnetic fields hinders development of accurate dynamo models and their calibration. In such situations, mathematical data assimilation methods provide an optimal approach for combining the available observational data and their uncertainties with theoretical models in order to estimate the state of the solar dynamo and predict future cycles. In this presentation, we will discuss the implementation and performance of an Ensemble Kalman Filter data assimilation method based on the Parker migratory dynamo model, complemented by the equation of magnetic helicity conservation and long-term sunspot data series. This approach has allowed us to reproduce the general properties of solar cycles and has already demonstrated a good predictive capability for the current cycle, 24. We will discuss further development of this approach, which includes a more sophisticated dynamo model, synoptic magnetogram data, and employs the DART Data Assimilation Research Testbed.

  19. Risk Management and Physical Modelling for Mountainous Natural Hazards

    NASA Astrophysics Data System (ADS)

    Lehning, Michael; Wilhelm, Christian

    Population growth and climate change cause rapid changes in mountainous regions resulting in increased risks of floods, avalanches, debris flows and other natural hazards. Xevents are of particular concern, since attempts to protect against them result in exponentially growing costs. In this contribution, we suggest an integral risk management approach to dealing with natural hazards that occur in mountainous areas. Using the example of a mountain pass road, which can be protected from the danger of an avalanche by engineering (galleries) and/or organisational (road closure) measures, we show the advantage of an optimal combination of both versus the traditional approach, which is to rely solely on engineering structures. Organisational measures become especially important for Xevents because engineering structures cannot be designed for those events. However, organisational measures need a reliable and objective forecast of the hazard. Therefore, we further suggest that such forecasts should be developed using physical numerical modelling. We present the status of current approaches to using physical modelling to predict snow cover stability for avalanche warnings and peak runoff from mountain catchments for flood warnings. While detailed physical models can already predict peak runoff reliably, they are only used to support avalanche warnings. With increased process knowledge and computer power, current developments should lead to a enhanced role for detailed physical models in natural mountain hazard prediction.

  20. Early Estimation of Solar Activity Cycle: Potential Capability and Limits

    NASA Technical Reports Server (NTRS)

    Kitiashvili, Irina N.; Collins, Nancy S.

    2017-01-01

    The variable solar magnetic activity known as the 11-year solar cycle has the longest history of solar observations. These cycles dramatically affect conditions in the heliosphere and the Earth's space environment. Our current understanding of the physical processes that make up global solar dynamics and the dynamo that generates the magnetic fields is sketchy, resulting in unrealistic descriptions in theoretical and numerical models of the solar cycles. The absence of long-term observations of solar interior dynamics and photospheric magnetic fields hinders development of accurate dynamo models and their calibration. In such situations, mathematical data assimilation methods provide an optimal approach for combining the available observational data and their uncertainties with theoretical models in order to estimate the state of the solar dynamo and predict future cycles. In this presentation, we will discuss the implementation and performance of an Ensemble Kalman Filter data assimilation method based on the Parker migratory dynamo model, complemented by the equation of magnetic helicity conservation and longterm sunspot data series. This approach has allowed us to reproduce the general properties of solar cycles and has already demonstrated a good predictive capability for the current cycle, 24. We will discuss further development of this approach, which includes a more sophisticated dynamo model, synoptic magnetogram data, and employs the DART Data Assimilation Research Testbed.

  1. Application of Precipitate Free Zone Growth Kinetics to the β-Phase Depletion Behavior in a CoNiCrAlY Coating Alloy: An Analytical Approach

    NASA Astrophysics Data System (ADS)

    Chen, H.

    2018-06-01

    This paper concerns the β-phase depletion kinetics of a thermally sprayed free-standing CoNiCrAlY (Co-31.7 pct Ni-20.8 pct Cr-8.1 pct Al-0.5 pct Y, all in wt pct) coating alloy. An analytical β-phase depletion model based on the precipitate free zone growth kinetics was developed to calculate the β-phase depletion kinetics during isothermal oxidation. This approach, which accounts for the molar volume of the alloy, the interfacial energy of the γ/ β interface, and the Al concentration at γ/ γ + β boundary, requires the Al concentrations in the β-phase depletion zone as the input rather than the oxidation kinetics at the oxide/coating interface. The calculated β-phase depletion zones derived from the current model were compared with experimental results. It is shown that the calculated β-phase depletion zones using the current model are in reasonable agreement with those obtained experimentally. The constant compositional terms used in the model are likely to cause the discrepancies between the model predictions and experimental results. This analytical approach, which shows a reasonable correlation with experimental results, demonstrates a good reliability in the fast evaluation on lifetime prediction of MCrAlY coatings.

  2. Application of Precipitate Free Zone Growth Kinetics to the β-Phase Depletion Behavior in a CoNiCrAlY Coating Alloy: An Analytical Approach

    NASA Astrophysics Data System (ADS)

    Chen, H.

    2018-03-01

    This paper concerns the β-phase depletion kinetics of a thermally sprayed free-standing CoNiCrAlY (Co-31.7 pct Ni-20.8 pct Cr-8.1 pct Al-0.5 pct Y, all in wt pct) coating alloy. An analytical β-phase depletion model based on the precipitate free zone growth kinetics was developed to calculate the β-phase depletion kinetics during isothermal oxidation. This approach, which accounts for the molar volume of the alloy, the interfacial energy of the γ/β interface, and the Al concentration at γ/γ + β boundary, requires the Al concentrations in the β-phase depletion zone as the input rather than the oxidation kinetics at the oxide/coating interface. The calculated β-phase depletion zones derived from the current model were compared with experimental results. It is shown that the calculated β-phase depletion zones using the current model are in reasonable agreement with those obtained experimentally. The constant compositional terms used in the model are likely to cause the discrepancies between the model predictions and experimental results. This analytical approach, which shows a reasonable correlation with experimental results, demonstrates a good reliability in the fast evaluation on lifetime prediction of MCrAlY coatings.

  3. Modeling and stabilization results for a charge or current-actuated active constrained layer (ACL) beam model with the electrostatic assumption

    NASA Astrophysics Data System (ADS)

    Özer, Ahmet Özkan

    2016-04-01

    An infinite dimensional model for a three-layer active constrained layer (ACL) beam model, consisting of a piezoelectric elastic layer at the top and an elastic host layer at the bottom constraining a viscoelastic layer in the middle, is obtained for clamped-free boundary conditions by using a thorough variational approach. The Rao-Nakra thin compliant layer approximation is adopted to model the sandwich structure, and the electrostatic approach (magnetic effects are ignored) is assumed for the piezoelectric layer. Instead of the voltage actuation of the piezoelectric layer, the piezoelectric layer is proposed to be activated by a charge (or current) source. We show that, the closed-loop system with all mechanical feedback is shown to be uniformly exponentially stable. Our result is the outcome of the compact perturbation argument and a unique continuation result for the spectral problem which relies on the multipliers method. Finally, the modeling methodology of the paper is generalized to the multilayer ACL beams, and the uniform exponential stabilizability result is established analogously.

  4. Tailoring Mathematical Models to Stem-Cell Derived Cardiomyocyte Lines Can Improve Predictions of Drug-Induced Changes to Their Electrophysiology.

    PubMed

    Lei, Chon Lok; Wang, Ken; Clerx, Michael; Johnstone, Ross H; Hortigon-Vinagre, Maria P; Zamora, Victor; Allan, Andrew; Smith, Godfrey L; Gavaghan, David J; Mirams, Gary R; Polonchuk, Liudmila

    2017-01-01

    Human induced pluripotent stem cell derived cardiomyocytes (iPSC-CMs) have applications in disease modeling, cell therapy, drug screening and personalized medicine. Computational models can be used to interpret experimental findings in iPSC-CMs, provide mechanistic insights, and translate these findings to adult cardiomyocyte (CM) electrophysiology. However, different cell lines display different expression of ion channels, pumps and receptors, and show differences in electrophysiology. In this exploratory study, we use a mathematical model based on iPSC-CMs from Cellular Dynamic International (CDI, iCell), and compare its predictions to novel experimental recordings made with the Axiogenesis Cor.4U line. We show that tailoring this model to the specific cell line, even using limited data and a relatively simple approach, leads to improved predictions of baseline behavior and response to drugs. This demonstrates the need and the feasibility to tailor models to individual cell lines, although a more refined approach will be needed to characterize individual currents, address differences in ion current kinetics, and further improve these results.

  5. Lexical is as lexical does: computational approaches to lexical representation

    PubMed Central

    Woollams, Anna M.

    2015-01-01

    In much of neuroimaging and neuropsychology, regions of the brain have been associated with ‘lexical representation’, with little consideration as to what this cognitive construct actually denotes. Within current computational models of word recognition, there are a number of different approaches to the representation of lexical knowledge. Structural lexical representations, found in original theories of word recognition, have been instantiated in modern localist models. However, such a representational scheme lacks neural plausibility in terms of economy and flexibility. Connectionist models have therefore adopted distributed representations of form and meaning. Semantic representations in connectionist models necessarily encode lexical knowledge. Yet when equipped with recurrent connections, connectionist models can also develop attractors for familiar forms that function as lexical representations. Current behavioural, neuropsychological and neuroimaging evidence shows a clear role for semantic information, but also suggests some modality- and task-specific lexical representations. A variety of connectionist architectures could implement these distributed functional representations, and further experimental and simulation work is required to discriminate between these alternatives. Future conceptualisations of lexical representations will therefore emerge from a synergy between modelling and neuroscience. PMID:25893204

  6. Comparative study of two approaches to model the offshore fish cages

    NASA Astrophysics Data System (ADS)

    Zhao, Yun-peng; Wang, Xin-xin; Decew, Jud; Tsukrov, Igor; Bai, Xiao-dong; Bi, Chun-wei

    2015-06-01

    The goal of this paper is to provide a comparative analysis of two commonly used approaches to discretize offshore fish cages: the lumped-mass approach and the finite element technique. Two case studies are chosen to compare predictions of the LMA (lumped-mass approach) and FEA (finite element analysis) based numerical modeling techniques. In both case studies, we consider several loading conditions consisting of different uniform currents and monochromatic waves. We investigate motion of the cage, its deformation, and the resultant tension in the mooring lines. Both model predictions are sufficient close to the experimental data, but for the first experiment, the DUT-FlexSim predictions are slightly more accurate than the ones provided by Aqua-FE™. According to the comparisons, both models can be successfully utilized to the design and analysis of the offshore fish cages provided that an appropriate safety factor is chosen.

  7. Spatial modeling in ecology: the flexibility of eigenfunction spatial analyses.

    PubMed

    Griffith, Daniel A; Peres-Neto, Pedro R

    2006-10-01

    Recently, analytical approaches based on the eigenfunctions of spatial configuration matrices have been proposed in order to consider explicitly spatial predictors. The present study demonstrates the usefulness of eigenfunctions in spatial modeling applied to ecological problems and shows equivalencies of and differences between the two current implementations of this methodology. The two approaches in this category are the distance-based (DB) eigenvector maps proposed by P. Legendre and his colleagues, and spatial filtering based upon geographic connectivity matrices (i.e., topology-based; CB) developed by D. A. Griffith and his colleagues. In both cases, the goal is to create spatial predictors that can be easily incorporated into conventional regression models. One important advantage of these two approaches over any other spatial approach is that they provide a flexible tool that allows the full range of general and generalized linear modeling theory to be applied to ecological and geographical problems in the presence of nonzero spatial autocorrelation.

  8. Next generation initiation techniques

    NASA Technical Reports Server (NTRS)

    Warner, Tom; Derber, John; Zupanski, Milija; Cohn, Steve; Verlinde, Hans

    1993-01-01

    Four-dimensional data assimilation strategies can generally be classified as either current or next generation, depending upon whether they are used operationally or not. Current-generation data-assimilation techniques are those that are presently used routinely in operational-forecasting or research applications. They can be classified into the following categories: intermittent assimilation, Newtonian relaxation, and physical initialization. It should be noted that these techniques are the subject of continued research, and their improvement will parallel the development of next generation techniques described by the other speakers. Next generation assimilation techniques are those that are under development but are not yet used operationally. Most of these procedures are derived from control theory or variational methods and primarily represent continuous assimilation approaches, in which the data and model dynamics are 'fitted' to each other in an optimal way. Another 'next generation' category is the initialization of convective-scale models. Intermittent assimilation systems use an objective analysis to combine all observations within a time window that is centered on the analysis time. Continuous first-generation assimilation systems are usually based on the Newtonian-relaxation or 'nudging' techniques. Physical initialization procedures generally involve the use of standard or nonstandard data to force some physical process in the model during an assimilation period. Under the topic of next-generation assimilation techniques, variational approaches are currently being actively developed. Variational approaches seek to minimize a cost or penalty function which measures a model's fit to observations, background fields and other imposed constraints. Alternatively, the Kalman filter technique, which is also under investigation as a data assimilation procedure for numerical weather prediction, can yield acceptable initial conditions for mesoscale models. The third kind of next-generation technique involves strategies to initialize convective scale (non-hydrostatic) models.

  9. Determinants of CD4 cell count change and time-to default from HAART; a comparison of separate and joint models.

    PubMed

    Tegegne, Awoke Seyoum; Ndlovu, Principal; Zewotir, Temesgen

    2018-04-27

    HIV has the most serious effects in Sub-Saharan African countries as compared to countries in other parts of the world. As part of these countries, Ethiopia has been affected significantly by the disease, and the burden of the disease has become worst in the Amhara Region, one of the eleven regions of the country. Being a defaulter or dropout of HIV patients from the treatment plays a significant role in treatment failure. The current research was conducted with the objective of comparing the performance of the joint and the separate modelling approaches in determining important factors that affect HIV patients' longitudinal CD4 cell count change and time to default from treatment. Longitudinal data was obtained from the records of 792 HIV adult patients at Felege-Hiwot Teaching and Specialized Hospital in Ethiopia. Two alternative approaches, namely separate and joint modeling data analyses, were conducted in the current study. Joint modeling was conducted for an analysis of the change of CD4 cell count and the time to default in the treatment. In the joint model, a generalized linear mixed effects model and Weibul survival sub-models were combined together for the repetitive measures of the CD4 cell count change and the number of follow-ups in which patients wait in the treatment. Finally, the two models were linked through their shared unobserved random effects using a shared parameter model. Both separate and joint modeling approach revealed a consistent result. However, the joint modeling approach was more parsimonious and fitted the given data well as compared to the separate one. Age, baseline CD4 cell count, marital status, sex, ownership of cell phone, adherence to HAART, disclosure of the disease and the number of follow-ups were important predictors for both the fluctuation of CD4 cell count and the time-to default from treatment. The inclusion of patient-specific variations in the analyses of the two outcomes improved the model significantly. Certain groups of patients were identified in the current investigation. The groups already identified had high fluctuation in the number of CD4 cell count and defaulted from HAART without any convincing reasons. Such patients need high intervention to adhere to the prescribed medication.

  10. Modeling Current-Voltage Charateristics of Proteorhodopsin and Bacteriorhodopsin: Towards an Optoelectronics Based on Proteins.

    PubMed

    Alfinito, Eleonora; Reggiani, Lino

    2016-10-01

    Current-voltage characteristics of metal-protein-metal structures made of proteorhodopsin and bacteriorhodopsin are modeled by using a percolation-like approach. Starting from the tertiary structure pertaining to the single protein, an analogous resistance network is created. Charge transfer inside the network is described as a sequential tunneling mechanism and the current is calculated for each value of the given voltage. The theory is validated with available experiments, in dark and light. The role of the tertiary structure of the single protein and of the mechanisms responsible for the photo-activity is discussed.

  11. Kuiper Prize Lecture - Present and past climates of the terrestrial planets

    NASA Technical Reports Server (NTRS)

    Pollack, James B.

    1991-01-01

    An evaluation is undertaken of the current understanding of factors shaping the current climates of Venus, Mars, and the earth, in conjunction with the ways in which these planetary climates may have been different in the past. Attention is given to modeling approaches of various levels of sophistication which both characterize current climates and elucidate prior climatic epochs; these are assessed in light of observational data in order to judge degrees of success thus far and formulate major remaining questions for future investigations. Venus is noted to offer excellent opportunities for modeling the greenhouse effect.

  12. Differential geometry based model for eddy current inspection of U-bend sections in steam generator tubes

    NASA Astrophysics Data System (ADS)

    Mukherjee, Saptarshi; Rosell, Anders; Udpa, Lalita; Udpa, Satish; Tamburrino, Antonello

    2017-02-01

    The modeling of U-Bend segment in steam generator tubes for predicting eddy current probe signals from cracks, wear and pitting in this region poses challenges and is non-trivial. Meshing the geometry in the cartesian coordinate system might require a large number of elements to model the U-bend region. Also, since the lift-off distance between the probe and tube wall is usually very small, a very fine mesh is required near the probe region to accurately describe the eddy current field. This paper presents a U-bend model using differential geometry principles that exploit the result that Maxwell's equations are covariant with respect to changes of coordinates and independent of metrics. The equations remain unaltered in their form, regardless of the choice of the coordinates system, provided the field quantities are represented in the proper covariant and contravariant form. The complex shapes are mapped into simple straight sections, while small lift-off is mapped to larger values, thus reducing the intrinsic dimension of the mesh and stiffness matrix. In this contribution, the numerical implementation of the above approach will be discussed with regard to field and current distributions within the U-bend tube wall. For the sake of simplicity, a two dimensional test case will be considered. The approach is evaluated in terms of efficiency and accuracy by comparing the results with that obtained using a conventional FE model in cartesian coordinates.

  13. Gene Therapy Models of Alzheimer’s Disease and Other Dementias

    PubMed Central

    Combs, Benjamin; Kneynsberg, Andrew; Kanaan, Nicholas M.

    2016-01-01

    Dementias are among the most common neurological disorders, and Alzheimer’s disease (AD) is the most common cause of dementia worldwide. AD remains a looming health crisis despite great efforts to learn the mechanisms surrounding the neuron dysfunction and neurodegeneration that accompanies AD primarily in the medial temporal lobe. In addition to AD, a group of diseases known as frontotemporal dementias (FTDs) are degenerative diseases involving atrophy and degeneration in the frontal and temporal lobe regions. Importantly, AD and a number of FTDs are collectively known as tauopathies due to the abundant accumulation of pathological tau inclusions in the brain. The precise role tau plays in disease pathogenesis remains an area of strong research focus. A critical component to effectively study any human disease is the availability of models that recapitulate key features of the disease. Accordingly, a number of animal models are currently being pursued to fill the current gaps in our knowledge of the causes of dementias and to develop effective therapeutics. Recent developments in gene therapy-based approaches, particularly in recombinant adeno-associated viruses (rAAVs), have provided new tools to study AD and other related neurodegenerative disorders. Additionally, gene therapy approaches have emerged as an intriguing possibility for treating these diseases in humans. This chapter explores the current state of rAAV models of AD and other dementias, discuss recent efforts to improve these models, and describe current and future possibilities in the use of rAAVs and other viruses in treatments of disease. PMID:26611599

  14. Mechanical Behavior and Fatigue Studies of Rubber Components in Army Tracked Vehicles

    DTIC Science & Technology

    2010-08-13

    strategy moved to glassy polymers (Bouvard et al., 2010) – Current efforts to apply ISV modeling strategy to elastomers • Fatigue approach – Researchers...metals at CAVS – Researchers have typically only investigated long crack for elastomers (Mars and Fatemi, 2003; Busfield et al., 2002; Chou et al...2007) – Current efforts are to add MSC/PSC, INC to fatigue modeling of elastomers and incorporate microstructure 13 August 2010 2 Overview 8/13/2010 3

  15. Optimisation of a Generic Ionic Model of Cardiac Myocyte Electrical Activity

    PubMed Central

    Guo, Tianruo; Al Abed, Amr; Lovell, Nigel H.; Dokos, Socrates

    2013-01-01

    A generic cardiomyocyte ionic model, whose complexity lies between a simple phenomenological formulation and a biophysically detailed ionic membrane current description, is presented. The model provides a user-defined number of ionic currents, employing two-gate Hodgkin-Huxley type kinetics. Its generic nature allows accurate reconstruction of action potential waveforms recorded experimentally from a range of cardiac myocytes. Using a multiobjective optimisation approach, the generic ionic model was optimised to accurately reproduce multiple action potential waveforms recorded from central and peripheral sinoatrial nodes and right atrial and left atrial myocytes from rabbit cardiac tissue preparations, under different electrical stimulus protocols and pharmacological conditions. When fitted simultaneously to multiple datasets, the time course of several physiologically realistic ionic currents could be reconstructed. Model behaviours tend to be well identified when extra experimental information is incorporated into the optimisation. PMID:23710254

  16. Principles of Design for High Performing Organizations: An Assessment of the State of the Field of Organizational Design Research

    DTIC Science & Technology

    1994-03-01

    asked whether the planned structure considered (a) all objectives, (b) all functions, (c) all relevant units of analysis such as the plant , the...literature and provides an integrative model of design for high perfor-ming organizations. The model is based on an analysis of current theories of...important midrange theories underlie much of the work on organizational analysis . 0 Systems Approaches. These approaches emphasize the rational, goal

  17. Inter-model analysis of tsunami-induced coastal currents

    NASA Astrophysics Data System (ADS)

    Lynett, Patrick J.; Gately, Kara; Wilson, Rick; Montoya, Luis; Arcas, Diego; Aytore, Betul; Bai, Yefei; Bricker, Jeremy D.; Castro, Manuel J.; Cheung, Kwok Fai; David, C. Gabriel; Dogan, Gozde Guney; Escalante, Cipriano; González-Vida, José Manuel; Grilli, Stephan T.; Heitmann, Troy W.; Horrillo, Juan; Kânoğlu, Utku; Kian, Rozita; Kirby, James T.; Li, Wenwen; Macías, Jorge; Nicolsky, Dmitry J.; Ortega, Sergio; Pampell-Manis, Alyssa; Park, Yong Sung; Roeber, Volker; Sharghivand, Naeimeh; Shelby, Michael; Shi, Fengyan; Tehranirad, Babak; Tolkova, Elena; Thio, Hong Kie; Velioğlu, Deniz; Yalçıner, Ahmet Cevdet; Yamazaki, Yoshiki; Zaytsev, Andrey; Zhang, Y. J.

    2017-06-01

    To help produce accurate and consistent maritime hazard products, the National Tsunami Hazard Mitigation Program organized a benchmarking workshop to evaluate the numerical modeling of tsunami currents. Thirteen teams of international researchers, using a set of tsunami models currently utilized for hazard mitigation studies, presented results for a series of benchmarking problems; these results are summarized in this paper. Comparisons focus on physical situations where the currents are shear and separation driven, and are thus de-coupled from the incident tsunami waveform. In general, we find that models of increasing physical complexity provide better accuracy, and that low-order three-dimensional models are superior to high-order two-dimensional models. Inside separation zones and in areas strongly affected by eddies, the magnitude of both model-data errors and inter-model differences can be the same as the magnitude of the mean flow. Thus, we make arguments for the need of an ensemble modeling approach for areas affected by large-scale turbulent eddies, where deterministic simulation may be misleading. As a result of the analyses presented herein, we expect that tsunami modelers now have a better awareness of their ability to accurately capture the physics of tsunami currents, and therefore a better understanding of how to use these simulation tools for hazard assessment and mitigation efforts.

  18. Developments in the Gung Ho dynamical core

    NASA Astrophysics Data System (ADS)

    Melvin, Thomas

    2017-04-01

    Gung Ho is the new dynamical core being developed for the next generation Met Office weather and climate model, suitable for meeting the exascale challenge on emerging computer architectures. It builds upon the earlier collaborative project between the Met Office, NERC and STFC Daresbury of the same name to investigate suitable numerical methods for dynamical cores. A mixed-finite element approach is used, where different finite element spaces are used to represent various fields. This method provides a number of beneficial improvements over the current model, such a compatibility and inherent conservation on quasi-uniform unstructured meshes, whilst maintaining the accuracy and good dispersion properties of the staggered grid currently used. Furthermore, the mixed finite element approach allows a large degree of flexibility in the type of mesh, order of approximation and discretisation, providing a simple way to test alternative options to obtain the best model possible.

  19. Modeling of delays in PKPD: classical approaches and a tutorial for delay differential equations.

    PubMed

    Koch, Gilbert; Krzyzanski, Wojciech; Pérez-Ruixo, Juan Jose; Schropp, Johannes

    2014-08-01

    In pharmacokinetics/pharmacodynamics (PKPD) the measured response is often delayed relative to drug administration, individuals in a population have a certain lifespan until they maturate or the change of biomarkers does not immediately affects the primary endpoint. The classical approach in PKPD is to apply transit compartment models (TCM) based on ordinary differential equations to handle such delays. However, an alternative approach to deal with delays are delay differential equations (DDE). DDEs feature additional flexibility and properties, realize more complex dynamics and can complementary be used together with TCMs. We introduce several delay based PKPD models and investigate mathematical properties of general DDE based models, which serve as subunits in order to build larger PKPD models. Finally, we review current PKPD software with respect to the implementation of DDEs for PKPD analysis.

  20. Spatially resolved, in-situ monitoring of crack growth via the coupling current in aluminum alloy 5083

    NASA Astrophysics Data System (ADS)

    Williams, Krystaufeux D.

    The work discussed in this dissertation is an experimental validation of a body of research that was created to model stress corrosion cracking phenomenon for 304 stainless steels in boiling water reactors. This coupled environment fracture model (CEFM) incorporates the natural laws of the conservation of charge and the differential aeration hypothesis to predict the amount of stress corrosion crack growth as a function of many external environmental variables, including potential, stress intensity, solution conductivity, oxidizer concentrations, and various other environmental parameters. Out of this approach came the concept of the coupling current; a local corrosion current that flows from within cracks, crevices, pits, etc... of a metal or alloy to the external surface. Because of the deterministic approach taken in the mentioned research, the coupling current analysis and CEFM model can be applied to the specific problem of SCC in aluminum alloy 5083 (the alloy of interest for this dissertation that is highly sought after today because of its corrosion resistance and high strength to weight ratio). This dissertation research is specifically devoted to the experimental verification of the coupling current, which results from a coupling between the crack's internal and external environments, by spatially resolving them using the scanning vibrating probe (SVP) as a tool. Hence, through the use of a unique fracture mechanics setup, simultaneous mechanical and local electrochemical data may be obtained, in situ..

  1. AQMEII: A New International Initiative on Air Quality Model Evaluation

    EPA Science Inventory

    We provide a conceptual view of the process of evaluating regional-scale three-dimensional numerical photochemical air quality modeling system, based on an examination of existing approached to the evaluation of such systems as they are currently used in a variety of application....

  2. Capturing well-being in activity pattern models within activity-based travel demand models.

    DOT National Transportation Integrated Search

    2013-03-01

    The activity-based approach which is based on the premise that the demand for travel is derived : from the demand for activities, currently constitutes the state of the art in metropolitan travel : demand forecasting and particularly in a form known ...

  3. Capturing well-being in activity pattern models within activity-based travel demand models.

    DOT National Transportation Integrated Search

    2013-04-01

    The activity-based approach which is based on the premise that the demand for travel is derived : from the demand for activities, currently constitutes the state of the art in metropolitan travel : demand forecasting and particularly in a form known ...

  4. Treatment of Sexual Offenders: Research, Best Practices, and Emerging Models

    ERIC Educational Resources Information Center

    Yates, Pamela M.

    2013-01-01

    Treatment of sexual offenders has evolved substantially over the years; various theoretical and practice models of treatment been developed, modified, refined, and proposed over time. The predominant current recommended approach, supported by research, adheres to specific principles of effective correctional intervention, follows a…

  5. A Chimpanzee (Pan troglodytes) Model of Triarchic Psychopathy Constructs: Development and Initial Validation

    PubMed Central

    Latzman, Robert D.; Drislane, Laura E.; Hecht, Lisa K.; Brislin, Sarah J.; Patrick, Christopher J.; Lilienfeld, Scott O.; Freeman, Hani J.; Schapiro, Steven J.; Hopkins, William D.

    2015-01-01

    The current work sought to operationalize constructs of the triarchic model of psychopathy in chimpanzees (Pan troglodytes), a species well-suited for investigations of basic biobehavioral dispositions relevant to psychopathology. Across three studies, we generated validity evidence for scale measures of the triarchic model constructs in a large sample (N=238) of socially-housed chimpanzees. Using a consensus-based rating approach, we first identified candidate items for the chimpanzee triarchic (CHMP-Tri) scales from an existing primate personality instrument and refined these into scales. In Study 2, we collected data for these scales from human informants (N=301), and examined their convergent and divergent relations with scales from another triarchic inventory developed for human use. In Study 3, we undertook validation work examining associations between CHMP-Tri scales and task measures of approach-avoidance behavior (N=73) and ability to delay gratification (N=55). Current findings provide support for a chimpanzee model of core dispositions relevant to psychopathy and other forms of psychopathology. PMID:26779396

  6. 1992 NASA Life Support Systems Analysis workshop

    NASA Technical Reports Server (NTRS)

    Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.

    1992-01-01

    The 1992 Life Support Systems Analysis Workshop was sponsored by NASA's Office of Aeronautics and Space Technology (OAST) to integrate the inputs from, disseminate information to, and foster communication among NASA, industry, and academic specialists. The workshop continued discussion and definition of key issues identified in the 1991 workshop, including: (1) modeling and experimental validation; (2) definition of systems analysis evaluation criteria; (3) integration of modeling at multiple levels; and (4) assessment of process control modeling approaches. Through both the 1991 and 1992 workshops, NASA has continued to seek input from industry and university chemical process modeling and analysis experts, and to introduce and apply new systems analysis approaches to life support systems. The workshop included technical presentations, discussions, and interactive planning, with sufficient time allocated for discussion of both technology status and technology development recommendations. Key personnel currently involved with life support technology developments from NASA, industry, and academia provided input to the status and priorities of current and future systems analysis methods and requirements.

  7. Characterizing and Assessing a Large-Scale Software Maintenance Organization

    NASA Technical Reports Server (NTRS)

    Briand, Lionel; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1995-01-01

    One important component of a software process is the organizational context in which the process is enacted. This component is often missing or incomplete in current process modeling approaches. One technique for modeling this perspective is the Actor-Dependency (AD) Model. This paper reports on a case study which used this approach to analyze and assess a large software maintenance organization. Our goal was to identify the approach's strengths and weaknesses while providing practical recommendations for improvement and research directions. The AD model was found to be very useful in capturing the important properties of the organizational context of the maintenance process, and aided in the understanding of the flaws found in this process. However, a number of opportunities for extending and improving the AD model were identified. Among others, there is a need to incorporate quantitative information to complement the qualitative model.

  8. Hidden Markov Item Response Theory Models for Responses and Response Times.

    PubMed

    Molenaar, Dylan; Oberski, Daniel; Vermunt, Jeroen; De Boeck, Paul

    2016-01-01

    Current approaches to model responses and response times to psychometric tests solely focus on between-subject differences in speed and ability. Within subjects, speed and ability are assumed to be constants. Violations of this assumption are generally absorbed in the residual of the model. As a result, within-subject departures from the between-subject speed and ability level remain undetected. These departures may be of interest to the researcher as they reflect differences in the response processes adopted on the items of a test. In this article, we propose a dynamic approach for responses and response times based on hidden Markov modeling to account for within-subject differences in responses and response times. A simulation study is conducted to demonstrate acceptable parameter recovery and acceptable performance of various fit indices in distinguishing between different models. In addition, both a confirmatory and an exploratory application are presented to demonstrate the practical value of the modeling approach.

  9. Correlation techniques to determine model form in robust nonlinear system realization/identification

    NASA Technical Reports Server (NTRS)

    Stry, Greselda I.; Mook, D. Joseph

    1991-01-01

    The fundamental challenge in identification of nonlinear dynamic systems is determining the appropriate form of the model. A robust technique is presented which essentially eliminates this problem for many applications. The technique is based on the Minimum Model Error (MME) optimal estimation approach. A detailed literature review is included in which fundamental differences between the current approach and previous work is described. The most significant feature is the ability to identify nonlinear dynamic systems without prior assumption regarding the form of the nonlinearities, in contrast to existing nonlinear identification approaches which usually require detailed assumptions of the nonlinearities. Model form is determined via statistical correlation of the MME optimal state estimates with the MME optimal model error estimates. The example illustrations indicate that the method is robust with respect to prior ignorance of the model, and with respect to measurement noise, measurement frequency, and measurement record length.

  10. Development of a structured approach for decomposition of complex systems on a functional basis

    NASA Astrophysics Data System (ADS)

    Yildirim, Unal; Felician Campean, I.

    2014-07-01

    The purpose of this paper is to present the System State Flow Diagram (SSFD) as a structured and coherent methodology to decompose a complex system on a solution- independent functional basis. The paper starts by reviewing common function modelling frameworks in literature and discusses practical requirements of the SSFD in the context of the current literature and current approaches in industry. The proposed methodology is illustrated through the analysis of a case study: design analysis of a generic Bread Toasting System (BTS).

  11. An efficiency-decay model for Lumen maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bobashev, Georgiy; Baldasaro, Nicholas G.; Mills, Karmann C.

    Proposed is a multicomponent model for the estimation of light-emitting diode (LED) lumen maintenance using test data that were acquired in accordance with the test standards of the Illumination Engineering Society of North America, i.e., LM-80-08. Lumen maintenance data acquired with this test do not always follow exponential decay, particularly data collected in the first 1000 h or under low-stress (e.g., low temperature) conditions. This deviation from true exponential behavior makes it difficult to use the full data set in models for the estimation of lumen maintenance decay coefficient. As a result, critical information that is relevant to the earlymore » life or low-stress operation of LED light sources may be missed. We present an efficiency-decay model approach, where all lumen maintenance data can be used to provide an alternative estimate of the decay rate constant. The approach considers a combined model wherein one part describes an initial “break-in” period and another part describes the decay in lumen maintenance. During the break-in period, several mechanisms within the LED can act to produce a small (typically <; 10%) increase in luminous flux. The effect of the break-in period and its longevity is more likely to be present at low-ambient temperatures and currents, where the discrepancy between a standard TM-21 approach and our proposed model is the largest. For high temperatures and currents, the difference between the estimates becomes nonsubstantial. Finally, our approach makes use of all the collected data and avoids producing unrealistic estimates of the decay coefficient.« less

  12. An efficiency-decay model for Lumen maintenance

    DOE PAGES

    Bobashev, Georgiy; Baldasaro, Nicholas G.; Mills, Karmann C.; ...

    2016-08-25

    Proposed is a multicomponent model for the estimation of light-emitting diode (LED) lumen maintenance using test data that were acquired in accordance with the test standards of the Illumination Engineering Society of North America, i.e., LM-80-08. Lumen maintenance data acquired with this test do not always follow exponential decay, particularly data collected in the first 1000 h or under low-stress (e.g., low temperature) conditions. This deviation from true exponential behavior makes it difficult to use the full data set in models for the estimation of lumen maintenance decay coefficient. As a result, critical information that is relevant to the earlymore » life or low-stress operation of LED light sources may be missed. We present an efficiency-decay model approach, where all lumen maintenance data can be used to provide an alternative estimate of the decay rate constant. The approach considers a combined model wherein one part describes an initial “break-in” period and another part describes the decay in lumen maintenance. During the break-in period, several mechanisms within the LED can act to produce a small (typically <; 10%) increase in luminous flux. The effect of the break-in period and its longevity is more likely to be present at low-ambient temperatures and currents, where the discrepancy between a standard TM-21 approach and our proposed model is the largest. For high temperatures and currents, the difference between the estimates becomes nonsubstantial. Finally, our approach makes use of all the collected data and avoids producing unrealistic estimates of the decay coefficient.« less

  13. The North American Multi-Model Ensemble (NMME): Phase-1 Seasonal to Interannual Prediction, Phase-2 Toward Developing Intra-Seasonal Prediction

    NASA Technical Reports Server (NTRS)

    Kirtman, Ben P.; Min, Dughong; Infanti, Johnna M.; Kinter, James L., III; Paolino, Daniel A.; Zhang, Qin; vandenDool, Huug; Saha, Suranjana; Mendez, Malaquias Pena; Becker, Emily; hide

    2013-01-01

    The recent US National Academies report "Assessment of Intraseasonal to Interannual Climate Prediction and Predictability" was unequivocal in recommending the need for the development of a North American Multi-Model Ensemble (NMME) operational predictive capability. Indeed, this effort is required to meet the specific tailored regional prediction and decision support needs of a large community of climate information users. The multi-model ensemble approach has proven extremely effective at quantifying prediction uncertainty due to uncertainty in model formulation, and has proven to produce better prediction quality (on average) then any single model ensemble. This multi-model approach is the basis for several international collaborative prediction research efforts, an operational European system and there are numerous examples of how this multi-model ensemble approach yields superior forecasts compared to any single model. Based on two NOAA Climate Test Bed (CTB) NMME workshops (February 18, and April 8, 2011) a collaborative and coordinated implementation strategy for a NMME prediction system has been developed and is currently delivering real-time seasonal-to-interannual predictions on the NOAA Climate Prediction Center (CPC) operational schedule. The hindcast and real-time prediction data is readily available (e.g., http://iridl.ldeo.columbia.edu/SOURCES/.Models/.NMME/) and in graphical format from CPC (http://origin.cpc.ncep.noaa.gov/products/people/wd51yf/NMME/index.html). Moreover, the NMME forecast are already currently being used as guidance for operational forecasters. This paper describes the new NMME effort, presents an overview of the multi-model forecast quality, and the complementary skill associated with individual models.

  14. A Realization of Bias Correction Method in the GMAO Coupled System

    NASA Technical Reports Server (NTRS)

    Chang, Yehui; Koster, Randal; Wang, Hailan; Schubert, Siegfried; Suarez, Max

    2018-01-01

    Over the past several decades, a tremendous effort has been made to improve model performance in the simulation of the climate system. The cold or warm sea surface temperature (SST) bias in the tropics is still a problem common to most coupled ocean atmosphere general circulation models (CGCMs). The precipitation biases in CGCMs are also accompanied by SST and surface wind biases. The deficiencies and biases over the equatorial oceans through their influence on the Walker circulation likely contribute the precipitation biases over land surfaces. In this study, we introduce an approach in the CGCM modeling to correct model biases. This approach utilizes the history of the model's short-term forecasting errors and their seasonal dependence to modify model's tendency term and to minimize its climate drift. The study shows that such an approach removes most of model climate biases. A number of other aspects of the model simulation (e.g. extratropical transient activities) are also improved considerably due to the imposed pre-processed initial 3-hour model drift corrections. Because many regional biases in the GEOS-5 CGCM are common amongst other current models, our approaches and findings are applicable to these other models as well.

  15. A longitudinal multilevel CFA-MTMM model for interchangeable and structurally different methods

    PubMed Central

    Koch, Tobias; Schultze, Martin; Eid, Michael; Geiser, Christian

    2014-01-01

    One of the key interests in the social sciences is the investigation of change and stability of a given attribute. Although numerous models have been proposed in the past for analyzing longitudinal data including multilevel and/or latent variable modeling approaches, only few modeling approaches have been developed for studying the construct validity in longitudinal multitrait-multimethod (MTMM) measurement designs. The aim of the present study was to extend the spectrum of current longitudinal modeling approaches for MTMM analysis. Specifically, a new longitudinal multilevel CFA-MTMM model for measurement designs with structurally different and interchangeable methods (called Latent-State-Combination-Of-Methods model, LS-COM) is presented. Interchangeable methods are methods that are randomly sampled from a set of equivalent methods (e.g., multiple student ratings for teaching quality), whereas structurally different methods are methods that cannot be easily replaced by one another (e.g., teacher, self-ratings, principle ratings). Results of a simulation study indicate that the parameters and standard errors in the LS-COM model are well recovered even in conditions with only five observations per estimated model parameter. The advantages and limitations of the LS-COM model relative to other longitudinal MTMM modeling approaches are discussed. PMID:24860515

  16. A Framework for Action: Intervening to Increase Adoption of Transformative Web 2.0 Learning Resources

    ERIC Educational Resources Information Center

    Hughes, Joan E.; Guion, James M.; Bruce, Kama A.; Horton, Lucas R.; Prescott, Amy

    2011-01-01

    Web 2.0 tools have emerged as conducive for innovative pedagogy and transformative learning opportunities for youth. Currently,Web 2.0 is often adopted into teachers' practice to simply replace or amplify traditional instructional approaches rather than to promote or facilitate transformative educational change. Current models of innovation…

  17. BinQuasi: a peak detection method for ChIP-sequencing data with biological replicates.

    PubMed

    Goren, Emily; Liu, Peng; Wang, Chao; Wang, Chong

    2018-04-19

    ChIP-seq experiments that are aimed at detecting DNA-protein interactions require biological replication to draw inferential conclusions, however there is no current consensus on how to analyze ChIP-seq data with biological replicates. Very few methodologies exist for the joint analysis of replicated ChIP-seq data, with approaches ranging from combining the results of analyzing replicates individually to joint modeling of all replicates. Combining the results of individual replicates analyzed separately can lead to reduced peak classification performance compared to joint modeling. Currently available methods for joint analysis may fail to control the false discovery rate at the nominal level. We propose BinQuasi, a peak caller for replicated ChIP-seq data, that jointly models biological replicates using a generalized linear model framework and employs a one-sided quasi-likelihood ratio test to detect peaks. When applied to simulated data and real datasets, BinQuasi performs favorably compared to existing methods, including better control of false discovery rate than existing joint modeling approaches. BinQuasi offers a flexible approach to joint modeling of replicated ChIP-seq data which is preferable to combining the results of replicates analyzed individually. Source code is freely available for download at https://cran.r-project.org/package=BinQuasi, implemented in R. pliu@iastate.edu or egoren@iastate.edu. Supplementary material is available at Bioinformatics online.

  18. An Exploratory Study of Sustainable Development at Italian Universities

    ERIC Educational Resources Information Center

    Vagnoni, Emidia; Cavicchi, Caterina

    2015-01-01

    Purpose: This paper aims to outline the current status of the implementation of sustainability practices in the context of Italian public universities, highlighting the strengths and gaps. Design/methodology/approach: Based on a qualitative approach, an exploratory study design has been outlined using the model of Glavic and Lukman (2007) focusing…

  19. The Lom Approach--a Call for Concern?

    ERIC Educational Resources Information Center

    Armitage, Nicholas; Bowerman, Chris

    2005-01-01

    The LOM (Learning Object Model) approach to courseware design seems to be driven by a desire to increase access to education as well as use technology to enable a higher staff-student ratio than is currently possible. The LOM standard involves the use of standard metadata descriptions of content and adaptive content engines to deliver the…

  20. The LOM Approach -- A CALL for Concern?

    ERIC Educational Resources Information Center

    Armitage, Nicholas; Bowerman, Chris

    2005-01-01

    The LOM (Learning Object Model) approach to courseware design seems to be driven by a desire to increase access to education as well as use technology to enable a higher staff-student ratio than is currently possible. The LOM standard involves the use of standard metadata descriptions of content and adaptive content engines to deliver the…

  1. Information Literacy in Oman's Higher Education: A Descriptive-Inferential Approach

    ERIC Educational Resources Information Center

    Al-Aufi, Ali; Al-Azri, Hamed

    2013-01-01

    This study aims to identify the current status of information literacy among the students at Sultan Qaboos University in their final year through using the Big6 model for solving information problems. The study utilizes self-assessment survey approach, with the questionnaire as a tool for data collection. It surveyed undergraduate students of…

  2. The ESA21 Project: A Model for Civic Engagement

    ERIC Educational Resources Information Center

    Pratte, John; Laposata, Matt

    2005-01-01

    There have been many systematic approaches to solving the problem of how to make science courses interesting to students. One that is currently receiving attention in the sciences is the use of civic engagement within the classroom. This approach works well in small enrollment courses, but it is logistically difficult to implement in large…

  3. Multi-Level Alignment Model: Transforming Face-to-Face into E-Instructional Programs

    ERIC Educational Resources Information Center

    Byers, Celina

    2005-01-01

    Purpose--To suggest to others in the field an approach equally valid for transforming existing courses into online courses and for creating new online courses. Design/methodology/approach--Using the literature for substantiation, this article discusses the current rapid change within organizations, the role of technology in that change, and the…

  4. Learning from Japanese Approach to Teachers' Professional Development: Can "Jugyou Kenkyu" Work in Other Countries?

    ERIC Educational Resources Information Center

    Masami, Matoba; Reza, Sarkar Arani M.

    2005-01-01

    This paper tries to present a careful analysis of current trends and challenges to importing Japanese model of teachers' professional development. The objective is to examine what "we" can learn from Japanese approach to improving instruction, especially "Jugyou Kenkyu" (Lesson Study) as a collaborative research on the…

  5. An Alternative Approach to the Operation of Multinational Reservoir Systems: Application to the Amistad & Falcon System (Lower Rio Grande/Rí-o Bravo)

    NASA Astrophysics Data System (ADS)

    Serrat-Capdevila, A.; Valdes, J. B.

    2005-12-01

    An optimization approach for the operation of international multi-reservoir systems is presented. The approach uses Stochastic Dynamic Programming (SDP) algorithms, both steady-state and real-time, to develop two models. In the first model, the reservoirs and flows of the system are aggregated to yield an equivalent reservoir, and the obtained operating policies are disaggregated using a non-linear optimization procedure for each reservoir and for each nation water balance. In the second model a multi-reservoir approach is applied, disaggregating the releases for each country water share in each reservoir. The non-linear disaggregation algorithm uses SDP-derived operating policies as boundary conditions for a local time-step optimization. Finally, the performance of the different approaches and methods is compared. These models are applied to the Amistad-Falcon International Reservoir System as part of a binational dynamic modeling effort to develop a decision support system tool for a better management of the water resources in the Lower Rio Grande Basin, currently enduring a severe drought.

  6. Electromagnetic nonlinearities in a Roebel-cable-based accelerator magnet prototype: variational approach

    NASA Astrophysics Data System (ADS)

    Ruuskanen, J.; Stenvall, A.; Lahtinen, V.; Pardo, E.

    2017-02-01

    Superconducting magnets are the most expensive series of components produced in the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN). When developing such magnets beyond state-of-the-art technology, one possible option is to use high-temperature superconductors (HTS) that are capable of tolerating much higher magnetic fields than low-temperature superconductors (LTS), carrying simultaneously high current densities. Significant cost reductions due to decreased prototype construction needs can be achieved by careful modelling of the magnets. Simulations are used, e.g. for designing magnets fulfilling the field quality requirements of the beampipe, and adequate protection by studying the losses occurring during charging and discharging. We model the hysteresis losses and the magnetic field nonlinearity in the beampipe as a function of the magnet’s current. These simulations rely on the minimum magnetic energy variation principle, with optimization algorithms provided by the open-source optimization library interior point optimizer. We utilize this methodology to investigate a research and development accelerator magnet prototype made of REBCO Roebel cable. The applicability of this approach, when the magnetic field dependence of the superconductor’s critical current density is considered, is discussed. We also scrutinize the influence of the necessary modelling decisions one needs to make with this approach. The results show that different decisions can lead to notably different results, and experiments are required to study the electromagnetic behaviour of such magnets further.

  7. A nonlinear viscoelastic approach to durability predictions for polymer based composite structures

    NASA Technical Reports Server (NTRS)

    Brinson, Hal F.

    1991-01-01

    Current industry approaches for the durability assessment of metallic structures are briefly reviewed. For polymer based composite structures, it is suggested that new approaches must be adopted to include memory or viscoelastic effects which could lead to delayed failures that might not be predicted using current techniques. A durability or accelerated life assessment plan for fiber reinforced plastics (FRP) developed and documented over the last decade or so is reviewed and discussed. Limitations to the plan are outlined and suggestions to remove the limitations are given. These include the development of a finite element code to replace the previously used lamination theory code and the development of new specimen geometries to evaluate delamination failures. The new DCB model is reviewed and results are presented. Finally, it is pointed out that new procedures are needed to determine interfacial properties and current efforts underway to determine such properties are reviewed. Suggestions for additional efforts to develop a consistent and accurate durability predictive approach for FRP structures are outlined.

  8. A nonlinear viscoelastic approach to durability predictions for polymer based composite structures

    NASA Technical Reports Server (NTRS)

    Brinson, Hal F.; Hiel, C. C.

    1990-01-01

    Current industry approaches for the durability assessment of metallic structures are briefly reviewed. For polymer based composite structures, it is suggested that new approaches must be adopted to include memory or viscoelastic effects which could lead to delayed failures that might not be predicted using current techniques. A durability or accelerated life assessment plan for fiber reinforced plastics (FRP) developed and documented over the last decade or so is reviewed and discussed. Limitations to the plan are outlined and suggestions to remove the limitations are given. These include the development of a finite element code to replace the previously used lamination theory code and the development of new specimen geometries to evaluate delamination failures. The new DCB model is reviewed and results are presented. Finally, it is pointed out that new procedures are needed to determine interfacial properties and current efforts underway to determine such properties are reviewed. Suggestions for additional efforts to develop a consistent and accurate durability predictive approach for FRP structures is outlined.

  9. Angular approach combined to mechanical model for tool breakage detection by eddy current sensors

    NASA Astrophysics Data System (ADS)

    Ritou, M.; Garnier, S.; Furet, B.; Hascoet, J. Y.

    2014-02-01

    The paper presents a new complete approach for Tool Condition Monitoring (TCM) in milling. The aim is the early detection of small damages so that catastrophic tool failures are prevented. A versatile in-process monitoring system is introduced for reliability concerns. The tool condition is determined by estimates of the radial eccentricity of the teeth. An adequate criterion is proposed combining mechanical model of milling and angular approach.Then, a new solution is proposed for the estimate of cutting force using eddy current sensors implemented close to spindle nose. Signals are analysed in the angular domain, notably by synchronous averaging technique. Phase shifts induced by changes of machining direction are compensated. Results are compared with cutting forces measured with a dynamometer table.The proposed method is implemented in an industrial case of pocket machining operation. One of the cutting edges has been slightly damaged during the machining, as shown by a direct measurement of the tool. A control chart is established with the estimates of cutter eccentricity obtained during the machining from the eddy current sensors signals. Efficiency and reliability of the method is demonstrated by a successful detection of the damage.

  10. Magnetic flux-load current interactions in ferrous conductors

    NASA Astrophysics Data System (ADS)

    Cannell, Michael J.; McConnell, Richard A.

    1992-06-01

    A modeling technique has been developed to account for interactions between load current and magnetic flux in an iron conductor. Such a conductor would be used in the active region of a normally conducting homopolar machine. This approach has been experimentally verified and its application to a real machine demonstrated. Additionally, measurements of the resistivity of steel under the combined effects of magnetic field and current have been conducted.

  11. Monitoring estuarine circulation and ocean waste dispersion using an integrated satellite-aircraft-drogue approach. [Delaware coast and Delaware Bay

    NASA Technical Reports Server (NTRS)

    Klemas, V. (Principal Investigator); Davis, G.; Wang, H.

    1975-01-01

    The author has identified the following significant results. An inexpensive, integrated drogue-aircraft-satellite approach was developed which is based on the Lagrangian technique and employs remotely tracked drogues and dyes together with satellite observation of natural tracers, such as suspended sediment. Results include current circulation studies in Delaware Bay in support of an oil slick movement model; investigations of the dispersion and movement of acid wastes dumped 40 miles off the Delaware coast; and coastal current circulation. In each case, the integrated drogue-aircraft-satellite approach compares favorably with other techniques on the basis of accuracy, cost effectiveness, and performance under severe weather conditions.

  12. Self-consistent modeling of the dynamic evolution of magnetic island growth in the presence of stabilizing electron-cyclotron current drive

    NASA Astrophysics Data System (ADS)

    Chatziantonaki, Ioanna; Tsironis, Christos; Isliker, Heinz; Vlahos, Loukas

    2013-11-01

    The most promising technique for the control of neoclassical tearing modes in tokamak experiments is the compensation of the missing bootstrap current with an electron-cyclotron current drive (ECCD). In this frame, the dynamics of magnetic islands has been studied extensively in terms of the modified Rutherford equation (MRE), including the presence of a current drive, either analytically described or computed by numerical methods. In this article, a self-consistent model for the dynamic evolution of the magnetic island and the driven current is derived, which takes into account the island's magnetic topology and its effect on the current drive. The model combines the MRE with a ray-tracing approach to electron-cyclotron wave-propagation and absorption. Numerical results exhibit a decrease in the time required for complete stabilization with respect to the conventional computation (not taking into account the island geometry), which increases by increasing the initial island size and radial misalignment of the deposition.

  13. Validation of conducting wall models using magnetic measurements

    DOE PAGES

    Hanson, Jeremy M.; Bialek, James M.; Turco, Francesca; ...

    2016-08-16

    The impact of conducting wall eddy currents on perturbed magnetic field measurements is a key issue for understanding the measurement and control of long-wavelength MHD stability in tokamak devices. As plasma response models have growth in sophistication, the need to understand and resolve small changes in these measurements has become more important, motivating increased fidelity in simulations of externally applied fields and the wall eddy current response. In this manuscript, we describe thorough validation studies of the wall models in the MARS-F and VALEN stability codes, using coil–sensor vacuum coupling measurements from the DIII-D tokamak. The valen formulation treats conductingmore » structures with arbitrary threedimensional geometries, while mars-f uses an axisymmetric wall model and a spectral decomposition of the problem geometry with a fixed toroidal harmonic n. The vacuum coupling measurements have a strong sensitivity to wall eddy currents induced by timechanging coil currents, owing to the close proximities of both the sensors and coils to the wall. Measurements from individual coil and sensor channels are directly compared with valen predictions. It is found that straightforward improvements to the valen model, such as refining the wall mesh and simulating the vertical extent of the DIII-D poloidal field sensors, lead to good agreement with the experimental measurements. In addition, couplings to multi-coil, n = 1 toroidal mode perturbations are calculated from the measurements and compared with predictions from both codes. Lastly, the toroidal mode comparisons favor the fully three-dimensional simulation approach, likely because this approach naturally treats n > 1 sidebands generated by the coils and wall eddy currents, as well as the n = 1 fundamental.« less

  14. Validation of conducting wall models using magnetic measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, Jeremy M.; Bialek, James M.; Turco, Francesca

    The impact of conducting wall eddy currents on perturbed magnetic field measurements is a key issue for understanding the measurement and control of long-wavelength MHD stability in tokamak devices. As plasma response models have growth in sophistication, the need to understand and resolve small changes in these measurements has become more important, motivating increased fidelity in simulations of externally applied fields and the wall eddy current response. In this manuscript, we describe thorough validation studies of the wall models in the MARS-F and VALEN stability codes, using coil–sensor vacuum coupling measurements from the DIII-D tokamak. The valen formulation treats conductingmore » structures with arbitrary threedimensional geometries, while mars-f uses an axisymmetric wall model and a spectral decomposition of the problem geometry with a fixed toroidal harmonic n. The vacuum coupling measurements have a strong sensitivity to wall eddy currents induced by timechanging coil currents, owing to the close proximities of both the sensors and coils to the wall. Measurements from individual coil and sensor channels are directly compared with valen predictions. It is found that straightforward improvements to the valen model, such as refining the wall mesh and simulating the vertical extent of the DIII-D poloidal field sensors, lead to good agreement with the experimental measurements. In addition, couplings to multi-coil, n = 1 toroidal mode perturbations are calculated from the measurements and compared with predictions from both codes. Lastly, the toroidal mode comparisons favor the fully three-dimensional simulation approach, likely because this approach naturally treats n > 1 sidebands generated by the coils and wall eddy currents, as well as the n = 1 fundamental.« less

  15. Integrating pixel- and polygon-based approaches to wildfire risk assessment: Application to a high-value watershed on the Pike and San Isabel National Forests, Colorado, USA

    Treesearch

    Matthew P. Thompson; Julie W. Gilbertson-Day; Joe H. Scott

    2015-01-01

    We develop a novel risk assessment approach that integrates complementary, yet distinct, spatial modeling approaches currently used in wildfire risk assessment. Motivation for this work stems largely from limitations of existing stochastic wildfire simulation systems, which can generate pixel-based outputs of fire behavior as well as polygon-based outputs of simulated...

  16. Toward Self-Referential Autonomous Learning of Object and Situation Models.

    PubMed

    Damerow, Florian; Knoblauch, Andreas; Körner, Ursula; Eggert, Julian; Körner, Edgar

    2016-01-01

    Most current approaches to scene understanding lack the capability to adapt object and situation models to behavioral needs not anticipated by the human system designer. Here, we give a detailed description of a system architecture for self-referential autonomous learning which enables the refinement of object and situation models during operation in order to optimize behavior. This includes structural learning of hierarchical models for situations and behaviors that is triggered by a mismatch between expected and actual action outcome. Besides proposing architectural concepts, we also describe a first implementation of our system within a simulated traffic scenario to demonstrate the feasibility of our approach.

  17. Predictive QSAR modeling workflow, model applicability domains, and virtual screening.

    PubMed

    Tropsha, Alexander; Golbraikh, Alexander

    2007-01-01

    Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.

  18. Potential for Inclusion of Information Encountering within Information Literacy Models

    ERIC Educational Resources Information Center

    Erdelez, Sanda; Basic, Josipa; Levitov, Deborah D.

    2011-01-01

    Introduction: Information encountering (finding information while searching for some other information), is a type of opportunistic discovery of information that complements purposeful approaches to finding information. The motivation for this paper was to determine if the current models of information literacy instruction refer to information…

  19. Integrating Research Competencies in Massage Therapy Education.

    ERIC Educational Resources Information Center

    Hymel, Glenn M.

    The massage therapy profession is currently engaged in a competency-based education movement that includes an emphasis on promoting massage therapy research competencies (MTRCs). A systems-based model for integrating MTRCs into massage therapy education was therefore proposed. The model and an accompanying checklist describe an approach to…

  20. Consequences of Psychotherapy Clients' Mental Health Ideology.

    ERIC Educational Resources Information Center

    Milling, Len; Kirsch, Irving

    Current theoretical approaches to understanding emotional difficulties are dominated by the medical model of mental illness, which assumes that emotional dysfunction can be viewed the same way as physical dysfunction. To examine the relationship between psychotherapy clients' beliefs about the medical model of psychotherapy and their behavior…

  1. The Bayesian Revolution Approaches Psychological Development

    ERIC Educational Resources Information Center

    Shultz, Thomas R.

    2007-01-01

    This commentary reviews five articles that apply Bayesian ideas to psychological development, some with psychology experiments, some with computational modeling, and some with both experiments and modeling. The reviewed work extends the current Bayesian revolution into tasks often studied in children, such as causal learning and word learning, and…

  2. Public Libraries and Internet Public Access Models: Describing Possible Approaches.

    ERIC Educational Resources Information Center

    Tomasello, Tami K.; McClure, Charles R.

    2002-01-01

    Discusses ways of providing Internet access to the general public and analyzes eight models currently in use: public schools, public libraries, cybermobiles, public housing, community technology centers, community networks, kiosks, and cyber cafes. Concludes that public libraries may wish to develop collaborative strategies with other…

  3. Introduction to the Special Issue: Advancing the State-of-the-Science in Reading Research through Modeling.

    PubMed

    Zevin, Jason D; Miller, Brett

    Reading research is increasingly a multi-disciplinary endeavor involving more complex, team-based science approaches. These approaches offer the potential of capturing the complexity of reading development, the emergence of individual differences in reading performance over time, how these differences relate to the development of reading difficulties and disability, and more fully understanding the nature of skilled reading in adults. This special issue focuses on the potential opportunities and insights that early and richly integrated advanced statistical and computational modeling approaches can provide to our foundational (and translational) understanding of reading. The issue explores how computational and statistical modeling, using both observed and simulated data, can serve as a contact point among research domains and topics, complement other data sources and critically provide analytic advantages over current approaches.

  4. Formulating "Principles of Procedure" for the Foreign Language Classroom: A Framework for Process Model Language Curricula

    ERIC Educational Resources Information Center

    Villacañas de Castro, Luis S.

    2016-01-01

    This article aims to apply Stenhouse's process model of curriculum to foreign language (FL) education, a model which is characterized by enacting "principles of procedure" which are specific to the discipline which the school subject belongs to. Rather than to replace or dissolve current approaches to FL teaching and curriculum…

  5. Testing Mediation in Structural Equation Modeling: The Effectiveness of the Test of Joint Significance

    ERIC Educational Resources Information Center

    Leth-Steensen, Craig; Gallitto, Elena

    2016-01-01

    A large number of approaches have been proposed for estimating and testing the significance of indirect effects in mediation models. In this study, four sets of Monte Carlo simulations involving full latent variable structural equation models were run in order to contrast the effectiveness of the currently popular bias-corrected bootstrapping…

  6. Identifying Successful Advancement Approaches in Four Catholic Universities: The Effectiveness of the Four Advancement Models of Communication

    ERIC Educational Resources Information Center

    Bonglia, Jean-Pierre K.

    2010-01-01

    The current longitudinal study of the most successful Catholic universities in the United States identifies the prevalence of four advancement models of communication that have contributed to make those institutions successful in their philanthropic efforts. While research by Grunig and Kelly maintained that the two-way symmetrical model of…

  7. A Communication Model for Teaching a Course in Mass Media and Society.

    ERIC Educational Resources Information Center

    Crumley, Wilma; Stricklin, Michael

    Many professors of mass media and society courses have relied on a teaching model implying that students are sponges soaking up information. A more appropriate model invites concern with an active audience, transaction, the interpersonal mass media mix, a general systems approach, and process and change--in other words, utilization of current and…

  8. Learning with Interactive Whiteboards: Determining the Factors on Promoting Interactive Whiteboards to Students by Technology Acceptance Model

    ERIC Educational Resources Information Center

    Kilic, Eylem; Güler, Çetin; Çelik, H. Eray; Tatli, Cemal

    2015-01-01

    Purpose: The purpose of this study is to investigate the factors which might affect the intention to use interactive whiteboards (IWBs) by university students, using Technology Acceptance Model by the structural equation modeling approach. The following hypothesis guided the current study: H1. There is a positive relationship between IWB…

  9. An efficient and scalable graph modeling approach for capturing information at different levels in next generation sequencing reads

    PubMed Central

    2013-01-01

    Background Next generation sequencing technologies have greatly advanced many research areas of the biomedical sciences through their capability to generate massive amounts of genetic information at unprecedented rates. The advent of next generation sequencing has led to the development of numerous computational tools to analyze and assemble the millions to billions of short sequencing reads produced by these technologies. While these tools filled an important gap, current approaches for storing, processing, and analyzing short read datasets generally have remained simple and lack the complexity needed to efficiently model the produced reads and assemble them correctly. Results Previously, we presented an overlap graph coarsening scheme for modeling read overlap relationships on multiple levels. Most current read assembly and analysis approaches use a single graph or set of clusters to represent the relationships among a read dataset. Instead, we use a series of graphs to represent the reads and their overlap relationships across a spectrum of information granularity. At each information level our algorithm is capable of generating clusters of reads from the reduced graph, forming an integrated graph modeling and clustering approach for read analysis and assembly. Previously we applied our algorithm to simulated and real 454 datasets to assess its ability to efficiently model and cluster next generation sequencing data. In this paper we extend our algorithm to large simulated and real Illumina datasets to demonstrate that our algorithm is practical for both sequencing technologies. Conclusions Our overlap graph theoretic algorithm is able to model next generation sequencing reads at various levels of granularity through the process of graph coarsening. Additionally, our model allows for efficient representation of the read overlap relationships, is scalable for large datasets, and is practical for both Illumina and 454 sequencing technologies. PMID:24564333

  10. Ocean Dynamics in the Key Regions of North Atlantic-Arctic Exchanges: Evaluation of Global Multi-Resolution FESOM and CMIP-type INMCM Models with Long-Term Observations

    NASA Astrophysics Data System (ADS)

    Beszczynska-Moeller, A.; Gürses, Ö.; Sidorenko, D.; Goessling, H.; Volodin, E. M.; Gritsun, A.; Iakovlev, N. G.; Andrzejewski, J.

    2017-12-01

    Enhancing the fidelity of climate models in the Arctic and North Atlantic in order to improve Arctic predictions requires better understanding of the underlying causes of common biases. The main focus of the ERA.Net project NAtMAP (Amending North Atlantic Model Biases to Improve Arctic Predictions) is on the dynamics of the key regions connecting the Arctic and the North Atlantic climate. The study aims not only at increased model realism, but also at a deeper understanding of North Atlantic-Arctic links and their contribution to Arctic predictability. Two complementary approaches employing different global coupled climate models, ECHAM6-FESOM and INMCM4/5, were adopted. The first approach is based on a recent development of climate models with ocean components based on unstructured meshes, allowing to resolve eddies and narrow boundary currents in the most crucial regions while keeping a moderate resolution elsewhere. The multi-resolution sea ice-ocean component of ECHAM6-FESOM allows studying the benefits of very high resolution in key areas of the North Atlantic. An alternative approach to address the North Atlantic and Arctic biases is also tried by tuning the performance of the relevant sub-grid-scale parameterizations in eddy resolving version the CMIP5 climate model INMCM4. Using long-term in situ and satellite observations and available climatologies we attempt to evaluate to what extent a higher resolution, allowing the explicit representation of eddies and narrow boundary currents in the North Atlantic and Nordic Seas, can alleviate the common model errors. The effects of better resolving the Labrador Sea area on reducing the model bias in surface hydrography and improved representation of ocean currents are addressed. Resolving eddy field in the Greenland Sea is assessed in terms of reducing the deep thermocline bias. The impact of increased resolution on the modeled characteristics of Atlantic water transport into the Arctic is examined with a special focus on separation of Atlantic inflow between Fram Strait and the Barents Sea, lateral exchanges in the Nordic Seas, and a role of eddies in modulating the poleward flow of Atlantic water. We also explore the effects of resolving boundary currents in the Arctic basin on the representation of the adjacent sea ice.

  11. From cancer genomes to cancer models: bridging the gaps

    PubMed Central

    Baudot, Anaïs; Real, Francisco X.; Izarzugaza, José M. G.; Valencia, Alfonso

    2009-01-01

    Cancer genome projects are now being expanded in an attempt to provide complete landscapes of the mutations that exist in tumours. Although the importance of cataloguing genome variations is well recognized, there are obvious difficulties in bridging the gaps between high-throughput resequencing information and the molecular mechanisms of cancer evolution. Here, we describe the current status of the high-throughput genomic technologies, and the current limitations of the associated computational analysis and experimental validation of cancer genetic variants. We emphasize how the current cancer-evolution models will be influenced by the high-throughput approaches, in particular through efforts devoted to monitoring tumour progression, and how, in turn, the integration of data and models will be translated into mechanistic knowledge and clinical applications. PMID:19305388

  12. Quasielastic neutrino charged-current scattering off 12C: Effects of the meson exchange currents and large nucleon axial mass

    NASA Astrophysics Data System (ADS)

    Butkevich, A. V.; Luchuk, S. V.

    2018-04-01

    The quasielastic scattering of muon neutrino and electrons on a carbon target are analyzed using the relativistic distorted-wave impulse approximation (RDWIA). We also evaluate the contribution of the two-particle and two-hole meson exchange current (2 p -2 h MEC) to electroweak response functions. The nuclear model dependence of the (anti)neutrino cross sections is studied within the RDWIA+MEC approach and RDWIA model with the large nucleon axial mass. It is shown that the results for the squared momentum transfer distribution d σ /d Q2 and for invariant mass of the final hadronic system distribution d σ /d W obtained within these models are substantially different.

  13. Probabilistic inversion of expert assessments to inform projections about Antarctic ice sheet responses.

    PubMed

    Fuller, Robert William; Wong, Tony E; Keller, Klaus

    2017-01-01

    The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potash, Peter J.; Bell, Eric B.; Harrison, Joshua J.

    Predictive models for tweet deletion have been a relatively unexplored area of Twitter-related computational research. We first approach the deletion of tweets as a spam detection problem, applying a small set of handcrafted features to improve upon the current state-of-the- art in predicting deleted tweets. Next, we apply our approach to a dataset of deleted tweets that better reflects the current deletion rate. Since tweets are deleted for reasons beyond just the presence of spam, we apply topic modeling and text embeddings in order to capture the semantic content of tweets that can lead to tweet deletion. Our goal ismore » to create an effective model that has a low-dimensional feature space and is also language-independent. A lean model would be computationally advantageous processing high-volumes of Twitter data, which can reach 9,885 tweets per second. Our results show that a small set of spam-related features combined with word topics and character-level text embeddings provide the best f1 when trained with a random forest model. The highest precision of the deleted tweet class is achieved by a modification of paragraph2vec to capture author identity.« less

  15. Low cost solar silicon production

    NASA Astrophysics Data System (ADS)

    Mede, Matt

    2009-08-01

    The worldwide demand for solar grade silicon reached an all time high between 2007 and 2008. Although growth in the solar industry is slowing due to the current economic downturn, demand is expected to rebound in 2011 based on current cost models. However, demand will increase even more than currently anticipated if costs are reduced. This situation creates an opportunity for new and innovative approaches to the production of photovoltaic grade silicon, especially methods which can demonstrate cost reductions over currently utilized processes.

  16. The Earth's magnetosphere modeling and ISO standard

    NASA Astrophysics Data System (ADS)

    Alexeev, I.

    The empirical model developed by Tsyganenko T96 is constructed by minimizing the rms deviation from the large magnetospheric data base Fairfield et al 1994 which contains Earth s magnetospheric magnetic field measurements accumulated during many years The applicability of the T96 model is limited mainly by quiet conditions in the solar wind along the Earth orbit But contrary to the internal planet s field the external magnetospheric magnetic field sources are much more time-dependent A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions The last version of the Tsyganenko model has been constructed for a geomagnetic storm time interval This version based on the more accurate and physically consistent approach in which each source of the magnetic field would have its own relaxation timescale and a driving function based on an individual best fit combination of the solar wind and IMF parameters The same method has been used previously for paraboloid model construction This method is based on a priori information about the global magnetospheric current systems structure Each current system is included as a separate block module in the magnetospheric model As it was shown by the spacecraft magnetometer data there are three current systems which are the main contributors to the external magnetospheric magnetic field magnetopause currents ring current and tail current sheet Paraboloid model is based on an analytical solution of the Laplace

  17. Implementing an ally development model to promote safer schools for LGB youth: a trans-disciplinary approach.

    PubMed

    Zammitt, Kimberly A; Pepperell, Jennifer; Coe, Megan

    2015-01-01

    Lesbian, gay, and bisexual (LGB) students experience ongoing bullying, harassment, and lack of safety in school. Specialized instructional support personnel (SISPs), such as school counselors, school social workers, and school psychologists, are in a unique position to advocate for LGB students and to implement an ally development model. The purpose of this article is to describe the current climate for LGB students, to discuss the current barriers facing SISPs in advocating for change, and to provide a model of ally development for use at each level of the K-12 system.

  18. Application of zonal model on indoor air sensor network design

    NASA Astrophysics Data System (ADS)

    Chen, Y. Lisa; Wen, Jin

    2007-04-01

    Growing concerns over the safety of the indoor environment have made the use of sensors ubiquitous. Sensors that detect chemical and biological warfare agents can offer early warning of dangerous contaminants. However, current sensor system design is more informed by intuition and experience rather by systematic design. To develop a sensor system design methodology, a proper indoor airflow modeling approach is needed. Various indoor airflow modeling techniques, from complicated computational fluid dynamics approaches to simplified multi-zone approaches, exist in the literature. In this study, the effects of two airflow modeling techniques, multi-zone modeling technique and zonal modeling technique, on indoor air protection sensor system design are discussed. Common building attack scenarios, using a typical CBW agent, are simulated. Both multi-zone and zonal models are used to predict airflows and contaminant dispersion. Genetic Algorithm is then applied to optimize the sensor location and quantity. Differences in the sensor system design resulting from the two airflow models are discussed for a typical office environment and a large hall environment.

  19. Marginal regression approach for additive hazards models with clustered current status data.

    PubMed

    Su, Pei-Fang; Chi, Yunchan

    2014-01-15

    Current status data arise naturally from tumorigenicity experiments, epidemiology studies, biomedicine, econometrics and demographic and sociology studies. Moreover, clustered current status data may occur with animals from the same litter in tumorigenicity experiments or with subjects from the same family in epidemiology studies. Because the only information extracted from current status data is whether the survival times are before or after the monitoring or censoring times, the nonparametric maximum likelihood estimator of survival function converges at a rate of n(1/3) to a complicated limiting distribution. Hence, semiparametric regression models such as the additive hazards model have been extended for independent current status data to derive the test statistics, whose distributions converge at a rate of n(1/2) , for testing the regression parameters. However, a straightforward application of these statistical methods to clustered current status data is not appropriate because intracluster correlation needs to be taken into account. Therefore, this paper proposes two estimating functions for estimating the parameters in the additive hazards model for clustered current status data. The comparative results from simulation studies are presented, and the application of the proposed estimating functions to one real data set is illustrated. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Assimilation of the seabird and ship drift data in the north-eastern sea of Japan into an operational ocean nowcast/forecast system

    PubMed Central

    Miyazawa, Yasumasa; Guo, Xinyu; Varlamov, Sergey M.; Miyama, Toru; Yoda, Ken; Sato, Katsufumi; Kano, Toshiyuki; Sato, Keiji

    2015-01-01

    At the present time, ocean current is being operationally monitored mainly by combined use of numerical ocean nowcast/forecast models and satellite remote sensing data. Improvement in the accuracy of the ocean current nowcast/forecast requires additional measurements with higher spatial and temporal resolution as expected from the current observation network. Here we show feasibility of assimilating high-resolution seabird and ship drift data into an operational ocean forecast system. Data assimilation of geostrophic current contained in the observed drift leads to refinement in the gyre mode events of the Tsugaru warm current in the north-eastern sea of Japan represented by the model. Fitting the observed drift to the model depends on ability of the drift representing geostrophic current compared to that representing directly wind driven components. A preferable horizontal scale of 50 km indicated for the seabird drift data assimilation implies their capability of capturing eddies with smaller horizontal scale than the minimum scale of 100 km resolved by the satellite altimetry. The present study actually demonstrates that transdisciplinary approaches combining bio-/ship- logging and numerical modeling could be effective for enhancement in monitoring the ocean current. PMID:26633309

  1. Identification of tumor evolution patterns by means of inductive logic programming.

    PubMed

    Bevilacqua, Vitoantonio; Chiarappa, Patrizia; Mastronardi, Giuseppe; Menolascina, Filippo; Paradiso, Angelo; Tommasi, Stefania

    2008-06-01

    In considering key events of genomic disorders in the development and progression of cancer, the correlation between genomic instability and carcinogenesis is currently under investigation. In this work, we propose an inductive logic programming approach to the problem of modeling evolution patterns for breast cancer. Using this approach, it is possible to extract fingerprints of stages of the disease that can be used in order to develop and deliver the most adequate therapies to patients. Furthermore, such a model can help physicians and biologists in the elucidation of molecular dynamics underlying the aberrations-waterfall model behind carcinogenesis. By showing results obtained on a real-world dataset, we try to give some hints about further approach to the knowledge-driven validations of such hypotheses.

  2. Practical Results from the Application of Model Checking and Test Generation from UML/SysML Models of On-Board Space Applications

    NASA Astrophysics Data System (ADS)

    Faria, J. M.; Mahomad, S.; Silva, N.

    2009-05-01

    The deployment of complex safety-critical applications requires rigorous techniques and powerful tools both for the development and V&V stages. Model-based technologies are increasingly being used to develop safety-critical software, and arguably, turning to them can bring significant benefits to such processes, however, along with new challenges. This paper presents the results of a research project where we tried to extend current V&V methodologies to be applied on UML/SysML models and aiming at answering the demands related to validation issues. Two quite different but complementary approaches were investigated: (i) model checking and the (ii) extraction of robustness test-cases from the same models. These two approaches don't overlap and when combined provide a wider reaching model/design validation ability than each one alone thus offering improved safety assurance. Results are very encouraging, even though they either fell short of the desired outcome as shown for model checking, or still appear as not fully matured as shown for robustness test case extraction. In the case of model checking, it was verified that the automatic model validation process can become fully operational and even expanded in scope once tool vendors help (inevitably) to improve the XMI standard interoperability situation. For the robustness test case extraction methodology, the early approach produced interesting results but need further systematisation and consolidation effort in order to produce results in a more predictable fashion and reduce reliance on expert's heuristics. Finally, further improvements and innovation research projects were immediately apparent for both investigated approaches, which point to either circumventing current limitations in XMI interoperability on one hand and bringing test case specification onto the same graphical level as the models themselves and then attempting to automate the generation of executable test cases from its standard UML notation.

  3. Molecular level in silico studies for oncology. Direct models review

    NASA Astrophysics Data System (ADS)

    Psakhie, S. G.; Tsukanov, A. A.

    2017-09-01

    The combination of therapy and diagnostics in one process "theranostics" is a trend in a modern medicine, especially in oncology. Such an approach requires development and usage of multifunctional hybrid nanoparticles with a hierarchical structure. Numerical methods and mathematical models play a significant role in the design of the hierarchical nanoparticles and allow looking inside the nanoscale mechanisms of agent-cell interactions. The current position of in silico approach in biomedicine and oncology is discussed. The review of the molecular level in silico studies in oncology, which are using the direct models, is presented.

  4. Estimation of hyper-parameters for a hierarchical model of combined cortical and extra-brain current sources in the MEG inverse problem.

    PubMed

    Morishige, Ken-ichi; Yoshioka, Taku; Kawawaki, Dai; Hiroe, Nobuo; Sato, Masa-aki; Kawato, Mitsuo

    2014-11-01

    One of the major obstacles in estimating cortical currents from MEG signals is the disturbance caused by magnetic artifacts derived from extra-cortical current sources such as heartbeats and eye movements. To remove the effect of such extra-brain sources, we improved the hybrid hierarchical variational Bayesian method (hyVBED) proposed by Fujiwara et al. (NeuroImage, 2009). hyVBED simultaneously estimates cortical and extra-brain source currents by placing dipoles on cortical surfaces as well as extra-brain sources. This method requires EOG data for an EOG forward model that describes the relationship between eye dipoles and electric potentials. In contrast, our improved approach requires no EOG and less a priori knowledge about the current variance of extra-brain sources. We propose a new method, "extra-dipole," that optimally selects hyper-parameter values regarding current variances of the cortical surface and extra-brain source dipoles. With the selected parameter values, the cortical and extra-brain dipole currents were accurately estimated from the simulated MEG data. The performance of this method was demonstrated to be better than conventional approaches, such as principal component analysis and independent component analysis, which use only statistical properties of MEG signals. Furthermore, we applied our proposed method to measured MEG data during covert pursuit of a smoothly moving target and confirmed its effectiveness. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Modeling and simulation of enhancement mode p-GaN Gate AlGaN/GaN HEMT for RF circuit switch applications

    NASA Astrophysics Data System (ADS)

    Panda, D. K.; Lenka, T. R.

    2017-06-01

    An enhancement mode p-GaN gate AlGaN/GaN HEMT is proposed and a physics based virtual source charge model with Landauer approach for electron transport has been developed using Verilog-A and simulated using Cadence Spectre, in order to predict device characteristics such as threshold voltage, drain current and gate capacitance. The drain current model incorporates important physical effects such as velocity saturation, short channel effects like DIBL (drain induced barrier lowering), channel length modulation (CLM), and mobility degradation due to self-heating. The predicted I d-V ds, I d-V gs, and C-V characteristics show an excellent agreement with the experimental data for both drain current and capacitance which validate the model. The developed model was then utilized to design and simulate a single-pole single-throw (SPST) RF switch.

  6. Current opinion in Alzheimer's disease therapy by nanotechnology-based approaches.

    PubMed

    Ansari, Shakeel Ahmed; Satar, Rukhsana; Perveen, Asma; Ashraf, Ghulam Md

    2017-03-01

    Nanotechnology typically deals with the measuring and modeling of matter at nanometer scale by incorporating the fields of engineering and technology. The most prominent feature of these engineered materials involves their manipulation/modification for imparting new functional properties. The current review covers the most recent findings of Alzheimer's disease (AD) therapeutics based on nanoscience and technology. Current studies involve the application of nanotechnology in developing novel diagnostic and therapeutic tools for neurological disorders. Nanotechnology-based approaches can be exploited for limiting/reversing these diseases for promoting functional regeneration of damaged neurons. These strategies offer neuroprotection by facilitating the delivery of drugs and small molecules more effectively across the blood-brain barrier. Nanotechnology based approaches show promise in improving AD therapeutics. Further replication work on synthesis and surface modification of nanoparticles, longer-term clinical trials, and attempts to increase their impact in treating AD are required.

  7. A multilayer approach for turbidity currents

    NASA Astrophysics Data System (ADS)

    Fernandez-Nieto, Enrique; Castro Díaz, Manuel J.; Morales de Luna, Tomás

    2017-04-01

    When a river that carries sediment in suspension enters into a lake or the ocean it can form a plume that can be classified as hyperpycnal or hypopycnal. Hypopycnal plumes occurs if the combined density of the sediment and interstitial fluid is lower than that of the ambient. Hyperpycnal plumes are a class of sediment-laden gravity current commonly referred to as turbidity currents [7,9]. Some layer-averaged models have been previously developed (see [3, 4, 8] among others). Although this layer-averaged approach gives a fast and valuable information, it has the disadvantage that the vertical distribution of the sediment in suspension is lost. A recent technique based on a multilayer approach [1, 2, 6] has shown to be specially useful to generalize shallow water type models in order to keep track of the vertical components of the averaged variables in the classical shallow water equations. In [5] multilayer model is obtained using a vertical discontinuous Galerkin approach for which the vertical velocity is supposed to be piecewise linear and the horizontal velocity is supposed to be piecewise constant. In this work the technique introduced in [5] is generalized to derive a model for turbidity currents. This model allows to simulate hyperpycnal as well as hypopycnal plumes. Several numerical tests will be presented. References [1] E. Audusse, M. Bristeau, B. Perthame, and J. Sainte-Marie. A multilayer Saint-Venant system with mass exchanges for shallow water flows. derivation and numerical validation. ESAIM: Mathematical Modelling and Numerical Analysis, 45(1):169-200, (2010). [2] E. Audusse, M.-O. Bristeau, M. Pelanti, and J. Sainte-Marie. Approximation of the hydrostatic Navier‚ÄìStokes system for density stratified flows by a multilayer model: Kinetic interpretation and numerical solution. Journal of Computational Physics, 230(9):3453-3478, (2011). [3] S. F. Bradford and N. D. Katopodes. Hydrodynamics of turbid underflows. i: Formulation and numerical analysis. Journal of Hydraulic Engineering, 125(10):1006-1015, (1999). [4] F. H. Chu, W. D. Pilkey, and O. H. Pilkey. An analytical study of turbidity current steady flow. Marine Geology, 33(3-4):205-220, 1979. [5] E. D. Fernández-Nieto, E. H. Koné, and T. C. Rebollo. A Multilayer Method for the Hydrostatic Navier-Stokes Equations: A Particular Weak Solution. J. of Scientific Computing, 60(2):408-437, (2013). [6] E. D. Fernández-Nieto, E. H. Koné, T. Morales de Luna, and R. Bürger. A multilayer shallow water system for polydisperse sedimentation. J. of Computational Physics, 238:281-314, (2013). [7] T. Mulder and J. P. M. Syvitski. Turbidity Currents Generated at River Mouths during Exceptional Discharges to the World Oceans. The Journal of Geology, 103(3):285-299, (1995). [8] G. Parker, Y. Fukushima, and H. M. Pantin. Self-accelerating turbidity currents. Journal of Fluid Mechanics, 171:145-181, (1986). [9] J. D. Parsons, J. W. M. Bush, and J. P. M. Syvitski. Hyperpycnal plume formation from riverine outflows with small sediment concentrations. Sedimentology, 48(2):465-478, (2001).

  8. Three-dimensional wave-induced current model equations and radiation stresses

    NASA Astrophysics Data System (ADS)

    Xia, Hua-yong

    2017-08-01

    After the approach by Mellor (2003, 2008), the present paper reports on a repeated effort to derive the equations for three-dimensional wave-induced current. Via the vertical momentum equation and a proper coordinate transformation, the phase-averaged wave dynamic pressure is well treated, and a continuous and depth-dependent radiation stress tensor, rather than the controversial delta Dirac function at the surface shown in Mellor (2008), is provided. Besides, a phase-averaged vertical momentum flux over a sloping bottom is introduced. All the inconsistencies in Mellor (2003, 2008), pointed out by Ardhuin et al. (2008) and Bennis and Ardhuin (2011), are overcome in the presently revised equations. In a test case with a sloping sea bed, as shown in Ardhuin et al. (2008), the wave-driving forces derived in the present equations are in good balance, and no spurious vertical circulation occurs outside the surf zone, indicating that Airy's wave theory and the approach of Mellor (2003, 2008) are applicable for the derivation of the wave-induced current model.

  9. Contribution to the modelling and analysis of logistics system performance by Petri nets and simulation models: Application in a supply chain

    NASA Astrophysics Data System (ADS)

    Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said

    2016-02-01

    In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.

  10. A developmental, biopsychosocial model for the treatment of children with gender identity disorder.

    PubMed

    Zucker, Kenneth J; Wood, Hayley; Singh, Devita; Bradley, Susan J

    2012-01-01

    This article provides a summary of the therapeutic model and approach used in the Gender Identity Service at the Centre for Addiction and Mental Health in Toronto. The authors describe their assessment protocol, describe their current multifactorial case formulation model, including a strong emphasis on developmental factors, and provide clinical examples of how the model is used in the treatment.

  11. Flamelet Model Application for Non-Premixed Turbulent Combustion

    NASA Technical Reports Server (NTRS)

    Secundov, A.; Bezgin, L.; Buriko, Yu.; Guskov, O.; Kopchenov, V.; Laskin, I.; Lomkov, K.; Tshepin, S.; Volkov, D.; Zaitsev, S.

    1996-01-01

    The current Final Report contains results of the study which was performed in Scientific Research Center 'ECOLEN' (Moscow, Russia). The study concerns the development and verification of non-expensive approach for modeling of supersonic turbulent diffusion flames based on flamelet consideration of the chemistry/turbulence interaction (FL approach). Research work included: development of the approach and CFD tests of the flamelet model for supersonic jet flames; development of the simplified procedure for solution of the flamelet equations based on partial equilibrium chemistry assumption; study of the flame ignition/extinction predictions provided by flamelet model. The performed investigation demonstrated that FL approach allowed to describe satisfactory main features of supersonic H 2/air jet flames. Model demonstrated also high capabilities for reduction of the computational expenses in CFD modeling of the supersonic flames taking into account detailed oxidation chemistry. However, some disadvantages and restrictions of the existing version of approach were found in this study. They were: (1) inaccuracy in predictions of the passive scalar statistics by our turbulence model for one of the considered test cases; and (2) applicability of the available version of the flamelet model to flames without large ignition delay distance only. Based on the results of the performed investigation, we formulated and submitted to the National Aeronautics and Space Administration our Project Proposal for the next step research directed toward further improvement of the FL approach.

  12. Multilevel Molecular Modeling Approach for a Rational Design of Ionic Current Sensors for Nanofluidics.

    PubMed

    Kirch, Alexsandro; de Almeida, James M; Miranda, Caetano R

    2018-05-10

    The complexity displayed by nanofluidic-based systems involves electronic and dynamic aspects occurring across different size and time scales. To properly model such kind of system, we introduced a top-down multilevel approach, combining molecular dynamics simulations (MD) with first-principles electronic transport calculations. The potential of this technique was demonstrated by investigating how the water and ionic flow through a (6,6) carbon nanotube (CNT) influences its electronic transport properties. We showed that the confinement on the CNT favors the partially hydrated Na, Cl, and Li ions to exchange charge with the nanotube. This leads to a change in the electronic transmittance, allowing for the distinguishing of cations from anions. Such an ionic trace may handle an indirect measurement of the ionic current that is recorded as a sensing output. With this case study, we are able to show the potential of this top-down multilevel approach, to be applied on the design of novel nanofluidic devices.

  13. NASA Occupant Protection Standards Development

    NASA Technical Reports Server (NTRS)

    Somers, Jeffrey; Gernhardt, Michael; Lawrence, Charles

    2012-01-01

    Historically, spacecraft landing systems have been tested with human volunteers, because analytical methods for estimating injury risk were insufficient. These tests were conducted with flight-like suits and seats to verify the safety of the landing systems. Currently, NASA uses the Brinkley Dynamic Response Index to estimate injury risk, although applying it to the NASA environment has drawbacks: (1) Does not indicate severity or anatomical location of injury (2) Unclear if model applies to NASA applications. Because of these limitations, a new validated, analytical approach was desired. Leveraging off of the current state of the art in automotive safety and racing, a new approach was developed. The approach has several aspects: (1) Define the acceptable level of injury risk by injury severity (2) Determine the appropriate human surrogate for testing and modeling (3) Mine existing human injury data to determine appropriate Injury Assessment Reference Values (IARV). (4) Rigorously Validate the IARVs with sub-injurious human testing (5) Use validated IARVs to update standards and vehicle requirement

  14. An ensemble approach to predicting the impact of vaccination on rotavirus disease in Niger.

    PubMed

    Park, Jaewoo; Goldstein, Joshua; Haran, Murali; Ferrari, Matthew

    2017-10-13

    Recently developed vaccines provide a new way of controlling rotavirus in sub-Saharan Africa. Models for the transmission dynamics of rotavirus are critical both for estimating current burden from imperfect surveillance and for assessing potential effects of vaccine intervention strategies. We examine rotavirus infection in the Maradi area in southern Niger using hospital surveillance data provided by Epicentre collected over two years. Additionally, a cluster survey of households in the region allows us to estimate the proportion of children with diarrhea who consulted at a health structure. Model fit and future projections are necessarily particular to a given model; thus, where there are competing models for the underlying epidemiology an ensemble approach can account for that uncertainty. We compare our results across several variants of Susceptible-Infectious-Recovered (SIR) compartmental models to quantify the impact of modeling assumptions on our estimates. Model-specific parameters are estimated by Bayesian inference using Markov chain Monte Carlo. We then use Bayesian model averaging to generate ensemble estimates of the current dynamics, including estimates of R 0 , the burden of infection in the region, as well as the impact of vaccination on both the short-term dynamics and the long-term reduction of rotavirus incidence under varying levels of coverage. The ensemble of models predicts that the current burden of severe rotavirus disease is 2.6-3.7% of the population each year and that a 2-dose vaccine schedule achieving 70% coverage could reduce burden by 39-42%. Copyright © 2017. Published by Elsevier Ltd.

  15. Signal timing on a shoestring

    DOT National Transportation Integrated Search

    2005-03-01

    The conventional approach to signal timing optimization and field deployment requires current traffic flow data, experience with optimization models, familiarity with the signal controller hardware, and knowledge of field operations including signal ...

  16. Signal timing on a shoestring.

    DOT National Transportation Integrated Search

    2005-03-01

    The conventional approach to signal timing optimization and field deployment requires current traffic flow data, experience with optimization models, familiarity with the signal controller hardware, and knowledge of field operations including signal ...

  17. Modeling approaches for the simulation of ultrasonic inspections of anisotropic composite structures in the CIVA software platform

    NASA Astrophysics Data System (ADS)

    Jezzine, Karim; Imperiale, Alexandre; Demaldent, Edouard; Le Bourdais, Florian; Calmon, Pierre; Dominguez, Nicolas

    2018-04-01

    Models for the simulation of ultrasonic inspections of flat and curved plate-like composite structures, as well as stiffeners, are available in the CIVA-COMPOSITE module released in 2016. A first modelling approach using a ray-based model is able to predict the ultrasonic propagation in an anisotropic effective medium obtained after having homogenized the composite laminate. Fast 3D computations can be performed on configurations featuring delaminations, flat bottom holes or inclusions for example. In addition, computations on ply waviness using this model will be available in CIVA 2017. Another approach is proposed in the CIVA-COMPOSITE module. It is based on the coupling of CIVA ray-based model and a finite difference scheme in time domain (FDTD) developed by AIRBUS. The ray model handles the ultrasonic propagation between the transducer and the FDTD computation zone that surrounds the composite part. In this way, the computational efficiency is preserved and the ultrasound scattering by the composite structure can be predicted. Alternatively, a high order finite element approach is currently developed at CEA but not yet integrated in CIVA. The advantages of this approach will be discussed and first simulation results on Carbon Fiber Reinforced Polymers (CFRP) will be shown. Finally, the application of these modelling tools to the construction of metamodels is discussed.

  18. Toward a 35-years North American Precipitation and Surface Reanalysis

    NASA Astrophysics Data System (ADS)

    Gasset, N.; Fortin, V.

    2017-12-01

    In support of the International Watersheds Initiative (IWI) of the International Joint Commission (IJC), a 35-years precipitation and surface reanalysis covering North America at a 3-hours and 15-km resolution is currently being developed at the Canadian Meteorological Centre (CMC). A deterministic reforecast / dynamical downscaling approach is followed where a global reanalysis (ERA-Interim) is used as initial condition of the Global Environmental Multi-scale model (GEM). Moreover, the latter is coupled with precipitation and surface data assimilation systems, i.e. the Canadian Precipitation Analysis (CaPA) and the Canadian Land Data Assimilation System (CaLDAS). While optimized to be more computationally efficient in the context of a reforecast experiment, all systems used are closely related to model versions and configurations currently run operationally at CMC, meaning they have undergone a strict and thorough validation procedure.As a proof of concept and in order to identify the optimal set-up before achieving the 35-years reanalysis, several configurations of the approach are evaluated for the years 2010-2014 using both standard CMC validation methodology as well as more dedicated scores such as comparison against the currently available products (North American Regional Reanalysis, MERRA-Land and the newly released ERA5 reanalysis). A special attention is dedicated to the evaluation of analysed variables, i.e. precipitation, snow depth, surface/ground temperature and moisture over the whole domain of interest. Results from these preliminary samples are very encouraging and the optimal set-up is identified. The coupled approach, i.e. GEM+CaPA/CaLDAS, always shows clear improvements over classical reforecast and dynamical downscaling where surface observations are present. Furthermore, results are inline or better than currently available products and the reference CMC operational approach that was operated from 2012 to 2016 (GEM 3.3, 10-km resolution). This reanalysis will allow for bias correction of current estimates and forecasts, and help decision maker understand and communicate by how much the current forecasted state of the system differs from the recent past.

  19. Linking adverse outcome pathways and population models: Current state of the science and future directions

    EPA Science Inventory

    Analysis of population impacts of chemical stressors through the use of modeling provides a linkage between endpoints observed in the individual and ecological risk to the population as a whole. In this presentation, we describe the evolution of an approach developed in our labor...

  20. Communication Policy and Theory: Current Perspectives on Mass Communication Research.

    ERIC Educational Resources Information Center

    Bybee, Carl R.; Cahn, Dudley D.

    The integration of American and European mass communication research models would provide a broader sociocultural framework for formulating communication policy. Emphasizing a functional approach, the American diffusionist model assumes that society is a system of interrelated parts naturally tending toward a state of dynamic equilibrium. The…

  1. The Structures of Centralized Governmental Privacy Protection: Approaches, Models, and Analysis.

    ERIC Educational Resources Information Center

    Jaeger, Paul T.; McClure, Charles R.; Fraser, Bruce T.

    2002-01-01

    Asserts that the federal government should adopt a centralized governmental structure for the privacy protection of personal information and data. Discusses the roles of federal law, federal agencies, and the judiciary; the concept of information privacy; the impact of current technologies; and models of centralized government structures for…

  2. Estimating the spatial scales of landscape effects on abundance

    Treesearch

    Richard Chandler; Jeffrey Hepinstall-Cymerman

    2016-01-01

    Spatial variation in abundance is influenced by local- and landscape-level environmental variables, but modeling landscape effects is challenging because the spatial scales of the relationships are unknown. Current approaches involve buffering survey locations with polygons of various sizes and using model selection to identify the best scale. The buffering...

  3. Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective

    ERIC Educational Resources Information Center

    Hadjerrouit, Said

    2005-01-01

    In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…

  4. Hybrid quantum teleportation: A theoretical model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeda, Shuntaro; Mizuta, Takahiro; Fuwa, Maria

    2014-12-04

    Hybrid quantum teleportation – continuous-variable teleportation of qubits – is a promising approach for deterministically teleporting photonic qubits. We propose how to implement it with current technology. Our theoretical model shows that faithful qubit transfer can be achieved for this teleportation by choosing an optimal gain for the teleporter’s classical channel.

  5. Application of Complex Adaptive Systems in Portfolio Management

    ERIC Educational Resources Information Center

    Su, Zheyuan

    2017-01-01

    Simulation-based methods are becoming a promising research tool in financial markets. A general Complex Adaptive System can be tailored to different application scenarios. Based on the current research, we built two models that would benefit portfolio management by utilizing Complex Adaptive Systems (CAS) in Agent-based Modeling (ABM) approach.…

  6. NASA's Use of Human Behavior Models for Concept Development and Evaluation

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2012-01-01

    Overview of NASA's use of computational approaches and methods to support research goals, of human performance models, with a focus on examples of the methods used in Code TH and TI at NASA Ames, followed by an in depth review of MIDAS' current FAA work.

  7. Simulating potato gas exchange as influenced by CO2 and irrigation

    USDA-ARS?s Scientific Manuscript database

    Recent research suggests that an energy balance approach is required for crop models to adequately respond to current and future climatic conditions associated with elevated CO2, higher temperatures, and water scarcity. More realistic models are needed in order to understand the impact of, and deve...

  8. CONCEPTUAL BASIS FOR MULTI-ROUTE INTAKE DOSE MODELING USING AN ENERGY EXPENDITURE APPROACH

    EPA Science Inventory

    This paper provides the conceptual basis for a modeling logic that is currently being developed in the National Exposure Research Laboratory (NERL) of the U.S. Environmental Protection Agency ( EPA) for use in intake dose assessments involving substances that can enter the body...

  9. Large-Scale Modeling of Wordform Learning and Representation

    ERIC Educational Resources Information Center

    Sibley, Daragh E.; Kello, Christopher T.; Plaut, David C.; Elman, Jeffrey L.

    2008-01-01

    The forms of words as they appear in text and speech are central to theories and models of lexical processing. Nonetheless, current methods for simulating their learning and representation fail to approach the scale and heterogeneity of real wordform lexicons. A connectionist architecture termed the "sequence encoder" is used to learn…

  10. Toward a More Comprehensive Model of Teacher Pay. Research Brief

    ERIC Educational Resources Information Center

    National Center on Performance Incentives, 2008

    2008-01-01

    In "Toward a More Comprehensive Model of Teacher Pay"--a paper presented at the February 2008 National Center on Performance Incentives research to policy conference--Julia Koppich examines recent policy initiatives implementing new approaches to teacher pay. Her discussion focuses on four current initiatives: ProComp in Denver, Toledo…

  11. Current Density and Continuity in Discretized Models

    ERIC Educational Resources Information Center

    Boykin, Timothy B.; Luisier, Mathieu; Klimeck, Gerhard

    2010-01-01

    Discrete approaches have long been used in numerical modelling of physical systems in both research and teaching. Discrete versions of the Schrodinger equation employing either one or several basis functions per mesh point are often used by senior undergraduates and beginning graduate students in computational physics projects. In studying…

  12. Contemporary Culture: A Model for Teaching a Culture's Heritage.

    ERIC Educational Resources Information Center

    Carr, Tom

    Current approaches to teaching culture which have adapted the anthropological model to contemporary life situations can serve as a guide to the organization of traditional civilization course material, from which exercises can be developed. Culture instruction should incorporate a cross-cultural dimension, be authentically contemporary, and be…

  13. Building Public Health Capacity through Online Global Learning

    ERIC Educational Resources Information Center

    Madhok, Rajan; Frank, Erica; Heller, Richard Frederick

    2018-01-01

    Rising disease burden and health inequalities remain global concerns, highlighting the need for health systems strengthening with a sufficient and appropriately trained workforce. The current models for developing such a workforce are inadequate and newer approaches are needed. In this paper we describe a model for public health capacity building…

  14. Errors of Inference in Structural Equation Modeling

    ERIC Educational Resources Information Center

    McCoach, D. Betsy; Black, Anne C.; O'Connell, Ann A.

    2007-01-01

    Although structural equation modeling (SEM) is one of the most comprehensive and flexible approaches to data analysis currently available, it is nonetheless prone to researcher misuse and misconceptions. This article offers a brief overview of the unique capabilities of SEM and discusses common sources of user error in drawing conclusions from…

  15. The Mystery Tubes: Teaching Pupils about Hypothetical Modelling

    ERIC Educational Resources Information Center

    Kenrick, Carole

    2017-01-01

    This article recounts the author's working experience of one method by which pupils' understanding of the epistemologies of science can be developed, specifically how scientists can develop hypothetical models and test them through simulations. She currently uses this approach for transition lessons with pupils in upper primary or lower secondary…

  16. A New Theoretical Approach to Postsecondary Student Disability: Disability-Diversity (Dis)Connect Model

    ERIC Educational Resources Information Center

    Aquino, Katherine C.

    2016-01-01

    Disability is often viewed as an obstacle to postsecondary inclusion, but not a characteristic of student diversity. Additionally, current theoretical frameworks isolate disability from other student diversity characteristics. In response, a new conceptual framework, the Disability-Diversity (Dis)Connect Model (DDDM), was created to address…

  17. The Planning Wheel: Value Added Performance.

    ERIC Educational Resources Information Center

    Murk, Peter J.; Walls, Jeffrey L.

    The "Planning Wheel" is an evolution of the original Systems Approach Model (SAM) that was introduced in 1986 by Murk and Galbraith. Unlike most current planning models, which are linear in design and concept, the Planning Wheel bridges the gap between linear and nonlinear processes. The "Program Planning Wheel" is designed to…

  18. An analysis of electrical conductivity model in saturated porous media

    NASA Astrophysics Data System (ADS)

    Cai, J.; Wei, W.; Qin, X.; Hu, X.

    2017-12-01

    Electrical conductivity of saturated porous media has numerous applications in many fields. In recent years, the number of theoretical methods to model electrical conductivity of complex porous media has dramatically increased. Nevertheless, the process of modeling the spatial conductivity distributed function continues to present challenges when these models used in reservoirs, particularly in porous media with strongly heterogeneous pore-space distributions. Many experiments show a more complex distribution of electrical conductivity data than the predictions derived from the experiential model. Studies have observed anomalously-high electrical conductivity of some low-porosity (tight) formations compared to more- porous reservoir rocks, which indicates current flow in porous media is complex and difficult to predict. Moreover, the change of electrical conductivity depends not only on the pore volume fraction but also on several geometric properties of the more extensive pore network, including pore interconnection and tortuosity. In our understanding of electrical conductivity models in porous media, we study the applicability of several well-known methods/theories to electrical characteristics of porous rocks as a function of pore volume, tortuosity and interconnection, to estimate electrical conductivity based on the micro-geometrical properties of rocks. We analyze the state of the art of scientific knowledge and practice for modeling porous structural systems, with the purpose of identifying current limitations and defining a blueprint for future modeling advances. We compare conceptual descriptions of electrical current flow processes in pore space considering several distinct modeling approaches. Approaches to obtaining more reasonable electrical conductivity models are discussed. Experiments suggest more complex relationships between electrical conductivity and porosity than experiential models, particularly in low-porosity formations. However, the available theoretical models combined with simulations do provide insight to how microscale physics affects macroscale electrical conductivity in porous media.

  19. Trends in Mediation Analysis in Nursing Research: Improving Current Practice.

    PubMed

    Hertzog, Melody

    2018-06-01

    The purpose of this study was to describe common approaches used by nursing researchers to test mediation models and evaluate them within the context of current methodological advances. MEDLINE was used to locate studies testing a mediation model and published from 2004 to 2015 in nursing journals. Design (experimental/correlation, cross-sectional/longitudinal, model complexity) and analysis (method, inclusion of test of mediated effect, violations/discussion of assumptions, sample size/power) characteristics were coded for 456 studies. General trends were identified using descriptive statistics. Consistent with findings of reviews in other disciplines, evidence was found that nursing researchers may not be aware of the strong assumptions and serious limitations of their analyses. Suggestions for strengthening the rigor of such studies and an overview of current methods for testing more complex models, including longitudinal mediation processes, are presented.

  20. A machine learning approach for real-time modelling of tissue deformation in image-guided neurosurgery.

    PubMed

    Tonutti, Michele; Gras, Gauthier; Yang, Guang-Zhong

    2017-07-01

    Accurate reconstruction and visualisation of soft tissue deformation in real time is crucial in image-guided surgery, particularly in augmented reality (AR) applications. Current deformation models are characterised by a trade-off between accuracy and computational speed. We propose an approach to derive a patient-specific deformation model for brain pathologies by combining the results of pre-computed finite element method (FEM) simulations with machine learning algorithms. The models can be computed instantaneously and offer an accuracy comparable to FEM models. A brain tumour is used as the subject of the deformation model. Load-driven FEM simulations are performed on a tetrahedral brain mesh afflicted by a tumour. Forces of varying magnitudes, positions, and inclination angles are applied onto the brain's surface. Two machine learning algorithms-artificial neural networks (ANNs) and support vector regression (SVR)-are employed to derive a model that can predict the resulting deformation for each node in the tumour's mesh. The tumour deformation can be predicted in real time given relevant information about the geometry of the anatomy and the load, all of which can be measured instantly during a surgical operation. The models can predict the position of the nodes with errors below 0.3mm, beyond the general threshold of surgical accuracy and suitable for high fidelity AR systems. The SVR models perform better than the ANN's, with positional errors for SVR models reaching under 0.2mm. The results represent an improvement over existing deformation models for real time applications, providing smaller errors and high patient-specificity. The proposed approach addresses the current needs of image-guided surgical systems and has the potential to be employed to model the deformation of any type of soft tissue. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Multi-Criteria Approach in Multifunctional Building Design Process

    NASA Astrophysics Data System (ADS)

    Gerigk, Mateusz

    2017-10-01

    The paper presents new approach in multifunctional building design process. Publication defines problems related to the design of complex multifunctional buildings. Currently, contemporary urban areas are characterized by very intensive use of space. Today, buildings are being built bigger and contain more diverse functions to meet the needs of a large number of users in one capacity. The trends show the need for recognition of design objects in an organized structure, which must meet current design criteria. The design process in terms of the complex system is a theoretical model, which is the basis for optimization solutions for the entire life cycle of the building. From the concept phase through exploitation phase to disposal phase multipurpose spaces should guarantee aesthetics, functionality, system efficiency, system safety and environmental protection in the best possible way. The result of the analysis of the design process is presented as a theoretical model of the multifunctional structure. Recognition of multi-criteria model in the form of Cartesian product allows to create a holistic representation of the designed building in the form of a graph model. The proposed network is the theoretical base that can be used in the design process of complex engineering systems. The systematic multi-criteria approach makes possible to maintain control over the entire design process and to provide the best possible performance. With respect to current design requirements, there are no established design rules for multifunctional buildings in relation to their operating phase. Enrichment of the basic criteria with functional flexibility criterion makes it possible to extend the exploitation phase which brings advantages on many levels.

  2. path integral approach to closed form pricing formulas in the Heston framework.

    NASA Astrophysics Data System (ADS)

    Lemmens, Damiaan; Wouters, Michiel; Tempere, Jacques; Foulon, Sven

    2008-03-01

    We present a path integral approach for finding closed form formulas for option prices in the framework of the Heston model. The first model for determining option prices was the Black-Scholes model, which assumed that the logreturn followed a Wiener process with a given drift and constant volatility. To provide a realistic description of the market, the Black-Scholes results must be extended to include stochastic volatility. This is achieved by the Heston model, which assumes that the volatility follows a mean reverting square root process. Current applications of the Heston model are hampered by the unavailability of fast numerical methods, due to a lack of closed-form formulae. Therefore the search for closed form solutions is an essential step before the qualitatively better stochastic volatility models will be used in practice. To attain this goal we outline a simplified path integral approach yielding straightforward results for vanilla Heston options with correlation. Extensions to barrier options and other path-dependent option are discussed, and the new derivation is compared to existing results obtained from alternative path-integral approaches (Dragulescu, Kleinert).

  3. To Control False Positives in Gene-Gene Interaction Analysis: Two Novel Conditional Entropy-Based Approaches

    PubMed Central

    Lin, Meihua; Li, Haoli; Zhao, Xiaolei; Qin, Jiheng

    2013-01-01

    Genome-wide analysis of gene-gene interactions has been recognized as a powerful avenue to identify the missing genetic components that can not be detected by using current single-point association analysis. Recently, several model-free methods (e.g. the commonly used information based metrics and several logistic regression-based metrics) were developed for detecting non-linear dependence between genetic loci, but they are potentially at the risk of inflated false positive error, in particular when the main effects at one or both loci are salient. In this study, we proposed two conditional entropy-based metrics to challenge this limitation. Extensive simulations demonstrated that the two proposed metrics, provided the disease is rare, could maintain consistently correct false positive rate. In the scenarios for a common disease, our proposed metrics achieved better or comparable control of false positive error, compared to four previously proposed model-free metrics. In terms of power, our methods outperformed several competing metrics in a range of common disease models. Furthermore, in real data analyses, both metrics succeeded in detecting interactions and were competitive with the originally reported results or the logistic regression approaches. In conclusion, the proposed conditional entropy-based metrics are promising as alternatives to current model-based approaches for detecting genuine epistatic effects. PMID:24339984

  4. A template-based approach for responsibility management in executable business processes

    NASA Astrophysics Data System (ADS)

    Cabanillas, Cristina; Resinas, Manuel; Ruiz-Cortés, Antonio

    2018-05-01

    Process-oriented organisations need to manage the different types of responsibilities their employees may have w.r.t. the activities involved in their business processes. Despite several approaches provide support for responsibility modelling, in current Business Process Management Systems (BPMS) the only responsibility considered at runtime is the one related to performing the work required for activity completion. Others like accountability or consultation must be implemented by manually adding activities in the executable process model, which is time-consuming and error-prone. In this paper, we address this limitation by enabling current BPMS to execute processes in which people with different responsibilities interact to complete the activities. We introduce a metamodel based on Responsibility Assignment Matrices (RAM) to model the responsibility assignment for each activity, and a flexible template-based mechanism that automatically transforms such information into BPMN elements, which can be interpreted and executed by a BPMS. Thus, our approach does not enforce any specific behaviour for the different responsibilities but new templates can be modelled to specify the interaction that best suits the activity requirements. Furthermore, libraries of templates can be created and reused in different processes. We provide a reference implementation and build a library of templates for a well-known set of responsibilities.

  5. Disrupting Traditions: Swimming against the Current of Adolescent Bullying

    ERIC Educational Resources Information Center

    Khasnabis, Debi; Upton, Kevin

    2013-01-01

    Advances in technology have aggravated the generations-old problem of bullying in schools. In this article, the authors attend to the impact of social media on bullying and advocate an approach to teaching anti-bullying that incorporates a project-based learning approach for young adolescents. Process drama as a model of learning and the use of…

  6. Interdisciplinary Approaches--Past and Present--to Foreign Language and Literature Studies.

    ERIC Educational Resources Information Center

    Jaffe, Samuel

    This paper discusses some past and present interdisciplinary approaches to foreign language and literature (FLL) studies. It is argued that the current unpopularity of FLL studies in the United States is the consequence of falling away from viable traditions of scholarship and teaching in this area. A model for reform which reawaken interest in…

  7. An Assessment of the Department of Education's Approach and Model for Analyzing Lender Profitability.

    ERIC Educational Resources Information Center

    Jenkins, Sarah; And Others

    An assessment was done of the Department of Education's (ED) approach to determining lender profitability for Guaranteed Student Loans. The assessment described the current net present value (NPV) method as well as discussing its strengths and weaknesses. The NPV method has been widely accepted for determining the profitability of different…

  8. Mathematical modelling of clostridial acetone-butanol-ethanol fermentation.

    PubMed

    Millat, Thomas; Winzer, Klaus

    2017-03-01

    Clostridial acetone-butanol-ethanol (ABE) fermentation features a remarkable shift in the cellular metabolic activity from acid formation, acidogenesis, to the production of industrial-relevant solvents, solventogensis. In recent decades, mathematical models have been employed to elucidate the complex interlinked regulation and conditions that determine these two distinct metabolic states and govern the transition between them. In this review, we discuss these models with a focus on the mechanisms controlling intra- and extracellular changes between acidogenesis and solventogenesis. In particular, we critically evaluate underlying model assumptions and predictions in the light of current experimental knowledge. Towards this end, we briefly introduce key ideas and assumptions applied in the discussed modelling approaches, but waive a comprehensive mathematical presentation. We distinguish between structural and dynamical models, which will be discussed in their chronological order to illustrate how new biological information facilitates the 'evolution' of mathematical models. Mathematical models and their analysis have significantly contributed to our knowledge of ABE fermentation and the underlying regulatory network which spans all levels of biological organization. However, the ties between the different levels of cellular regulation are not well understood. Furthermore, contradictory experimental and theoretical results challenge our current notion of ABE metabolic network structure. Thus, clostridial ABE fermentation still poses theoretical as well as experimental challenges which are best approached in close collaboration between modellers and experimentalists.

  9. Evidence-based dentistry: a clinician's perspective.

    PubMed

    Bauer, Janet; Spackman, Sue; Chiappelli, Francesco; Prolo, Paolo; Stevenson, Richard

    2006-07-01

    Evidence-based dentistry is a discipline that provides best, explicit-based evidence to dentists and their patients in shared decision-making. Currently, dentists are being trained and directed to adopt the role of translational researchers in developing evidence-based dental practices. Practically, evidence-based dentistry is not usable in its current mode for the provision of labor-intensive services that characterize current dental practice. The purpose of this article is to introduce a model of evidence-based dental practice. This model conceptualizes a team approach in explaining problems and solutions to change current dental practice. These changes constitute an evidence-based dental practice that involves the electronic chart, centralized database, knowledge management software, and personnel in optimizing effective oral health care to dental patients.

  10. Current algebra, statistical mechanics and quantum models

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2017-11-01

    Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.

  11. Dynamics of Charge Transfer in DNA Wires: A Proton-Coupled Approach

    NASA Astrophysics Data System (ADS)

    Behnia, Sohrab; Fathizadeh, Samira; Ziaei, Javid; Akhshani, Afshin

    2017-12-01

    The advent of molecular electronics has fueled interest in studying DNA as a nanowire. The well-known Peyrard-Bishop-Dauxois (PBD) model, which was proposed for the purpose of understanding the mechanism of DNA denaturation, has a limited number of degrees of freedom. In addition, considering the Peyrard-Bishop-Holstein (PBH) model as a means of studying the charge transfer effect, in which the dynamical motion is described via the PBD model, may apply limitations on observing all the phenomena. Therefore, we have attempted to add the mutual interaction of a proton and electron in the form of proton-coupled electron transfer (PCET) to the PBH model. PCET has been implicated in a variety of oxidative processes that ultimately lead to mutations. When we have considered the PCET approach to DNA based on a proton-combined PBH model, we were able to extract the electron and proton currents independently. In this case, the reciprocal influence of electron and proton current is considered. This interaction does not affect the general form of the electronic current in DNA, but it changes the threshold of the occurrence of phenomena such as negative differential resistance. It is worth mentioning that perceiving the structural properties of the attractors in phase space via the Rényi dimension and concentrating on the critical regions through a scalogram can present a clear picture of the critical points in such phenomena.

  12. PACE: Probabilistic Assessment for Contributor Estimation- A machine learning-based assessment of the number of contributors in DNA mixtures.

    PubMed

    Marciano, Michael A; Adelman, Jonathan D

    2017-03-01

    The deconvolution of DNA mixtures remains one of the most critical challenges in the field of forensic DNA analysis. In addition, of all the data features required to perform such deconvolution, the number of contributors in the sample is widely considered the most important, and, if incorrectly chosen, the most likely to negatively influence the mixture interpretation of a DNA profile. Unfortunately, most current approaches to mixture deconvolution require the assumption that the number of contributors is known by the analyst, an assumption that can prove to be especially faulty when faced with increasingly complex mixtures of 3 or more contributors. In this study, we propose a probabilistic approach for estimating the number of contributors in a DNA mixture that leverages the strengths of machine learning. To assess this approach, we compare classification performances of six machine learning algorithms and evaluate the model from the top-performing algorithm against the current state of the art in the field of contributor number classification. Overall results show over 98% accuracy in identifying the number of contributors in a DNA mixture of up to 4 contributors. Comparative results showed 3-person mixtures had a classification accuracy improvement of over 6% compared to the current best-in-field methodology, and that 4-person mixtures had a classification accuracy improvement of over 20%. The Probabilistic Assessment for Contributor Estimation (PACE) also accomplishes classification of mixtures of up to 4 contributors in less than 1s using a standard laptop or desktop computer. Considering the high classification accuracy rates, as well as the significant time commitment required by the current state of the art model versus seconds required by a machine learning-derived model, the approach described herein provides a promising means of estimating the number of contributors and, subsequently, will lead to improved DNA mixture interpretation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Equivalence of MAXENT and Poisson point process models for species distribution modeling in ecology.

    PubMed

    Renner, Ian W; Warton, David I

    2013-03-01

    Modeling the spatial distribution of a species is a fundamental problem in ecology. A number of modeling methods have been developed, an extremely popular one being MAXENT, a maximum entropy modeling approach. In this article, we show that MAXENT is equivalent to a Poisson regression model and hence is related to a Poisson point process model, differing only in the intercept term, which is scale-dependent in MAXENT. We illustrate a number of improvements to MAXENT that follow from these relations. In particular, a point process model approach facilitates methods for choosing the appropriate spatial resolution, assessing model adequacy, and choosing the LASSO penalty parameter, all currently unavailable to MAXENT. The equivalence result represents a significant step in the unification of the species distribution modeling literature. Copyright © 2013, The International Biometric Society.

  14. An alternative approach for socio-hydrology: case study research

    NASA Astrophysics Data System (ADS)

    Mostert, Erik

    2018-01-01

    Currently the most popular approach in socio hydrology is to develop coupled human-water models. This article proposes an alternative approach, qualitative case study research, involving a systematic review of (1) the human activities affecting the hydrology in the case, (2) the main human actors, and (3) the main factors influencing the actors and their activities. Moreover, this article presents a case study of the Dommel Basin in Belgium and the Netherlands, and compares this with a coupled model of the Kissimmee Basin in Florida. In both basins a pendulum swing from water resources development and control to protection and restoration can be observed. The Dommel case study moreover points to the importance of institutional and financial arrangements, community values, and broader social, economic, and technical developments. These factors are missing from the Kissimmee model. Generally, case studies can result in a more complete understanding of individual cases than coupled models, and if the cases are selected carefully and compared with previous studies, it is possible to generalize on the basis of them. Case studies also offer more levers for management and facilitate interdisciplinary cooperation. Coupled models, on the other hand, can be used to generate possible explanations of past developments and quantitative scenarios for future developments. The article concludes that, given the limited attention they currently get and their potential benefits, case studies deserve more attention in socio-hydrology.

  15. Metabolic modeling of dynamic 13C NMR isotopomer data in the brain in vivo: Fast screening of metabolic models using automated generation of differential equations

    PubMed Central

    Tiret, Brice; Shestov, Alexander A.; Valette, Julien; Henry, Pierre-Gilles

    2017-01-01

    Most current brain metabolic models are not capable of taking into account the dynamic isotopomer information available from fine structure multiplets in 13C spectra, due to the difficulty of implementing such models. Here we present a new approach that allows automatic implementation of multi-compartment metabolic models capable of fitting any number of 13C isotopomer curves in the brain. The new automated approach also makes it possible to quickly modify and test new models to best describe the experimental data. We demonstrate the power of the new approach by testing the effect of adding separate pyruvate pools in astrocytes and neurons, and adding a vesicular neuronal glutamate pool. Including both changes reduced the global fit residual by half and pointed to dilution of label prior to entry into the astrocytic TCA cycle as the main source of glutamine dilution. The glutamate-glutamine cycle rate was particularly sensitive to changes in the model. PMID:26553273

  16. Identification of time-varying structural dynamic systems - An artificial intelligence approach

    NASA Technical Reports Server (NTRS)

    Glass, B. J.; Hanagud, S.

    1992-01-01

    An application of the artificial intelligence-derived methodologies of heuristic search and object-oriented programming to the problem of identifying the form of the model and the associated parameters of a time-varying structural dynamic system is presented in this paper. Possible model variations due to changes in boundary conditions or configurations of a structure are organized into a taxonomy of models, and a variant of best-first search is used to identify the model whose simulated response best matches that of the current physical structure. Simulated model responses are verified experimentally. An output-error approach is used in a discontinuous model space, and an equation-error approach is used in the parameter space. The advantages of the AI methods used, compared with conventional programming techniques for implementing knowledge structuring and inheritance, are discussed. Convergence conditions and example problems have been discussed. In the example problem, both the time-varying model and its new parameters have been identified when changes occur.

  17. Computational aspects in mechanical modeling of the articular cartilage tissue.

    PubMed

    Mohammadi, Hadi; Mequanint, Kibret; Herzog, Walter

    2013-04-01

    This review focuses on the modeling of articular cartilage (at the tissue level), chondrocyte mechanobiology (at the cell level) and a combination of both in a multiscale computation scheme. The primary objective is to evaluate the advantages and disadvantages of conventional models implemented to study the mechanics of the articular cartilage tissue and chondrocytes. From monophasic material models as the simplest form to more complicated multiscale theories, these approaches have been frequently used to model articular cartilage and have contributed significantly to modeling joint mechanics, addressing and resolving numerous issues regarding cartilage mechanics and function. It should be noted that attentiveness is important when using different modeling approaches, as the choice of the model limits the applications available. In this review, we discuss the conventional models applicable to some of the mechanical aspects of articular cartilage such as lubrication, swelling pressure and chondrocyte mechanics and address some of the issues associated with the current modeling approaches. We then suggest future pathways for a more realistic modeling strategy as applied for the simulation of the mechanics of the cartilage tissue using multiscale and parallelized finite element method.

  18. Circulation-based Modeling of Gravity Currents

    NASA Astrophysics Data System (ADS)

    Meiburg, E. H.; Borden, Z.

    2013-05-01

    Atmospheric and oceanic flows driven by predominantly horizontal density differences, such as sea breezes, thunderstorm outflows, powder snow avalanches, and turbidity currents, are frequently modeled as gravity currents. Efforts to develop simplified models of such currents date back to von Karman (1940), who considered a two-dimensional gravity current in an inviscid, irrotational and infinitely deep ambient. Benjamin (1968) presented an alternative model, focusing on the inviscid, irrotational flow past a gravity current in a finite-depth channel. More recently, Shin et al. (2004) proposed a model for gravity currents generated by partial-depth lock releases, considering a control volume that encompasses both fronts. All of the above models, in addition to the conservation of mass and horizontal momentum, invoke Bernoulli's law along some specific streamline in the flow field, in order to obtain a closed system of equations that can be solved for the front velocity as function of the current height. More recent computational investigations based on the Navier-Stokes equations, on the other hand, reproduce the dynamics of gravity currents based on the conservation of mass and momentum alone. We propose that it should therefore be possible to formulate a fundamental gravity current model without invoking Bernoulli's law. The talk will show that the front velocity of gravity currents can indeed be predicted as a function of their height from mass and momentum considerations alone, by considering the evolution of interfacial vorticity. This approach does not require information on the pressure field and therefore avoids the need for an energy closure argument such as those invoked by the earlier models. Predictions by the new theory are shown to be in close agreement with direct numerical simulation results. References Von Karman, T. 1940 The engineer grapples with nonlinear problems, Bull. Am. Math Soc. 46, 615-683. Benjamin, T.B. 1968 Gravity currents and related phenomena, J. Fluid Mech. 31, 209-248. Shin, J.O., Dalziel, S.B. and Linden, P.F. 2004 Gravity currents produced by lock exchange, J. Fluid Mech. 521, 1-34.

  19. Computational intelligence approaches for pattern discovery in biological systems.

    PubMed

    Fogel, Gary B

    2008-07-01

    Biology, chemistry and medicine are faced by tremendous challenges caused by an overwhelming amount of data and the need for rapid interpretation. Computational intelligence (CI) approaches such as artificial neural networks, fuzzy systems and evolutionary computation are being used with increasing frequency to contend with this problem, in light of noise, non-linearity and temporal dynamics in the data. Such methods can be used to develop robust models of processes either on their own or in combination with standard statistical approaches. This is especially true for database mining, where modeling is a key component of scientific understanding. This review provides an introduction to current CI methods, their application to biological problems, and concludes with a commentary about the anticipated impact of these approaches in bioinformatics.

  20. Introducing a model of cardiovascular prevention in Nairobi's slums by integrating a public health and private-sector approach: the SCALE-UP study.

    PubMed

    van de Vijver, Steven; Oti, Samuel; Tervaert, Thijs Cohen; Hankins, Catherine; Kyobutungi, Catherine; Gomez, Gabriela B; Brewster, Lizzy; Agyemang, Charles; Lange, Joep

    2013-10-21

    Cardiovascular disease (CVD) is a leading cause of death in sub-Saharan Africa (SSA), with annual deaths expected to increase to 2 million by 2030. Currently, most national health systems in SSA are not adequately prepared for this epidemic. This is especially so in slum settlements where access to formal healthcare and resources is limited. To develop and introduce a model of cardiovascular prevention in the slums of Nairobi by integrating public health and private sector approaches. Two non-profit organizations that conduct public health research, Amsterdam Institute for Global Health and Development (AIGHD) and African Population and Health Research Center (APHRC), collaborated with private-sector Boston Consulting Group (BCG) to develop a service delivery package for CVD prevention in slum settings. A theoretic model was designed based on the integration of public and private sector approaches with the focus on costs and feasibility. The final model includes components that aim to improve community awareness, a home-based screening service, patient and provider incentives to seek and deliver treatment specifically for hypertension, and adherence support. The expected outcomes projected by this model could prove potentially cost effective and affordable (1 USD/person/year). The model is currently being implemented in a Nairobi slum and is closely followed by key stakeholders in Kenya including the Ministry of Health, the World Health Organization (WHO), and leading non-governmental organizations (NGOs). Through the collaboration of public health and private sectors, a theoretically cost-effective model was developed for the prevention of CVD and is currently being implemented in the slums of Nairobi. If results are in line with the theoretical projections and first impressions on the ground, scale-up of the service delivery package could be planned in other poor urban areas in Kenya by relevant policymakers and NGOs.

  1. Assimilating the Future for Better Forecasts and Earlier Warnings

    NASA Astrophysics Data System (ADS)

    Du, H.; Wheatcroft, E.; Smith, L. A.

    2016-12-01

    Multi-model ensembles have become popular tools to account for some of the uncertainty due to model inadequacy in weather and climate simulation-based predictions. The current multi-model forecasts focus on combining single model ensemble forecasts by means of statistical post-processing. Assuming each model is developed independently or with different primary target variables, each is likely to contain different dynamical strengths and weaknesses. Using statistical post-processing, such information is only carried by the simulations under a single model ensemble: no advantage is taken to influence simulations under the other models. A novel methodology, named Multi-model Cross Pollination in Time, is proposed for multi-model ensemble scheme with the aim of integrating the dynamical information regarding the future from each individual model operationally. The proposed approach generates model states in time via applying data assimilation scheme(s) to yield truly "multi-model trajectories". It is demonstrated to outperform traditional statistical post-processing in the 40-dimensional Lorenz96 flow. Data assimilation approaches are originally designed to improve state estimation from the past to the current time. The aim of this talk is to introduce a framework that uses data assimilation to improve model forecasts at future time (not to argue for any one particular data assimilation scheme). Illustration of applying data assimilation "in the future" to provide early warning of future high-impact events is also presented.

  2. Suppression cost forecasts in advance of wildfire seasons

    Treesearch

    Jeffrey P. Prestemon; Karen Abt; Krista Gebert

    2008-01-01

    Approaches for forecasting wildfire suppression costs in advance of a wildfire season are demonstrated for two lead times: fall and spring of the current fiscal year (Oct. 1–Sept. 30). Model functional forms are derived from aggregate expressions of a least cost plus net value change model. Empirical estimates of these models are used to generate advance-of-season...

  3. Evaluation of Student Models on Current Socio-Scientific Topics Based on System Dynamics

    ERIC Educational Resources Information Center

    Nuhoglu, Hasret

    2014-01-01

    This study aims to 1) enable primary school students to develop models that will help them understand and analyze a system, through a learning process based on system dynamics approach, 2) examine and evaluate students' models related to socio-scientific issues using certain criteria. The research method used is a case study. The study sample…

  4. A Bayesian Model for the Estimation of Latent Interaction and Quadratic Effects When Latent Variables Are Non-Normally Distributed

    ERIC Educational Resources Information Center

    Kelava, Augustin; Nagengast, Benjamin

    2012-01-01

    Structural equation models with interaction and quadratic effects have become a standard tool for testing nonlinear hypotheses in the social sciences. Most of the current approaches assume normally distributed latent predictor variables. In this article, we present a Bayesian model for the estimation of latent nonlinear effects when the latent…

  5. A Hybrid Satellite-Terrestrial Approach to Aeronautical Communication Networks

    NASA Technical Reports Server (NTRS)

    Kerczewski, Robert J.; Chomos, Gerald J.; Griner, James H.; Mainger, Steven W.; Martzaklis, Konstantinos S.; Kachmar, Brian A.

    2000-01-01

    Rapid growth in air travel has been projected to continue for the foreseeable future. To maintain a safe and efficient national and global aviation system, significant advances in communications systems supporting aviation are required. Satellites will increasingly play a critical role in the aeronautical communications network. At the same time, current ground-based communications links, primarily very high frequency (VHF), will continue to be employed due to cost advantages and legacy issues. Hence a hybrid satellite-terrestrial network, or group of networks, will emerge. The increased complexity of future aeronautical communications networks dictates that system-level modeling be employed to obtain an optimal system fulfilling a majority of user needs. The NASA Glenn Research Center is investigating the current and potential future state of aeronautical communications, and is developing a simulation and modeling program to research future communications architectures for national and global aeronautical needs. This paper describes the primary requirements, the current infrastructure, and emerging trends of aeronautical communications, including a growing role for satellite communications. The need for a hybrid communications system architecture approach including both satellite and ground-based communications links is explained. Future aeronautical communication network topologies and key issues in simulation and modeling of future aeronautical communications systems are described.

  6. A dynamical model of plasma turbulence in the solar wind

    PubMed Central

    Howes, G. G.

    2015-01-01

    A dynamical approach, rather than the usual statistical approach, is taken to explore the physical mechanisms underlying the nonlinear transfer of energy, the damping of the turbulent fluctuations, and the development of coherent structures in kinetic plasma turbulence. It is argued that the linear and nonlinear dynamics of Alfvén waves are responsible, at a very fundamental level, for some of the key qualitative features of plasma turbulence that distinguish it from hydrodynamic turbulence, including the anisotropic cascade of energy and the development of current sheets at small scales. The first dynamical model of kinetic turbulence in the weakly collisional solar wind plasma that combines self-consistently the physics of Alfvén waves with the development of small-scale current sheets is presented and its physical implications are discussed. This model leads to a simplified perspective on the nature of turbulence in a weakly collisional plasma: the nonlinear interactions responsible for the turbulent cascade of energy and the formation of current sheets are essentially fluid in nature, while the collisionless damping of the turbulent fluctuations and the energy injection by kinetic instabilities are essentially kinetic in nature. PMID:25848075

  7. Multiscale modeling of ductile failure in metallic alloys

    NASA Astrophysics Data System (ADS)

    Pardoen, Thomas; Scheyvaerts, Florence; Simar, Aude; Tekoğlu, Cihan; Onck, Patrick R.

    2010-04-01

    Micromechanical models for ductile failure have been developed in the 1970s and 1980s essentially to address cracking in structural applications and complement the fracture mechanics approach. Later, this approach has become attractive for physical metallurgists interested by the prediction of failure during forming operations and as a guide for the design of more ductile and/or high-toughness microstructures. Nowadays, a realistic treatment of damage evolution in complex metallic microstructures is becoming feasible when sufficiently sophisticated constitutive laws are used within the context of a multilevel modelling strategy. The current understanding and the state of the art models for the nucleation, growth and coalescence of voids are reviewed with a focus on the underlying physics. Considerations are made about the introduction of the different length scales associated with the microstructure and damage process. Two applications of the methodology are then described to illustrate the potential of the current models. The first application concerns the competition between intergranular and transgranular ductile fracture in aluminum alloys involving soft precipitate free zones along the grain boundaries. The second application concerns the modeling of ductile failure in friction stir welded joints, a problem which also involves soft and hard zones, albeit at a larger scale.

  8. Coastal strategies to predict Escherichia coli concentrations for beaches along a 35 km stretch of southern Lake Michigan

    USGS Publications Warehouse

    Nevers, M.B.; Whitman, R.L.

    2008-01-01

    To understand the fate and movement of Escherichia coli in beach water, numerous modeling studies have been undertaken including mechanistic predictions of currents and plumes and empirical modeling based on hydrometeorological variables. Most approaches are limited in scope by nearshore currents or physical obstacles and data limitations; few examine the issue from a larger spatial scale. Given the similarities between variables typically included in these models, we attempted to take a broader view of E. coli fluctuations by simultaneously examining twelve beaches along 35 km of Indiana's Lake Michigan coastline that includes five point-source outfalls. The beaches had similar E. coli fluctuations, and a best-fit empirical model included two variables: wave height and an interactive term comprised of wind direction and creek turbidity. Individual beach R2 was 0.32-0.50. Data training-set results were comparable to validation results (R2 = 0.48). Amount of variation explained by the model was similar to previous reports for individual beaches. By extending the modeling approach to include more coastline distance, broader-scale spatial and temporal changes in bacteria concentrations and the influencing factors can be characterized. ?? 2008 American Chemical Society.

  9. A validated approach for modeling collapse of steel structures

    NASA Astrophysics Data System (ADS)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are shown. The calibration is performed using a particle swarm optimization algorithm to establish accurate parameters when calibrated to circumferentially notched tensile coupons. It is shown that consistent, accurate predictions are attained using the chosen models. The variation of triaxiality in steel material during plastic hardening and softening is reported. The range of triaxiality in steel structures undergoing collapse is investigated in detail and the accuracy of the chosen finite element deletion approaches is discussed. This is done through validation of different structural components and structural frames undergoing severe fracture and collapse.

  10. Assessing socioeconomic vulnerability to dengue fever in Cali, Colombia: statistical vs expert-based modeling

    PubMed Central

    2013-01-01

    Background As a result of changes in climatic conditions and greater resistance to insecticides, many regions across the globe, including Colombia, have been facing a resurgence of vector-borne diseases, and dengue fever in particular. Timely information on both (1) the spatial distribution of the disease, and (2) prevailing vulnerabilities of the population are needed to adequately plan targeted preventive intervention. We propose a methodology for the spatial assessment of current socioeconomic vulnerabilities to dengue fever in Cali, a tropical urban environment of Colombia. Methods Based on a set of socioeconomic and demographic indicators derived from census data and ancillary geospatial datasets, we develop a spatial approach for both expert-based and purely statistical-based modeling of current vulnerability levels across 340 neighborhoods of the city using a Geographic Information System (GIS). The results of both approaches are comparatively evaluated by means of spatial statistics. A web-based approach is proposed to facilitate the visualization and the dissemination of the output vulnerability index to the community. Results The statistical and the expert-based modeling approach exhibit a high concordance, globally, and spatially. The expert-based approach indicates a slightly higher vulnerability mean (0.53) and vulnerability median (0.56) across all neighborhoods, compared to the purely statistical approach (mean = 0.48; median = 0.49). Both approaches reveal that high values of vulnerability tend to cluster in the eastern, north-eastern, and western part of the city. These are poor neighborhoods with high percentages of young (i.e., < 15 years) and illiterate residents, as well as a high proportion of individuals being either unemployed or doing housework. Conclusions Both modeling approaches reveal similar outputs, indicating that in the absence of local expertise, statistical approaches could be used, with caution. By decomposing identified vulnerability “hotspots” into their underlying factors, our approach provides valuable information on both (1) the location of neighborhoods, and (2) vulnerability factors that should be given priority in the context of targeted intervention strategies. The results support decision makers to allocate resources in a manner that may reduce existing susceptibilities and strengthen resilience, and thus help to reduce the burden of vector-borne diseases. PMID:23945265

  11. Electrochemical kinetic and mass transfer model for direct ethanol alkaline fuel cell (DEAFC)

    NASA Astrophysics Data System (ADS)

    Abdullah, S.; Kamarudin, S. K.; Hasran, U. A.; Masdar, M. S.; Daud, W. R. W.

    2016-07-01

    A mathematical model is developed for a liquid-feed DEAFC incorporating an alkaline anion-exchange membrane. The one-dimensional mass transport of chemical species is modelled using isothermal, single-phase and steady-state assumptions. The anode and cathode electrochemical reactions use the Tafel kinetics approach, with two limiting cases, for the reaction order. The model fully accounts for the mixed potential effects of ethanol oxidation at the cathode due to ethanol crossover via an alkaline anion-exchange membrane. In contrast to a polymer electrolyte membrane model, the current model considers the flux of ethanol at the membrane as the difference between diffusive and electroosmotic effects. The model is used to investigate the effects of the ethanol and alkali inlet feed concentrations at the anode. The model predicts that the cell performance is almost identical for different ethanol concentrations at a low current density. Moreover, the model results show that feeding the DEAFC with 5 M NaOH and 3 M ethanol at specific operating conditions yields a better performance at a higher current density. Furthermore, the model indicates that crossover effects on the DEAFC performance are significant. The cell performance decrease from its theoretical value when a parasitic current is enabled in the model.

  12. Systems metabolic engineering: genome-scale models and beyond.

    PubMed

    Blazeck, John; Alper, Hal

    2010-07-01

    The advent of high throughput genome-scale bioinformatics has led to an exponential increase in available cellular system data. Systems metabolic engineering attempts to use data-driven approaches--based on the data collected with high throughput technologies--to identify gene targets and optimize phenotypical properties on a systems level. Current systems metabolic engineering tools are limited for predicting and defining complex phenotypes such as chemical tolerances and other global, multigenic traits. The most pragmatic systems-based tool for metabolic engineering to arise is the in silico genome-scale metabolic reconstruction. This tool has seen wide adoption for modeling cell growth and predicting beneficial gene knockouts, and we examine here how this approach can be expanded for novel organisms. This review will highlight advances of the systems metabolic engineering approach with a focus on de novo development and use of genome-scale metabolic reconstructions for metabolic engineering applications. We will then discuss the challenges and prospects for this emerging field to enable model-based metabolic engineering. Specifically, we argue that current state-of-the-art systems metabolic engineering techniques represent a viable first step for improving product yield that still must be followed by combinatorial techniques or random strain mutagenesis to achieve optimal cellular systems.

  13. Performance Analysis of GFDL's GCM Line-By-Line Radiative Transfer Model on GPU and MIC Architectures

    NASA Astrophysics Data System (ADS)

    Menzel, R.; Paynter, D.; Jones, A. L.

    2017-12-01

    Due to their relatively low computational cost, radiative transfer models in global climate models (GCMs) run on traditional CPU architectures generally consist of shortwave and longwave parameterizations over a small number of wavelength bands. With the rise of newer GPU and MIC architectures, however, the performance of high resolution line-by-line radiative transfer models may soon approach those of the physical parameterizations currently employed in GCMs. Here we present an analysis of the current performance of a new line-by-line radiative transfer model currently under development at GFDL. Although originally designed to specifically exploit GPU architectures through the use of CUDA, the radiative transfer model has recently been extended to include OpenMP in an effort to also effectively target MIC architectures such as Intel's Xeon Phi. Using input data provided by the upcoming Radiative Forcing Model Intercomparison Project (RFMIP, as part of CMIP 6), we compare model results and performance data for various model configurations and spectral resolutions run on both GPU and Intel Knights Landing architectures to analogous runs of the standard Oxford Reference Forward Model on traditional CPUs.

  14. Future Approach to tier-0 extension

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Cordeiro, C.; Giordano, D.; Traylen, S.; Moreno García, D.

    2017-10-01

    The current tier-0 processing at CERN is done on two managed sites, the CERN computer centre and the Wigner computer centre. With the proliferation of public cloud resources at increasingly competitive prices, we have been investigating how to transparently increase our compute capacity to include these providers. The approach taken has been to integrate these resources using our existing deployment and computer management tools and to provide them in a way that exposes them to users as part of the same site. The paper will describe the architecture, the toolset and the current production experiences of this model.

  15. CFD Methods and Tools for Multi-Element Airfoil Analysis

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; George, Michael W. (Technical Monitor)

    1995-01-01

    This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.

  16. Forecasting need and demand for home health care: a selective review

    PubMed Central

    Sharma, Rabinder K.

    1980-01-01

    Three models for forecasting home health care (HHC) needs are analyzed: HSA/SP model (Health Systems Agency of Southwestern Pennsylvania); Florida model (Florida State Department of Health and Rehabilitative Services); and Rhode Island model (Rhode Island Department of Community Affairs). A utilization approach to forecasting is also presented. In the HSA/SP and Florida models, need for HHC is based on a certain proportion of (a) hospital admissions and (b) patients entering HHC from other sources. The major advantage of these models is that they are relatively easy to use and explain; their major weaknesses are an imprecise definition of need and an incomplete model specification. The Rhode Island approach defines need for HHC in terms of the health status of the population as measured by chronic activity limitations. The major strengths of this approach are its explicit assumptions and its emphasis on consumer needs. The major drawback is that it requires considerable local area data. The utilization approach is based on extrapolation from observed utilization experience of the target population. Its main limitation is that it is based on current market imperfections; its major advantage is that it exposes existing deficiencies in HHC. The author concludes that each approach should be tested empirically in order to refine it, and that need and demand approaches be used jointly in the planning process. PMID:6893631

  17. Dynamic output feedback control of a flexible air-breathing hypersonic vehicle via T-S fuzzy approach

    NASA Astrophysics Data System (ADS)

    Hu, Xiaoxiang; Wu, Ligang; Hu, Changhua; Wang, Zhaoqiang; Gao, Huijun

    2014-08-01

    By utilising Takagi-Sugeno (T-S) fuzzy set approach, this paper addresses the robust H∞ dynamic output feedback control for the non-linear longitudinal model of flexible air-breathing hypersonic vehicles (FAHVs). The flight control of FAHVs is highly challenging due to the unique dynamic characteristics, and the intricate couplings between the engine and fight dynamics and external disturbance. Because of the dynamics' enormous complexity, currently, only the longitudinal dynamics models of FAHVs have been used for controller design. In this work, T-S fuzzy modelling technique is utilised to approach the non-linear dynamics of FAHVs, then a fuzzy model is developed for the output tracking problem of FAHVs. The fuzzy model contains parameter uncertainties and disturbance, which can approach the non-linear dynamics of FAHVs more exactly. The flexible models of FAHVs are difficult to measure because of the complex dynamics and the strong couplings, thus a full-order dynamic output feedback controller is designed for the fuzzy model. A robust H∞ controller is designed for the obtained closed-loop system. By utilising the Lyapunov functional approach, sufficient solvability conditions for such controllers are established in terms of linear matrix inequalities. Finally, the effectiveness of the proposed T-S fuzzy dynamic output feedback control method is demonstrated by numerical simulations.

  18. Computational Approach for Improving Three-Dimensional Sub-Surface Earth Structure for Regional Earthquake Hazard Simulations in the San Francisco Bay Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, A. J.

    In our Exascale Computing Project (ECP) we seek to simulate earthquake ground motions at much higher frequency than is currently possible. Previous simulations in the SFBA were limited to 0.5-1 Hz or lower (Aagaard et al. 2008, 2010), while we have recently simulated the response to 5 Hz. In order to improve confidence in simulated ground motions, we must accurately represent the three-dimensional (3D) sub-surface material properties that govern seismic wave propagation over a broad region. We are currently focusing on the San Francisco Bay Area (SFBA) with a Cartesian domain of size 120 x 80 x 35 km, butmore » this area will be expanded to cover a larger domain. Currently, the United States Geologic Survey (USGS) has a 3D model of the SFBA for seismic simulations. However, this model suffers from two serious shortcomings relative to our application: 1) it does not fit most of the available low frequency (< 1 Hz) seismic waveforms from moderate (magnitude M 3.5-5.0) earthquakes; and 2) it is represented with much lower resolution than necessary for the high frequency simulations (> 5 Hz) we seek to perform. The current model will serve as a starting model for full waveform tomography based on 3D sensitivity kernels. This report serves as the deliverable for our ECP FY2017 Quarter 4 milestone to FY 2018 “Computational approach to developing model updates”. We summarize the current state of 3D seismic simulations in the SFBA and demonstrate the performance of the USGS 3D model for a few selected paths. We show the available open-source waveform data sets for model updates, based on moderate earthquakes recorded in the region. We present a plan for improving the 3D model utilizing the available data and further development of our SW4 application. We project how the model could be improved and present options for further improvements focused on the shallow geotechnical layers using dense passive recordings of ambient and human-induced noise.« less

  19. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study.

    PubMed

    MacLean, Adam L; Harrington, Heather A; Stumpf, Michael P H; Byrne, Helen M

    2016-01-01

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  20. An online mineral dust model within the global/regional NMMB: current progress and plans

    NASA Astrophysics Data System (ADS)

    Perez, C.; Haustein, K.; Janjic, Z.; Jorba, O.; Baldasano, J. M.; Black, T.; Nickovic, S.

    2008-12-01

    While mineral dust distribution and effects are important on global scales, they strongly depend on dust emissions that are occurring on small spatial and temporal scales. Indeed, the accuracy of surface wind speed used in dust models is crucial. Due to the high-order power dependency on wind friction velocity and the threshold behaviour of dust emissions, small errors in surface wind speed lead to large dust emission errors. Most global dust models use prescribed wind fields provided by major meteorological centres (e.g., NCEP and ECMWF) and their spatial resolution is currently about 1 degree x 1 degree . Such wind speeds tend to be strongly underestimated over arid and semi-arid areas and do not account for mesoscale systems responsible for a significant fraction of dust emissions regionally and globally. Other significant uncertainties in dust emissions resulting from such approaches are related to the misrepresentation of high subgrid-scale spatial heterogeneity in soil and vegetation boundary conditions, mainly in semi-arid areas. In order to significantly reduce these uncertainties, the Barcelona Supercomputing Center is currently implementing a mineral dust model coupled on-line with the new global/regional NMMB atmospheric model using the ESMF framework under development in NOAA/NCEP/EMC. The NMMB is an evolution of the operational WRF-NMME extending from meso to global scales, and including non-hydrostatic option and improved tracer advection. This model is planned to become the next-generation NCEP mesoscale model for operational weather forecasting in North America. Current implementation is based on the well established regional dust model and forecast system Eta/DREAM (http://www.bsc.es/projects/earthscience/DREAM/). First successful global simulations show the potentials of such an approach and compare well with DREAM regionally. Ongoing developments include improvements in dust size distribution representation, sedimentation, dry deposition, wet scavenging and dust-radiation feedback, as well as the efficient implementation of the model on High Performance Supercomputers for global simulations and forecasts at high resolution.

  1. Radiogenomics and radiotherapy response modeling

    NASA Astrophysics Data System (ADS)

    El Naqa, Issam; Kerns, Sarah L.; Coates, James; Luo, Yi; Speers, Corey; West, Catharine M. L.; Rosenstein, Barry S.; Ten Haken, Randall K.

    2017-08-01

    Advances in patient-specific information and biotechnology have contributed to a new era of computational medicine. Radiogenomics has emerged as a new field that investigates the role of genetics in treatment response to radiation therapy. Radiation oncology is currently attempting to embrace these recent advances and add to its rich history by maintaining its prominent role as a quantitative leader in oncologic response modeling. Here, we provide an overview of radiogenomics starting with genotyping, data aggregation, and application of different modeling approaches based on modifying traditional radiobiological methods or application of advanced machine learning techniques. We highlight the current status and potential for this new field to reshape the landscape of outcome modeling in radiotherapy and drive future advances in computational oncology.

  2. Integrative Modeling of Electrical Properties of Pacemaker Cardiac Cells

    NASA Astrophysics Data System (ADS)

    Grigoriev, M.; Babich, L.

    2016-06-01

    This work represents modeling of electrical properties of pacemaker (sinus) cardiac cells. Special attention is paid to electrical potential arising from transmembrane current of Na+, K+ and Ca2+ ions. This potential is calculated using the NaCaX model. In this respect, molar concentration of ions in the intercellular space which is calculated on the basis of the GENTEX model is essential. Combined use of two different models allows referring this approach to integrative modeling.

  3. A moni-modelling approach to manage groundwater risk to pesticide leaching at regional scale.

    PubMed

    Di Guardo, Andrea; Finizio, Antonio

    2016-03-01

    Historically, the approach used to manage risk of chemical contamination of water bodies is based on the use of monitoring programmes, which provide a snapshot of the presence/absence of chemicals in water bodies. Monitoring is required in the current EU regulations, such as the Water Framework Directive (WFD), as a tool to record temporal variation in the chemical status of water bodies. More recently, a number of models have been developed and used to forecast chemical contamination of water bodies. These models combine information of chemical properties, their use, and environmental scenarios. Both approaches are useful for risk assessors in decision processes. However, in our opinion, both show flaws and strengths when taken alone. This paper proposes an integrated approach (moni-modelling approach) where monitoring data and modelling simulations work together in order to provide a common decision framework for the risk assessor. This approach would be very useful, particularly for the risk management of pesticides at a territorial level. It fulfils the requirement of the recent Sustainable Use of Pesticides Directive. In fact, the moni-modelling approach could be used to identify sensible areas where implement mitigation measures or limitation of use of pesticides, but even to effectively re-design future monitoring networks or to better calibrate the pedo-climatic input data for the environmental fate models. A case study is presented, where the moni-modelling approach is applied in Lombardy region (North of Italy) to identify groundwater vulnerable areas to pesticides. The approach has been applied to six active substances with different leaching behaviour, in order to highlight the advantages in using the proposed methodology. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Early experiences building a software quality prediction model

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  5. An Efficient Approach to Modeling the Topographic Control of Surface Hydrology for Regional and Global Climate Modeling.

    NASA Astrophysics Data System (ADS)

    Stieglitz, Marc; Rind, David; Famiglietti, James; Rosenzweig, Cynthia

    1997-01-01

    The current generation of land-surface models used in GCMs view the soil column as the fundamental hydrologic unit. While this may be effective in simulating such processes as the evolution of ground temperatures and the growth/ablation of a snowpack at the soil plot scale, it effectively ignores the role topography plays in the development of soil moisture heterogeneity and the subsequent impacts of this soil moisture heterogeneity on watershed evapotranspiration and the partitioning of surface fluxes. This view also ignores the role topography plays in the timing of discharge and the partitioning of discharge into surface runoff and baseflow. In this paper an approach to land-surface modeling is presented that allows us to view the watershed as the fundamental hydrologic unit. The analytic form of TOPMODEL equations are incorporated into the soil column framework and the resulting model is used to predict the saturated fraction of the watershed and baseflow in a consistent fashion. Soil moisture heterogeneity represented by saturated lowlands subsequently impacts the partitioning of surface fluxes, including evapotranspiration and runoff. The approach is computationally efficient, allows for a greatly improved simulation of the hydrologic cycle, and is easily coupled into the existing framework of the current generation of single column land-surface models. Because this approach uses the statistics of the topography rather than the details of the topography, it is compatible with the large spatial scales of today's regional and global climate models. Five years of meteorological and hydrological data from the Sleepers River watershed located in the northeastern United States where winter snow cover is significant were used to drive the new model. Site validation data were sufficient to evaluate model performance with regard to various aspects of the watershed water balance, including snowpack growth/ablation, the spring snowmelt hydrograph, storm hydrographs, and the seasonal development of watershed evapotranspiration and soil moisture.

  6. Current modeling practice may lead to falsely high benchmark dose estimates.

    PubMed

    Ringblom, Joakim; Johanson, Gunnar; Öberg, Mattias

    2014-07-01

    Benchmark dose (BMD) modeling is increasingly used as the preferred approach to define the point-of-departure for health risk assessment of chemicals. As data are inherently variable, there is always a risk to select a model that defines a lower confidence bound of the BMD (BMDL) that, contrary to expected, exceeds the true BMD. The aim of this study was to investigate how often and under what circumstances such anomalies occur under current modeling practice. Continuous data were generated from a realistic dose-effect curve by Monte Carlo simulations using four dose groups and a set of five different dose placement scenarios, group sizes between 5 and 50 animals and coefficients of variations of 5-15%. The BMD calculations were conducted using nested exponential models, as most BMD software use nested approaches. "Non-protective" BMDLs (higher than true BMD) were frequently observed, in some scenarios reaching 80%. The phenomenon was mainly related to the selection of the non-sigmoidal exponential model (Effect=a·e(b)(·dose)). In conclusion, non-sigmoid models should be used with caution as it may underestimate the risk, illustrating that awareness of the model selection process and sound identification of the point-of-departure is vital for health risk assessment. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Coarse Grained Model for Biological Simulations: Recent Refinements and Validation

    PubMed Central

    Vicatos, Spyridon; Rychkova, Anna; Mukherjee, Shayantani; Warshel, Arieh

    2014-01-01

    Exploring the free energy landscape of proteins and modeling the corresponding functional aspects presents a major challenge for computer simulation approaches. This challenge is due to the complexity of the landscape and the enormous computer time needed for converging simulations. The use of various simplified coarse grained (CG) models offers an effective way of sampling the landscape, but most current models are not expected to give a reliable description of protein stability and functional aspects. The main problem is associated with insufficient focus on the electrostatic features of the model. In this respect our recent CG model offers significant advantage as it has been refined while focusing on its electrostatic free energy. Here we review the current state of our model, describing recent refinement, extensions and validation studies while focusing on demonstrating key applications. These include studies of protein stability, extending the model to include membranes and electrolytes and electrodes as well as studies of voltage activated proteins, protein insertion trough the translocon, the action of molecular motors and even the coupling of the stalled ribosome and the translocon. Our example illustrates the general potential of our approach in overcoming major challenges in studies of structure function correlation in proteins and large macromolecular complexes. PMID:25050439

  8. Rethinking developmental toxicity testing: Evolution or revolution?

    PubMed

    Scialli, Anthony R; Daston, George; Chen, Connie; Coder, Prägati S; Euling, Susan Y; Foreman, Jennifer; Hoberman, Alan M; Hui, Julia; Knudsen, Thomas; Makris, Susan L; Morford, LaRonda; Piersma, Aldert H; Stanislaus, Dinesh; Thompson, Kary E

    2018-06-01

    Current developmental toxicity testing adheres largely to protocols suggested in 1966 involving the administration of test compound to pregnant laboratory animals. After more than 50 years of embryo-fetal development testing, are we ready to consider a different approach to human developmental toxicity testing? A workshop was held under the auspices of the Developmental and Reproductive Toxicology Technical Committee of the ILSI Health and Environmental Sciences Institute to consider how we might design developmental toxicity testing if we started over with 21st century knowledge and techniques (revolution). We first consider what changes to the current protocols might be recommended to make them more predictive for human risk (evolution). The evolutionary approach includes modifications of existing protocols and can include humanized models, disease models, more accurate assessment and testing of metabolites, and informed approaches to dose selection. The revolution could start with hypothesis-driven testing where we take what we know about a compound or close analog and answer specific questions using targeted experimental techniques rather than a one-protocol-fits-all approach. Central to the idea of hypothesis-driven testing is the concept that testing can be done at the level of mode of action. It might be feasible to identify a small number of key events at a molecular or cellular level that predict an adverse outcome and for which testing could be performed in vitro or in silico or, rarely, using limited in vivo models. Techniques for evaluating these key events exist today or are in development. Opportunities exist for refining and then replacing current developmental toxicity testing protocols using techniques that have already been developed or are within reach. © 2018 The Authors. Birth Defects Research Published by Wiley Periodicals, Inc.

  9. What is behind the priority heuristic? A mathematical analysis and comment on Brandstätter, Gigerenzer, and Hertwig (2006).

    PubMed

    Rieger, Marc Oliver; Wang, Mei

    2008-01-01

    Comments on the article by E. Brandstätter, G. Gigerenzer, and R. Hertwig. The authors discuss the priority heuristic, a recent model for decisions under risk. They reanalyze the experimental validity of this approach and discuss how these results compare with cumulative prospect theory, the currently most established model in behavioral economics. They also discuss how general models for decisions under risk based on a heuristic approach can be understood mathematically to gain some insight in their limitations. They finally consider whether the priority heuristic model can lead to some understanding of the decision process of individuals or whether it is better seen as an as-if model. (c) 2008 APA, all rights reserved

  10. Probable Maximum Precipitation in the U.S. Pacific Northwest in a Changing Climate

    NASA Astrophysics Data System (ADS)

    Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby

    2017-11-01

    The safety of large and aging water infrastructures is gaining attention in water management given the accelerated rate of change in landscape, climate, and society. In current engineering practice, such safety is ensured by the design of infrastructure for the Probable Maximum Precipitation (PMP). Recently, several numerical modeling approaches have been proposed to modernize the conventional and ad hoc PMP estimation approach. However, the underlying physics have not been fully investigated and thus differing PMP estimates are sometimes obtained without physics-based interpretations. In this study, we present a hybrid approach that takes advantage of both traditional engineering practice and modern climate science to estimate PMP for current and future climate conditions. The traditional PMP approach is modified and applied to five statistically downscaled CMIP5 model outputs, producing an ensemble of PMP estimates in the Pacific Northwest (PNW) during the historical (1970-2016) and future (2050-2099) time periods. The hybrid approach produced consistent historical PMP estimates as the traditional estimates. PMP in the PNW will increase by 50% ± 30% of the current design PMP by 2099 under the RCP8.5 scenario. Most of the increase is caused by warming, which mainly affects moisture availability through increased sea surface temperature, with minor contributions from changes in storm efficiency in the future. Moist track change tends to reduce the future PMP. Compared with extreme precipitation, PMP exhibits higher internal variability. Thus, long-time records of high-quality data in both precipitation and related meteorological fields (temperature, wind fields) are required to reduce uncertainties in the ensemble PMP estimates.

  11. Probable Maximum Precipitation in the U.S. Pacific Northwest in a Changing Climate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xiaodong; Hossain, Faisal; Leung, Lai-Yung

    2017-12-22

    The safety of large and aging water infrastructures is gaining attention in water management given the accelerated rate of change in landscape, climate and society. In current engineering practice, such safety is ensured by the design of infrastructure for the Probable Maximum Precipitation (PMP). Recently, several physics-based numerical modeling approaches have been proposed to modernize the conventional and ad hoc PMP estimation approach. However, the underlying physics has not been investigated and thus differing PMP estimates are obtained without clarity on their interpretation. In this study, we present a hybrid approach that takes advantage of both traditional engineering wisdom andmore » modern climate science to estimate PMP for current and future climate conditions. The traditional PMP approach is improved and applied to outputs from an ensemble of five CMIP5 models. This hybrid approach is applied in the Pacific Northwest (PNW) to produce ensemble PMP estimation for the historical (1970-2016) and future (2050-2099) time periods. The new historical PMP estimates are verified by comparing them with the traditional estimates. PMP in the PNW will increase by 50% of the current level by 2099 under the RCP8.5 scenario. Most of the increase is caused by warming, which mainly affects moisture availability, with minor contributions from changes in storm efficiency in the future. Moist track change tends to reduce the future PMP. Compared with extreme precipitation, ensemble PMP exhibits higher internal variation. Thus high-quality data of both precipitation and related meteorological fields (temperature, wind fields) are required to reduce uncertainties in the ensemble PMP estimates.« less

  12. Current State of the Art Historic Building Information Modelling

    NASA Astrophysics Data System (ADS)

    Dore, C.; Murphy, M.

    2017-08-01

    In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.

  13. Prediction of Patient-Controlled Analgesic Consumption: A Multimodel Regression Tree Approach.

    PubMed

    Hu, Yuh-Jyh; Ku, Tien-Hsiung; Yang, Yu-Hung; Shen, Jia-Ying

    2018-01-01

    Several factors contribute to individual variability in postoperative pain, therefore, individuals consume postoperative analgesics at different rates. Although many statistical studies have analyzed postoperative pain and analgesic consumption, most have identified only the correlation and have not subjected the statistical model to further tests in order to evaluate its predictive accuracy. In this study involving 3052 patients, a multistrategy computational approach was developed for analgesic consumption prediction. This approach uses data on patient-controlled analgesia demand behavior over time and combines clustering, classification, and regression to mitigate the limitations of current statistical models. Cross-validation results indicated that the proposed approach significantly outperforms various existing regression methods. Moreover, a comparison between the predictions by anesthesiologists and medical specialists and those of the computational approach for an independent test data set of 60 patients further evidenced the superiority of the computational approach in predicting analgesic consumption because it produced markedly lower root mean squared errors.

  14. Mapping the distribution of malaria: current approaches and future directions

    USGS Publications Warehouse

    Johnson, Leah R.; Lafferty, Kevin D.; McNally, Amy; Mordecai, Erin A.; Paaijmans, Krijn P.; Pawar, Samraat; Ryan, Sadie J.; Chen, Dongmei; Moulin, Bernard; Wu, Jianhong

    2015-01-01

    Mapping the distribution of malaria has received substantial attention because the disease is a major source of illness and mortality in humans, especially in developing countries. It also has a defined temporal and spatial distribution. The distribution of malaria is most influenced by its mosquito vector, which is sensitive to extrinsic environmental factors such as rainfall and temperature. Temperature also affects the development rate of the malaria parasite in the mosquito. Here, we review the range of approaches used to model the distribution of malaria, from spatially explicit to implicit, mechanistic to correlative. Although current methods have significantly improved our understanding of the factors influencing malaria transmission, significant gaps remain, particularly in incorporating nonlinear responses to temperature and temperature variability. We highlight new methods to tackle these gaps and to integrate new data with models.

  15. Modeling radiation loads in the ILC main linac and a novel approach to treat dark current

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mokhov, Nilolai V.; Rakhno, Igor L.; Tropin, Igor S.

    Electromagnetic and hadron showers generated by electrons of dark current (DC) can represent a significant radiation threat to the ILC linac equipment and personnel. In this study, a commissioning scenario is analysed which is considered as the worst-case scenario for the main linac regarding the DC contribution to the radiation environment in the tunnel. A normal operation scenario is analysed as well. An emphasis is made on radiation load to sensitive electronic equipment—cryogenic thermometers inside the cryomodules. Prompt and residual dose rates in the ILC main linac tunnels were also calculated in these new high-statistics runs. A novel approach wasmore » developed—as a part of general purpose Monte Carlo code MARS15—to model generation, acceleration and transport of DC electrons in electromagnetic fields inside SRF cavities. Comparisons were made with a standard approach when a set of pre-calculated DC electron trajectories is used, with a proper normalization, as a source for Monte Carlo modelling. Results of MARS15 Monte Carlo calculations, performed for the current main linac tunnel design, reveal that the peak absorbed dose in the cryogenic thermometers in the main tunnel for 20 years of operation is about 0.8 MGy. The calculated contact residual dose on cryomodules and tunnel walls in the main tunnel for typical irradiation and cooling conditions is 0.1 and 0.01 mSv/hr, respectively.« less

  16. Modeling the transition region

    NASA Technical Reports Server (NTRS)

    Singer, Bart A.

    1993-01-01

    The current status of transition-region models is reviewed in this report. To understand modeling problems, various flow features that influence the transition process are discussed first. Then an overview of the different approaches to transition-region modeling is given. This is followed by a detailed discussion of turbulence models and the specific modifications that are needed to predict flows undergoing laminar-turbulent transition. Methods for determining the usefulness of the models are presented, and an outlook for the future of transition-region modeling is suggested.

  17. Leveraging model-informed approaches for drug discovery and development in the cardiovascular space.

    PubMed

    Dockendorf, Marissa F; Vargo, Ryan C; Gheyas, Ferdous; Chain, Anne S Y; Chatterjee, Manash S; Wenning, Larissa A

    2018-06-01

    Cardiovascular disease remains a significant global health burden, and development of cardiovascular drugs in the current regulatory environment often demands large and expensive cardiovascular outcome trials. Thus, the use of quantitative pharmacometric approaches which can help enable early Go/No Go decision making, ensure appropriate dose selection, and increase the likelihood of successful clinical trials, have become increasingly important to help reduce the risk of failed cardiovascular outcomes studies. In addition, cardiovascular safety is an important consideration for many drug development programs, whether or not the drug is designed to treat cardiovascular disease; modeling and simulation approaches also have utility in assessing risk in this area. Herein, examples of modeling and simulation applied at various stages of drug development, spanning from the discovery stage through late-stage clinical development, for cardiovascular programs are presented. Examples of how modeling approaches have been utilized in early development programs across various therapeutic areas to help inform strategies to mitigate the risk of cardiovascular-related adverse events, such as QTc prolongation and changes in blood pressure, are also presented. These examples demonstrate how more informed drug development decisions can be enabled by modeling and simulation approaches in the cardiovascular area.

  18. Regional 3-D Modeling of Ground Geoelectric Field for the Northeast United States due to Realistic Geomagnetic Disturbances

    NASA Astrophysics Data System (ADS)

    Ivannikova, E.; Kruglyakov, M.; Kuvshinov, A. V.; Rastaetter, L.; Pulkkinen, A. A.; Ngwira, C. M.

    2017-12-01

    During extreme space weather events electric currents in the Earth's magnetosphere and ionosphere experience large variations, which leads to dramatic intensification of the fluctuating magnetic field at the surface of the Earth. According to Faraday's law of induction, the fluctuating geomagnetic field in turn induces electric field that generates harmful currents (so-called "geomagnetically induced currents"; GICs) in grounded technological systems. Understanding (via modeling) of the spatio-temporal evolution of the geoelectric field during enhanced geomagnetic activity is a key consideration in estimating the hazard to technological systems from space weather. We present the results of ground geoelectric field modeling for the Northeast United States, which is performed with the use of our novel numerical tool based on integral equation approach. The tool exploits realistic regional three-dimensional (3-D) models of the Earth's electrical conductivity and realistic global models of the spatio-temporal evolution of the magnetospheric and ionospheric current systems responsible for geomagnetic disturbances. We also explore in detail the manifestation of the coastal effect (anomalous intensification of the geoelectric field near the coasts) in this region.

  19. A simplified, data-constrained approach to estimate the permafrost carbon-climate feedback: The PCN Incubation-Panarctic Thermal (PInc-PanTher) Scaling Approach

    NASA Astrophysics Data System (ADS)

    Koven, C. D.; Schuur, E.; Schaedel, C.; Bohn, T. J.; Burke, E.; Chen, G.; Chen, X.; Ciais, P.; Grosse, G.; Harden, J. W.; Hayes, D. J.; Hugelius, G.; Jafarov, E. E.; Krinner, G.; Kuhry, P.; Lawrence, D. M.; MacDougall, A.; Marchenko, S. S.; McGuire, A. D.; Natali, S.; Nicolsky, D.; Olefeldt, D.; Peng, S.; Romanovsky, V. E.; Schaefer, K. M.; Strauss, J.; Treat, C. C.; Turetsky, M. R.

    2015-12-01

    We present an approach to estimate the feedback from large-scale thawing of permafrost soils using a simplified, data-constrained model that combines three elements: soil carbon (C) maps and profiles to identify the distribution and type of C in permafrost soils; incubation experiments to quantify the rates of C lost after thaw; and models of soil thermal dynamics in response to climate warming. We call the approach the Permafrost Carbon Network Incubation-Panarctic Thermal scaling approach (PInc-PanTher). The approach assumes that C stocks do not decompose at all when frozen, but once thawed follow set decomposition trajectories as a function of soil temperature. The trajectories are determined according to a 3-pool decomposition model fitted to incubation data using parameters specific to soil horizon types. We calculate litterfall C inputs required to maintain steady-state C balance for the current climate, and hold those inputs constant. Soil temperatures are taken from the soil thermal modules of ecosystem model simulations forced by a common set of future climate change anomalies under two warming scenarios over the period 2010 to 2100.

  20. Model-based Optimization and Feedback Control of the Current Density Profile Evolution in NSTX-U

    NASA Astrophysics Data System (ADS)

    Ilhan, Zeki Okan

    Nuclear fusion research is a highly challenging, multidisciplinary field seeking contributions from both plasma physics and multiple engineering areas. As an application of plasma control engineering, this dissertation mainly explores methods to control the current density profile evolution within the National Spherical Torus eXperiment-Upgrade (NSTX-U), which is a substantial upgrade based on the NSTX device, which is located in Princeton Plasma Physics Laboratory (PPPL), Princeton, NJ. Active control of the toroidal current density profile is among those plasma control milestones that the NSTX-U program must achieve to realize its next-step operational goals, which are characterized by high-performance, long-pulse, MHD-stable plasma operation with neutral beam heating. Therefore, the aim of this work is to develop model-based, feedforward and feedback controllers that can enable time regulation of the current density profile in NSTX-U by actuating the total plasma current, electron density, and the powers of the individual neutral beam injectors. Motivated by the coupled, nonlinear, multivariable, distributed-parameter plasma dynamics, the first step towards control design is the development of a physics-based, control-oriented model for the current profile evolution in NSTX-U in response to non-inductive current drives and heating systems. Numerical simulations of the proposed control-oriented model show qualitative agreement with the high-fidelity physics code TRANSP. The next step is to utilize the proposed control-oriented model to design an open-loop actuator trajectory optimizer. Given a desired operating state, the optimizer produces the actuator trajectories that can steer the plasma to such state. The objective of the feedforward control design is to provide a more systematic approach to advanced scenario planning in NSTX-U since the development of such scenarios is conventionally carried out experimentally by modifying the tokamak's actuator trajectories and analyzing the resulting plasma evolution. Finally, the proposed control-oriented model is embedded in feedback control schemes based on optimal control and Model Predictive Control (MPC) approaches. Integrators are added to the standard Linear Quadratic Gaussian (LQG) and MPC formulations to provide robustness against various modeling uncertainties and external disturbances. The effectiveness of the proposed feedback controllers in regulating the current density profile in NSTX-U is demonstrated in closed-loop nonlinear simulations. Moreover, the optimal feedback control algorithm has been implemented successfully in closed-loop control simulations within TRANSP through the recently developed Expert routine. (Abstract shortened by ProQuest.).

Top